@createlex/figma-swiftui-mcp 1.0.9 → 1.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,28 +1,42 @@
1
1
  # CreateLex Figma to SwiftUI MCP
2
2
 
3
- CreateLex Figma to SwiftUI MCP is a paid local runtime for CreateLex subscribers. It connects your live Figma plugin session to CreateLex and lets you inspect designs, generate SwiftUI, export assets, and write output directly into a local Xcode project.
3
+ A local MCP (Model Context Protocol) runtime that connects your Figma plugin session to Claude Code, Cursor, Windsurf, or any MCP-compatible AI tool and writes production-ready SwiftUI directly into your Xcode project.
4
4
 
5
- ## What it does
5
+ ## Generation tiers
6
6
 
7
- - Connects to the Figma to SwiftUI plugin while the plugin window is open
8
- - Authenticates against your CreateLex account and subscription
9
- - Starts the local MCP runtime and localhost bridge automatically
10
- - Generates SwiftUI and asset output for your selected Figma frame
11
- - Writes generated files into your local Xcode project
7
+ ### Tier 1 AI-native (zero extra API cost)
8
+ Your AI IDE (Claude Code, Cursor, Windsurf, etc.) calls the MCP tools itself:
9
+
10
+ 1. `get_swiftui_generation_prompt` returns a ready-to-use system prompt + user message
11
+ 2. The AI generates SwiftUI using **its own model** (your existing subscription)
12
+ 3. `write_generated_swiftui_to_xcode` → writes the file(s) to Xcode
13
+
14
+ No CreateLex tokens consumed. No extra API keys required.
15
+
16
+ ### Tier 2 — BYOK (Bring Your Own Key)
17
+ Set one of these env vars in your MCP client config and `generate_swiftui` runs locally:
18
+
19
+ | Env var | Provider | Notes |
20
+ |---|---|---|
21
+ | `ANTHROPIC_API_KEY` | Claude (recommended) | Override model with `ANTHROPIC_MODEL` |
22
+ | `HF_API_TOKEN` | Hugging Face | Default model: `Qwen/Qwen2.5-Coder-32B-Instruct`, override with `HF_MODEL` |
23
+ | `OPENAI_API_KEY` | OpenAI or compatible | Override model with `OPENAI_MODEL` (default `gpt-4o`) |
24
+ | `OPENAI_BASE_URL` | Ollama / LM Studio | Set alongside `OPENAI_API_KEY=ollama` for fully local generation |
25
+
26
+ ### Tier 3 — CreateLex hosted (subscription fallback)
27
+ If no BYOK keys are set, falls back to the CreateLex hosted pattern matcher. Requires an active [CreateLex subscription](https://createlex.com/figma-swiftui).
28
+
29
+ ---
12
30
 
13
31
  ## Requirements
14
32
 
15
- - An active CreateLex subscription for `figma-swiftui`
16
33
  - Node.js 18 or later
17
34
  - Figma desktop app with the Figma to SwiftUI plugin open
18
- - A local Xcode project
35
+ - A local Xcode project (optional for Tier 1 prompt-copy workflow)
19
36
 
20
37
  ## Install
21
38
 
22
- Run directly with `npx`:
23
-
24
39
  ```bash
25
- npx @createlex/figma-swiftui-mcp login
26
40
  npx @createlex/figma-swiftui-mcp start --project /path/to/MyApp/MyApp
27
41
  ```
28
42
 
@@ -30,48 +44,121 @@ Or install globally:
30
44
 
31
45
  ```bash
32
46
  npm install -g @createlex/figma-swiftui-mcp
33
- figma-swiftui-mcp login
34
47
  figma-swiftui-mcp start --project /path/to/MyApp/MyApp
35
48
  ```
36
49
 
37
- ## Login
50
+ ## MCP client configuration
51
+
52
+ Add to your Claude Code / Cursor / Windsurf MCP config:
53
+
54
+ ```json
55
+ {
56
+ "mcpServers": {
57
+ "figma-swiftui": {
58
+ "command": "npx",
59
+ "args": ["@createlex/figma-swiftui-mcp", "start"],
60
+ "env": {
61
+ "FIGMA_SWIFTUI_PROJECT_PATH": "/path/to/MyApp/MyApp",
62
+ "ANTHROPIC_API_KEY": "sk-ant-..."
63
+ }
64
+ }
65
+ }
66
+ }
67
+ ```
38
68
 
39
- `login` opens your browser and signs you in with CreateLex. When login succeeds, the runtime saves your local session to:
69
+ For Hugging Face BYOK:
40
70
 
41
- ```text
42
- ~/.createlex/auth.json
71
+ ```json
72
+ {
73
+ "env": {
74
+ "FIGMA_SWIFTUI_PROJECT_PATH": "/path/to/MyApp/MyApp",
75
+ "HF_API_TOKEN": "hf_...",
76
+ "HF_MODEL": "Qwen/Qwen2.5-Coder-32B-Instruct"
77
+ }
78
+ }
43
79
  ```
44
80
 
45
- ## Start the runtime
81
+ For fully local generation with Ollama:
82
+
83
+ ```json
84
+ {
85
+ "env": {
86
+ "FIGMA_SWIFTUI_PROJECT_PATH": "/path/to/MyApp/MyApp",
87
+ "OPENAI_API_KEY": "ollama",
88
+ "OPENAI_BASE_URL": "http://localhost:11434/v1",
89
+ "OPENAI_MODEL": "llama3"
90
+ }
91
+ }
92
+ ```
46
93
 
47
- Start the runtime with the Xcode source folder that contains your Swift files and `Assets.xcassets`:
94
+ ## MCP tools
95
+
96
+ | Tool | Description |
97
+ |---|---|
98
+ | `get_design_context` | Get the full Figma node tree, colors, typography, and layout for a selection |
99
+ | `get_swiftui_generation_prompt` | Returns a ready-to-use system prompt + user message — feed to your AI model to generate SwiftUI |
100
+ | `generate_swiftui` | Generate SwiftUI via BYOK key or CreateLex hosted (Tier 2/3) |
101
+ | `write_generated_swiftui_to_xcode` | Write a generated Swift file (plus optional DesignTokens.swift and component files) to Xcode |
102
+ | `get_project_path` / `set_project_path` | Read or update the target Xcode source directory |
103
+ | `bridge_status` | Check if the Figma plugin bridge is connected |
104
+
105
+ ### Multi-file output
106
+
107
+ `write_generated_swiftui_to_xcode` accepts an `additionalFiles` array for AI-generated companion files:
108
+
109
+ ```json
110
+ {
111
+ "structName": "LoginScreen",
112
+ "code": "import SwiftUI\n...",
113
+ "additionalFiles": [
114
+ { "name": "DesignTokens.swift", "code": "...", "dir": "shared" },
115
+ { "name": "PrimaryButton.swift", "code": "...", "dir": "components" }
116
+ ]
117
+ }
118
+ ```
48
119
 
49
- ```bash
50
- npx @createlex/figma-swiftui-mcp start --project /path/to/MyApp/MyApp
120
+ Output layout inside your Xcode source folder:
121
+
122
+ ```
123
+ FigmaGenerated/
124
+ Screens/ ← primary view files
125
+ Components/ ← reusable component files
126
+ DesignTokens.swift ← shared tokens (dir: "shared")
127
+ Manifest/ ← generation metadata
51
128
  ```
52
129
 
53
- If you omit `--project`, you can set the project path later from the plugin UI or your MCP client.
130
+ ## AI-native workflow (Tier 1 example)
54
131
 
55
- ## Before you use it
132
+ In Claude Code or any MCP-connected AI:
56
133
 
57
- 1. Log in with CreateLex
58
- 2. Make sure your subscription is active
59
- 3. Open the Figma to SwiftUI plugin in Figma and keep it open
60
- 4. Start the local runtime
61
- 5. Use your MCP client or the plugin UI to generate and sync output
134
+ ```
135
+ 1. Select a frame in Figma
136
+ 2. Ask: "Generate SwiftUI for my selected Figma frame"
137
+ 3. Claude calls get_swiftui_generation_prompt → generates code → calls write_generated_swiftui_to_xcode
138
+ 4. File appears in Xcode immediately
139
+ ```
62
140
 
63
- ## Subscription and access
141
+ Zero CreateLex tokens consumed.
64
142
 
65
- This package is public to install, but runtime access is gated by CreateLex. If your account is not logged in or does not have an active `figma-swiftui` subscription, the runtime will not start.
143
+ ## Plugin UI
66
144
 
67
- ## Product page
145
+ The plugin's **Smart Generate** button supports two modes via the dropdown:
146
+ - **Smart AI** — uses your AI IDE or BYOK key for true AI generation
147
+ - **Classic** — uses the hosted pattern matcher (requires subscription)
68
148
 
69
- Manage plans and billing at:
149
+ After generation, the **Copy AI Prompt** button copies a ready-to-use prompt you can paste into Claude.ai, ChatGPT, or any AI chat. Requires the MCP server to be running.
70
150
 
71
- - https://createlex.com/figma-swiftui
151
+ ## Login (CreateLex subscription)
72
152
 
73
- ## Support
153
+ Only required for Tier 3 hosted generation:
74
154
 
75
- For help, billing, or onboarding:
155
+ ```bash
156
+ npx @createlex/figma-swiftui-mcp login
157
+ ```
158
+
159
+ Saves your session to `~/.createlex/auth.json`.
160
+
161
+ ## Support
76
162
 
77
- - https://createlex.com/contact
163
+ - Product: https://createlex.com/figma-swiftui
164
+ - Help & billing: https://createlex.com/contact
@@ -0,0 +1,432 @@
1
+ 'use strict';
2
+
3
+ /**
4
+ * local-llm-generator.cjs
5
+ *
6
+ * BYOK (Bring Your Own Key) SwiftUI generation.
7
+ * Uses ANTHROPIC_API_KEY, OPENAI_API_KEY, or an OpenAI-compatible
8
+ * base URL (e.g. Ollama, LM Studio) to generate SwiftUI without
9
+ * touching the CreateLex hosted backend.
10
+ *
11
+ * Required env vars (at least one):
12
+ * ANTHROPIC_API_KEY — use Claude (recommended)
13
+ * OPENAI_API_KEY — use OpenAI or compatible API
14
+ *
15
+ * Optional:
16
+ * OPENAI_MODEL — model name (default: gpt-4o)
17
+ * OPENAI_BASE_URL — custom base URL for Ollama / LM Studio etc.
18
+ */
19
+
20
+ // ---------------------------------------------------------------------------
21
+ // Prompt constants
22
+ // ---------------------------------------------------------------------------
23
+
24
+ const SWIFTUI_SYSTEM_PROMPT = `You are an expert SwiftUI engineer generating production-ready iOS code from Figma design context JSON.
25
+
26
+ RULES:
27
+ 1. Output ONLY valid Swift code inside <file name="StructName.swift"> XML tags — one tag per file.
28
+ 2. Always import SwiftUI. Never import UIKit or use UIViewRepresentable.
29
+ 3. Map Figma layoutMode: HORIZONTAL→HStack, VERTICAL→VStack, NONE→ZStack/overlay.
30
+ 4. Use the struct name supplied in OUTPUT_STRUCT_NAME for the primary view.
31
+ 5. Responsive sizing: follow every _responsiveHint annotation in the node tree.
32
+ 6. For colors: use Color extension token names when a styleName is present; otherwise Color(red:green:blue:).
33
+ 7. Generate @State vars for every interactive element (TextField, Toggle, Button, Picker).
34
+ 8. Add .accessibilityLabel() and .accessibilityHint() derived from node name and text content.
35
+ 9. End every file with a #Preview { StructName() } block.
36
+ 10. Emit best-effort code for anything complex — never emit TODO comments or placeholder stubs.
37
+ 11. If reusableComponents are present, output each as a separate <file name="ComponentName.swift"> tag.
38
+
39
+ RESPONSIVE LAYOUT RULES:
40
+ - Root frame with FILL sizing → .frame(maxWidth: .infinity)
41
+ - Font sizes → @ScaledMetric var: e.g. @ScaledMetric var titleSize: CGFloat = 34
42
+ - Horizontal scrolling children → ScrollView(.horizontal, showsIndicators: false)
43
+ - Vertical root scroll behavior → wrap body in ScrollView
44
+ - ViewThatFits for HStack→VStack fallback on narrow devices when content may wrap
45
+ - VStack(spacing:) / HStack(spacing:) from Figma itemSpacing — not fixed .padding() for inter-item spacing
46
+ - .ignoresSafeArea() only for background color/image layers, never for foreground content
47
+ - Hardcoded values are acceptable only for: cornerRadius, icon sizes ≤24pt, stroke widths
48
+
49
+ DESIGN TOKENS:
50
+ If designTokens.colors or designTokens.fonts are non-empty, output a <file name="DesignTokens.swift"> containing:
51
+ extension Color { static let tokenName = Color(red: ..., green: ..., blue: ...) }
52
+ extension Font { static let tokenName = .system(size: ..., weight: ...) }
53
+ Then use these token names everywhere in the main view code.
54
+
55
+ REUSABLE COMPONENTS:
56
+ For each item in reusableComponents:
57
+ - Output a separate <file name="ComponentName.swift">
58
+ - Promote repeated values (text, colors, image names) into View parameters
59
+ - In the parent view, replace inline rendering with ComponentName(param: value)`;
60
+
61
+ // ---------------------------------------------------------------------------
62
+ // Context helpers
63
+ // ---------------------------------------------------------------------------
64
+
65
+ function cleanText(value) {
66
+ return typeof value === 'string' ? value.replace(/\s+/g, ' ').trim() : '';
67
+ }
68
+
69
+ function sanitizeName(value) {
70
+ const words = cleanText(value)
71
+ .replace(/[^a-zA-Z0-9]+/g, ' ')
72
+ .split(' ')
73
+ .filter(Boolean);
74
+ if (words.length === 0) return 'GeneratedView';
75
+ return words.map((w) => w[0].toUpperCase() + w.slice(1)).join('');
76
+ }
77
+
78
+ function getRootNode(metadata) {
79
+ if (!metadata || typeof metadata !== 'object') return null;
80
+ if (Array.isArray(metadata.nodes)) {
81
+ return metadata.nodes.length === 1 ? metadata.nodes[0] : null;
82
+ }
83
+ return metadata.type ? metadata : null;
84
+ }
85
+
86
+ function flattenNodes(node, limit = 200) {
87
+ const results = [];
88
+ const visit = (current) => {
89
+ if (!current || typeof current !== 'object' || results.length >= limit) return;
90
+ results.push(current);
91
+ if (Array.isArray(current.children)) {
92
+ current.children.forEach(visit);
93
+ }
94
+ };
95
+ visit(node);
96
+ return results;
97
+ }
98
+
99
+ function annotateResponsiveHints(node, rootWidth) {
100
+ if (!node || !rootWidth) return node;
101
+ const width = node?.geometry?.width;
102
+ if (typeof width === 'number' && width > 0) {
103
+ const frac = width / rootWidth;
104
+ if (frac > 0.5 && frac < 1.0) {
105
+ node._responsiveHint = `width is ~${Math.round(frac * 100)}% of root canvas — use .frame(maxWidth: .infinity) with horizontal padding`;
106
+ }
107
+ }
108
+ if (Array.isArray(node.children)) {
109
+ node.children.forEach((child) => annotateResponsiveHints(child, rootWidth));
110
+ }
111
+ return node;
112
+ }
113
+
114
+ function pruneNodeTree(root, maxNodes) {
115
+ const all = flattenNodes(root, maxNodes);
116
+ const ids = new Set(all.map((n) => n.id));
117
+
118
+ function prune(node) {
119
+ if (!node) return node;
120
+ const pruned = { ...node };
121
+ if (Array.isArray(pruned.children)) {
122
+ pruned.children = pruned.children
123
+ .filter((c) => ids.has(c?.id))
124
+ .map(prune);
125
+ }
126
+ return pruned;
127
+ }
128
+
129
+ return prune(root);
130
+ }
131
+
132
+ function extractDesignTokens(rootNode) {
133
+ const colors = new Map();
134
+ const fonts = new Map();
135
+ const radiusCounts = new Map();
136
+
137
+ function toSwiftTokenName(styleName) {
138
+ return styleName
139
+ .replace(/[^a-zA-Z0-9/]+/g, ' ')
140
+ .split(/[\s/]+/)
141
+ .filter(Boolean)
142
+ .map((w, i) => i === 0 ? w[0].toLowerCase() + w.slice(1) : w[0].toUpperCase() + w.slice(1))
143
+ .join('');
144
+ }
145
+
146
+ flattenNodes(rootNode, 500).forEach((node) => {
147
+ const style = node.style;
148
+ if (!style) return;
149
+
150
+ if (style.styleName && Array.isArray(style.fills)) {
151
+ const fill = style.fills.find((f) => f && f.visible !== false && f.type === 'SOLID' && f.color);
152
+ if (fill) colors.set(toSwiftTokenName(style.styleName), fill.color);
153
+ }
154
+
155
+ if (node.type === 'TEXT' && style.styleName && node.text?.fontSize) {
156
+ fonts.set(toSwiftTokenName(style.styleName), {
157
+ size: node.text.fontSize,
158
+ weight: node.text.fontName?.style,
159
+ });
160
+ }
161
+
162
+ if (typeof style.cornerRadius === 'number' && style.cornerRadius > 0) {
163
+ radiusCounts.set(style.cornerRadius, (radiusCounts.get(style.cornerRadius) ?? 0) + 1);
164
+ }
165
+ });
166
+
167
+ return {
168
+ colors: [...colors.entries()].map(([name, color]) => ({ name, color })),
169
+ fonts: [...fonts.entries()].map(([name, font]) => ({ name, ...font })),
170
+ commonRadii: [...radiusCounts.entries()].filter(([, count]) => count >= 3).map(([r]) => r),
171
+ };
172
+ }
173
+
174
+ function buildPromptContext(context) {
175
+ const root = getRootNode(context?.metadata);
176
+ if (!root) return null;
177
+
178
+ const rootWidth = root?.geometry?.width ?? 390;
179
+ const prunedRoot = pruneNodeTree(root, 200);
180
+ annotateResponsiveHints(prunedRoot, rootWidth);
181
+
182
+ const designTokens = extractDesignTokens(root);
183
+ const structName = sanitizeName(root.name || 'GeneratedView');
184
+
185
+ return {
186
+ structName,
187
+ canvasWidth: rootWidth,
188
+ canvasHeight: root?.geometry?.height ?? 844,
189
+ nodeTree: prunedRoot,
190
+ designTokens,
191
+ reusableComponents: (context?.reusableComponents?.candidates ?? []).slice(0, 10),
192
+ assetRequests: (context?.assetExportPlan?.candidates ?? []).slice(0, 10),
193
+ generationHints: context?.generationHints ?? null,
194
+ };
195
+ }
196
+
197
+ function buildUserMessage(context, generationMode) {
198
+ const promptCtx = buildPromptContext(context);
199
+ if (!promptCtx) {
200
+ throw new Error('Could not extract root node from design context');
201
+ }
202
+
203
+ return `Generate SwiftUI code for this Figma design.
204
+
205
+ OUTPUT_STRUCT_NAME: ${promptCtx.structName}
206
+ CANVAS_WIDTH: ${promptCtx.canvasWidth}
207
+ GENERATION_MODE: ${generationMode}
208
+
209
+ DESIGN CONTEXT:
210
+ ${JSON.stringify(promptCtx, null, 2)}
211
+
212
+ Output all Swift files using <file name="StructName.swift"> tags.`;
213
+ }
214
+
215
+ // ---------------------------------------------------------------------------
216
+ // Response parser
217
+ // ---------------------------------------------------------------------------
218
+
219
+ function parseClaudeResponse(text) {
220
+ const fileRegex = /<file name="([^"]+\.swift)">([\s\S]*?)<\/file>/g;
221
+ const files = [];
222
+ let match;
223
+ while ((match = fileRegex.exec(text)) !== null) {
224
+ files.push({ name: match[1], code: match[2].trim() });
225
+ }
226
+
227
+ if (files.length === 0) {
228
+ // Fallback: treat entire response as primary file code
229
+ return { code: text.trim(), designTokensCode: null, componentFiles: [] };
230
+ }
231
+
232
+ const tokensFile = files.find((f) => f.name === 'DesignTokens.swift');
233
+ const primaryName = buildPromptContext !== null
234
+ ? files.find((f) => f.name !== 'DesignTokens.swift' && !f.name.startsWith('Component') && !f.name.endsWith('Row.swift') && !f.name.endsWith('Card.swift') && !f.name.endsWith('Cell.swift'))
235
+ : null;
236
+ const primary = primaryName ?? files.find((f) => f !== tokensFile);
237
+ const componentFiles = files.filter((f) => f !== primary && f !== tokensFile);
238
+
239
+ return {
240
+ code: primary?.code ?? '',
241
+ designTokensCode: tokensFile?.code ?? null,
242
+ componentFiles,
243
+ };
244
+ }
245
+
246
+ // ---------------------------------------------------------------------------
247
+ // Anthropic generation
248
+ // ---------------------------------------------------------------------------
249
+
250
+ async function generateWithAnthropic(context, generationMode, apiKey) {
251
+ let Anthropic;
252
+ try {
253
+ ({ Anthropic } = require('@anthropic-ai/sdk'));
254
+ } catch {
255
+ throw new Error('ANTHROPIC_API_KEY is set but @anthropic-ai/sdk is not installed. Run: npm install @anthropic-ai/sdk');
256
+ }
257
+
258
+ const client = new Anthropic({ apiKey });
259
+ const userMessage = buildUserMessage(context, generationMode);
260
+
261
+ const response = await client.messages.create({
262
+ model: process.env.ANTHROPIC_MODEL || 'claude-sonnet-4-6',
263
+ max_tokens: 8192,
264
+ system: SWIFTUI_SYSTEM_PROMPT,
265
+ messages: [{ role: 'user', content: userMessage }],
266
+ });
267
+
268
+ const text = response.content
269
+ .filter((b) => b.type === 'text')
270
+ .map((b) => b.text)
271
+ .join('');
272
+
273
+ return parseClaudeResponse(text);
274
+ }
275
+
276
+ // ---------------------------------------------------------------------------
277
+ // OpenAI-compatible generation (OpenAI, Ollama, LM Studio, etc.)
278
+ // ---------------------------------------------------------------------------
279
+
280
+ async function generateWithOpenAI(context, generationMode, apiKey) {
281
+ let OpenAI;
282
+ try {
283
+ ({ OpenAI } = require('openai'));
284
+ } catch {
285
+ throw new Error('OPENAI_API_KEY is set but the openai package is not installed. Run: npm install openai');
286
+ }
287
+
288
+ const client = new OpenAI({
289
+ apiKey,
290
+ baseURL: process.env.OPENAI_BASE_URL || undefined,
291
+ });
292
+
293
+ const model = process.env.OPENAI_MODEL || 'gpt-4o';
294
+ const userMessage = buildUserMessage(context, generationMode);
295
+
296
+ const response = await client.chat.completions.create({
297
+ model,
298
+ max_tokens: 8192,
299
+ messages: [
300
+ { role: 'system', content: SWIFTUI_SYSTEM_PROMPT },
301
+ { role: 'user', content: userMessage },
302
+ ],
303
+ });
304
+
305
+ const text = response.choices[0]?.message?.content ?? '';
306
+ return parseClaudeResponse(text);
307
+ }
308
+
309
+ // ---------------------------------------------------------------------------
310
+ // Hugging Face Inference API
311
+ //
312
+ // Uses HF's OpenAI-compatible endpoint: https://api-inference.huggingface.co/v1
313
+ //
314
+ // Best models for SwiftUI generation (set HF_MODEL to override):
315
+ // Qwen/Qwen2.5-Coder-32B-Instruct — default, best open-source code model
316
+ // meta-llama/Llama-3.3-70B-Instruct — strong reasoning + code
317
+ // deepseek-ai/DeepSeek-Coder-V2-Instruct — excellent complex code
318
+ // Qwen/Qwen2.5-Coder-7B-Instruct — faster/cheaper for simpler screens
319
+ //
320
+ // Env vars:
321
+ // HF_API_TOKEN — required
322
+ // HF_MODEL — optional, defaults to Qwen2.5-Coder-32B-Instruct
323
+ // ---------------------------------------------------------------------------
324
+
325
+ const HF_INFERENCE_BASE_URL = 'https://api-inference.huggingface.co/v1';
326
+ const HF_DEFAULT_MODEL = 'Qwen/Qwen2.5-Coder-32B-Instruct';
327
+
328
+ async function generateWithHuggingFace(context, generationMode, apiKey) {
329
+ let OpenAI;
330
+ try {
331
+ ({ OpenAI } = require('openai'));
332
+ } catch {
333
+ throw new Error('HF_API_TOKEN is set but the openai package is not installed. Run: npm install openai');
334
+ }
335
+
336
+ const model = process.env.HF_MODEL || HF_DEFAULT_MODEL;
337
+ const client = new OpenAI({
338
+ apiKey,
339
+ baseURL: HF_INFERENCE_BASE_URL,
340
+ });
341
+
342
+ const userMessage = buildUserMessage(context, generationMode);
343
+
344
+ const response = await client.chat.completions.create({
345
+ model,
346
+ max_tokens: 8192,
347
+ messages: [
348
+ { role: 'system', content: SWIFTUI_SYSTEM_PROMPT },
349
+ { role: 'user', content: userMessage },
350
+ ],
351
+ });
352
+
353
+ const text = response.choices[0]?.message?.content ?? '';
354
+ return parseClaudeResponse(text);
355
+ }
356
+
357
+ // ---------------------------------------------------------------------------
358
+ // Public API
359
+ // ---------------------------------------------------------------------------
360
+
361
+ /**
362
+ * Returns the SwiftUI system prompt and a formatted user message for AI-native
363
+ * generation. AI tools (Claude Code, Cursor, Windsurf, etc.) call this to get a
364
+ * ready-to-use prompt, then generate code with their own model and call
365
+ * write_generated_swiftui_to_xcode — burning zero CreateLex tokens.
366
+ */
367
+ function buildGenerationPrompt(context) {
368
+ const promptCtx = buildPromptContext(context);
369
+ if (!promptCtx) return null;
370
+
371
+ return {
372
+ systemPrompt: SWIFTUI_SYSTEM_PROMPT,
373
+ userMessage: buildUserMessage(context, 'editable'),
374
+ outputStructName: promptCtx.structName,
375
+ assetRequests: promptCtx.assetRequests,
376
+ canvasWidth: promptCtx.canvasWidth,
377
+ };
378
+ }
379
+
380
+ /**
381
+ * Generate SwiftUI using the user's own API key (BYOK).
382
+ *
383
+ * Priority order:
384
+ * 1. ANTHROPIC_API_KEY → Claude (best quality, recommended)
385
+ * 2. HF_API_TOKEN → Hugging Face (Qwen2.5-Coder-32B by default — best open-source)
386
+ * 3. OPENAI_API_KEY → OpenAI or any OpenAI-compatible endpoint (Ollama, LM Studio)
387
+ *
388
+ * Returns: { handled, provider, code, designTokensCode, componentFiles, diagnostics }
389
+ */
390
+ async function generateWithLocalKey(context, generationMode = 'editable') {
391
+ const anthropicKey = process.env.ANTHROPIC_API_KEY;
392
+ const hfKey = process.env.HF_API_TOKEN;
393
+ const openaiKey = process.env.OPENAI_API_KEY;
394
+
395
+ if (!anthropicKey && !hfKey && !openaiKey) {
396
+ return null; // No BYOK configured — caller should try next tier
397
+ }
398
+
399
+ let parsed;
400
+ let provider;
401
+
402
+ if (anthropicKey) {
403
+ parsed = await generateWithAnthropic(context, generationMode, anthropicKey);
404
+ provider = `local-anthropic:${process.env.ANTHROPIC_MODEL || 'claude-sonnet-4-6'}`;
405
+ } else if (hfKey) {
406
+ const model = process.env.HF_MODEL || HF_DEFAULT_MODEL;
407
+ parsed = await generateWithHuggingFace(context, generationMode, hfKey);
408
+ provider = `huggingface:${model}`;
409
+ } else {
410
+ parsed = await generateWithOpenAI(context, generationMode, openaiKey);
411
+ const baseUrl = process.env.OPENAI_BASE_URL;
412
+ const model = process.env.OPENAI_MODEL || 'gpt-4o';
413
+ provider = baseUrl ? `local-openai-compatible:${model}` : `local-openai:${model}`;
414
+ }
415
+
416
+ return {
417
+ handled: true,
418
+ provider,
419
+ screenType: 'llm-generated',
420
+ code: parsed.code,
421
+ designTokensCode: parsed.designTokensCode,
422
+ componentFiles: parsed.componentFiles,
423
+ assetRequests: [],
424
+ diagnostics: [],
425
+ };
426
+ }
427
+
428
+ module.exports = {
429
+ generateWithLocalKey,
430
+ buildGenerationPrompt,
431
+ SWIFTUI_SYSTEM_PROMPT,
432
+ };
@@ -25,6 +25,10 @@ const {
25
25
  postAuthorizedApi,
26
26
  validateRuntimeSession,
27
27
  } = require('./createlex-auth.cjs');
28
+ const {
29
+ generateWithLocalKey,
30
+ buildGenerationPrompt,
31
+ } = require('./local-llm-generator.cjs');
28
32
 
29
33
  const BRIDGE_HTTP_URL = process.env.FIGMA_SWIFTUI_BRIDGE_HTTP_URL || 'http://localhost:7765';
30
34
  const BRIDGE_WS_URL = process.env.FIGMA_SWIFTUI_BRIDGE_WS_URL || 'ws://localhost:7765/bridge';
@@ -110,6 +114,35 @@ async function tryHostedSemanticGeneration({ nodeIds, generationMode, includeOve
110
114
  includeScreenshot: false,
111
115
  });
112
116
 
117
+ // Tier 2: BYOK — use the user's own API key (Anthropic, OpenAI, or Ollama)
118
+ // Bypasses CreateLex backend entirely; no subscription token consumed.
119
+ if (process.env.ANTHROPIC_API_KEY || process.env.HF_API_TOKEN || process.env.OPENAI_API_KEY) {
120
+ try {
121
+ const byokResult = await generateWithLocalKey(context, generationMode);
122
+ if (byokResult?.handled) {
123
+ const metadata = context?.metadata;
124
+ const selection = Array.isArray(metadata?.nodes)
125
+ ? { ids: metadata.nodes.map((n) => n.id), names: metadata.nodes.map((n) => n.name) }
126
+ : { ids: [metadata?.id], names: [metadata?.name] };
127
+
128
+ if (analyze) {
129
+ return {
130
+ selection: metadata,
131
+ generated: { ...byokResult, imageCount: 0, imageNames: [] },
132
+ assetExportPlan: context?.assetExportPlan ?? null,
133
+ reusableComponents: context?.reusableComponents ?? null,
134
+ generationHints: context?.generationHints ?? null,
135
+ manualRefinementHints: context?.generationHints?.manualRefinementHints ?? [],
136
+ hosted: false,
137
+ };
138
+ }
139
+ return { ...byokResult, selection, imageCount: 0, imageNames: [], hosted: false };
140
+ }
141
+ } catch (byokError) {
142
+ console.error('[figma-swiftui-mcp] BYOK generation failed, falling back to hosted:', byokError.message);
143
+ }
144
+ }
145
+
113
146
  const metadata = context?.metadata;
114
147
  const isSingleRootNode = !!metadata && !Array.isArray(metadata?.nodes) ? true : Array.isArray(metadata?.nodes) && metadata.nodes.length === 1;
115
148
  if (!isSingleRootNode) {
@@ -387,7 +420,7 @@ server.registerTool('get_metadata', {
387
420
  });
388
421
 
389
422
  server.registerTool('get_design_context', {
390
- description: 'Return node metadata, asset export candidates, and generation hints for the current selection or an explicit node.',
423
+ description: 'Return node metadata, asset export candidates, and generation hints for the current selection or an explicit node. AI-native workflow: call this tool, then generate SwiftUI yourself using your own model (burns zero CreateLex tokens), then call write_generated_swiftui_to_xcode to save the result. Alternatively call get_swiftui_generation_prompt to get a ready-to-use system prompt + user message.',
391
424
  inputSchema: {
392
425
  nodeIds: z.array(z.string()).optional().describe('Optional list of Figma node ids. If omitted, uses the current selection'),
393
426
  nodeId: z.string().optional().describe('Optional single Figma node id'),
@@ -404,6 +437,37 @@ server.registerTool('get_design_context', {
404
437
  return jsonResult(data);
405
438
  });
406
439
 
440
+ server.registerTool('get_swiftui_generation_prompt', {
441
+ description: 'Return a ready-to-use SwiftUI system prompt and user message for AI-native generation. Call this, then feed the returned systemPrompt + userMessage to your own AI model (Claude Code, Cursor, Windsurf, ChatGPT, etc.) to generate SwiftUI — burning zero CreateLex tokens. After generation, parse <file name="X.swift"> tags in the response and call write_generated_swiftui_to_xcode with the code and any additionalFiles.',
442
+ inputSchema: {
443
+ nodeIds: z.array(z.string()).optional().describe('Optional list of Figma node ids. If omitted, uses the current selection'),
444
+ nodeId: z.string().optional().describe('Optional single Figma node id'),
445
+ maxDepth: z.number().int().min(0).max(8).default(4).describe('Maximum child depth to include'),
446
+ },
447
+ }, async ({ nodeIds, nodeId, maxDepth }) => {
448
+ const context = await callBridge('get_design_context', {
449
+ nodeIds,
450
+ nodeId,
451
+ maxDepth,
452
+ includeScreenshot: false,
453
+ });
454
+
455
+ const prompt = buildGenerationPrompt(context);
456
+ if (!prompt) {
457
+ throw new Error('Could not extract root node from design context. Ensure a single frame or component is selected.');
458
+ }
459
+
460
+ return jsonResult({
461
+ ...prompt,
462
+ instructions: [
463
+ '1. Use systemPrompt as the system message and userMessage as the user message.',
464
+ '2. Ask your AI to generate SwiftUI code.',
465
+ '3. Parse <file name="X.swift"> tags from the response — each tag is one Swift file.',
466
+ '4. Call write_generated_swiftui_to_xcode with: code (primary view), additionalFiles for DesignTokens.swift (dir:"shared") and components (dir:"components").',
467
+ ],
468
+ });
469
+ });
470
+
407
471
  server.registerTool('get_screenshot', {
408
472
  description: 'Export a PNG screenshot for a single target node or the single current selection.',
409
473
  inputSchema: {
@@ -642,9 +706,9 @@ server.registerTool('analyze_generation', {
642
706
  });
643
707
 
644
708
  server.registerTool('write_generated_swiftui_to_xcode', {
645
- description: 'Write generated SwiftUI code and optional exported images directly into the configured Xcode project.',
709
+ description: 'Write generated SwiftUI code and optional exported images directly into the configured Xcode project. Supports writing multiple files: use additionalFiles for DesignTokens.swift (dir: "shared"), reusable components (dir: "components"), or extra screen files (dir: "screens"). AI tools generating code with their own model should call this after generating SwiftUI from get_design_context or get_swiftui_generation_prompt.',
646
710
  inputSchema: {
647
- code: z.string().describe('SwiftUI source code to write'),
711
+ code: z.string().describe('SwiftUI source code for the primary view'),
648
712
  structName: z.string().optional().describe('Optional Swift struct name; if omitted it will be inferred from the code'),
649
713
  projectPath: z.string().optional().describe('Optional Xcode source folder override'),
650
714
  images: z.array(z.object({
@@ -652,8 +716,13 @@ server.registerTool('write_generated_swiftui_to_xcode', {
652
716
  base64: z.string(),
653
717
  })).default([]).describe('Optional exported image payloads'),
654
718
  selectionNames: z.array(z.string()).default([]).describe('Optional original Figma selection names for struct inference'),
719
+ additionalFiles: z.array(z.object({
720
+ name: z.string().describe('Swift filename, e.g. DesignTokens.swift'),
721
+ code: z.string().describe('Full Swift source code for this file'),
722
+ dir: z.enum(['shared', 'components', 'screens']).optional().describe('shared=project root, components=Components/ subdir, screens=same dir as primary (default)'),
723
+ })).optional().describe('Additional Swift files to write: DesignTokens.swift, reusable components, etc.'),
655
724
  },
656
- }, async ({ code, structName, projectPath, images, selectionNames }) => {
725
+ }, async ({ code, structName, projectPath, images, selectionNames, additionalFiles }) => {
657
726
  const targetDir = resolveTargetProjectPath(projectPath);
658
727
  const effectiveStructName = inferStructName({ structName, code, selectionNames });
659
728
  const result = writeSwiftUIScreen({
@@ -661,6 +730,7 @@ server.registerTool('write_generated_swiftui_to_xcode', {
661
730
  code,
662
731
  structName: effectiveStructName,
663
732
  images,
733
+ additionalFiles,
664
734
  });
665
735
 
666
736
  if (!result.ok) {
@@ -73,11 +73,12 @@ function loadProjectPath({ explicitPath } = {}) {
73
73
  return getSavedProjectPath();
74
74
  }
75
75
 
76
- function writeSwiftUIScreen({ targetDir, code, structName, images = [] }) {
76
+ function writeSwiftUIScreen({ targetDir, code, structName, images = [], additionalFiles = [] }) {
77
77
  const resolvedTarget = resolveWritableProjectPath(targetDir);
78
78
  const generatedDirs = ensureGeneratedLayout(resolvedTarget);
79
79
  const results = {
80
80
  swiftFile: null,
81
+ additionalSwiftFiles: [],
81
82
  images: [],
82
83
  manifestFiles: [],
83
84
  generatedRoot: generatedDirs.rootDir,
@@ -94,6 +95,34 @@ function writeSwiftUIScreen({ targetDir, code, structName, images = [] }) {
94
95
  }
95
96
  }
96
97
 
98
+ // Write additional files (DesignTokens.swift, component files, etc.)
99
+ if (Array.isArray(additionalFiles) && additionalFiles.length > 0) {
100
+ const componentsDir = path.join(generatedDirs.rootDir, 'Components');
101
+ for (const extra of additionalFiles) {
102
+ if (!extra.name || !extra.code) continue;
103
+ let destDir;
104
+ if (extra.dir === 'shared') {
105
+ destDir = generatedDirs.rootDir;
106
+ } else if (extra.dir === 'components') {
107
+ try {
108
+ fs.mkdirSync(componentsDir, { recursive: true });
109
+ } catch {
110
+ // ignore
111
+ }
112
+ destDir = componentsDir;
113
+ } else {
114
+ destDir = generatedDirs.screensDir;
115
+ }
116
+ const filePath = path.join(destDir, extra.name);
117
+ try {
118
+ fs.writeFileSync(filePath, extra.code, 'utf8');
119
+ results.additionalSwiftFiles.push(filePath);
120
+ } catch (error) {
121
+ results.errors.push(`Failed to write ${extra.name}: ${error.message}`);
122
+ }
123
+ }
124
+ }
125
+
97
126
  if (Array.isArray(images) && images.length > 0) {
98
127
  const assetResult = writeAssetCatalogEntries({
99
128
  targetDir: resolvedTarget,
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@createlex/figma-swiftui-mcp",
3
- "version": "1.0.9",
3
+ "version": "1.1.0",
4
4
  "description": "CreateLex MCP runtime for Figma-to-SwiftUI generation and Xcode export",
5
5
  "bin": {
6
6
  "figma-swiftui-mcp": "bin/figma-swiftui-mcp.js"
@@ -9,6 +9,7 @@
9
9
  "bin/",
10
10
  "companion/bridge-server.cjs",
11
11
  "companion/createlex-auth.cjs",
12
+ "companion/local-llm-generator.cjs",
12
13
  "companion/login.mjs",
13
14
  "companion/mcp-server.mjs",
14
15
  "companion/package.json",
@@ -39,6 +40,10 @@
39
40
  "ws": "^8.20.0",
40
41
  "zod": "^4.3.6"
41
42
  },
43
+ "optionalDependencies": {
44
+ "@anthropic-ai/sdk": "^0.52.0",
45
+ "openai": "^4.100.0"
46
+ },
42
47
  "devDependencies": {
43
48
  "@figma/plugin-typings": "*",
44
49
  "typescript": "^5.4.0"