@createlex/figgen 1.5.0 → 1.5.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -103,9 +103,9 @@ For Ollama (fully local):
103
103
 
104
104
  | Tier | How | Cost |
105
105
  |---|---|---|
106
- | **1 — AI-native** | AI IDE calls `write_selection_to_xcode`; plugin generates code; no extra key needed | Your existing AI subscription |
107
- | **2 — BYOK** | Set `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, or `HF_API_TOKEN`; MCP server calls the model | Your API key |
108
- | **3 — CreateLex hosted** | No keys set; falls back to hosted pattern matcher | CreateLex subscription |
106
+ | **1 — AI-native** | AI IDE calls `figma_to_swiftui`; server exports assets and returns either a validated one-shot write or a prompt package for the IDE model to finish with `write_generated_swiftui_to_xcode` | Your existing AI subscription |
107
+ | **2 — BYOK** | Set `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, or `HF_API_TOKEN`; MCP server calls the model and only accepts the result if it passes a fidelity check | Your API key |
108
+ | **3 — CreateLex hosted** | No keys set; server may use hosted semantic generation, but rejects low-fidelity output and falls back to the AI-native prompt package | CreateLex subscription |
109
109
 
110
110
  ### Login (Tier 3 only)
111
111
 
@@ -121,12 +121,12 @@ npx @createlex/figgen login
121
121
  ```
122
122
  1. Select a frame in Figma
123
123
  2. Ask your AI: "Generate SwiftUI for my selected Figma frame"
124
- 3. AI calls write_selection_to_xcode
125
- 4. Plugin generates code + exports PNG/SVG assets
126
- 5. Files appear in Xcode immediately
124
+ 3. AI calls `figma_to_swiftui`
125
+ 4. If `oneShot=true`, files are already written to Xcode
126
+ 5. If `oneShot=false`, AI uses `generationPrompt` to generate SwiftUI and then calls `write_generated_swiftui_to_xcode`
127
127
  ```
128
128
 
129
- **Critical:** `write_selection_to_xcode` **overwrites the Swift file** on every call. Apply all manual refinements (adaptive layout, hidden node removal, real buttons, etc.) **after** the last MCP call, not before.
129
+ **Critical:** `write_selection_to_xcode` **overwrites the Swift file** on every call and is now best treated as a direct-write scaffold tool, not the default high-fidelity path. Apply all manual refinements (adaptive layout, hidden node removal, real buttons, etc.) **after** the last MCP call, not before.
130
130
 
131
131
  To hide a node from export: set it invisible in Figma before running. Hidden nodes (`visible: false`) are skipped in both the asset export plan and code generation.
132
132
 
@@ -159,7 +159,7 @@ To hide a node from export: set it invisible in Figma before running. Hidden nod
159
159
  | Tool | Parameters | Description |
160
160
  |---|---|---|
161
161
  | `get_design_context` | `excludeScreenshot` (bool) | Full node tree + asset export plan + generation hints. Also available via `GET http://localhost:7765/design-context` |
162
- | `get_swiftui_generation_prompt` | — | Returns `systemPrompt` + `userMessage` to feed to any LLM manually |
162
+ | `get_swiftui_generation_prompt` | — | Returns `systemPrompt` + `userMessage` for the preferred AI-native high-fidelity workflow |
163
163
  | `generate_swiftui` | `generationMode`, `scope` | Runs plugin generator; returns code without writing to disk |
164
164
  | `analyze_generation` | `generationMode` | Code + per-node diagnostics + refinement hints |
165
165
 
@@ -177,10 +177,10 @@ To hide a node from export: set it invisible in Figma before running. Hidden nod
177
177
  |---|---|---|
178
178
  | `get_project_path` | — | Reads saved Xcode source folder |
179
179
  | `set_project_path` | `path` | Persists Xcode project path to config |
180
- | `write_selection_to_xcode` | `generationMode`, `includeOverflow`, `projectPath`, `nodeIds` | **Primary tool.** Generates from Figma selection + exports PNG/SVG assets + writes Swift + asset catalog in one call |
180
+ | `write_selection_to_xcode` | `generationMode`, `includeOverflow`, `projectPath`, `nodeIds` | Direct-write scaffold tool. Generates from Figma selection + exports PNG/SVG assets + writes Swift + asset catalog in one call |
181
181
  | `write_generated_swiftui_to_xcode` | `code`, `structName`, `images[]`, `additionalFiles[]`, `projectPath` | Writes pre-generated Swift code + manually provided image payloads |
182
182
  | `write_svg_to_xcode` | `nodeId`, `assetName` | Exports a single SVG directly to Assets.xcassets |
183
- | `figma_to_swiftui` | `scope`, `generationMode` | One-shot: generates code + assets + tokens + components in a single response |
183
+ | `figma_to_swiftui` | `scope`, `generationMode` | Preferred universal entrypoint. Exports assets, then either writes validated one-shot output or returns an AI-native prompt package |
184
184
 
185
185
  ### Component analysis
186
186
 
@@ -190,8 +190,8 @@ To hide a node from export: set it invisible in Figma before running. Hidden nod
190
190
 
191
191
  ### `write_selection_to_xcode` vs `write_generated_swiftui_to_xcode`
192
192
 
193
- - **`write_selection_to_xcode`** — plugin generates code AND exports assets automatically. One call does everything. This is what the AI should use in the standard workflow.
194
- - **`write_generated_swiftui_to_xcode`** — you supply the Swift code string and base64-encoded image payloads. Use when an external LLM generated the code and you need to write it plus manually gathered assets to disk.
193
+ - **`write_selection_to_xcode`** — plugin generates code AND exports assets automatically. Use this when you explicitly want an immediate local scaffold write and can tolerate lower fidelity than the AI-native flow.
194
+ - **`write_generated_swiftui_to_xcode`** — you supply the Swift code string and base64-encoded image payloads. This is the preferred final write step when an external or IDE-integrated LLM generated the code from the prompt package.
195
195
 
196
196
  ---
197
197
 
@@ -316,13 +316,14 @@ function startBridgeServer(options = {}) {
316
316
  const assetCount = data.assetExportPlan?.candidates?.length ?? 0;
317
317
  const aiWorkflow = {
318
318
  IF_YOU_HAVE_MCP_ACCESS: {
319
- step1: 'Call the MCP tool `write_selection_to_xcode` with generationMode="editable" (the default). It exports individual PNG assets with correct blendMode info and generates Image("name").blendMode() calls automatically.',
320
- step2: 'After write_selection_to_xcode returns, ONLY refine: add GeometryReader for responsiveness, wire @State interactions. DO NOT rewrite Image("name") calls, DO NOT switch to fidelity mode unless the whole frame is a single photo with no text.',
321
- mcpTool: 'write_selection_to_xcode',
319
+ step1: 'Call the MCP tool `figma_to_swiftui` with generationMode="editable" (the default). It exports the assets first, then either writes SwiftUI directly if the one-shot result passes a fidelity check, or returns an AI-native prompt package.',
320
+ step2: 'If figma_to_swiftui returns oneShot=false, use generationPrompt.systemPrompt as the system message and generationPrompt.userMessage as the user message, then write the result with `write_generated_swiftui_to_xcode`.',
321
+ step3: 'After code is written, ONLY refine: add GeometryReader for responsiveness, wire @State interactions, and extract reusable subviews. DO NOT rewrite Image("name") calls, DO NOT switch to fidelity mode unless the whole frame is a single photo with no text.',
322
+ mcpTool: 'figma_to_swiftui',
322
323
  mcpServer: 'figma-swiftui',
323
324
  },
324
325
  IF_NO_MCP_ACCESS: {
325
- instruction: 'Use _generationPrompt.systemPrompt as the system message and _generationPrompt.userMessage as the user message to generate native SwiftUI code. Output individual SwiftUI views — Text(), VStack, HStack, ZStack, Image("name") — NOT a single composite Image().',
326
+ instruction: 'Use _generationPrompt.systemPrompt as the system message and _generationPrompt.userMessage as the user message to generate native SwiftUI code. Output individual SwiftUI views — Text(), VStack, HStack, ZStack, Image("name") — NOT a single composite Image() or placeholder scaffold.',
326
327
  rule1: 'Every asset in assetExportPlan.candidates MUST become Image("assetName").resizable() — never Rectangle() or Color.',
327
328
  rule2: 'Text nodes become Text("content") with .font(.system(size: X, weight: .bold)) — NEVER custom font names.',
328
329
  rule3: 'Output a full SwiftUI View struct with individual elements positioned using the node geometry in the design context.',
@@ -106,6 +106,59 @@ function ensureRuntimeAuthorized() {
106
106
  }
107
107
  }
108
108
 
109
+ function countImageReferences(code, imageNames = []) {
110
+ if (!code) return 0;
111
+ if (imageNames.length === 0) {
112
+ return (code.match(/Image\("/g) || []).length;
113
+ }
114
+ return imageNames.reduce((count, imageName) => {
115
+ return count + (code.includes(`Image("${imageName}")`) ? 1 : 0);
116
+ }, 0);
117
+ }
118
+
119
+ function countTextLiterals(root) {
120
+ if (!root || typeof root !== 'object') return 0;
121
+ let count = 0;
122
+ const stack = [root];
123
+ while (stack.length > 0) {
124
+ const current = stack.pop();
125
+ if (!current || typeof current !== 'object') continue;
126
+ if (current.type === 'TEXT' && current.text?.characters) {
127
+ count += 1;
128
+ }
129
+ if (Array.isArray(current.children)) {
130
+ stack.push(...current.children);
131
+ }
132
+ }
133
+ return count;
134
+ }
135
+
136
+ function isLowFidelitySemanticResult({ code, context, imageNames = [] }) {
137
+ if (!code || typeof code !== 'string') return true;
138
+
139
+ const metadata = context?.metadata;
140
+ const root = Array.isArray(metadata?.nodes) ? metadata.nodes[0] : metadata;
141
+ const assetCandidates = context?.assetExportPlan?.candidates ?? [];
142
+ const imageReferenceCount = countImageReferences(code, imageNames);
143
+ const textNodeCount = countTextLiterals(root);
144
+ const primitiveCount = (code.match(/\b(Text|Image|Button|TextField|Toggle|Picker|VStack|HStack|ZStack)\b/g) || []).length;
145
+ const rectangleCount = (code.match(/\bRectangle\(/g) || []).length;
146
+
147
+ if (assetCandidates.length >= 3 && imageReferenceCount === 0) {
148
+ return true;
149
+ }
150
+
151
+ if (textNodeCount >= 3 && primitiveCount <= 8) {
152
+ return true;
153
+ }
154
+
155
+ if (rectangleCount > 0 && imageReferenceCount < Math.min(2, assetCandidates.length)) {
156
+ return true;
157
+ }
158
+
159
+ return false;
160
+ }
161
+
109
162
  async function tryHostedSemanticGeneration({ nodeIds, generationMode, includeOverflow, analyze }) {
110
163
  if (generationMode !== 'editable') {
111
164
  return null;
@@ -123,23 +176,27 @@ async function tryHostedSemanticGeneration({ nodeIds, generationMode, includeOve
123
176
  try {
124
177
  const byokResult = await generateWithLocalKey(context, generationMode);
125
178
  if (byokResult?.handled) {
126
- const metadata = context?.metadata;
127
- const selection = Array.isArray(metadata?.nodes)
128
- ? { ids: metadata.nodes.map((n) => n.id), names: metadata.nodes.map((n) => n.name) }
129
- : { ids: [metadata?.id], names: [metadata?.name] };
130
-
131
- if (analyze) {
132
- return {
133
- selection: metadata,
134
- generated: { ...byokResult, imageCount: 0, imageNames: [] },
135
- assetExportPlan: context?.assetExportPlan ?? null,
136
- reusableComponents: context?.reusableComponents ?? null,
137
- generationHints: context?.generationHints ?? null,
138
- manualRefinementHints: context?.generationHints?.manualRefinementHints ?? [],
139
- hosted: false,
140
- };
179
+ if (!analyze && isLowFidelitySemanticResult({ code: byokResult.code, context })) {
180
+ console.warn('[figgen] Rejecting low-fidelity BYOK semantic result; falling back.');
181
+ } else {
182
+ const metadata = context?.metadata;
183
+ const selection = Array.isArray(metadata?.nodes)
184
+ ? { ids: metadata.nodes.map((n) => n.id), names: metadata.nodes.map((n) => n.name) }
185
+ : { ids: [metadata?.id], names: [metadata?.name] };
186
+
187
+ if (analyze) {
188
+ return {
189
+ selection: metadata,
190
+ generated: { ...byokResult, imageCount: 0, imageNames: [] },
191
+ assetExportPlan: context?.assetExportPlan ?? null,
192
+ reusableComponents: context?.reusableComponents ?? null,
193
+ generationHints: context?.generationHints ?? null,
194
+ manualRefinementHints: context?.generationHints?.manualRefinementHints ?? [],
195
+ hosted: false,
196
+ };
197
+ }
198
+ return { ...byokResult, selection, imageCount: 0, imageNames: [], hosted: false };
141
199
  }
142
- return { ...byokResult, selection, imageCount: 0, imageNames: [], hosted: false };
143
200
  }
144
201
  } catch (byokError) {
145
202
  console.error('[figgen] BYOK generation failed, falling back to hosted:', byokError.message);
@@ -167,6 +224,11 @@ async function tryHostedSemanticGeneration({ nodeIds, generationMode, includeOve
167
224
  return null;
168
225
  }
169
226
 
227
+ if (!analyze && isLowFidelitySemanticResult({ code: data.code, context })) {
228
+ console.warn('[figgen] Rejecting low-fidelity hosted semantic result; falling back.');
229
+ return null;
230
+ }
231
+
170
232
  const selection = Array.isArray(metadata?.nodes)
171
233
  ? {
172
234
  ids: metadata.nodes.map((node) => node.id),
@@ -423,7 +485,7 @@ server.registerTool('get_metadata', {
423
485
  });
424
486
 
425
487
  server.registerTool('get_design_context', {
426
- description: 'Return node metadata, asset export candidates, and generation hints for the current Figma selection. PREFERRED WORKFLOW: call figma_to_swiftui instead it handles LLM generation, Image("name") asset refs, PNG export, DesignTokens, and reusable components in one shot. Only call get_design_context if you need to inspect the raw node tree before generating.',
488
+ description: 'Return node metadata, asset export candidates, and generation hints for the current Figma selection. For high-fidelity results across all models and IDEs, use this with get_swiftui_generation_prompt or figma_to_swiftui\'s AI-native fallback instead of relying on a one-shot scaffold write.',
427
489
  inputSchema: {
428
490
  nodeIds: z.array(z.string()).optional().describe('Optional list of Figma node ids. If omitted, uses the current selection'),
429
491
  nodeId: z.string().optional().describe('Optional single Figma node id'),
@@ -441,7 +503,7 @@ server.registerTool('get_design_context', {
441
503
  });
442
504
 
443
505
  server.registerTool('get_swiftui_generation_prompt', {
444
- description: 'Return a ready-to-use SwiftUI system prompt and user message for AI-native generation. PREFERRED: use figma_to_swiftui instead — it pre-writes assets and handles LLM generation automatically. Only use get_swiftui_generation_prompt when you need the raw prompt. If you do: (a) use .font(.system(size:weight:)) never hardcode custom font names like Inter or Roboto, (b) reference every assetExportPlan entry as Image("name") not Rectangle().',
506
+ description: 'Return a ready-to-use SwiftUI system prompt and user message for AI-native generation. This is the preferred high-fidelity path for all models and IDEs: generate code from the prompt, then write it with write_generated_swiftui_to_xcode. Use .font(.system(size:weight:)) and reference every exported asset as Image("name"), never Rectangle().',
445
507
  inputSchema: {
446
508
  nodeIds: z.array(z.string()).optional().describe('Optional list of Figma node ids. If omitted, uses the current selection'),
447
509
  nodeId: z.string().optional().describe('Optional single Figma node id'),
@@ -768,7 +830,7 @@ server.registerTool('write_generated_swiftui_to_xcode', {
768
830
  });
769
831
 
770
832
  server.registerTool('write_selection_to_xcode', {
771
- description: 'Generate SwiftUI from the connected Figma selection and write it into the configured Xcode project. THIS IS THE CORRECT TOOL TO CALL it exports real PNG assets to Assets.xcassets and generates Image("name") references automatically. Always use the default generationMode="editable" individual assets include blendModeSwiftUI values so blend modes are applied correctly on each Image(). Only use generationMode="fidelity" when the ENTIRE frame is a single photographic image or illustration with no text, no interactive elements, and no distinct layers (e.g. a full-bleed photo background).',
833
+ description: 'Generate SwiftUI from the connected Figma selection and write it directly into the configured Xcode project. Use this when you explicitly want an immediate local scaffold write. For the most reliable cross-model, cross-IDE high-fidelity workflow, prefer figma_to_swiftui or get_swiftui_generation_prompt plus write_generated_swiftui_to_xcode. Always use generationMode="editable" unless the ENTIRE frame is a single photographic image or illustration with no text or interactive elements.',
772
834
  inputSchema: {
773
835
  nodeIds: z.array(z.string()).optional().describe('Optional list of Figma node ids. If omitted, uses the current selection'),
774
836
  includeOverflow: z.boolean().default(false).describe('Ignore Figma clipping when generating layout'),
@@ -828,18 +890,8 @@ server.registerTool('write_selection_to_xcode', {
828
890
  includeImages: true,
829
891
  });
830
892
 
831
- const images = Array.isArray(generated.images) ? generated.images : [];
832
- const imageNames = images.map((img) => img.name);
833
-
834
- // If the hosted provider returned a generic template that doesn't reference any exported
835
- // assets, fall back to the plugin-generated code which has correct Image("name") calls.
836
- const hostedMissedAssets =
837
- imageNames.length > 0 &&
838
- llmCode &&
839
- !imageNames.some((n) => llmCode.includes(`Image("${n}")`));
840
-
841
- // Prefer LLM-generated code only when it actually wires up the exported assets.
842
- const finalCode = hostedMissedAssets ? (generated.code || llmCode) : (llmCode || generated.code);
893
+ // Prefer LLM-generated code over template-generated code
894
+ const finalCode = llmCode || generated.code;
843
895
 
844
896
  const effectiveStructName = inferStructName({
845
897
  code: finalCode,
@@ -850,7 +902,7 @@ server.registerTool('write_selection_to_xcode', {
850
902
  targetDir,
851
903
  code: finalCode,
852
904
  structName: effectiveStructName,
853
- images,
905
+ images: Array.isArray(generated.images) ? generated.images : [],
854
906
  additionalFiles: llmAdditionalFiles,
855
907
  });
856
908
 
@@ -870,6 +922,8 @@ server.registerTool('write_selection_to_xcode', {
870
922
  filePath: result.results?.swiftFile ?? null,
871
923
  });
872
924
 
925
+ const imageNames = (generated.images ?? []).map((img) => img.name);
926
+
873
927
  return jsonResult(buildCompactResponse({
874
928
  result,
875
929
  structName: effectiveStructName,
@@ -894,9 +948,9 @@ server.registerTool('write_selection_to_xcode', {
894
948
 
895
949
  server.registerTool('figma_to_swiftui', {
896
950
  description:
897
- 'One-shot Figma-to-SwiftUI: generates production-ready SwiftUI code from the current Figma selection (or all page frames) and writes it to the Xcode project — including PNG assets, DesignTokens.swift, and reusable components. ' +
898
- 'With BYOK keys (ANTHROPIC_API_KEY / OPENAI_API_KEY) or a CreateLex subscription this is a TRUE single-call operation. ' +
899
- 'Without those, it pre-writes assets and returns a structured prompt so the calling AI can generate code and finish with one more write_generated_swiftui_to_xcode call. ' +
951
+ 'Universal Figma-to-SwiftUI entrypoint for all models and IDEs. It inspects the current Figma selection (or all page frames), exports PNG assets, and then either: ' +
952
+ '(a) writes SwiftUI directly if one-shot generation passes a fidelity check, or ' +
953
+ '(b) returns a structured AI-native prompt package so the calling model can generate code and finish with write_generated_swiftui_to_xcode. ' +
900
954
  'Use scope="page" to batch-generate every top-level frame on the current page.',
901
955
  inputSchema: {
902
956
  nodeIds: z.array(z.string()).optional().describe('Optional list of Figma node ids. If omitted, uses the current selection'),
@@ -950,13 +1004,18 @@ server.registerTool('figma_to_swiftui', {
950
1004
  // 4. Attempt LLM generation (Tier 2 BYOK → Tier 3 Hosted)
951
1005
  // -------------------------------------------------------------------------
952
1006
  let llmResult = null;
1007
+ let fidelityFallback = null;
953
1008
 
954
1009
  // Tier 2: BYOK
955
1010
  if (process.env.ANTHROPIC_API_KEY || process.env.HF_API_TOKEN || process.env.OPENAI_API_KEY) {
956
1011
  try {
957
1012
  const byokResult = await generateWithLocalKey(context, generationMode);
958
1013
  if (byokResult?.handled) {
959
- llmResult = byokResult;
1014
+ if (isLowFidelitySemanticResult({ code: byokResult.code, context, imageNames })) {
1015
+ fidelityFallback = `Rejected low-fidelity BYOK output from ${byokResult.provider}. Returning the AI-native prompt package instead.`;
1016
+ } else {
1017
+ llmResult = byokResult;
1018
+ }
960
1019
  }
961
1020
  } catch (err) {
962
1021
  console.error('[figgen] BYOK generation failed:', err?.message ?? err);
@@ -973,10 +1032,9 @@ server.registerTool('figma_to_swiftui', {
973
1032
  context,
974
1033
  generationMode,
975
1034
  includeOverflow,
976
- imageNames,
977
1035
  });
978
1036
  if (response.ok && data?.handled) {
979
- llmResult = {
1037
+ const hostedResult = {
980
1038
  handled: true,
981
1039
  provider: data.provider || 'createlex-hosted',
982
1040
  code: data.code,
@@ -984,6 +1042,11 @@ server.registerTool('figma_to_swiftui', {
984
1042
  componentFiles: [],
985
1043
  diagnostics: data.diagnostics || [],
986
1044
  };
1045
+ if (isLowFidelitySemanticResult({ code: hostedResult.code, context, imageNames })) {
1046
+ fidelityFallback = `Rejected low-fidelity hosted output from ${hostedResult.provider}. Returning the AI-native prompt package instead.`;
1047
+ } else {
1048
+ llmResult = hostedResult;
1049
+ }
987
1050
  }
988
1051
  }
989
1052
  } catch (err) {
@@ -996,14 +1059,7 @@ server.registerTool('figma_to_swiftui', {
996
1059
  // -------------------------------------------------------------------------
997
1060
  if (llmResult) {
998
1061
  const parsed = parseClaudeResponse(llmResult.code || '');
999
- const llmCode = parsed.code || llmResult.code || '';
1000
- // If the hosted provider returned a generic template that doesn't reference any exported
1001
- // assets, fall back to the plugin-generated code which has correct Image("name") calls.
1002
- const hostedMissedAssets =
1003
- llmResult.provider === 'createlex-hosted' &&
1004
- imageNames.length > 0 &&
1005
- !imageNames.some((n) => llmCode.includes(`Image("${n}")`));
1006
- const finalCode = hostedMissedAssets ? (generated.code || llmCode) : (llmCode || generated.code);
1062
+ const finalCode = parsed.code || llmResult.code || generated.code;
1007
1063
  const effectiveStructName = inferStructName({
1008
1064
  code: finalCode,
1009
1065
  selectionNames: generated.selection?.names ?? [],
@@ -1157,6 +1213,7 @@ server.registerTool('figma_to_swiftui', {
1157
1213
  assetsPreWritten
1158
1214
  ? `🖼 ${imageNames.length} PNG asset(s) pre-written to Assets.xcassets: ${imageNames.join(', ')}.`
1159
1215
  : null,
1216
+ fidelityFallback,
1160
1217
  'Generate SwiftUI code using the generationPrompt above (systemPrompt + userMessage).',
1161
1218
  'Output code using <file name="StructName.swift"> XML tags.',
1162
1219
  'Then call write_generated_swiftui_to_xcode with the generated code and images:[] (assets are already on disk).',
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@createlex/figgen",
3
- "version": "1.5.0",
3
+ "version": "1.5.3",
4
4
  "description": "CreateLex MCP runtime for Figma-to-SwiftUI generation and Xcode export",
5
5
  "bin": {
6
6
  "figgen": "bin/figgen.js"