@salesforce/webapp-template-feature-micro-frontend 1.92.0 → 1.93.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/dist/.a4drules/skills/feature-react-file-upload-file-upload/SKILL.md +396 -0
- package/dist/.a4drules/webapp-cli-commands.md +88 -0
- package/dist/.a4drules/webapp-react-code-quality.md +1 -1
- package/dist/.a4drules/webapp-react-typescript.md +1 -1
- package/dist/.a4drules/webapp-react.md +1 -1
- package/dist/.a4drules/webapp-skills-first.md +1 -1
- package/dist/.a4drules/webapp.md +2 -2
- package/dist/AGENT.md +17 -13
- package/dist/CHANGELOG.md +19 -0
- package/dist/force-app/main/default/webapplications/feature-micro-frontend/package.json +3 -3
- package/dist/package.json +1 -1
- package/dist/scripts/prepare-import-unique-fields.js +17 -3
- package/dist/scripts/setup-cli.mjs +318 -67
- package/package.json +3 -3
- package/dist/.a4drules/webapp-no-node-e.md +0 -65
|
@@ -0,0 +1,396 @@
|
|
|
1
|
+
---
|
|
2
|
+
name: feature-react-file-upload-file-upload
|
|
3
|
+
description: Add file upload functionality to React webapps with progress tracking and Salesforce ContentVersion integration. Use when the user wants to upload files, attach documents, handle file input, create file dropzones, track upload progress, or link files to Salesforce records. This feature provides programmatic APIs ONLY — no components or hooks are exported. Build your own custom UI using the upload() API. ALWAYS use this feature instead of building file upload from scratch with FormData or XHR.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# File Upload API (workflow)
|
|
7
|
+
|
|
8
|
+
When the user wants file upload functionality in a React webapp, follow this workflow. This feature provides **APIs only** — you must build the UI components yourself using the provided APIs.
|
|
9
|
+
|
|
10
|
+
## CRITICAL: This is an API-only package
|
|
11
|
+
|
|
12
|
+
The package exports **programmatic APIs**, not React components or hooks. You will:
|
|
13
|
+
|
|
14
|
+
- Use the `upload()` function to handle file uploads with progress tracking
|
|
15
|
+
- Build your own custom UI (file input, dropzone, progress bars, etc.)
|
|
16
|
+
- Track upload progress through the `onProgress` callback
|
|
17
|
+
|
|
18
|
+
**Do NOT:**
|
|
19
|
+
|
|
20
|
+
- Expect pre-built components like `<FileUpload />` — they are not exported
|
|
21
|
+
- Try to import React hooks like `useFileUpload` — they are not exported
|
|
22
|
+
- Look for dropzone components — they are not exported
|
|
23
|
+
|
|
24
|
+
The source code contains reference components for demonstration, but they are **not available** as imports. Use them as examples to build your own UI.
|
|
25
|
+
|
|
26
|
+
## 1. Install the package
|
|
27
|
+
|
|
28
|
+
```bash
|
|
29
|
+
npm install @salesforce/webapp-template-feature-react-file-upload-experimental
|
|
30
|
+
```
|
|
31
|
+
|
|
32
|
+
Dependencies are automatically installed:
|
|
33
|
+
|
|
34
|
+
- `@salesforce/webapp-experimental` (API client)
|
|
35
|
+
- `@salesforce/sdk-data` (data SDK)
|
|
36
|
+
|
|
37
|
+
## 2. Understand the three upload patterns
|
|
38
|
+
|
|
39
|
+
### Pattern A: Basic upload (no record linking)
|
|
40
|
+
|
|
41
|
+
Upload files to Salesforce and get back `contentBodyId` for each file. No ContentVersion record is created.
|
|
42
|
+
|
|
43
|
+
**When to use:**
|
|
44
|
+
|
|
45
|
+
- User wants to upload files first, then create/link them to a record later
|
|
46
|
+
- Building a multi-step form where the record doesn't exist yet
|
|
47
|
+
- Deferred record linking scenarios
|
|
48
|
+
|
|
49
|
+
```tsx
|
|
50
|
+
import { upload } from "@salesforce/webapp-template-feature-react-file-upload-experimental";
|
|
51
|
+
|
|
52
|
+
const results = await upload({
|
|
53
|
+
files: [file1, file2],
|
|
54
|
+
onProgress: (progress) => {
|
|
55
|
+
console.log(`${progress.fileName}: ${progress.status} - ${progress.progress}%`);
|
|
56
|
+
},
|
|
57
|
+
});
|
|
58
|
+
|
|
59
|
+
// results[0].contentBodyId: "069..." (always available)
|
|
60
|
+
// results[0].contentVersionId: undefined (no record linked)
|
|
61
|
+
```
|
|
62
|
+
|
|
63
|
+
### Pattern B: Upload with immediate record linking
|
|
64
|
+
|
|
65
|
+
Upload files and immediately link them to an existing Salesforce record by creating ContentVersion records.
|
|
66
|
+
|
|
67
|
+
**When to use:**
|
|
68
|
+
|
|
69
|
+
- Record already exists (Account, Opportunity, Case, etc.)
|
|
70
|
+
- User wants files immediately attached to the record
|
|
71
|
+
- Direct upload-and-attach scenarios
|
|
72
|
+
|
|
73
|
+
```tsx
|
|
74
|
+
import { upload } from "@salesforce/webapp-template-feature-react-file-upload-experimental";
|
|
75
|
+
|
|
76
|
+
const results = await upload({
|
|
77
|
+
files: [file1, file2],
|
|
78
|
+
recordId: "001xx000000yyyy", // Existing record ID
|
|
79
|
+
onProgress: (progress) => {
|
|
80
|
+
console.log(`${progress.fileName}: ${progress.status} - ${progress.progress}%`);
|
|
81
|
+
},
|
|
82
|
+
});
|
|
83
|
+
|
|
84
|
+
// results[0].contentBodyId: "069..." (always available)
|
|
85
|
+
// results[0].contentVersionId: "068..." (linked to record)
|
|
86
|
+
```
|
|
87
|
+
|
|
88
|
+
### Pattern C: Deferred record linking (record creation flow)
|
|
89
|
+
|
|
90
|
+
Upload files without a record, then link them after the record is created.
|
|
91
|
+
|
|
92
|
+
**When to use:**
|
|
93
|
+
|
|
94
|
+
- Building a "create record with attachments" form
|
|
95
|
+
- Record doesn't exist until form submission
|
|
96
|
+
- Need to upload files before knowing the final record ID
|
|
97
|
+
|
|
98
|
+
```tsx
|
|
99
|
+
import {
|
|
100
|
+
upload,
|
|
101
|
+
createContentVersion,
|
|
102
|
+
} from "@salesforce/webapp-template-feature-react-file-upload-experimental";
|
|
103
|
+
|
|
104
|
+
// Step 1: Upload files (no recordId)
|
|
105
|
+
const uploadResults = await upload({
|
|
106
|
+
files: [file1, file2],
|
|
107
|
+
onProgress: (progress) => console.log(progress),
|
|
108
|
+
});
|
|
109
|
+
|
|
110
|
+
// Step 2: Create the record
|
|
111
|
+
const newRecordId = await createRecord(formData);
|
|
112
|
+
|
|
113
|
+
// Step 3: Link uploaded files to the new record
|
|
114
|
+
for (const file of uploadResults) {
|
|
115
|
+
const contentVersionId = await createContentVersion(
|
|
116
|
+
new File([""], file.fileName),
|
|
117
|
+
file.contentBodyId,
|
|
118
|
+
newRecordId,
|
|
119
|
+
);
|
|
120
|
+
}
|
|
121
|
+
```
|
|
122
|
+
|
|
123
|
+
## 3. Build your custom UI
|
|
124
|
+
|
|
125
|
+
The package provides the backend — you build the frontend. Here's a minimal example:
|
|
126
|
+
|
|
127
|
+
```tsx
|
|
128
|
+
import {
|
|
129
|
+
upload,
|
|
130
|
+
type FileUploadProgress,
|
|
131
|
+
} from "@salesforce/webapp-template-feature-react-file-upload-experimental";
|
|
132
|
+
import { useState } from "react";
|
|
133
|
+
|
|
134
|
+
function CustomFileUpload({ recordId }: { recordId?: string }) {
|
|
135
|
+
const [progress, setProgress] = useState<Map<string, FileUploadProgress>>(new Map());
|
|
136
|
+
|
|
137
|
+
const handleFileSelect = async (event: React.ChangeEvent<HTMLInputElement>) => {
|
|
138
|
+
const files = Array.from(event.target.files || []);
|
|
139
|
+
|
|
140
|
+
await upload({
|
|
141
|
+
files,
|
|
142
|
+
recordId,
|
|
143
|
+
onProgress: (fileProgress) => {
|
|
144
|
+
setProgress((prev) => new Map(prev).set(fileProgress.fileName, fileProgress));
|
|
145
|
+
},
|
|
146
|
+
});
|
|
147
|
+
};
|
|
148
|
+
|
|
149
|
+
return (
|
|
150
|
+
<div>
|
|
151
|
+
<input type="file" multiple onChange={handleFileSelect} />
|
|
152
|
+
|
|
153
|
+
{Array.from(progress.entries()).map(([fileName, fileProgress]) => (
|
|
154
|
+
<div key={fileName}>
|
|
155
|
+
{fileName}: {fileProgress.status} - {fileProgress.progress}%
|
|
156
|
+
{fileProgress.error && <span>Error: {fileProgress.error}</span>}
|
|
157
|
+
</div>
|
|
158
|
+
))}
|
|
159
|
+
</div>
|
|
160
|
+
);
|
|
161
|
+
}
|
|
162
|
+
```
|
|
163
|
+
|
|
164
|
+
## 4. Track upload progress
|
|
165
|
+
|
|
166
|
+
The `onProgress` callback fires multiple times for each file as it moves through stages:
|
|
167
|
+
|
|
168
|
+
| Status | When | Progress Value |
|
|
169
|
+
| -------------- | ---------------------------------------------- | -------------------- |
|
|
170
|
+
| `"pending"` | File queued for upload | `0` |
|
|
171
|
+
| `"uploading"` | Upload in progress (XHR) | `0-100` (percentage) |
|
|
172
|
+
| `"processing"` | Creating ContentVersion (if recordId provided) | `0` |
|
|
173
|
+
| `"success"` | Upload complete | `100` |
|
|
174
|
+
| `"error"` | Upload failed | `0` |
|
|
175
|
+
|
|
176
|
+
**Always provide visual feedback:**
|
|
177
|
+
|
|
178
|
+
- Show file name
|
|
179
|
+
- Display current status
|
|
180
|
+
- Render progress bar for "uploading" status
|
|
181
|
+
- Show error message if status is "error"
|
|
182
|
+
|
|
183
|
+
## 5. Cancel uploads (optional)
|
|
184
|
+
|
|
185
|
+
Use an `AbortController` to allow users to cancel uploads:
|
|
186
|
+
|
|
187
|
+
```tsx
|
|
188
|
+
const abortController = new AbortController();
|
|
189
|
+
|
|
190
|
+
const handleUpload = async (files: File[]) => {
|
|
191
|
+
try {
|
|
192
|
+
await upload({
|
|
193
|
+
files,
|
|
194
|
+
signal: abortController.signal,
|
|
195
|
+
onProgress: (progress) => console.log(progress),
|
|
196
|
+
});
|
|
197
|
+
} catch (error) {
|
|
198
|
+
console.error("Upload cancelled or failed:", error);
|
|
199
|
+
}
|
|
200
|
+
};
|
|
201
|
+
|
|
202
|
+
const cancelUpload = () => {
|
|
203
|
+
abortController.abort();
|
|
204
|
+
};
|
|
205
|
+
```
|
|
206
|
+
|
|
207
|
+
## 6. Link to current user (special case)
|
|
208
|
+
|
|
209
|
+
If the user wants to upload files to their own profile or personal library:
|
|
210
|
+
|
|
211
|
+
```tsx
|
|
212
|
+
import {
|
|
213
|
+
upload,
|
|
214
|
+
getCurrentUserId,
|
|
215
|
+
} from "@salesforce/webapp-template-feature-react-file-upload-experimental";
|
|
216
|
+
|
|
217
|
+
const userId = await getCurrentUserId();
|
|
218
|
+
await upload({ files, recordId: userId });
|
|
219
|
+
```
|
|
220
|
+
|
|
221
|
+
## API Reference
|
|
222
|
+
|
|
223
|
+
### upload(options)
|
|
224
|
+
|
|
225
|
+
Main upload API that handles complete flow with progress tracking.
|
|
226
|
+
|
|
227
|
+
```typescript
|
|
228
|
+
interface UploadOptions {
|
|
229
|
+
files: File[];
|
|
230
|
+
recordId?: string | null; // If provided, creates ContentVersion
|
|
231
|
+
onProgress?: (progress: FileUploadProgress) => void;
|
|
232
|
+
signal?: AbortSignal; // Optional cancellation
|
|
233
|
+
}
|
|
234
|
+
|
|
235
|
+
interface FileUploadProgress {
|
|
236
|
+
fileName: string;
|
|
237
|
+
status: "pending" | "uploading" | "processing" | "success" | "error";
|
|
238
|
+
progress: number; // 0-100 for uploading, 0 for other states
|
|
239
|
+
error?: string;
|
|
240
|
+
}
|
|
241
|
+
|
|
242
|
+
interface FileUploadResult {
|
|
243
|
+
fileName: string;
|
|
244
|
+
size: number;
|
|
245
|
+
contentBodyId: string; // Always available
|
|
246
|
+
contentVersionId?: string; // Only if recordId was provided
|
|
247
|
+
}
|
|
248
|
+
```
|
|
249
|
+
|
|
250
|
+
**Returns:** `Promise<FileUploadResult[]>`
|
|
251
|
+
|
|
252
|
+
### createContentVersion(file, contentBodyId, recordId)
|
|
253
|
+
|
|
254
|
+
Manually create a ContentVersion record from a previously uploaded file.
|
|
255
|
+
|
|
256
|
+
```typescript
|
|
257
|
+
async function createContentVersion(
|
|
258
|
+
file: File,
|
|
259
|
+
contentBodyId: string,
|
|
260
|
+
recordId: string,
|
|
261
|
+
): Promise<string | undefined>;
|
|
262
|
+
```
|
|
263
|
+
|
|
264
|
+
**Parameters:**
|
|
265
|
+
|
|
266
|
+
- `file` — File object (used for metadata like name)
|
|
267
|
+
- `contentBodyId` — ContentBody ID from previous upload
|
|
268
|
+
- `recordId` — Record ID for FirstPublishLocationId
|
|
269
|
+
|
|
270
|
+
**Returns:** ContentVersion ID if successful
|
|
271
|
+
|
|
272
|
+
### getCurrentUserId()
|
|
273
|
+
|
|
274
|
+
Get the current user's Salesforce ID.
|
|
275
|
+
|
|
276
|
+
```typescript
|
|
277
|
+
async function getCurrentUserId(): Promise<string>;
|
|
278
|
+
```
|
|
279
|
+
|
|
280
|
+
**Returns:** Current user ID
|
|
281
|
+
|
|
282
|
+
## Common UI patterns
|
|
283
|
+
|
|
284
|
+
### File input with button
|
|
285
|
+
|
|
286
|
+
```tsx
|
|
287
|
+
<input type="file" multiple accept=".pdf,.doc,.docx,.jpg,.png" onChange={handleFileSelect} />
|
|
288
|
+
```
|
|
289
|
+
|
|
290
|
+
### Drag-and-drop zone
|
|
291
|
+
|
|
292
|
+
Build your own dropzone using native events:
|
|
293
|
+
|
|
294
|
+
```tsx
|
|
295
|
+
function DropZone({ onDrop }: { onDrop: (files: File[]) => void }) {
|
|
296
|
+
const handleDrop = (e: React.DragEvent) => {
|
|
297
|
+
e.preventDefault();
|
|
298
|
+
const files = Array.from(e.dataTransfer.files);
|
|
299
|
+
onDrop(files);
|
|
300
|
+
};
|
|
301
|
+
|
|
302
|
+
return (
|
|
303
|
+
<div
|
|
304
|
+
onDrop={handleDrop}
|
|
305
|
+
onDragOver={(e) => e.preventDefault()}
|
|
306
|
+
style={{ border: "2px dashed #ccc", padding: "2rem" }}
|
|
307
|
+
>
|
|
308
|
+
Drop files here
|
|
309
|
+
</div>
|
|
310
|
+
);
|
|
311
|
+
}
|
|
312
|
+
```
|
|
313
|
+
|
|
314
|
+
### Progress bar
|
|
315
|
+
|
|
316
|
+
```tsx
|
|
317
|
+
{
|
|
318
|
+
progress.status === "uploading" && (
|
|
319
|
+
<div style={{ width: "100%", background: "#eee" }}>
|
|
320
|
+
<div
|
|
321
|
+
style={{
|
|
322
|
+
width: `${progress.progress}%`,
|
|
323
|
+
background: "#0176d3",
|
|
324
|
+
height: "8px",
|
|
325
|
+
}}
|
|
326
|
+
/>
|
|
327
|
+
</div>
|
|
328
|
+
);
|
|
329
|
+
}
|
|
330
|
+
```
|
|
331
|
+
|
|
332
|
+
## Decision tree for agents
|
|
333
|
+
|
|
334
|
+
**User asks for file upload functionality:**
|
|
335
|
+
|
|
336
|
+
1. **Ask about record context:**
|
|
337
|
+
- "Do you want to link uploaded files to a specific record, or upload them first and link later?"
|
|
338
|
+
|
|
339
|
+
2. **Based on response:**
|
|
340
|
+
- **Link to existing record** → Use Pattern B with `recordId`
|
|
341
|
+
- **Upload first, link later** → Use Pattern A (no recordId), then Pattern C for linking
|
|
342
|
+
- **Link to current user** → Use Pattern B with `getCurrentUserId()`
|
|
343
|
+
|
|
344
|
+
3. **Build the UI:**
|
|
345
|
+
- Create file input or dropzone (not provided by package)
|
|
346
|
+
- Add progress display for each file (status + progress bar)
|
|
347
|
+
- Handle errors in the UI
|
|
348
|
+
|
|
349
|
+
4. **Test the implementation:**
|
|
350
|
+
- Verify progress callbacks fire correctly
|
|
351
|
+
- Check that `contentBodyId` is returned
|
|
352
|
+
- If `recordId` was provided, verify `contentVersionId` is returned
|
|
353
|
+
|
|
354
|
+
## Reference implementation
|
|
355
|
+
|
|
356
|
+
The package includes a reference implementation in `src/features/fileupload/` with:
|
|
357
|
+
|
|
358
|
+
- `FileUpload.tsx` — Complete component with dropzone and dialog
|
|
359
|
+
- `FileUploadDialog.tsx` — Progress tracking dialog
|
|
360
|
+
- `FileUploadDropZone.tsx` — Drag-and-drop zone
|
|
361
|
+
- `useFileUpload.ts` — React hook for state management
|
|
362
|
+
|
|
363
|
+
**These are NOT exported** but can be viewed as examples. Read the source files to understand patterns for building your own UI.
|
|
364
|
+
|
|
365
|
+
## Troubleshooting
|
|
366
|
+
|
|
367
|
+
**Upload fails with CORS error:**
|
|
368
|
+
|
|
369
|
+
- Ensure the webapp is properly deployed to Salesforce or running on `localhost`
|
|
370
|
+
- Check that the org allows the origin in CORS settings
|
|
371
|
+
|
|
372
|
+
**No progress updates:**
|
|
373
|
+
|
|
374
|
+
- Verify `onProgress` callback is provided
|
|
375
|
+
- Check that the callback function updates React state correctly
|
|
376
|
+
|
|
377
|
+
**ContentVersion not created:**
|
|
378
|
+
|
|
379
|
+
- Verify `recordId` is provided to `upload()` function
|
|
380
|
+
- Check that the record ID is valid and exists in the org
|
|
381
|
+
- Ensure user has permissions to create ContentVersion records
|
|
382
|
+
|
|
383
|
+
**Files upload but don't appear in record:**
|
|
384
|
+
|
|
385
|
+
- Verify `recordId` is correct
|
|
386
|
+
- Check that ContentVersion was created (look for `contentVersionId` in results)
|
|
387
|
+
- Confirm user has access to view files on the record
|
|
388
|
+
|
|
389
|
+
## DO NOT do these things
|
|
390
|
+
|
|
391
|
+
- ❌ Build XHR/fetch upload logic from scratch — use the `upload()` API
|
|
392
|
+
- ❌ Try to import `<FileUpload />` component — it's not exported
|
|
393
|
+
- ❌ Try to import `useFileUpload` hook — it's not exported
|
|
394
|
+
- ❌ Use third-party file upload libraries when this feature exists
|
|
395
|
+
- ❌ Skip progress tracking — always provide user feedback
|
|
396
|
+
- ❌ Ignore errors — always handle and display error messages
|
|
@@ -0,0 +1,88 @@
|
|
|
1
|
+
---
|
|
2
|
+
description: CLI command generation rules — no node -e one-liners; safe quoting and scripts
|
|
3
|
+
paths:
|
|
4
|
+
- "**/webapplications/**/*"
|
|
5
|
+
---
|
|
6
|
+
|
|
7
|
+
# A4D Enforcement: CLI Command Generation & No `node -e` One-Liners
|
|
8
|
+
|
|
9
|
+
When **generating any CLI/shell commands** (for the user or for automation), follow the rules below. Default shell is **Zsh** (macOS); commands must work there without Bash-only syntax.
|
|
10
|
+
|
|
11
|
+
---
|
|
12
|
+
|
|
13
|
+
## 1. Never Use Complex `node -e` One-Liners
|
|
14
|
+
|
|
15
|
+
**Forbidden:** `node -e` (or `node -p`, `node --eval`) for file manipulation, string replacement, reading/writing configs, or multi-line logic.
|
|
16
|
+
|
|
17
|
+
**Why:** In Zsh, `node -e '...'` **silently breaks** due to:
|
|
18
|
+
- **`!` (history expansion):** Zsh expands `!` in double-quoted strings → `event not found` or wrong output.
|
|
19
|
+
- **Backticks:** `` ` `` is command substitution; the shell runs it before Node sees the string.
|
|
20
|
+
- **Nested quoting:** Escaping differs between Bash and Zsh; multi-line JS in one string is fragile.
|
|
21
|
+
|
|
22
|
+
**Allowed:** Trivial one-line only if it has **no** backticks, no `!`, no nested quotes, no multi-line, no `fs` usage. Example: `node -e "console.log(1+1)"`. If in doubt, **use a script file**.
|
|
23
|
+
|
|
24
|
+
---
|
|
25
|
+
|
|
26
|
+
## 2. How To Generate CLI Commands Correctly
|
|
27
|
+
|
|
28
|
+
### Run Node scripts by path, not inline code
|
|
29
|
+
|
|
30
|
+
- **Do:** `node scripts/setup-cli.mjs --target-org myorg`
|
|
31
|
+
- **Do:** `node path/to/script.mjs arg1 arg2`
|
|
32
|
+
- **Do not:** `node -e "require('fs').writeFileSync(...)"` or any non-trivial inline JS.
|
|
33
|
+
|
|
34
|
+
### For file edits or transforms
|
|
35
|
+
|
|
36
|
+
1. **Prefer IDE/agent file tools** (StrReplace, Write, etc.) — they avoid the shell.
|
|
37
|
+
2. **Otherwise:** write a **temporary `.js` or `.mjs` file**, run it, then remove it. Use a **heredoc with quoted delimiter** so the shell does not interpret `$`, `` ` ``, or `!`:
|
|
38
|
+
|
|
39
|
+
```bash
|
|
40
|
+
cat > /tmp/_transform.js << 'SCRIPT'
|
|
41
|
+
const fs = require('fs');
|
|
42
|
+
const data = JSON.parse(fs.readFileSync('package.json', 'utf8'));
|
|
43
|
+
data.name = 'my-app';
|
|
44
|
+
fs.writeFileSync('package.json', JSON.stringify(data, null, 2) + '\n');
|
|
45
|
+
SCRIPT
|
|
46
|
+
node /tmp/_transform.js && rm /tmp/_transform.js
|
|
47
|
+
```
|
|
48
|
+
|
|
49
|
+
3. **Simple replacements:** `sed -i '' 's/old/new/g' path/to/file` (macOS: `-i ''`).
|
|
50
|
+
4. **JSON:** `jq '.name = "my-app"' package.json > tmp.json && mv tmp.json package.json`.
|
|
51
|
+
|
|
52
|
+
### Quoting and shell safety
|
|
53
|
+
|
|
54
|
+
- Use **single quotes** around the outer shell string when the inner content has `$`, `` ` ``, or `!`.
|
|
55
|
+
- For heredocs that must be literal, use **`<< 'END'`** (quoted delimiter) so the body is not expanded.
|
|
56
|
+
- **Do not** generate commands that rely on Bash-only features (e.g. `[[ ]]` is fine in Zsh; avoid `source` vs `.` if you need POSIX).
|
|
57
|
+
|
|
58
|
+
### Paths and working directory
|
|
59
|
+
|
|
60
|
+
- **State where to run from** when it matters, e.g. "From project root" or "From `force-app/main/default/webapplications/<appName>`".
|
|
61
|
+
- Prefer **explicit paths** or `cd <dir> && ...` so the command is copy-paste safe.
|
|
62
|
+
- This project: setup is `node scripts/setup-cli.mjs --target-org <alias>` from **project root**. Web app commands (`npm run dev`, `npm run build`, `npm run lint`) run from the **web app directory** (e.g. `force-app/main/default/webapplications/<appName>` or `**/webapplications/<appName>`).
|
|
63
|
+
|
|
64
|
+
### npm / npx
|
|
65
|
+
|
|
66
|
+
- Use **exact package names** (e.g. `npx @salesforce/webapps-features-experimental list`).
|
|
67
|
+
- For app-specific scripts, **cd to the web app directory first**, then run `npm run <script>` or `npm install ...`.
|
|
68
|
+
- Chain steps with `&&`; one logical command per line is clearer than one giant line.
|
|
69
|
+
|
|
70
|
+
### Summary checklist when generating a command
|
|
71
|
+
|
|
72
|
+
- [ ] No `node -e` / `node -p` / `node --eval` with complex or multi-line code.
|
|
73
|
+
- [ ] If Node logic is needed, use `node path/to/script.mjs` or a temp script with heredoc `<< 'SCRIPT'`.
|
|
74
|
+
- [ ] No unescaped `!`, `` ` ``, or `$` in double-quoted strings in Zsh.
|
|
75
|
+
- [ ] Working directory and required args (e.g. `--target-org`) are clear.
|
|
76
|
+
- [ ] Prefer file-editing tools over shell one-liners when editing project files.
|
|
77
|
+
|
|
78
|
+
---
|
|
79
|
+
|
|
80
|
+
## 3. Violation Handling
|
|
81
|
+
|
|
82
|
+
- If a generated command used a complex `node -e` one-liner, **revert and redo** using a script file, `sed`/`jq`, or IDE file tools.
|
|
83
|
+
- If the user sees `event not found`, `unexpected token`, or garbled output, **check for**:
|
|
84
|
+
- `node -e` with special characters,
|
|
85
|
+
- Double-quoted strings containing `!` or backticks,
|
|
86
|
+
- Wrong working directory or missing args.
|
|
87
|
+
|
|
88
|
+
**Cross-reference:** **webapp.md** (MUST FOLLOW #1) summarizes the no–`node -e` rule; this file is the full reference for CLI command generation and alternatives.
|
package/dist/.a4drules/webapp.md
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
---
|
|
2
2
|
description: Core web application rules for SFDX React apps
|
|
3
3
|
paths:
|
|
4
|
-
- "
|
|
4
|
+
- "**/webapplications/**/*"
|
|
5
5
|
---
|
|
6
6
|
|
|
7
7
|
# Skills-First (MUST FOLLOW)
|
|
@@ -81,7 +81,7 @@ Agents consistently miss these. **You must not leave them default.**
|
|
|
81
81
|
|
|
82
82
|
# Shell Command Safety (MUST FOLLOW)
|
|
83
83
|
|
|
84
|
-
**Never use complex `node -e` one-liners** for file edits or multi-line transforms. They break in Zsh due to `!` history expansion and backtick interpolation. Use a temporary `.js` file, `sed`/`awk`, `jq`, or IDE file-editing tools instead. See **webapp-
|
|
84
|
+
**Never use complex `node -e` one-liners** for file edits or multi-line transforms. They break in Zsh due to `!` history expansion and backtick interpolation. Use a temporary `.js` file, `sed`/`awk`, `jq`, or IDE file-editing tools instead. See **webapp-cli-commands.md** for full details and approved alternatives.
|
|
85
85
|
|
|
86
86
|
# Development Cycle
|
|
87
87
|
|
package/dist/AGENT.md
CHANGED
|
@@ -1,11 +1,15 @@
|
|
|
1
1
|
# Agent guide: SFDX project with React web app
|
|
2
2
|
|
|
3
|
-
This project is a **Salesforce DX (SFDX) project** containing a **React web application**. The
|
|
3
|
+
This project is a **Salesforce DX (SFDX) project** containing a **React web application**. The SFDX source path is defined in `sfdx-project.json` (`packageDirectories[].path`); the web app lives under `<sfdx-source>/webapplications/<appName>/`. Use this file when working in this directory.
|
|
4
|
+
|
|
5
|
+
## SFDX Source Path
|
|
6
|
+
|
|
7
|
+
The source path prefix is **not** always `force-app`. Read `sfdx-project.json` at the project root, take the first `packageDirectories[].path` value, and append `/main/default` to get `<sfdx-source>`. All paths below use this placeholder.
|
|
4
8
|
|
|
5
9
|
## Project layout
|
|
6
10
|
|
|
7
|
-
- **Project root**: this directory — SFDX project root. Contains `sfdx-project.json`,
|
|
8
|
-
- **React web app**:
|
|
11
|
+
- **Project root**: this directory — SFDX project root. Contains `sfdx-project.json`, the SFDX source directory, and (optionally) LWC/Aura.
|
|
12
|
+
- **React web app**: `<sfdx-source>/webapplications/<appName>/`
|
|
9
13
|
- Replace `<appName>` with the actual app folder name (e.g. `base-react-app`, or the name chosen when the app was generated).
|
|
10
14
|
- Entry: `src/App.tsx`
|
|
11
15
|
- Routes: `src/routes.tsx`
|
|
@@ -35,7 +39,7 @@ Root **does not** run the React app. The root `npm run build` is a no-op for the
|
|
|
35
39
|
**Always `cd` into the web app directory for dev/build/lint/test:**
|
|
36
40
|
|
|
37
41
|
```bash
|
|
38
|
-
cd
|
|
42
|
+
cd <sfdx-source>/webapplications/<appName>
|
|
39
43
|
```
|
|
40
44
|
|
|
41
45
|
| Command | Purpose |
|
|
@@ -54,24 +58,24 @@ cd force-app/main/default/webapplications/<appName>
|
|
|
54
58
|
|
|
55
59
|
This project includes **.a4drules/** at the project root. Follow them when generating or editing code.
|
|
56
60
|
|
|
57
|
-
When rules refer to
|
|
61
|
+
When rules refer to "web app directory" or `<sfdx-source>/webapplications/<appName>/`, resolve `<sfdx-source>` from `sfdx-project.json` and use the **actual app folder name** for this project.
|
|
58
62
|
|
|
59
63
|
## Deploying
|
|
60
64
|
|
|
61
|
-
From **this project root
|
|
65
|
+
From **this project root** (resolve the actual SFDX source path from `sfdx-project.json`):
|
|
62
66
|
|
|
63
67
|
```bash
|
|
64
|
-
# Build the React app first (replace <appName> with
|
|
65
|
-
cd
|
|
68
|
+
# Build the React app first (replace <sfdx-source> and <appName> with actual values)
|
|
69
|
+
cd <sfdx-source>/webapplications/<appName> && npm i && npm run build && cd -
|
|
66
70
|
|
|
67
|
-
# Deploy web app only
|
|
68
|
-
sf project deploy start --source-dir
|
|
71
|
+
# Deploy web app only (replace <sfdx-source> with actual path, e.g. force-app/main/default)
|
|
72
|
+
sf project deploy start --source-dir <sfdx-source>/webapplications --target-org <alias>
|
|
69
73
|
|
|
70
|
-
# Deploy all metadata
|
|
71
|
-
sf project deploy start --source-dir
|
|
74
|
+
# Deploy all metadata (use the top-level package directory, e.g. force-app)
|
|
75
|
+
sf project deploy start --source-dir <packageDir> --target-org <alias>
|
|
72
76
|
```
|
|
73
77
|
|
|
74
78
|
## Conventions (quick reference)
|
|
75
79
|
|
|
76
80
|
- **UI**: shadcn/ui + Tailwind. Import from `@/components/ui/...`.
|
|
77
|
-
- **Entry**: Keep `App.tsx` and routes in `src/`; add features as new routes or sections, don
|
|
81
|
+
- **Entry**: Keep `App.tsx` and routes in `src/`; add features as new routes or sections, don't replace the app shell but you may modify it to match the requested design.
|
package/dist/CHANGELOG.md
CHANGED
|
@@ -3,6 +3,25 @@
|
|
|
3
3
|
All notable changes to this project will be documented in this file.
|
|
4
4
|
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
|
|
5
5
|
|
|
6
|
+
# [1.93.0](https://github.com/salesforce-experience-platform-emu/webapps/compare/v1.92.1...v1.93.0) (2026-03-11)
|
|
7
|
+
|
|
8
|
+
**Note:** Version bump only for package @salesforce/webapp-template-base-sfdx-project-experimental
|
|
9
|
+
|
|
10
|
+
|
|
11
|
+
|
|
12
|
+
|
|
13
|
+
|
|
14
|
+
## [1.92.1](https://github.com/salesforce-experience-platform-emu/webapps/compare/v1.92.0...v1.92.1) (2026-03-11)
|
|
15
|
+
|
|
16
|
+
|
|
17
|
+
### Bug Fixes
|
|
18
|
+
|
|
19
|
+
* **template:** make setup-cli data import idempotent with Apex-based insert ([#254](https://github.com/salesforce-experience-platform-emu/webapps/issues/254)) ([76ee673](https://github.com/salesforce-experience-platform-emu/webapps/commit/76ee673f1f13c117f33550dfbb2fff6870a038dc))
|
|
20
|
+
|
|
21
|
+
|
|
22
|
+
|
|
23
|
+
|
|
24
|
+
|
|
6
25
|
# [1.92.0](https://github.com/salesforce-experience-platform-emu/webapps/compare/v1.91.0...v1.92.0) (2026-03-11)
|
|
7
26
|
|
|
8
27
|
**Note:** Version bump only for package @salesforce/webapp-template-base-sfdx-project-experimental
|
|
@@ -15,8 +15,8 @@
|
|
|
15
15
|
"graphql:schema": "node scripts/get-graphql-schema.mjs"
|
|
16
16
|
},
|
|
17
17
|
"dependencies": {
|
|
18
|
-
"@salesforce/sdk-data": "^1.
|
|
19
|
-
"@salesforce/webapp-experimental": "^1.
|
|
18
|
+
"@salesforce/sdk-data": "^1.93.0",
|
|
19
|
+
"@salesforce/webapp-experimental": "^1.93.0",
|
|
20
20
|
"@tailwindcss/vite": "^4.1.17",
|
|
21
21
|
"class-variance-authority": "^0.7.1",
|
|
22
22
|
"clsx": "^2.1.1",
|
|
@@ -38,7 +38,7 @@
|
|
|
38
38
|
"@graphql-eslint/eslint-plugin": "^4.1.0",
|
|
39
39
|
"@graphql-tools/utils": "^11.0.0",
|
|
40
40
|
"@playwright/test": "^1.49.0",
|
|
41
|
-
"@salesforce/vite-plugin-webapp-experimental": "^1.
|
|
41
|
+
"@salesforce/vite-plugin-webapp-experimental": "^1.93.0",
|
|
42
42
|
"@testing-library/jest-dom": "^6.6.3",
|
|
43
43
|
"@testing-library/react": "^16.1.0",
|
|
44
44
|
"@testing-library/user-event": "^14.5.2",
|
package/dist/package.json
CHANGED
|
@@ -5,7 +5,7 @@
|
|
|
5
5
|
*
|
|
6
6
|
* Usage:
|
|
7
7
|
* node scripts/prepare-import-unique-fields.js
|
|
8
|
-
* node scripts/prepare-import-unique-fields.js --data-dir /path/to
|
|
8
|
+
* node scripts/prepare-import-unique-fields.js --data-dir /path/to/<sfdx-source>/data
|
|
9
9
|
*
|
|
10
10
|
* Expects data dir to contain (optional) JSON files:
|
|
11
11
|
* Contact.json (Email with unique domain per run, LastName, FirstName, Phone — standard Contact)
|
|
@@ -18,8 +18,22 @@
|
|
|
18
18
|
const fs = require('fs');
|
|
19
19
|
const path = require('path');
|
|
20
20
|
|
|
21
|
-
|
|
22
|
-
const
|
|
21
|
+
function resolveSfdxSource() {
|
|
22
|
+
const sfdxPath = path.resolve(__dirname, '..', 'sfdx-project.json');
|
|
23
|
+
if (!fs.existsSync(sfdxPath)) {
|
|
24
|
+
console.error('Error: sfdx-project.json not found at project root.');
|
|
25
|
+
process.exit(1);
|
|
26
|
+
}
|
|
27
|
+
const sfdxProject = JSON.parse(fs.readFileSync(sfdxPath, 'utf8'));
|
|
28
|
+
const pkgDir = sfdxProject?.packageDirectories?.[0]?.path;
|
|
29
|
+
if (!pkgDir) {
|
|
30
|
+
console.error('Error: No packageDirectories[].path found in sfdx-project.json.');
|
|
31
|
+
process.exit(1);
|
|
32
|
+
}
|
|
33
|
+
return path.resolve(__dirname, '..', pkgDir, 'main', 'default');
|
|
34
|
+
}
|
|
35
|
+
|
|
36
|
+
const DEFAULT_DATA_DIR = path.resolve(resolveSfdxSource(), 'data');
|
|
23
37
|
|
|
24
38
|
function parseArgs() {
|
|
25
39
|
const args = process.argv.slice(2);
|
|
@@ -4,37 +4,56 @@
|
|
|
4
4
|
* Use this script to make setup easier for each app generated from this template.
|
|
5
5
|
*
|
|
6
6
|
* Usage:
|
|
7
|
-
* node scripts/setup-cli.mjs --target-org <alias>
|
|
7
|
+
* node scripts/setup-cli.mjs --target-org <alias> # interactive step picker (all selected)
|
|
8
|
+
* node scripts/setup-cli.mjs --target-org <alias> --yes # skip picker, run all steps
|
|
8
9
|
* node scripts/setup-cli.mjs --target-org afv5 --skip-login
|
|
9
10
|
* node scripts/setup-cli.mjs --target-org afv5 --skip-data --skip-webapp-build
|
|
10
11
|
* node scripts/setup-cli.mjs --target-org myorg --webapp-name my-app
|
|
11
12
|
*
|
|
12
13
|
* Steps (in order):
|
|
13
14
|
* 1. login — sf org login web only if org not already connected (skip with --skip-login)
|
|
14
|
-
* 2.
|
|
15
|
-
* 3.
|
|
16
|
-
* 4.
|
|
17
|
-
* 5.
|
|
18
|
-
* 6.
|
|
15
|
+
* 2. webapp — (all web apps) npm install && npm run build so dist exists for deploy (skip with --skip-webapp-build)
|
|
16
|
+
* 3. deploy — sf project deploy start --target-org <alias> (requires dist for entity deployment)
|
|
17
|
+
* 4. permset — sf org assign permset (skip with --skip-permset; name via --permset-name)
|
|
18
|
+
* 5. data — prepare unique fields + sf data import tree (skipped if no data dir/plan)
|
|
19
|
+
* 6. graphql — (in webapp) npm run graphql:schema then npm run graphql:codegen
|
|
19
20
|
* 7. dev — (in webapp) npm run dev — launch dev server (skip with --skip-dev)
|
|
20
21
|
*/
|
|
21
22
|
|
|
22
23
|
import { spawnSync } from 'node:child_process';
|
|
23
24
|
import { resolve, dirname } from 'node:path';
|
|
24
25
|
import { fileURLToPath } from 'node:url';
|
|
25
|
-
import { readdirSync, existsSync } from 'node:fs';
|
|
26
|
+
import { readdirSync, existsSync, readFileSync, writeFileSync, unlinkSync } from 'node:fs';
|
|
26
27
|
|
|
27
28
|
const __dirname = dirname(fileURLToPath(import.meta.url));
|
|
28
29
|
const ROOT = resolve(__dirname, '..');
|
|
29
|
-
|
|
30
|
-
|
|
31
|
-
const
|
|
30
|
+
|
|
31
|
+
function resolveSfdxSource() {
|
|
32
|
+
const sfdxPath = resolve(ROOT, 'sfdx-project.json');
|
|
33
|
+
if (!existsSync(sfdxPath)) {
|
|
34
|
+
console.error('Error: sfdx-project.json not found at project root.');
|
|
35
|
+
process.exit(1);
|
|
36
|
+
}
|
|
37
|
+
const sfdxProject = JSON.parse(readFileSync(sfdxPath, 'utf8'));
|
|
38
|
+
const pkgDir = sfdxProject?.packageDirectories?.[0]?.path;
|
|
39
|
+
if (!pkgDir) {
|
|
40
|
+
console.error('Error: No packageDirectories[].path found in sfdx-project.json.');
|
|
41
|
+
process.exit(1);
|
|
42
|
+
}
|
|
43
|
+
return resolve(ROOT, pkgDir, 'main', 'default');
|
|
44
|
+
}
|
|
45
|
+
|
|
46
|
+
const SFDX_SOURCE = resolveSfdxSource();
|
|
47
|
+
const WEBAPPLICATIONS_DIR = resolve(SFDX_SOURCE, 'webapplications');
|
|
48
|
+
const DATA_DIR = resolve(SFDX_SOURCE, 'data');
|
|
49
|
+
const DATA_PLAN = resolve(SFDX_SOURCE, 'data/data-plan.json');
|
|
32
50
|
|
|
33
51
|
function parseArgs() {
|
|
34
52
|
const args = process.argv.slice(2);
|
|
35
53
|
let targetOrg = null;
|
|
36
54
|
let webappName = null;
|
|
37
55
|
let permsetName = 'Property_Management_Access';
|
|
56
|
+
let yes = false;
|
|
38
57
|
const flags = {
|
|
39
58
|
skipLogin: false,
|
|
40
59
|
skipDeploy: false,
|
|
@@ -58,6 +77,7 @@ function parseArgs() {
|
|
|
58
77
|
else if (args[i] === '--skip-graphql') flags.skipGraphql = true;
|
|
59
78
|
else if (args[i] === '--skip-webapp-build') flags.skipWebappBuild = true;
|
|
60
79
|
else if (args[i] === '--skip-dev') flags.skipDev = true;
|
|
80
|
+
else if (args[i] === '--yes' || args[i] === '-y') yes = true;
|
|
61
81
|
else if (args[i] === '--help' || args[i] === '-h') {
|
|
62
82
|
console.log(`
|
|
63
83
|
Setup CLI — one-command setup for apps in this project
|
|
@@ -78,6 +98,7 @@ Options:
|
|
|
78
98
|
--skip-graphql Do not fetch schema or run GraphQL codegen
|
|
79
99
|
--skip-webapp-build Do not npm install / build the web application
|
|
80
100
|
--skip-dev Do not launch the dev server at the end
|
|
101
|
+
-y, --yes Skip interactive step picker; run all enabled steps immediately
|
|
81
102
|
-h, --help Show this help
|
|
82
103
|
`);
|
|
83
104
|
process.exit(0);
|
|
@@ -87,18 +108,10 @@ Options:
|
|
|
87
108
|
console.error('Error: --target-org <alias> is required.');
|
|
88
109
|
process.exit(1);
|
|
89
110
|
}
|
|
90
|
-
return { targetOrg, webappName, permsetName, ...flags };
|
|
111
|
+
return { targetOrg, webappName, permsetName, yes, ...flags };
|
|
91
112
|
}
|
|
92
113
|
|
|
93
|
-
function
|
|
94
|
-
if (webappName) {
|
|
95
|
-
const dir = resolve(WEBAPPLICATIONS_DIR, webappName);
|
|
96
|
-
if (!existsSync(dir)) {
|
|
97
|
-
console.error(`Error: Web app directory not found: ${dir}`);
|
|
98
|
-
process.exit(1);
|
|
99
|
-
}
|
|
100
|
-
return dir;
|
|
101
|
-
}
|
|
114
|
+
function discoverAllWebappDirs(webappName) {
|
|
102
115
|
if (!existsSync(WEBAPPLICATIONS_DIR)) {
|
|
103
116
|
console.error(`Error: webapplications directory not found: ${WEBAPPLICATIONS_DIR}`);
|
|
104
117
|
process.exit(1);
|
|
@@ -109,10 +122,23 @@ function discoverWebappDir(webappName) {
|
|
|
109
122
|
console.error(`Error: No web app folder found under ${WEBAPPLICATIONS_DIR}`);
|
|
110
123
|
process.exit(1);
|
|
111
124
|
}
|
|
112
|
-
if (
|
|
113
|
-
|
|
125
|
+
if (webappName) {
|
|
126
|
+
const requested = dirs.find((d) => d.name === webappName);
|
|
127
|
+
if (!requested) {
|
|
128
|
+
console.error(`Error: Web app directory not found: ${webappName}`);
|
|
129
|
+
process.exit(1);
|
|
130
|
+
}
|
|
131
|
+
return [resolve(WEBAPPLICATIONS_DIR, requested.name)];
|
|
132
|
+
}
|
|
133
|
+
return dirs.map((d) => resolve(WEBAPPLICATIONS_DIR, d.name));
|
|
134
|
+
}
|
|
135
|
+
|
|
136
|
+
function discoverWebappDir(webappName) {
|
|
137
|
+
const all = discoverAllWebappDirs(webappName);
|
|
138
|
+
if (all.length > 1 && !webappName) {
|
|
139
|
+
console.log(`Multiple web apps found; using first: ${all[0].split(/[/\\]/).pop()}`);
|
|
114
140
|
}
|
|
115
|
-
return
|
|
141
|
+
return all[0];
|
|
116
142
|
}
|
|
117
143
|
|
|
118
144
|
function isOrgConnected(targetOrg) {
|
|
@@ -124,6 +150,117 @@ function isOrgConnected(targetOrg) {
|
|
|
124
150
|
return result.status === 0;
|
|
125
151
|
}
|
|
126
152
|
|
|
153
|
+
function apexLiteral(value) {
|
|
154
|
+
if (value === null || value === undefined) return 'null';
|
|
155
|
+
if (typeof value === 'boolean') return String(value);
|
|
156
|
+
if (typeof value === 'number') return String(value);
|
|
157
|
+
const s = String(value);
|
|
158
|
+
if (/^\d{4}-\d{2}-\d{2}$/.test(s)) return `Date.valueOf('${s}')`;
|
|
159
|
+
if (/^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}/.test(s)) {
|
|
160
|
+
const dt = s.replace('T', ' ').replace(/\.\d+/, '').replace('Z', '');
|
|
161
|
+
return `DateTime.valueOf('${dt}')`;
|
|
162
|
+
}
|
|
163
|
+
return "'" + s.replace(/\\/g, '\\\\').replace(/'/g, "\\'") + "'";
|
|
164
|
+
}
|
|
165
|
+
|
|
166
|
+
function buildApexInsert(sobject, records, refIds) {
|
|
167
|
+
const lines = [
|
|
168
|
+
'Database.DMLOptions dmlOpts = new Database.DMLOptions();',
|
|
169
|
+
'dmlOpts.DuplicateRuleHeader.allowSave = true;',
|
|
170
|
+
`List<${sobject}> recs = new List<${sobject}>();`,
|
|
171
|
+
];
|
|
172
|
+
for (const rec of records) {
|
|
173
|
+
lines.push(`{ ${sobject} r = new ${sobject}();`);
|
|
174
|
+
for (const [key, val] of Object.entries(rec)) {
|
|
175
|
+
if (key === 'attributes') continue;
|
|
176
|
+
lines.push(`r.put('${key}', ${apexLiteral(val)});`);
|
|
177
|
+
}
|
|
178
|
+
lines.push('recs.add(r); }');
|
|
179
|
+
}
|
|
180
|
+
lines.push('Database.SaveResult[] results = Database.insert(recs, dmlOpts);');
|
|
181
|
+
const refArray = refIds.map((r) => `'${r}'`).join(',');
|
|
182
|
+
lines.push(`String[] refs = new String[]{${refArray}};`);
|
|
183
|
+
lines.push('for (Integer i = 0; i < results.size(); i++) {');
|
|
184
|
+
lines.push(" if (results[i].isSuccess()) System.debug('REF:' + refs[i] + ':' + results[i].getId());");
|
|
185
|
+
lines.push(" else System.debug('ERR:' + refs[i] + ':' + results[i].getErrors()[0].getMessage());");
|
|
186
|
+
lines.push('}');
|
|
187
|
+
return lines.join('\n');
|
|
188
|
+
}
|
|
189
|
+
|
|
190
|
+
/**
|
|
191
|
+
* Interactive multi-select: arrow keys navigate, space toggles, 'a' toggles all, enter confirms.
|
|
192
|
+
* Returns a boolean[] matching the input order. Falls through immediately when stdin is not a TTY.
|
|
193
|
+
*/
|
|
194
|
+
async function promptSteps(steps) {
|
|
195
|
+
if (!process.stdin.isTTY) return steps.map((s) => s.enabled);
|
|
196
|
+
|
|
197
|
+
const selected = steps.map((s) => s.enabled);
|
|
198
|
+
let cursor = 0;
|
|
199
|
+
const DIM = '\x1B[2m';
|
|
200
|
+
const RST = '\x1B[0m';
|
|
201
|
+
const CYAN = '\x1B[36m';
|
|
202
|
+
const GREEN = '\x1B[32m';
|
|
203
|
+
|
|
204
|
+
function render() {
|
|
205
|
+
return steps.map((s, i) => {
|
|
206
|
+
const ptr = i === cursor ? `${CYAN}❯${RST}` : ' ';
|
|
207
|
+
if (!s.available) return `${ptr} ${DIM}○ ${s.label} (n/a)${RST}`;
|
|
208
|
+
const chk = selected[i] ? `${GREEN}●${RST}` : '○';
|
|
209
|
+
return `${ptr} ${chk} ${s.label}`;
|
|
210
|
+
});
|
|
211
|
+
}
|
|
212
|
+
|
|
213
|
+
return new Promise((resolve) => {
|
|
214
|
+
process.stdin.setRawMode(true);
|
|
215
|
+
process.stdin.resume();
|
|
216
|
+
process.stdin.setEncoding('utf8');
|
|
217
|
+
process.stdout.write('\x1B[?25l');
|
|
218
|
+
console.log('\nSelect steps (↑↓ move, space toggle, a all, enter confirm):\n');
|
|
219
|
+
process.stdout.write(render().join('\n') + '\n');
|
|
220
|
+
|
|
221
|
+
function redraw() {
|
|
222
|
+
process.stdout.write(`\x1B[${steps.length}A`);
|
|
223
|
+
for (const line of render()) process.stdout.write(`\x1B[2K${line}\n`);
|
|
224
|
+
}
|
|
225
|
+
|
|
226
|
+
process.stdin.on('data', (key) => {
|
|
227
|
+
if (key === '\x03') {
|
|
228
|
+
process.stdout.write('\x1B[?25h\n');
|
|
229
|
+
process.exit(0);
|
|
230
|
+
}
|
|
231
|
+
if (key === '\r' || key === '\n') {
|
|
232
|
+
process.stdout.write('\x1B[?25h');
|
|
233
|
+
process.stdin.setRawMode(false);
|
|
234
|
+
process.stdin.pause();
|
|
235
|
+
process.stdin.removeAllListeners('data');
|
|
236
|
+
console.log();
|
|
237
|
+
resolve(selected);
|
|
238
|
+
return;
|
|
239
|
+
}
|
|
240
|
+
if (key === ' ') {
|
|
241
|
+
if (steps[cursor].available) selected[cursor] = !selected[cursor];
|
|
242
|
+
redraw();
|
|
243
|
+
return;
|
|
244
|
+
}
|
|
245
|
+
if (key === 'a') {
|
|
246
|
+
const allOn = steps.every((s, i) => !s.available || selected[i]);
|
|
247
|
+
for (let i = 0; i < steps.length; i++) {
|
|
248
|
+
if (steps[i].available) selected[i] = !allOn;
|
|
249
|
+
}
|
|
250
|
+
redraw();
|
|
251
|
+
return;
|
|
252
|
+
}
|
|
253
|
+
if (key === '\x1B[A' || key === 'k') {
|
|
254
|
+
cursor = Math.max(0, cursor - 1);
|
|
255
|
+
redraw();
|
|
256
|
+
} else if (key === '\x1B[B' || key === 'j') {
|
|
257
|
+
cursor = Math.min(steps.length - 1, cursor + 1);
|
|
258
|
+
redraw();
|
|
259
|
+
}
|
|
260
|
+
});
|
|
261
|
+
});
|
|
262
|
+
}
|
|
263
|
+
|
|
127
264
|
function run(name, cmd, args, opts = {}) {
|
|
128
265
|
const { cwd = ROOT, optional = false } = opts;
|
|
129
266
|
console.log('\n---', name, '---');
|
|
@@ -140,25 +277,52 @@ function run(name, cmd, args, opts = {}) {
|
|
|
140
277
|
return result;
|
|
141
278
|
}
|
|
142
279
|
|
|
143
|
-
function main() {
|
|
280
|
+
async function main() {
|
|
144
281
|
const {
|
|
145
282
|
targetOrg,
|
|
146
283
|
webappName,
|
|
147
284
|
permsetName,
|
|
148
|
-
|
|
149
|
-
|
|
150
|
-
|
|
151
|
-
|
|
152
|
-
|
|
153
|
-
|
|
154
|
-
|
|
285
|
+
yes,
|
|
286
|
+
skipLogin: argSkipLogin,
|
|
287
|
+
skipDeploy: argSkipDeploy,
|
|
288
|
+
skipPermset: argSkipPermset,
|
|
289
|
+
skipData: argSkipData,
|
|
290
|
+
skipGraphql: argSkipGraphql,
|
|
291
|
+
skipWebappBuild: argSkipWebappBuild,
|
|
292
|
+
skipDev: argSkipDev,
|
|
155
293
|
} = parseArgs();
|
|
156
294
|
|
|
157
|
-
const webappDir = discoverWebappDir(webappName);
|
|
158
295
|
const hasDataPlan = existsSync(DATA_PLAN) && existsSync(DATA_DIR);
|
|
159
|
-
const doData = !skipData && hasDataPlan;
|
|
160
296
|
|
|
161
|
-
|
|
297
|
+
const stepDefs = [
|
|
298
|
+
{ key: 'login', label: 'Login — org authentication', enabled: !argSkipLogin, available: true },
|
|
299
|
+
{ key: 'webappBuild', label: 'Webapp Build — npm install + build (pre-deploy)', enabled: !argSkipWebappBuild, available: true },
|
|
300
|
+
{ key: 'deploy', label: 'Deploy — sf project deploy start', enabled: !argSkipDeploy, available: true },
|
|
301
|
+
{ key: 'permset', label: `Permset — assign ${permsetName}`, enabled: !argSkipPermset, available: true },
|
|
302
|
+
{ key: 'data', label: 'Data — delete + import records via Apex', enabled: !argSkipData && hasDataPlan, available: hasDataPlan },
|
|
303
|
+
{ key: 'graphql', label: 'GraphQL — schema introspect + codegen', enabled: !argSkipGraphql, available: true },
|
|
304
|
+
{ key: 'dev', label: 'Dev — launch dev server', enabled: !argSkipDev, available: true },
|
|
305
|
+
];
|
|
306
|
+
|
|
307
|
+
const selections = yes ? stepDefs.map((s) => s.enabled) : await promptSteps(stepDefs);
|
|
308
|
+
const on = {};
|
|
309
|
+
stepDefs.forEach((s, i) => {
|
|
310
|
+
on[s.key] = selections[i];
|
|
311
|
+
});
|
|
312
|
+
|
|
313
|
+
const skipLogin = !on.login;
|
|
314
|
+
const skipWebappBuild = !on.webappBuild;
|
|
315
|
+
const skipDeploy = !on.deploy;
|
|
316
|
+
const skipPermset = !on.permset;
|
|
317
|
+
const skipData = !on.data;
|
|
318
|
+
const skipGraphql = !on.graphql;
|
|
319
|
+
const skipDev = !on.dev;
|
|
320
|
+
|
|
321
|
+
const needsWebapp = !skipWebappBuild || !skipGraphql || !skipDev;
|
|
322
|
+
const webappDir = needsWebapp ? discoverWebappDir(webappName) : null;
|
|
323
|
+
const doData = !skipData;
|
|
324
|
+
|
|
325
|
+
console.log('Setup — target org:', targetOrg, '| web app:', webappDir ?? '(none)');
|
|
162
326
|
console.log(
|
|
163
327
|
'Steps: login=%s deploy=%s permset=%s data=%s graphql=%s webapp=%s dev=%s',
|
|
164
328
|
!skipLogin,
|
|
@@ -169,11 +333,6 @@ function main() {
|
|
|
169
333
|
!skipWebappBuild,
|
|
170
334
|
!skipDev
|
|
171
335
|
);
|
|
172
|
-
if (skipData && hasDataPlan) {
|
|
173
|
-
console.log('(Data dir present; use without --skip-data to run data import.)');
|
|
174
|
-
} else if (!hasDataPlan && !skipData) {
|
|
175
|
-
console.log('(No data plan found; skipping data step.)');
|
|
176
|
-
}
|
|
177
336
|
|
|
178
337
|
if (!skipLogin) {
|
|
179
338
|
if (isOrgConnected(targetOrg)) {
|
|
@@ -184,6 +343,16 @@ function main() {
|
|
|
184
343
|
}
|
|
185
344
|
}
|
|
186
345
|
|
|
346
|
+
// Build all web apps before deploy so dist exists for entity deployment
|
|
347
|
+
if (!skipDeploy && !skipWebappBuild) {
|
|
348
|
+
const allWebappDirs = discoverAllWebappDirs(webappName);
|
|
349
|
+
for (const dir of allWebappDirs) {
|
|
350
|
+
const name = dir.split(/[/\\]/).pop();
|
|
351
|
+
run(`Web app install (${name})`, 'npm', ['install'], { cwd: dir });
|
|
352
|
+
run(`Web app build (${name})`, 'npm', ['run', 'build'], { cwd: dir });
|
|
353
|
+
}
|
|
354
|
+
}
|
|
355
|
+
|
|
187
356
|
if (!skipDeploy) {
|
|
188
357
|
run('Deploy metadata', 'sf', ['project', 'deploy', 'start', '--target-org', targetOrg], {
|
|
189
358
|
timeout: 180000,
|
|
@@ -225,36 +394,115 @@ function main() {
|
|
|
225
394
|
run('Prepare data (unique fields)', 'node', [prepareScript, '--data-dir', DATA_DIR], {
|
|
226
395
|
cwd: ROOT,
|
|
227
396
|
});
|
|
228
|
-
//
|
|
397
|
+
// Normalize Lease__c Tenant refs to 1–15 so all refs resolve (Tenant__c.json has 15 records)
|
|
398
|
+
const leasePath = resolve(DATA_DIR, 'Lease__c.json');
|
|
399
|
+
if (existsSync(leasePath)) {
|
|
400
|
+
let leaseContent = readFileSync(leasePath, 'utf8');
|
|
401
|
+
leaseContent = leaseContent.replace(/@TenantRef(\d+)/g, (_m, n) => {
|
|
402
|
+
const k = ((parseInt(n, 10) - 1) % 15) + 1;
|
|
403
|
+
return `@TenantRef${k}`;
|
|
404
|
+
});
|
|
405
|
+
writeFileSync(leasePath, leaseContent);
|
|
406
|
+
}
|
|
407
|
+
|
|
408
|
+
// Delete existing records so every run inserts the full dataset without duplicate conflicts.
|
|
409
|
+
// Reverse plan order ensures children are removed before parents (FK safety).
|
|
410
|
+
console.log('\n--- Clean existing data for fresh import ---');
|
|
411
|
+
const planEntries = JSON.parse(readFileSync(DATA_PLAN, 'utf8'));
|
|
412
|
+
const sobjectsReversed = [...planEntries.map((e) => e.sobject)].reverse();
|
|
413
|
+
const tmpApex = resolve(ROOT, '.tmp-setup-delete.apex');
|
|
414
|
+
for (const sobject of sobjectsReversed) {
|
|
415
|
+
const apexCode = [
|
|
416
|
+
'try {',
|
|
417
|
+
` List<SObject> recs = Database.query('SELECT Id FROM ${sobject} LIMIT 10000');`,
|
|
418
|
+
' if (!recs.isEmpty()) {',
|
|
419
|
+
' Database.delete(recs, false);',
|
|
420
|
+
' Database.emptyRecycleBin(recs);',
|
|
421
|
+
' }',
|
|
422
|
+
'} catch (Exception e) {',
|
|
423
|
+
' // non-deletable records (e.g. Contact linked to Case) are skipped via allOrNone=false',
|
|
424
|
+
'}',
|
|
425
|
+
].join('\n');
|
|
426
|
+
writeFileSync(tmpApex, apexCode);
|
|
427
|
+
spawnSync('sf', ['apex', 'run', '--target-org', targetOrg, '--file', tmpApex], {
|
|
428
|
+
cwd: ROOT,
|
|
429
|
+
stdio: 'pipe',
|
|
430
|
+
shell: true,
|
|
431
|
+
timeout: 60000,
|
|
432
|
+
});
|
|
433
|
+
console.log(` ${sobject}: cleaned`);
|
|
434
|
+
}
|
|
435
|
+
if (existsSync(tmpApex)) unlinkSync(tmpApex);
|
|
436
|
+
|
|
437
|
+
// Import via Anonymous Apex with Database.DMLOptions.duplicateRuleHeader.allowSave = true.
|
|
438
|
+
// This bypasses both duplicate-rule blocks AND matching-service timeouts that the REST
|
|
439
|
+
// API headers (Sforce-Duplicate-Rule-Action) cannot override.
|
|
229
440
|
console.log('\n--- Data import tree ---');
|
|
230
|
-
const
|
|
231
|
-
|
|
232
|
-
|
|
233
|
-
|
|
234
|
-
)
|
|
235
|
-
|
|
236
|
-
|
|
237
|
-
|
|
238
|
-
|
|
239
|
-
const
|
|
240
|
-
|
|
241
|
-
|
|
242
|
-
|
|
243
|
-
|
|
244
|
-
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
|
|
248
|
-
|
|
249
|
-
|
|
250
|
-
|
|
251
|
-
|
|
252
|
-
|
|
253
|
-
|
|
254
|
-
|
|
255
|
-
|
|
441
|
+
const refMap = new Map();
|
|
442
|
+
const APEX_CHAR_LIMIT = 25000;
|
|
443
|
+
const APEX_MAX_BATCH = 200;
|
|
444
|
+
|
|
445
|
+
for (const entry of planEntries) {
|
|
446
|
+
for (const file of entry.files) {
|
|
447
|
+
const data = JSON.parse(readFileSync(resolve(DATA_DIR, file), 'utf8'));
|
|
448
|
+
const records = data.records || [];
|
|
449
|
+
|
|
450
|
+
for (const rec of records) {
|
|
451
|
+
for (const key of Object.keys(rec)) {
|
|
452
|
+
if (key === 'attributes') continue;
|
|
453
|
+
const val = rec[key];
|
|
454
|
+
if (typeof val === 'string' && val.startsWith('@')) {
|
|
455
|
+
const actual = refMap.get(val.slice(1));
|
|
456
|
+
if (actual) {
|
|
457
|
+
rec[key] = actual;
|
|
458
|
+
} else if (refMap.size > 0) {
|
|
459
|
+
console.warn(` Warning: unresolved ref ${val} in ${file}`);
|
|
460
|
+
}
|
|
461
|
+
}
|
|
462
|
+
}
|
|
463
|
+
}
|
|
464
|
+
|
|
465
|
+
let imported = 0;
|
|
466
|
+
const sampleRec = records[0] || {};
|
|
467
|
+
const fieldsPerRec = Object.keys(sampleRec).filter((k) => k !== 'attributes').length;
|
|
468
|
+
const estCharsPerRec = 40 + fieldsPerRec * 55;
|
|
469
|
+
const batchSize = Math.min(APEX_MAX_BATCH, Math.max(5, Math.floor(APEX_CHAR_LIMIT / estCharsPerRec)));
|
|
470
|
+
for (let i = 0; i < records.length; i += batchSize) {
|
|
471
|
+
const batch = records.slice(i, i + batchSize);
|
|
472
|
+
const refIds = batch.map((r) => r.attributes?.referenceId || `_idx${i}`);
|
|
473
|
+
const apex = buildApexInsert(entry.sobject, batch, refIds);
|
|
474
|
+
writeFileSync(tmpApex, apex);
|
|
475
|
+
const apexResult = spawnSync(
|
|
476
|
+
'sf',
|
|
477
|
+
['apex', 'run', '--target-org', targetOrg, '--file', tmpApex],
|
|
478
|
+
{ cwd: ROOT, stdio: 'pipe', shell: true, timeout: 120000 }
|
|
479
|
+
);
|
|
480
|
+
const apexOut = apexResult.stdout?.toString() || '';
|
|
481
|
+
const apexErr = apexResult.stderr?.toString() || '';
|
|
482
|
+
if (apexResult.status !== 0 && !apexOut.includes('Compiled successfully')) {
|
|
483
|
+
console.error(` ${entry.sobject}: apex execution failed`);
|
|
484
|
+
process.stderr.write(apexErr || apexOut);
|
|
485
|
+
process.exit(1);
|
|
486
|
+
}
|
|
487
|
+
const okMatches = [...apexOut.matchAll(/\|DEBUG\|REF:([^:\n]+):(\w+)/g)];
|
|
488
|
+
const errMatches = [...apexOut.matchAll(/\|DEBUG\|ERR:([^:\n]+):([^\n]+)/g)];
|
|
489
|
+
if (errMatches.length) {
|
|
490
|
+
for (const m of errMatches.slice(0, 5)) {
|
|
491
|
+
console.error(` ${m[1]}: ${m[2].trim()}`);
|
|
492
|
+
}
|
|
493
|
+
if (errMatches.length > 5) console.error(` ... and ${errMatches.length - 5} more`);
|
|
494
|
+
console.error(`\nSetup failed at step: Data import tree (${entry.sobject})`);
|
|
495
|
+
process.exit(1);
|
|
496
|
+
}
|
|
497
|
+
if (entry.saveRefs) {
|
|
498
|
+
for (const m of okMatches) refMap.set(m[1], m[2]);
|
|
499
|
+
}
|
|
500
|
+
imported += okMatches.length;
|
|
501
|
+
}
|
|
502
|
+
console.log(` ${entry.sobject}: imported ${imported} records`);
|
|
256
503
|
}
|
|
257
504
|
}
|
|
505
|
+
if (existsSync(tmpApex)) unlinkSync(tmpApex);
|
|
258
506
|
}
|
|
259
507
|
|
|
260
508
|
if (!skipGraphql || !skipWebappBuild) {
|
|
@@ -279,4 +527,7 @@ function main() {
|
|
|
279
527
|
}
|
|
280
528
|
}
|
|
281
529
|
|
|
282
|
-
main()
|
|
530
|
+
main().catch((err) => {
|
|
531
|
+
console.error(err);
|
|
532
|
+
process.exit(1);
|
|
533
|
+
});
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@salesforce/webapp-template-feature-micro-frontend",
|
|
3
|
-
"version": "1.
|
|
3
|
+
"version": "1.93.0",
|
|
4
4
|
"description": "Micro Frontend generation feature for Web Applications",
|
|
5
5
|
"license": "SEE LICENSE IN LICENSE.txt",
|
|
6
6
|
"author": "",
|
|
@@ -18,8 +18,8 @@
|
|
|
18
18
|
"clean": "rm -rf dist"
|
|
19
19
|
},
|
|
20
20
|
"devDependencies": {
|
|
21
|
-
"@salesforce/micro-frontends-experimental": "^1.
|
|
22
|
-
"@salesforce/webapp-experimental": "^1.
|
|
21
|
+
"@salesforce/micro-frontends-experimental": "^1.93.0",
|
|
22
|
+
"@salesforce/webapp-experimental": "^1.93.0",
|
|
23
23
|
"@types/react": "^19.2.7",
|
|
24
24
|
"@types/react-dom": "^19.2.3",
|
|
25
25
|
"react-dom": "^19.2.1",
|
|
@@ -1,65 +0,0 @@
|
|
|
1
|
-
---
|
|
2
|
-
description: No complex node -e one-liners — use temp .js files or sed/awk instead
|
|
3
|
-
paths:
|
|
4
|
-
- "force-app/main/default/webapplications/**/*"
|
|
5
|
-
---
|
|
6
|
-
|
|
7
|
-
# A4D Enforcement: No `node -e` One-Liners
|
|
8
|
-
|
|
9
|
-
This project **forbids** using `node -e` (or `node -p`, `node --eval`) one-liners for any operation—file manipulation, string replacement, reading/writing configs, or shell automation.
|
|
10
|
-
|
|
11
|
-
## Why This Exists
|
|
12
|
-
|
|
13
|
-
Complex `node -e '...'` one-liners **silently break in Zsh** (the default macOS shell) because of:
|
|
14
|
-
|
|
15
|
-
- **`!` (history expansion):** Zsh interprets `!` inside double quotes as history expansion, causing `event not found` errors or silent corruption.
|
|
16
|
-
- **Backtick interpolation:** Template literals (`` ` ``) are interpreted as command substitution by the shell before Node.js ever sees them.
|
|
17
|
-
- **Nested quoting:** Multi-line JS crammed into a single shell string requires fragile quote escaping that differs between Bash and Zsh.
|
|
18
|
-
|
|
19
|
-
These failures are **silent and intermittent**—the command may appear to succeed while producing corrupt output.
|
|
20
|
-
|
|
21
|
-
## What To Do Instead
|
|
22
|
-
|
|
23
|
-
### For multi-line file edits or transforms
|
|
24
|
-
|
|
25
|
-
1. **Write a temporary `.js` script**, run it, then delete it:
|
|
26
|
-
|
|
27
|
-
```bash
|
|
28
|
-
cat > /tmp/_transform.js << 'SCRIPT'
|
|
29
|
-
const fs = require('fs');
|
|
30
|
-
const data = JSON.parse(fs.readFileSync('package.json', 'utf8'));
|
|
31
|
-
data.name = 'my-app';
|
|
32
|
-
fs.writeFileSync('package.json', JSON.stringify(data, null, 2) + '\n');
|
|
33
|
-
SCRIPT
|
|
34
|
-
node /tmp/_transform.js && rm /tmp/_transform.js
|
|
35
|
-
```
|
|
36
|
-
|
|
37
|
-
2. **Use `sed` or `awk`** for simple, single-pattern replacements with careful escaping:
|
|
38
|
-
|
|
39
|
-
```bash
|
|
40
|
-
sed -i '' 's/old-value/new-value/g' path/to/file
|
|
41
|
-
```
|
|
42
|
-
|
|
43
|
-
3. **Use `jq`** for JSON edits:
|
|
44
|
-
|
|
45
|
-
```bash
|
|
46
|
-
jq '.name = "my-app"' package.json > tmp.json && mv tmp.json package.json
|
|
47
|
-
```
|
|
48
|
-
|
|
49
|
-
4. **Use the IDE/agent file-editing tools** (replace_in_file, write_to_file, StrReplace, Write) whenever available—these bypass the shell entirely.
|
|
50
|
-
|
|
51
|
-
### For simple, truly one-line expressions
|
|
52
|
-
|
|
53
|
-
A trivial `node -e "console.log(1+1)"` with no special characters is acceptable, but **if the expression contains any of these, use a temp file instead:**
|
|
54
|
-
- Template literals (backticks)
|
|
55
|
-
- `!` characters
|
|
56
|
-
- Nested quotes
|
|
57
|
-
- Multi-line strings
|
|
58
|
-
- `fs` read/write operations
|
|
59
|
-
|
|
60
|
-
## Violation Handling
|
|
61
|
-
|
|
62
|
-
- If any prior step used a complex `node -e` one-liner, **revert and redo** using one of the approved methods above.
|
|
63
|
-
- If a shell command fails with `event not found`, `unexpected token`, or produces garbled output, check for a `node -e` violation first.
|
|
64
|
-
|
|
65
|
-
**Cross-reference:** This rule is also summarized in **webapp.md** (MUST FOLLOW #1). Both apply.
|