@projectdochelp/s3te 3.1.2 → 3.1.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -59,7 +59,9 @@ The website bucket contains the finished result that visitors actually receive t
59
59
  <details>
60
60
  <summary>What happens when I deploy?</summary>
61
61
 
62
- `s3te deploy` validates the project, packages the AWS runtime, creates or updates one persistent CloudFormation environment stack, creates one temporary CloudFormation deploy stack for packaging artifacts, synchronizes your current source files into the code bucket, and removes the temporary stack again after the real deploy run.
62
+ `s3te deploy` loads the validated project configuration, packages the AWS runtime, creates or updates one persistent CloudFormation environment stack, creates one temporary CloudFormation deploy stack for packaging artifacts, synchronizes your current source files into the code bucket, and removes the temporary stack again after the real deploy run.
63
+
64
+ That source sync is not limited to Lambda code. It includes your `.html`, `.part`, CSS, JavaScript, images and other project files so the running AWS stack can react to source changes inside the code bucket.
63
65
 
64
66
  The persistent environment stack contains the long-lived AWS resources such as buckets, Lambda functions, DynamoDB tables, CloudFront distributions and the runtime manifest parameter. The temporary deploy stack exists only so CloudFormation can consume the packaged Lambda artifacts cleanly.
65
67
 
@@ -171,6 +173,9 @@ The default scaffold creates:
171
173
  mywebsite/
172
174
  package.json
173
175
  s3te.config.json
176
+ .github/
177
+ workflows/
178
+ s3te-sync.yml
174
179
  app/
175
180
  part/
176
181
  head.part
@@ -187,6 +192,8 @@ mywebsite/
187
192
  extensions.json
188
193
  ```
189
194
 
195
+ The generated `.github/workflows/s3te-sync.yml` is the default CI path for GitHub-based source publishing into the S3TE code bucket. It is scaffolded once and then left alone on later `s3te init` runs unless you use `--force`.
196
+
190
197
  </details>
191
198
 
192
199
  <details>
@@ -238,13 +245,15 @@ npx s3te deploy --env dev
238
245
 
239
246
  `deploy` creates or updates the persistent environment stack, uses a temporary deploy stack for packaged Lambda artifacts, synchronizes the source project into the code bucket, and removes the temporary stack again when the deploy finishes.
240
247
 
248
+ After the first successful deploy, use `s3te sync --env dev` for regular template, partial, asset and source updates when the infrastructure itself did not change.
249
+
241
250
  If you left `route53HostedZoneId` out of the config, the last DNS step stays manual: point your domain at the created CloudFront distribution after deploy.
242
251
 
243
252
  </details>
244
253
 
245
254
  ## Usage
246
255
 
247
- Once the project is installed, your everyday loop is deliberately small: edit templates, validate, render locally, then deploy.
256
+ Once the project is installed, your everyday loop splits into two paths: deploy when infrastructure changes, sync when only project sources changed.
248
257
 
249
258
  ### Daily Workflow
250
259
 
@@ -255,15 +264,24 @@ Once the project is installed, your everyday loop is deliberately small: edit te
255
264
  2. If you use content-driven tags without Webiny, edit `offline/content/en.json` or `offline/content/items.json`.
256
265
  3. Validate and render locally.
257
266
  4. Run your tests.
258
- 5. Deploy when the result looks right.
267
+ 5. Use `deploy` for the first installation or after infrastructure/config/runtime changes.
268
+ 6. Use `sync` for day-to-day source publishing into the code bucket.
259
269
 
260
270
  ```bash
261
271
  npx s3te validate
262
272
  npx s3te render --env dev
263
273
  npx s3te test
274
+ npx s3te sync --env dev
275
+ ```
276
+
277
+ Use a full deploy only when needed:
278
+
279
+ ```bash
264
280
  npx s3te deploy --env dev
265
281
  ```
266
282
 
283
+ Once Webiny is installed and the stack is deployed with Webiny enabled, CMS content changes are picked up in AWS through the DynamoDB stream integration. Those content changes do not require another `sync` or `deploy`.
284
+
267
285
  </details>
268
286
 
269
287
  ### CLI Commands
@@ -278,6 +296,7 @@ npx s3te deploy --env dev
278
296
  | `s3te render --env <name>` | Renders locally into `offline/S3TELocal/preview/<env>/...`. |
279
297
  | `s3te test` | Runs the project tests from `offline/tests/`. |
280
298
  | `s3te package --env <name>` | Builds the AWS deployment artifacts without deploying them yet. |
299
+ | `s3te sync --env <name>` | Uploads current project sources into the configured code buckets. |
281
300
  | `s3te doctor --env <name>` | Checks local machine and AWS access before deploy. |
282
301
  | `s3te deploy --env <name>` | Deploys or updates the AWS environment and syncs source files. |
283
302
  | `s3te migrate` | Updates older project configs and can retrofit Webiny into an existing S3TE project. |
@@ -394,7 +413,100 @@ npx s3te doctor --env prod
394
413
  npx s3te deploy --env prod
395
414
  ```
396
415
 
397
- That deploy updates the existing environment stack and adds the Webiny mirror resources to it. You do not need a fresh S3TE installation.
416
+ That deploy updates the existing environment stack and adds the Webiny mirror resources to it. You do not need a fresh S3TE installation. After that, Webiny content changes flow through the deployed AWS resources automatically; only template or asset changes still need `s3te sync --env <name>`.
417
+
418
+ </details>
419
+
420
+ <details>
421
+ <summary>GitHub Actions source publishing</summary>
422
+
423
+ If your team works through GitHub instead of running `s3te sync` locally, the scaffold already includes `.github/workflows/s3te-sync.yml`.
424
+
425
+ That workflow is meant for source publishing only:
426
+
427
+ - it validates the project
428
+ - it uploads `app/...` and `part/...` into the S3TE code bucket
429
+ - the resulting S3 events trigger the deployed Lambda pipeline in AWS
430
+
431
+ Use a full `deploy` only when the infrastructure, environment config, or runtime package changes.
432
+
433
+ Before the workflow can run, do this once:
434
+
435
+ 1. Run the first real `npx s3te deploy --env <name>` so the code bucket already exists.
436
+ 2. In AWS IAM, create an access key for a CI user that may sync only the S3TE code bucket for that environment.
437
+ 3. In GitHub open `Settings -> Secrets and variables -> Actions -> Secrets`.
438
+ 4. Add these repository secrets:
439
+ - `AWS_ACCESS_KEY_ID`
440
+ - `AWS_SECRET_ACCESS_KEY`
441
+ 5. Open `.github/workflows/s3te-sync.yml` and adjust:
442
+ - the branch under `on.push.branches`
443
+ - `aws-region`
444
+ - `npx s3te sync --env dev` to your target environment such as `prod` or `test`
445
+
446
+ Minimal IAM policy example for one code bucket:
447
+
448
+ ```json
449
+ {
450
+ "Version": "2012-10-17",
451
+ "Statement": [
452
+ {
453
+ "Effect": "Allow",
454
+ "Action": ["s3:ListBucket"],
455
+ "Resource": ["arn:aws:s3:::dev-website-code-mywebsite"]
456
+ },
457
+ {
458
+ "Effect": "Allow",
459
+ "Action": ["s3:GetObject", "s3:PutObject", "s3:DeleteObject"],
460
+ "Resource": ["arn:aws:s3:::dev-website-code-mywebsite/*"]
461
+ }
462
+ ]
463
+ }
464
+ ```
465
+
466
+ For non-production environments or additional variants, use the derived code bucket names from your config, for example `test-website-code-mywebsite` or `app-code-mywebsite`.
467
+
468
+ The scaffolded workflow looks like this:
469
+
470
+ ```yaml
471
+ name: S3TE Sync
472
+ on:
473
+ workflow_dispatch:
474
+ push:
475
+ branches: ["main"]
476
+ paths:
477
+ - "app/**"
478
+ - "package.json"
479
+ - "package-lock.json"
480
+ - ".github/workflows/s3te-sync.yml"
481
+
482
+ jobs:
483
+ sync:
484
+ runs-on: ubuntu-latest
485
+ permissions:
486
+ contents: read
487
+ steps:
488
+ - uses: actions/checkout@v4
489
+ - uses: actions/setup-node@v4
490
+ with:
491
+ node-version: 22
492
+ cache: npm
493
+ - name: Install dependencies
494
+ shell: bash
495
+ run: |
496
+ if [ -f package-lock.json ]; then
497
+ npm ci
498
+ else
499
+ npm install
500
+ fi
501
+ - name: Configure AWS credentials
502
+ uses: aws-actions/configure-aws-credentials@v4
503
+ with:
504
+ aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
505
+ aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
506
+ aws-region: eu-central-1
507
+ - run: npx s3te validate
508
+ - run: npx s3te sync --env dev
509
+ ```
398
510
 
399
511
  </details>
400
512
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@projectdochelp/s3te",
3
- "version": "3.1.2",
3
+ "version": "3.1.3",
4
4
  "description": "CLI, render core, AWS adapter, and testkit for S3TemplateEngine projects",
5
5
  "repository": {
6
6
  "type": "git",
@@ -10,6 +10,7 @@ import { ensureAwsCliAvailable, ensureAwsCredentials, runAwsCli } from "./aws-cl
10
10
  import { resolveRequestedFeatures } from "./features.mjs";
11
11
  import { buildAwsRuntimeManifest, extractStackOutputsMap } from "./manifest.mjs";
12
12
  import { packageAwsProject } from "./package.mjs";
13
+ import { syncPreparedSources } from "./sync.mjs";
13
14
  import { buildTemporaryDeployStackTemplate } from "./template.mjs";
14
15
 
15
16
  function normalizeRelative(projectDir, targetPath) {
@@ -365,20 +366,15 @@ export async function deployAwsProject({
365
366
  stdio
366
367
  });
367
368
 
368
- const syncedCodeBuckets = [];
369
- if (!noSync) {
370
- for (const [variantName, variantConfig] of Object.entries(runtimeConfig.variants)) {
371
- const syncDir = path.join(projectDir, packaged.manifest.syncDirectories[variantName]);
372
- await runAwsCli(["s3", "sync", syncDir, `s3://${variantConfig.codeBucket}`], {
373
- region: runtimeConfig.awsRegion,
369
+ const syncedCodeBuckets = noSync
370
+ ? []
371
+ : (await syncPreparedSources({
372
+ projectDir,
373
+ runtimeConfig,
374
+ syncDirectories: packaged.manifest.syncDirectories,
374
375
  profile,
375
- cwd: projectDir,
376
- stdio,
377
- errorCode: "ADAPTER_ERROR"
378
- });
379
- syncedCodeBuckets.push(variantConfig.codeBucket);
380
- }
381
- }
376
+ stdio
377
+ })).syncedCodeBuckets;
382
378
 
383
379
  const distributions = [];
384
380
  for (const [variantName, variantConfig] of Object.entries(runtimeConfig.variants)) {
@@ -5,3 +5,4 @@ export { ensureAwsCliAvailable, ensureAwsCredentials, runAwsCli } from "./aws-cl
5
5
  export { getConfiguredFeatures, resolveRequestedFeatures } from "./features.mjs";
6
6
  export { packageAwsProject } from "./package.mjs";
7
7
  export { deployAwsProject } from "./deploy.mjs";
8
+ export { stageProjectSources, syncPreparedSources, syncAwsProject } from "./sync.mjs";
@@ -0,0 +1,155 @@
1
+ import fs from "node:fs/promises";
2
+ import path from "node:path";
3
+
4
+ import { buildEnvironmentRuntimeConfig } from "../../core/src/index.mjs";
5
+ import { ensureAwsCliAvailable, ensureAwsCredentials, runAwsCli } from "./aws-cli.mjs";
6
+
7
+ async function ensureDirectory(targetDir) {
8
+ await fs.mkdir(targetDir, { recursive: true });
9
+ }
10
+
11
+ async function removeDirectory(targetDir) {
12
+ await fs.rm(targetDir, { recursive: true, force: true });
13
+ }
14
+
15
+ async function listFiles(rootDir, currentDir = rootDir) {
16
+ const entries = await fs.readdir(currentDir, { withFileTypes: true });
17
+ const files = [];
18
+
19
+ for (const entry of entries) {
20
+ const fullPath = path.join(currentDir, entry.name);
21
+ if (entry.isDirectory()) {
22
+ files.push(...await listFiles(rootDir, fullPath));
23
+ continue;
24
+ }
25
+
26
+ if (entry.isFile()) {
27
+ files.push(path.relative(rootDir, fullPath).replace(/\\/g, "/"));
28
+ }
29
+ }
30
+
31
+ return files.sort();
32
+ }
33
+
34
+ async function copyDirectory(sourceDir, targetDir) {
35
+ const files = await listFiles(sourceDir);
36
+ for (const relativePath of files) {
37
+ const sourcePath = path.join(sourceDir, relativePath);
38
+ const targetPath = path.join(targetDir, relativePath);
39
+ await ensureDirectory(path.dirname(targetPath));
40
+ await fs.copyFile(sourcePath, targetPath);
41
+ }
42
+ }
43
+
44
+ function normalizeRelative(projectDir, targetPath) {
45
+ return path.relative(projectDir, targetPath).replace(/\\/g, "/");
46
+ }
47
+
48
+ async function stageVariantSources(projectDir, runtimeConfig, variantName, syncRoot) {
49
+ const variantConfig = runtimeConfig.variants[variantName];
50
+ const variantRoot = path.join(syncRoot, variantName);
51
+ await removeDirectory(variantRoot);
52
+ await ensureDirectory(variantRoot);
53
+
54
+ await copyDirectory(path.join(projectDir, variantConfig.partDir), path.join(variantRoot, "part"));
55
+ await copyDirectory(path.join(projectDir, variantConfig.sourceDir), path.join(variantRoot, variantName));
56
+
57
+ return variantRoot;
58
+ }
59
+
60
+ export async function stageProjectSources({
61
+ projectDir,
62
+ config,
63
+ environment,
64
+ outDir
65
+ }) {
66
+ const runtimeConfig = buildEnvironmentRuntimeConfig(config, environment);
67
+ const syncRoot = outDir
68
+ ? path.join(projectDir, outDir)
69
+ : path.join(projectDir, "offline", "IAAS", "sync", environment);
70
+
71
+ await ensureDirectory(syncRoot);
72
+
73
+ const syncDirectories = {};
74
+ for (const variantName of Object.keys(runtimeConfig.variants)) {
75
+ const variantRoot = await stageVariantSources(projectDir, runtimeConfig, variantName, syncRoot);
76
+ syncDirectories[variantName] = normalizeRelative(projectDir, variantRoot);
77
+ }
78
+
79
+ return {
80
+ runtimeConfig,
81
+ syncRoot: normalizeRelative(projectDir, syncRoot),
82
+ syncDirectories
83
+ };
84
+ }
85
+
86
+ export async function syncPreparedSources({
87
+ projectDir,
88
+ runtimeConfig,
89
+ syncDirectories,
90
+ profile,
91
+ stdio = "pipe",
92
+ ensureAwsCliAvailableFn = ensureAwsCliAvailable,
93
+ ensureAwsCredentialsFn = ensureAwsCredentials,
94
+ runAwsCliFn = runAwsCli
95
+ }) {
96
+ await ensureAwsCliAvailableFn({ cwd: projectDir });
97
+ await ensureAwsCredentialsFn({
98
+ region: runtimeConfig.awsRegion,
99
+ profile,
100
+ cwd: projectDir
101
+ });
102
+
103
+ const syncedCodeBuckets = [];
104
+ for (const [variantName, variantConfig] of Object.entries(runtimeConfig.variants)) {
105
+ const syncDir = path.join(projectDir, syncDirectories[variantName]);
106
+ await runAwsCliFn(["s3", "sync", syncDir, `s3://${variantConfig.codeBucket}`, "--delete"], {
107
+ region: runtimeConfig.awsRegion,
108
+ profile,
109
+ cwd: projectDir,
110
+ stdio,
111
+ errorCode: "ADAPTER_ERROR"
112
+ });
113
+ syncedCodeBuckets.push(variantConfig.codeBucket);
114
+ }
115
+
116
+ return {
117
+ syncedCodeBuckets
118
+ };
119
+ }
120
+
121
+ export async function syncAwsProject({
122
+ projectDir,
123
+ config,
124
+ environment,
125
+ outDir,
126
+ profile,
127
+ stdio = "pipe",
128
+ ensureAwsCliAvailableFn = ensureAwsCliAvailable,
129
+ ensureAwsCredentialsFn = ensureAwsCredentials,
130
+ runAwsCliFn = runAwsCli
131
+ }) {
132
+ const prepared = await stageProjectSources({
133
+ projectDir,
134
+ config,
135
+ environment,
136
+ outDir
137
+ });
138
+
139
+ const synced = await syncPreparedSources({
140
+ projectDir,
141
+ runtimeConfig: prepared.runtimeConfig,
142
+ syncDirectories: prepared.syncDirectories,
143
+ profile,
144
+ stdio,
145
+ ensureAwsCliAvailableFn,
146
+ ensureAwsCredentialsFn,
147
+ runAwsCliFn
148
+ });
149
+
150
+ return {
151
+ syncRoot: prepared.syncRoot,
152
+ syncDirectories: prepared.syncDirectories,
153
+ syncedCodeBuckets: synced.syncedCodeBuckets
154
+ };
155
+ }
@@ -11,6 +11,7 @@ import {
11
11
  renderProject,
12
12
  runProjectTests,
13
13
  scaffoldProject,
14
+ syncProject,
14
15
  validateProject
15
16
  } from "../src/project.mjs";
16
17
 
@@ -82,6 +83,7 @@ function printHelp() {
82
83
  " render\n" +
83
84
  " test\n" +
84
85
  " package\n" +
86
+ " sync\n" +
85
87
  " deploy\n" +
86
88
  " doctor\n" +
87
89
  " migrate\n"
@@ -281,6 +283,37 @@ async function main() {
281
283
  return;
282
284
  }
283
285
 
286
+ if (command === "sync") {
287
+ const loaded = await loadConfigForCommand(cwd, options.config);
288
+ if (!loaded.ok) {
289
+ if (wantsJson) {
290
+ printJson("sync", false, loaded.warnings, loaded.errors, startedAt);
291
+ }
292
+ process.exitCode = 2;
293
+ return;
294
+ }
295
+ if (!options.env) {
296
+ process.stderr.write("sync requires --env <name>\n");
297
+ process.exitCode = 1;
298
+ return;
299
+ }
300
+
301
+ const report = await syncProject(cwd, loaded.config, {
302
+ environment: asArray(options.env)[0],
303
+ outDir: options["out-dir"],
304
+ profile: options.profile,
305
+ stdio: wantsJson ? "pipe" : "inherit"
306
+ });
307
+
308
+ if (wantsJson) {
309
+ printJson("sync", true, [], [], startedAt, report);
310
+ return;
311
+ }
312
+
313
+ process.stdout.write(`Synced project sources to ${report.syncedCodeBuckets.length} code bucket(s)\n`);
314
+ return;
315
+ }
316
+
284
317
  if (command === "deploy") {
285
318
  const loaded = await loadConfigForCommand(cwd, options.config);
286
319
  if (!loaded.ok) {
@@ -15,7 +15,8 @@ import {
15
15
  deployAwsProject,
16
16
  ensureAwsCliAvailable,
17
17
  ensureAwsCredentials,
18
- packageAwsProject
18
+ packageAwsProject,
19
+ syncAwsProject
19
20
  } from "../../aws-adapter/src/index.mjs";
20
21
 
21
22
  import {
@@ -233,6 +234,52 @@ function schemaTemplate() {
233
234
  };
234
235
  }
235
236
 
237
+ function githubSyncWorkflowTemplate() {
238
+ return `name: S3TE Sync
239
+
240
+ on:
241
+ workflow_dispatch:
242
+ push:
243
+ branches:
244
+ - main
245
+ paths:
246
+ - "app/**"
247
+ - "package.json"
248
+ - "package-lock.json"
249
+ - ".github/workflows/s3te-sync.yml"
250
+
251
+ jobs:
252
+ sync:
253
+ runs-on: ubuntu-latest
254
+ permissions:
255
+ contents: read
256
+ steps:
257
+ - uses: actions/checkout@v4
258
+ - uses: actions/setup-node@v4
259
+ with:
260
+ node-version: 22
261
+ cache: npm
262
+ - name: Install dependencies
263
+ shell: bash
264
+ run: |
265
+ if [ -f package-lock.json ]; then
266
+ npm ci
267
+ else
268
+ npm install
269
+ fi
270
+ - name: Configure AWS credentials
271
+ uses: aws-actions/configure-aws-credentials@v4
272
+ with:
273
+ aws-access-key-id: \${{ secrets.AWS_ACCESS_KEY_ID }}
274
+ aws-secret-access-key: \${{ secrets.AWS_SECRET_ACCESS_KEY }}
275
+ aws-region: eu-central-1
276
+ - name: Validate project
277
+ run: npx s3te validate
278
+ - name: Sync project sources to the S3TE code bucket
279
+ run: npx s3te sync --env dev
280
+ `;
281
+ }
282
+
236
283
  async function fileExists(targetPath) {
237
284
  try {
238
285
  await fs.stat(targetPath);
@@ -482,6 +529,7 @@ export async function scaffoldProject(projectDir, options = {}) {
482
529
  await ensureDirectory(path.join(projectDir, "offline", "tests"));
483
530
  await ensureDirectory(path.join(projectDir, "offline", "content"));
484
531
  await ensureDirectory(path.join(projectDir, "offline", "schemas"));
532
+ await ensureDirectory(path.join(projectDir, ".github", "workflows"));
485
533
  await ensureDirectory(path.join(projectDir, ".vscode"));
486
534
 
487
535
  const projectPackageJson = {
@@ -491,6 +539,7 @@ export async function scaffoldProject(projectDir, options = {}) {
491
539
  scripts: {
492
540
  validate: "s3te validate",
493
541
  render: "s3te render --env dev",
542
+ sync: "s3te sync --env dev",
494
543
  test: "s3te test"
495
544
  }
496
545
  };
@@ -530,6 +579,7 @@ export async function scaffoldProject(projectDir, options = {}) {
530
579
  await writeProjectPackageJson(path.join(projectDir, "package.json"), projectPackageJson, scaffoldOptions, force);
531
580
  await writeProjectConfigJson(path.join(projectDir, "s3te.config.json"), config, scaffoldOptions, force);
532
581
  await writeProjectFile(path.join(projectDir, "offline", "schemas", "s3te.config.schema.json"), JSON.stringify(schemaTemplate(), null, 2) + "\n", force, true);
582
+ await writeProjectFile(path.join(projectDir, ".github", "workflows", "s3te-sync.yml"), githubSyncWorkflowTemplate(), force);
533
583
  await writeProjectFile(path.join(projectDir, "app", "part", "head.part"), "<meta charset='utf-8'>\n<title>My S3TE Site</title>\n", force);
534
584
  await writeProjectFile(path.join(projectDir, "app", variant, "index.html"), "<!doctype html>\n<html lang=\"<lang>2</lang>\">\n <head>\n <part>head.part</part>\n </head>\n <body>\n <h1>Hello from S3TemplateEngine</h1>\n </body>\n</html>\n", force);
535
585
  await writeProjectFile(path.join(projectDir, "offline", "content", `${language}.json`), "[]\n", force);
@@ -674,6 +724,17 @@ export async function deployProject(projectDir, config, options = {}) {
674
724
  });
675
725
  }
676
726
 
727
+ export async function syncProject(projectDir, config, options = {}) {
728
+ return syncAwsProject({
729
+ projectDir,
730
+ config,
731
+ environment: options.environment,
732
+ outDir: options.outDir,
733
+ profile: options.profile,
734
+ stdio: options.stdio ?? "pipe"
735
+ });
736
+ }
737
+
677
738
  export async function doctorProject(projectDir, configPath, options = {}) {
678
739
  const checks = [];
679
740
  const majorVersion = Number(process.versions.node.split(".")[0]);