@projectdochelp/s3te 3.1.2 → 3.1.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md
CHANGED
|
@@ -59,7 +59,9 @@ The website bucket contains the finished result that visitors actually receive t
|
|
|
59
59
|
<details>
|
|
60
60
|
<summary>What happens when I deploy?</summary>
|
|
61
61
|
|
|
62
|
-
`s3te deploy`
|
|
62
|
+
`s3te deploy` loads the validated project configuration, packages the AWS runtime, creates or updates one persistent CloudFormation environment stack, creates one temporary CloudFormation deploy stack for packaging artifacts, synchronizes your current source files into the code bucket, and removes the temporary stack again after the real deploy run.
|
|
63
|
+
|
|
64
|
+
That source sync is not limited to Lambda code. It includes your `.html`, `.part`, CSS, JavaScript, images and other project files so the running AWS stack can react to source changes inside the code bucket.
|
|
63
65
|
|
|
64
66
|
The persistent environment stack contains the long-lived AWS resources such as buckets, Lambda functions, DynamoDB tables, CloudFront distributions and the runtime manifest parameter. The temporary deploy stack exists only so CloudFormation can consume the packaged Lambda artifacts cleanly.
|
|
65
67
|
|
|
@@ -171,6 +173,9 @@ The default scaffold creates:
|
|
|
171
173
|
mywebsite/
|
|
172
174
|
package.json
|
|
173
175
|
s3te.config.json
|
|
176
|
+
.github/
|
|
177
|
+
workflows/
|
|
178
|
+
s3te-sync.yml
|
|
174
179
|
app/
|
|
175
180
|
part/
|
|
176
181
|
head.part
|
|
@@ -187,6 +192,8 @@ mywebsite/
|
|
|
187
192
|
extensions.json
|
|
188
193
|
```
|
|
189
194
|
|
|
195
|
+
The generated `.github/workflows/s3te-sync.yml` is the default CI path for GitHub-based source publishing into the S3TE code bucket. It is scaffolded once and then left alone on later `s3te init` runs unless you use `--force`.
|
|
196
|
+
|
|
190
197
|
</details>
|
|
191
198
|
|
|
192
199
|
<details>
|
|
@@ -238,13 +245,15 @@ npx s3te deploy --env dev
|
|
|
238
245
|
|
|
239
246
|
`deploy` creates or updates the persistent environment stack, uses a temporary deploy stack for packaged Lambda artifacts, synchronizes the source project into the code bucket, and removes the temporary stack again when the deploy finishes.
|
|
240
247
|
|
|
248
|
+
After the first successful deploy, use `s3te sync --env dev` for regular template, partial, asset and source updates when the infrastructure itself did not change.
|
|
249
|
+
|
|
241
250
|
If you left `route53HostedZoneId` out of the config, the last DNS step stays manual: point your domain at the created CloudFront distribution after deploy.
|
|
242
251
|
|
|
243
252
|
</details>
|
|
244
253
|
|
|
245
254
|
## Usage
|
|
246
255
|
|
|
247
|
-
Once the project is installed, your everyday loop
|
|
256
|
+
Once the project is installed, your everyday loop splits into two paths: deploy when infrastructure changes, sync when only project sources changed.
|
|
248
257
|
|
|
249
258
|
### Daily Workflow
|
|
250
259
|
|
|
@@ -255,15 +264,24 @@ Once the project is installed, your everyday loop is deliberately small: edit te
|
|
|
255
264
|
2. If you use content-driven tags without Webiny, edit `offline/content/en.json` or `offline/content/items.json`.
|
|
256
265
|
3. Validate and render locally.
|
|
257
266
|
4. Run your tests.
|
|
258
|
-
5.
|
|
267
|
+
5. Use `deploy` for the first installation or after infrastructure/config/runtime changes.
|
|
268
|
+
6. Use `sync` for day-to-day source publishing into the code bucket.
|
|
259
269
|
|
|
260
270
|
```bash
|
|
261
271
|
npx s3te validate
|
|
262
272
|
npx s3te render --env dev
|
|
263
273
|
npx s3te test
|
|
274
|
+
npx s3te sync --env dev
|
|
275
|
+
```
|
|
276
|
+
|
|
277
|
+
Use a full deploy only when needed:
|
|
278
|
+
|
|
279
|
+
```bash
|
|
264
280
|
npx s3te deploy --env dev
|
|
265
281
|
```
|
|
266
282
|
|
|
283
|
+
Once Webiny is installed and the stack is deployed with Webiny enabled, CMS content changes are picked up in AWS through the DynamoDB stream integration. Those content changes do not require another `sync` or `deploy`.
|
|
284
|
+
|
|
267
285
|
</details>
|
|
268
286
|
|
|
269
287
|
### CLI Commands
|
|
@@ -278,6 +296,7 @@ npx s3te deploy --env dev
|
|
|
278
296
|
| `s3te render --env <name>` | Renders locally into `offline/S3TELocal/preview/<env>/...`. |
|
|
279
297
|
| `s3te test` | Runs the project tests from `offline/tests/`. |
|
|
280
298
|
| `s3te package --env <name>` | Builds the AWS deployment artifacts without deploying them yet. |
|
|
299
|
+
| `s3te sync --env <name>` | Uploads current project sources into the configured code buckets. |
|
|
281
300
|
| `s3te doctor --env <name>` | Checks local machine and AWS access before deploy. |
|
|
282
301
|
| `s3te deploy --env <name>` | Deploys or updates the AWS environment and syncs source files. |
|
|
283
302
|
| `s3te migrate` | Updates older project configs and can retrofit Webiny into an existing S3TE project. |
|
|
@@ -394,7 +413,116 @@ npx s3te doctor --env prod
|
|
|
394
413
|
npx s3te deploy --env prod
|
|
395
414
|
```
|
|
396
415
|
|
|
397
|
-
That deploy updates the existing environment stack and adds the Webiny mirror resources to it. You do not need a fresh S3TE installation.
|
|
416
|
+
That deploy updates the existing environment stack and adds the Webiny mirror resources to it. You do not need a fresh S3TE installation. After that, Webiny content changes flow through the deployed AWS resources automatically; only template or asset changes still need `s3te sync --env <name>`.
|
|
417
|
+
|
|
418
|
+
</details>
|
|
419
|
+
|
|
420
|
+
<details>
|
|
421
|
+
<summary>GitHub Actions source publishing</summary>
|
|
422
|
+
|
|
423
|
+
If your team works through GitHub instead of running `s3te sync` locally, the scaffold already includes `.github/workflows/s3te-sync.yml`.
|
|
424
|
+
|
|
425
|
+
That workflow is meant for source publishing only:
|
|
426
|
+
|
|
427
|
+
- it validates the project
|
|
428
|
+
- it uploads `app/...` and `part/...` into the S3TE code bucket
|
|
429
|
+
- the resulting S3 events trigger the deployed Lambda pipeline in AWS
|
|
430
|
+
|
|
431
|
+
Use a full `deploy` only when the infrastructure, environment config, or runtime package changes.
|
|
432
|
+
|
|
433
|
+
GitHub preparation checklist:
|
|
434
|
+
|
|
435
|
+
1. Push the project to GitHub together with `.github/workflows/s3te-sync.yml`.
|
|
436
|
+
2. Make sure GitHub Actions are allowed for the repository or organization.
|
|
437
|
+
3. Run the first real `npx s3te deploy --env <name>` so the code bucket already exists.
|
|
438
|
+
4. In AWS IAM, create an access key for a CI user that may sync only the S3TE code bucket for that environment.
|
|
439
|
+
5. In GitHub open `Settings -> Secrets and variables -> Actions -> Secrets`.
|
|
440
|
+
6. Add these repository secrets:
|
|
441
|
+
- `AWS_ACCESS_KEY_ID`
|
|
442
|
+
- `AWS_SECRET_ACCESS_KEY`
|
|
443
|
+
7. Open `.github/workflows/s3te-sync.yml` and adjust:
|
|
444
|
+
- the branch under `on.push.branches`
|
|
445
|
+
- `aws-region`
|
|
446
|
+
- `npx s3te sync --env dev` to your target environment such as `prod` or `test`
|
|
447
|
+
|
|
448
|
+
No GitHub variables are required by the scaffolded workflow. The code bucket name is resolved by S3TE from `s3te.config.json`, so you do not have to store bucket names in GitHub.
|
|
449
|
+
|
|
450
|
+
For projects with multiple environments such as `test` and `prod`, the simplest setup is usually one workflow file per target environment, for example:
|
|
451
|
+
|
|
452
|
+
- `.github/workflows/s3te-sync-test.yml` with `npx s3te sync --env test`
|
|
453
|
+
- `.github/workflows/s3te-sync-prod.yml` with `npx s3te sync --env prod`
|
|
454
|
+
|
|
455
|
+
First verification in GitHub:
|
|
456
|
+
|
|
457
|
+
1. Open the `Actions` tab in the repository.
|
|
458
|
+
2. Select `S3TE Sync`.
|
|
459
|
+
3. Start it once manually with `Run workflow`.
|
|
460
|
+
4. Check that the run reaches the `Configure AWS credentials`, `Validate project`, and `Sync project sources to the S3TE code bucket` steps without error.
|
|
461
|
+
|
|
462
|
+
Minimal IAM policy example for one code bucket:
|
|
463
|
+
|
|
464
|
+
```json
|
|
465
|
+
{
|
|
466
|
+
"Version": "2012-10-17",
|
|
467
|
+
"Statement": [
|
|
468
|
+
{
|
|
469
|
+
"Effect": "Allow",
|
|
470
|
+
"Action": ["s3:ListBucket"],
|
|
471
|
+
"Resource": ["arn:aws:s3:::dev-website-code-mywebsite"]
|
|
472
|
+
},
|
|
473
|
+
{
|
|
474
|
+
"Effect": "Allow",
|
|
475
|
+
"Action": ["s3:GetObject", "s3:PutObject", "s3:DeleteObject"],
|
|
476
|
+
"Resource": ["arn:aws:s3:::dev-website-code-mywebsite/*"]
|
|
477
|
+
}
|
|
478
|
+
]
|
|
479
|
+
}
|
|
480
|
+
```
|
|
481
|
+
|
|
482
|
+
For non-production environments or additional variants, use the derived code bucket names from your config, for example `test-website-code-mywebsite` or `app-code-mywebsite`.
|
|
483
|
+
|
|
484
|
+
The scaffolded workflow looks like this:
|
|
485
|
+
|
|
486
|
+
```yaml
|
|
487
|
+
name: S3TE Sync
|
|
488
|
+
on:
|
|
489
|
+
workflow_dispatch:
|
|
490
|
+
push:
|
|
491
|
+
branches: ["main"]
|
|
492
|
+
paths:
|
|
493
|
+
- "app/**"
|
|
494
|
+
- "package.json"
|
|
495
|
+
- "package-lock.json"
|
|
496
|
+
- ".github/workflows/s3te-sync.yml"
|
|
497
|
+
|
|
498
|
+
jobs:
|
|
499
|
+
sync:
|
|
500
|
+
runs-on: ubuntu-latest
|
|
501
|
+
permissions:
|
|
502
|
+
contents: read
|
|
503
|
+
steps:
|
|
504
|
+
- uses: actions/checkout@v4
|
|
505
|
+
- uses: actions/setup-node@v4
|
|
506
|
+
with:
|
|
507
|
+
node-version: 22
|
|
508
|
+
cache: npm
|
|
509
|
+
- name: Install dependencies
|
|
510
|
+
shell: bash
|
|
511
|
+
run: |
|
|
512
|
+
if [ -f package-lock.json ]; then
|
|
513
|
+
npm ci
|
|
514
|
+
else
|
|
515
|
+
npm install
|
|
516
|
+
fi
|
|
517
|
+
- name: Configure AWS credentials
|
|
518
|
+
uses: aws-actions/configure-aws-credentials@v4
|
|
519
|
+
with:
|
|
520
|
+
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
|
|
521
|
+
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
|
|
522
|
+
aws-region: eu-central-1
|
|
523
|
+
- run: npx s3te validate
|
|
524
|
+
- run: npx s3te sync --env dev
|
|
525
|
+
```
|
|
398
526
|
|
|
399
527
|
</details>
|
|
400
528
|
|
package/package.json
CHANGED
|
@@ -10,6 +10,7 @@ import { ensureAwsCliAvailable, ensureAwsCredentials, runAwsCli } from "./aws-cl
|
|
|
10
10
|
import { resolveRequestedFeatures } from "./features.mjs";
|
|
11
11
|
import { buildAwsRuntimeManifest, extractStackOutputsMap } from "./manifest.mjs";
|
|
12
12
|
import { packageAwsProject } from "./package.mjs";
|
|
13
|
+
import { syncPreparedSources } from "./sync.mjs";
|
|
13
14
|
import { buildTemporaryDeployStackTemplate } from "./template.mjs";
|
|
14
15
|
|
|
15
16
|
function normalizeRelative(projectDir, targetPath) {
|
|
@@ -365,20 +366,15 @@ export async function deployAwsProject({
|
|
|
365
366
|
stdio
|
|
366
367
|
});
|
|
367
368
|
|
|
368
|
-
const syncedCodeBuckets =
|
|
369
|
-
|
|
370
|
-
|
|
371
|
-
|
|
372
|
-
|
|
373
|
-
|
|
369
|
+
const syncedCodeBuckets = noSync
|
|
370
|
+
? []
|
|
371
|
+
: (await syncPreparedSources({
|
|
372
|
+
projectDir,
|
|
373
|
+
runtimeConfig,
|
|
374
|
+
syncDirectories: packaged.manifest.syncDirectories,
|
|
374
375
|
profile,
|
|
375
|
-
|
|
376
|
-
|
|
377
|
-
errorCode: "ADAPTER_ERROR"
|
|
378
|
-
});
|
|
379
|
-
syncedCodeBuckets.push(variantConfig.codeBucket);
|
|
380
|
-
}
|
|
381
|
-
}
|
|
376
|
+
stdio
|
|
377
|
+
})).syncedCodeBuckets;
|
|
382
378
|
|
|
383
379
|
const distributions = [];
|
|
384
380
|
for (const [variantName, variantConfig] of Object.entries(runtimeConfig.variants)) {
|
|
@@ -5,3 +5,4 @@ export { ensureAwsCliAvailable, ensureAwsCredentials, runAwsCli } from "./aws-cl
|
|
|
5
5
|
export { getConfiguredFeatures, resolveRequestedFeatures } from "./features.mjs";
|
|
6
6
|
export { packageAwsProject } from "./package.mjs";
|
|
7
7
|
export { deployAwsProject } from "./deploy.mjs";
|
|
8
|
+
export { stageProjectSources, syncPreparedSources, syncAwsProject } from "./sync.mjs";
|
|
@@ -0,0 +1,155 @@
|
|
|
1
|
+
import fs from "node:fs/promises";
|
|
2
|
+
import path from "node:path";
|
|
3
|
+
|
|
4
|
+
import { buildEnvironmentRuntimeConfig } from "../../core/src/index.mjs";
|
|
5
|
+
import { ensureAwsCliAvailable, ensureAwsCredentials, runAwsCli } from "./aws-cli.mjs";
|
|
6
|
+
|
|
7
|
+
async function ensureDirectory(targetDir) {
|
|
8
|
+
await fs.mkdir(targetDir, { recursive: true });
|
|
9
|
+
}
|
|
10
|
+
|
|
11
|
+
async function removeDirectory(targetDir) {
|
|
12
|
+
await fs.rm(targetDir, { recursive: true, force: true });
|
|
13
|
+
}
|
|
14
|
+
|
|
15
|
+
async function listFiles(rootDir, currentDir = rootDir) {
|
|
16
|
+
const entries = await fs.readdir(currentDir, { withFileTypes: true });
|
|
17
|
+
const files = [];
|
|
18
|
+
|
|
19
|
+
for (const entry of entries) {
|
|
20
|
+
const fullPath = path.join(currentDir, entry.name);
|
|
21
|
+
if (entry.isDirectory()) {
|
|
22
|
+
files.push(...await listFiles(rootDir, fullPath));
|
|
23
|
+
continue;
|
|
24
|
+
}
|
|
25
|
+
|
|
26
|
+
if (entry.isFile()) {
|
|
27
|
+
files.push(path.relative(rootDir, fullPath).replace(/\\/g, "/"));
|
|
28
|
+
}
|
|
29
|
+
}
|
|
30
|
+
|
|
31
|
+
return files.sort();
|
|
32
|
+
}
|
|
33
|
+
|
|
34
|
+
async function copyDirectory(sourceDir, targetDir) {
|
|
35
|
+
const files = await listFiles(sourceDir);
|
|
36
|
+
for (const relativePath of files) {
|
|
37
|
+
const sourcePath = path.join(sourceDir, relativePath);
|
|
38
|
+
const targetPath = path.join(targetDir, relativePath);
|
|
39
|
+
await ensureDirectory(path.dirname(targetPath));
|
|
40
|
+
await fs.copyFile(sourcePath, targetPath);
|
|
41
|
+
}
|
|
42
|
+
}
|
|
43
|
+
|
|
44
|
+
function normalizeRelative(projectDir, targetPath) {
|
|
45
|
+
return path.relative(projectDir, targetPath).replace(/\\/g, "/");
|
|
46
|
+
}
|
|
47
|
+
|
|
48
|
+
async function stageVariantSources(projectDir, runtimeConfig, variantName, syncRoot) {
|
|
49
|
+
const variantConfig = runtimeConfig.variants[variantName];
|
|
50
|
+
const variantRoot = path.join(syncRoot, variantName);
|
|
51
|
+
await removeDirectory(variantRoot);
|
|
52
|
+
await ensureDirectory(variantRoot);
|
|
53
|
+
|
|
54
|
+
await copyDirectory(path.join(projectDir, variantConfig.partDir), path.join(variantRoot, "part"));
|
|
55
|
+
await copyDirectory(path.join(projectDir, variantConfig.sourceDir), path.join(variantRoot, variantName));
|
|
56
|
+
|
|
57
|
+
return variantRoot;
|
|
58
|
+
}
|
|
59
|
+
|
|
60
|
+
export async function stageProjectSources({
|
|
61
|
+
projectDir,
|
|
62
|
+
config,
|
|
63
|
+
environment,
|
|
64
|
+
outDir
|
|
65
|
+
}) {
|
|
66
|
+
const runtimeConfig = buildEnvironmentRuntimeConfig(config, environment);
|
|
67
|
+
const syncRoot = outDir
|
|
68
|
+
? path.join(projectDir, outDir)
|
|
69
|
+
: path.join(projectDir, "offline", "IAAS", "sync", environment);
|
|
70
|
+
|
|
71
|
+
await ensureDirectory(syncRoot);
|
|
72
|
+
|
|
73
|
+
const syncDirectories = {};
|
|
74
|
+
for (const variantName of Object.keys(runtimeConfig.variants)) {
|
|
75
|
+
const variantRoot = await stageVariantSources(projectDir, runtimeConfig, variantName, syncRoot);
|
|
76
|
+
syncDirectories[variantName] = normalizeRelative(projectDir, variantRoot);
|
|
77
|
+
}
|
|
78
|
+
|
|
79
|
+
return {
|
|
80
|
+
runtimeConfig,
|
|
81
|
+
syncRoot: normalizeRelative(projectDir, syncRoot),
|
|
82
|
+
syncDirectories
|
|
83
|
+
};
|
|
84
|
+
}
|
|
85
|
+
|
|
86
|
+
export async function syncPreparedSources({
|
|
87
|
+
projectDir,
|
|
88
|
+
runtimeConfig,
|
|
89
|
+
syncDirectories,
|
|
90
|
+
profile,
|
|
91
|
+
stdio = "pipe",
|
|
92
|
+
ensureAwsCliAvailableFn = ensureAwsCliAvailable,
|
|
93
|
+
ensureAwsCredentialsFn = ensureAwsCredentials,
|
|
94
|
+
runAwsCliFn = runAwsCli
|
|
95
|
+
}) {
|
|
96
|
+
await ensureAwsCliAvailableFn({ cwd: projectDir });
|
|
97
|
+
await ensureAwsCredentialsFn({
|
|
98
|
+
region: runtimeConfig.awsRegion,
|
|
99
|
+
profile,
|
|
100
|
+
cwd: projectDir
|
|
101
|
+
});
|
|
102
|
+
|
|
103
|
+
const syncedCodeBuckets = [];
|
|
104
|
+
for (const [variantName, variantConfig] of Object.entries(runtimeConfig.variants)) {
|
|
105
|
+
const syncDir = path.join(projectDir, syncDirectories[variantName]);
|
|
106
|
+
await runAwsCliFn(["s3", "sync", syncDir, `s3://${variantConfig.codeBucket}`, "--delete"], {
|
|
107
|
+
region: runtimeConfig.awsRegion,
|
|
108
|
+
profile,
|
|
109
|
+
cwd: projectDir,
|
|
110
|
+
stdio,
|
|
111
|
+
errorCode: "ADAPTER_ERROR"
|
|
112
|
+
});
|
|
113
|
+
syncedCodeBuckets.push(variantConfig.codeBucket);
|
|
114
|
+
}
|
|
115
|
+
|
|
116
|
+
return {
|
|
117
|
+
syncedCodeBuckets
|
|
118
|
+
};
|
|
119
|
+
}
|
|
120
|
+
|
|
121
|
+
export async function syncAwsProject({
|
|
122
|
+
projectDir,
|
|
123
|
+
config,
|
|
124
|
+
environment,
|
|
125
|
+
outDir,
|
|
126
|
+
profile,
|
|
127
|
+
stdio = "pipe",
|
|
128
|
+
ensureAwsCliAvailableFn = ensureAwsCliAvailable,
|
|
129
|
+
ensureAwsCredentialsFn = ensureAwsCredentials,
|
|
130
|
+
runAwsCliFn = runAwsCli
|
|
131
|
+
}) {
|
|
132
|
+
const prepared = await stageProjectSources({
|
|
133
|
+
projectDir,
|
|
134
|
+
config,
|
|
135
|
+
environment,
|
|
136
|
+
outDir
|
|
137
|
+
});
|
|
138
|
+
|
|
139
|
+
const synced = await syncPreparedSources({
|
|
140
|
+
projectDir,
|
|
141
|
+
runtimeConfig: prepared.runtimeConfig,
|
|
142
|
+
syncDirectories: prepared.syncDirectories,
|
|
143
|
+
profile,
|
|
144
|
+
stdio,
|
|
145
|
+
ensureAwsCliAvailableFn,
|
|
146
|
+
ensureAwsCredentialsFn,
|
|
147
|
+
runAwsCliFn
|
|
148
|
+
});
|
|
149
|
+
|
|
150
|
+
return {
|
|
151
|
+
syncRoot: prepared.syncRoot,
|
|
152
|
+
syncDirectories: prepared.syncDirectories,
|
|
153
|
+
syncedCodeBuckets: synced.syncedCodeBuckets
|
|
154
|
+
};
|
|
155
|
+
}
|
|
@@ -11,6 +11,7 @@ import {
|
|
|
11
11
|
renderProject,
|
|
12
12
|
runProjectTests,
|
|
13
13
|
scaffoldProject,
|
|
14
|
+
syncProject,
|
|
14
15
|
validateProject
|
|
15
16
|
} from "../src/project.mjs";
|
|
16
17
|
|
|
@@ -82,6 +83,7 @@ function printHelp() {
|
|
|
82
83
|
" render\n" +
|
|
83
84
|
" test\n" +
|
|
84
85
|
" package\n" +
|
|
86
|
+
" sync\n" +
|
|
85
87
|
" deploy\n" +
|
|
86
88
|
" doctor\n" +
|
|
87
89
|
" migrate\n"
|
|
@@ -281,6 +283,37 @@ async function main() {
|
|
|
281
283
|
return;
|
|
282
284
|
}
|
|
283
285
|
|
|
286
|
+
if (command === "sync") {
|
|
287
|
+
const loaded = await loadConfigForCommand(cwd, options.config);
|
|
288
|
+
if (!loaded.ok) {
|
|
289
|
+
if (wantsJson) {
|
|
290
|
+
printJson("sync", false, loaded.warnings, loaded.errors, startedAt);
|
|
291
|
+
}
|
|
292
|
+
process.exitCode = 2;
|
|
293
|
+
return;
|
|
294
|
+
}
|
|
295
|
+
if (!options.env) {
|
|
296
|
+
process.stderr.write("sync requires --env <name>\n");
|
|
297
|
+
process.exitCode = 1;
|
|
298
|
+
return;
|
|
299
|
+
}
|
|
300
|
+
|
|
301
|
+
const report = await syncProject(cwd, loaded.config, {
|
|
302
|
+
environment: asArray(options.env)[0],
|
|
303
|
+
outDir: options["out-dir"],
|
|
304
|
+
profile: options.profile,
|
|
305
|
+
stdio: wantsJson ? "pipe" : "inherit"
|
|
306
|
+
});
|
|
307
|
+
|
|
308
|
+
if (wantsJson) {
|
|
309
|
+
printJson("sync", true, [], [], startedAt, report);
|
|
310
|
+
return;
|
|
311
|
+
}
|
|
312
|
+
|
|
313
|
+
process.stdout.write(`Synced project sources to ${report.syncedCodeBuckets.length} code bucket(s)\n`);
|
|
314
|
+
return;
|
|
315
|
+
}
|
|
316
|
+
|
|
284
317
|
if (command === "deploy") {
|
|
285
318
|
const loaded = await loadConfigForCommand(cwd, options.config);
|
|
286
319
|
if (!loaded.ok) {
|
|
@@ -15,7 +15,8 @@ import {
|
|
|
15
15
|
deployAwsProject,
|
|
16
16
|
ensureAwsCliAvailable,
|
|
17
17
|
ensureAwsCredentials,
|
|
18
|
-
packageAwsProject
|
|
18
|
+
packageAwsProject,
|
|
19
|
+
syncAwsProject
|
|
19
20
|
} from "../../aws-adapter/src/index.mjs";
|
|
20
21
|
|
|
21
22
|
import {
|
|
@@ -31,6 +32,21 @@ function normalizePath(value) {
|
|
|
31
32
|
return String(value).replace(/\\/g, "/");
|
|
32
33
|
}
|
|
33
34
|
|
|
35
|
+
function unknownEnvironmentMessage(config, environmentName) {
|
|
36
|
+
const knownEnvironments = Object.keys(config?.environments ?? {});
|
|
37
|
+
return `Unknown environment ${environmentName}. Known environments: ${knownEnvironments.length > 0 ? knownEnvironments.join(", ") : "(none)"}.`;
|
|
38
|
+
}
|
|
39
|
+
|
|
40
|
+
function assertKnownEnvironment(config, environmentName) {
|
|
41
|
+
if (!environmentName) {
|
|
42
|
+
return;
|
|
43
|
+
}
|
|
44
|
+
|
|
45
|
+
if (!config?.environments?.[environmentName]) {
|
|
46
|
+
throw new S3teError("CONFIG_CONFLICT_ERROR", unknownEnvironmentMessage(config, environmentName));
|
|
47
|
+
}
|
|
48
|
+
}
|
|
49
|
+
|
|
34
50
|
function normalizeBaseUrl(value) {
|
|
35
51
|
const trimmed = String(value).trim();
|
|
36
52
|
if (!trimmed) {
|
|
@@ -233,6 +249,56 @@ function schemaTemplate() {
|
|
|
233
249
|
};
|
|
234
250
|
}
|
|
235
251
|
|
|
252
|
+
function githubSyncWorkflowTemplate() {
|
|
253
|
+
return `# Before first use:
|
|
254
|
+
# 1. Run "npx s3te deploy --env dev" once so the S3TE code bucket already exists.
|
|
255
|
+
# 2. Add GitHub Actions secrets AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
|
|
256
|
+
# 3. Adjust branch, aws-region, and the target environment below.
|
|
257
|
+
name: S3TE Sync
|
|
258
|
+
|
|
259
|
+
on:
|
|
260
|
+
workflow_dispatch:
|
|
261
|
+
push:
|
|
262
|
+
branches:
|
|
263
|
+
- main
|
|
264
|
+
paths:
|
|
265
|
+
- "app/**"
|
|
266
|
+
- "package.json"
|
|
267
|
+
- "package-lock.json"
|
|
268
|
+
- ".github/workflows/s3te-sync.yml"
|
|
269
|
+
|
|
270
|
+
jobs:
|
|
271
|
+
sync:
|
|
272
|
+
runs-on: ubuntu-latest
|
|
273
|
+
permissions:
|
|
274
|
+
contents: read
|
|
275
|
+
steps:
|
|
276
|
+
- uses: actions/checkout@v4
|
|
277
|
+
- uses: actions/setup-node@v4
|
|
278
|
+
with:
|
|
279
|
+
node-version: 22
|
|
280
|
+
cache: npm
|
|
281
|
+
- name: Install dependencies
|
|
282
|
+
shell: bash
|
|
283
|
+
run: |
|
|
284
|
+
if [ -f package-lock.json ]; then
|
|
285
|
+
npm ci
|
|
286
|
+
else
|
|
287
|
+
npm install
|
|
288
|
+
fi
|
|
289
|
+
- name: Configure AWS credentials
|
|
290
|
+
uses: aws-actions/configure-aws-credentials@v4
|
|
291
|
+
with:
|
|
292
|
+
aws-access-key-id: \${{ secrets.AWS_ACCESS_KEY_ID }}
|
|
293
|
+
aws-secret-access-key: \${{ secrets.AWS_SECRET_ACCESS_KEY }}
|
|
294
|
+
aws-region: eu-central-1
|
|
295
|
+
- name: Validate project
|
|
296
|
+
run: npx s3te validate
|
|
297
|
+
- name: Sync project sources to the S3TE code bucket
|
|
298
|
+
run: npx s3te sync --env dev
|
|
299
|
+
`;
|
|
300
|
+
}
|
|
301
|
+
|
|
236
302
|
async function fileExists(targetPath) {
|
|
237
303
|
try {
|
|
238
304
|
await fs.stat(targetPath);
|
|
@@ -412,6 +478,18 @@ export async function loadResolvedConfig(projectDir, configPath) {
|
|
|
412
478
|
}
|
|
413
479
|
|
|
414
480
|
export async function validateProject(projectDir, config, options = {}) {
|
|
481
|
+
if (options.environment && !config?.environments?.[options.environment]) {
|
|
482
|
+
return {
|
|
483
|
+
ok: false,
|
|
484
|
+
errors: [{
|
|
485
|
+
code: "CONFIG_CONFLICT_ERROR",
|
|
486
|
+
message: unknownEnvironmentMessage(config, options.environment)
|
|
487
|
+
}],
|
|
488
|
+
warnings: [],
|
|
489
|
+
checkedTemplates: []
|
|
490
|
+
};
|
|
491
|
+
}
|
|
492
|
+
|
|
415
493
|
const templateRepository = new FileSystemTemplateRepository(projectDir, config);
|
|
416
494
|
const contentRepository = await loadLocalContent(projectDir, config);
|
|
417
495
|
const warnings = [];
|
|
@@ -482,6 +560,7 @@ export async function scaffoldProject(projectDir, options = {}) {
|
|
|
482
560
|
await ensureDirectory(path.join(projectDir, "offline", "tests"));
|
|
483
561
|
await ensureDirectory(path.join(projectDir, "offline", "content"));
|
|
484
562
|
await ensureDirectory(path.join(projectDir, "offline", "schemas"));
|
|
563
|
+
await ensureDirectory(path.join(projectDir, ".github", "workflows"));
|
|
485
564
|
await ensureDirectory(path.join(projectDir, ".vscode"));
|
|
486
565
|
|
|
487
566
|
const projectPackageJson = {
|
|
@@ -491,6 +570,7 @@ export async function scaffoldProject(projectDir, options = {}) {
|
|
|
491
570
|
scripts: {
|
|
492
571
|
validate: "s3te validate",
|
|
493
572
|
render: "s3te render --env dev",
|
|
573
|
+
sync: "s3te sync --env dev",
|
|
494
574
|
test: "s3te test"
|
|
495
575
|
}
|
|
496
576
|
};
|
|
@@ -530,6 +610,7 @@ export async function scaffoldProject(projectDir, options = {}) {
|
|
|
530
610
|
await writeProjectPackageJson(path.join(projectDir, "package.json"), projectPackageJson, scaffoldOptions, force);
|
|
531
611
|
await writeProjectConfigJson(path.join(projectDir, "s3te.config.json"), config, scaffoldOptions, force);
|
|
532
612
|
await writeProjectFile(path.join(projectDir, "offline", "schemas", "s3te.config.schema.json"), JSON.stringify(schemaTemplate(), null, 2) + "\n", force, true);
|
|
613
|
+
await writeProjectFile(path.join(projectDir, ".github", "workflows", "s3te-sync.yml"), githubSyncWorkflowTemplate(), force);
|
|
533
614
|
await writeProjectFile(path.join(projectDir, "app", "part", "head.part"), "<meta charset='utf-8'>\n<title>My S3TE Site</title>\n", force);
|
|
534
615
|
await writeProjectFile(path.join(projectDir, "app", variant, "index.html"), "<!doctype html>\n<html lang=\"<lang>2</lang>\">\n <head>\n <part>head.part</part>\n </head>\n <body>\n <h1>Hello from S3TemplateEngine</h1>\n </body>\n</html>\n", force);
|
|
535
616
|
await writeProjectFile(path.join(projectDir, "offline", "content", `${language}.json`), "[]\n", force);
|
|
@@ -545,6 +626,7 @@ export async function scaffoldProject(projectDir, options = {}) {
|
|
|
545
626
|
}
|
|
546
627
|
|
|
547
628
|
export async function renderProject(projectDir, config, options = {}) {
|
|
629
|
+
assertKnownEnvironment(config, options.environment);
|
|
548
630
|
const templateRepository = new FileSystemTemplateRepository(projectDir, config);
|
|
549
631
|
const contentRepository = await loadLocalContent(projectDir, config);
|
|
550
632
|
const outputRoot = path.join(projectDir, options.outputDir ?? config.rendering.outputDir);
|
|
@@ -650,6 +732,7 @@ export async function runProjectTests(projectDir) {
|
|
|
650
732
|
}
|
|
651
733
|
|
|
652
734
|
export async function packageProject(projectDir, config, options = {}) {
|
|
735
|
+
assertKnownEnvironment(config, options.environment);
|
|
653
736
|
return packageAwsProject({
|
|
654
737
|
projectDir,
|
|
655
738
|
config,
|
|
@@ -661,6 +744,7 @@ export async function packageProject(projectDir, config, options = {}) {
|
|
|
661
744
|
}
|
|
662
745
|
|
|
663
746
|
export async function deployProject(projectDir, config, options = {}) {
|
|
747
|
+
assertKnownEnvironment(config, options.environment);
|
|
664
748
|
return deployAwsProject({
|
|
665
749
|
projectDir,
|
|
666
750
|
config,
|
|
@@ -674,6 +758,18 @@ export async function deployProject(projectDir, config, options = {}) {
|
|
|
674
758
|
});
|
|
675
759
|
}
|
|
676
760
|
|
|
761
|
+
export async function syncProject(projectDir, config, options = {}) {
|
|
762
|
+
assertKnownEnvironment(config, options.environment);
|
|
763
|
+
return syncAwsProject({
|
|
764
|
+
projectDir,
|
|
765
|
+
config,
|
|
766
|
+
environment: options.environment,
|
|
767
|
+
outDir: options.outDir,
|
|
768
|
+
profile: options.profile,
|
|
769
|
+
stdio: options.stdio ?? "pipe"
|
|
770
|
+
});
|
|
771
|
+
}
|
|
772
|
+
|
|
677
773
|
export async function doctorProject(projectDir, configPath, options = {}) {
|
|
678
774
|
const checks = [];
|
|
679
775
|
const majorVersion = Number(process.versions.node.split(".")[0]);
|
|
@@ -706,6 +802,15 @@ export async function doctorProject(projectDir, configPath, options = {}) {
|
|
|
706
802
|
}
|
|
707
803
|
|
|
708
804
|
if (options.environment && options.config) {
|
|
805
|
+
if (!options.config.environments?.[options.environment]) {
|
|
806
|
+
checks.push({
|
|
807
|
+
name: "environment",
|
|
808
|
+
ok: false,
|
|
809
|
+
message: unknownEnvironmentMessage(options.config, options.environment)
|
|
810
|
+
});
|
|
811
|
+
return checks;
|
|
812
|
+
}
|
|
813
|
+
|
|
709
814
|
try {
|
|
710
815
|
await ensureAwsCredentials({
|
|
711
816
|
region: options.config.environments[options.environment].awsRegion,
|