@i-santos/create-package-starter 1.0.0 → 1.2.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +75 -14
- package/lib/run.js +639 -32
- package/package.json +1 -1
- package/template/.changeset/README.md +2 -2
- package/template/.changeset/config.json +1 -1
- package/template/.github/CODEOWNERS +2 -0
- package/template/.github/PULL_REQUEST_TEMPLATE.md +15 -0
- package/template/.github/workflows/ci.yml +29 -0
- package/template/.github/workflows/release.yml +1 -4
- package/template/CONTRIBUTING.md +28 -0
- package/template/README.md +22 -7
package/README.md
CHANGED
|
@@ -1,33 +1,94 @@
|
|
|
1
1
|
# @i-santos/create-package-starter
|
|
2
2
|
|
|
3
|
-
Scaffold
|
|
3
|
+
Scaffold and standardize npm packages with a Changesets-first release workflow.
|
|
4
4
|
|
|
5
5
|
## Install / Run
|
|
6
6
|
|
|
7
7
|
```bash
|
|
8
8
|
npx @i-santos/create-package-starter --name hello-package
|
|
9
|
-
npx @i-santos/create-package-starter --name @i-santos/swarm
|
|
9
|
+
npx @i-santos/create-package-starter --name @i-santos/swarm --default-branch main
|
|
10
|
+
npx @i-santos/create-package-starter init --dir ./existing-package
|
|
11
|
+
npx @i-santos/create-package-starter setup-github --repo i-santos/firestack --dry-run
|
|
10
12
|
```
|
|
11
13
|
|
|
12
|
-
##
|
|
14
|
+
## Commands
|
|
15
|
+
|
|
16
|
+
Create new package:
|
|
13
17
|
|
|
14
18
|
- `--name <name>` (required, supports `pkg` and `@scope/pkg`)
|
|
15
19
|
- `--out <directory>` (default: current directory)
|
|
20
|
+
- `--default-branch <branch>` (default: `main`)
|
|
16
21
|
|
|
17
|
-
|
|
22
|
+
Bootstrap existing package:
|
|
18
23
|
|
|
19
|
-
|
|
24
|
+
- `init`
|
|
25
|
+
- `--dir <directory>` (default: current directory)
|
|
26
|
+
- `--force` (overwrite managed files/script keys/dependency versions)
|
|
27
|
+
- `--cleanup-legacy-release` (remove `release:beta*`, `release:stable*`, `release:promote*`, `release:rollback*`, `release:dist-tags`)
|
|
28
|
+
- `--scope <scope>` (optional placeholder helper for docs/templates)
|
|
29
|
+
- `--default-branch <branch>` (default: `main`)
|
|
20
30
|
|
|
21
|
-
|
|
22
|
-
|
|
23
|
-
- `
|
|
24
|
-
-
|
|
31
|
+
Configure GitHub repository settings:
|
|
32
|
+
|
|
33
|
+
- `setup-github`
|
|
34
|
+
- `--repo <owner/repo>` (optional; inferred from `remote.origin.url` when omitted)
|
|
35
|
+
- `--default-branch <branch>` (default: `main`)
|
|
36
|
+
- `--ruleset <path>` (optional JSON override)
|
|
37
|
+
- `--dry-run` (prints intended operations only)
|
|
38
|
+
|
|
39
|
+
## Managed Standards
|
|
40
|
+
|
|
41
|
+
The generated and managed baseline includes:
|
|
42
|
+
|
|
43
|
+
- `package.json` scripts: `check`, `changeset`, `version-packages`, `release`
|
|
44
|
+
- `@changesets/cli` in `devDependencies`
|
|
25
45
|
- `.changeset/config.json`
|
|
46
|
+
- `.changeset/README.md`
|
|
47
|
+
- `.github/workflows/ci.yml`
|
|
48
|
+
- `.github/workflows/release.yml`
|
|
49
|
+
- `.github/PULL_REQUEST_TEMPLATE.md`
|
|
50
|
+
- `.github/CODEOWNERS`
|
|
51
|
+
- `CONTRIBUTING.md`
|
|
52
|
+
- `README.md`
|
|
53
|
+
- `.gitignore`
|
|
54
|
+
|
|
55
|
+
## Init Behavior
|
|
56
|
+
|
|
57
|
+
- Default mode is safe-merge: existing managed files and keys are preserved.
|
|
58
|
+
- `--force` overwrites managed files and managed script/dependency keys.
|
|
59
|
+
- Existing custom `check` script is preserved unless `--force`.
|
|
60
|
+
- Existing `@changesets/cli` version is preserved unless `--force`.
|
|
61
|
+
- Lowercase `.github/pull_request_template.md` is recognized as an existing equivalent template.
|
|
62
|
+
|
|
63
|
+
## Output Summary Contract
|
|
26
64
|
|
|
27
|
-
|
|
65
|
+
All commands print a deterministic summary with:
|
|
28
66
|
|
|
29
|
-
|
|
67
|
+
- files created
|
|
68
|
+
- files overwritten
|
|
69
|
+
- files skipped
|
|
70
|
+
- scripts updated/skipped/removed
|
|
71
|
+
- dependencies updated/skipped
|
|
72
|
+
- warnings
|
|
73
|
+
|
|
74
|
+
## setup-github Behavior
|
|
75
|
+
|
|
76
|
+
`setup-github` applies repository defaults via `gh` API:
|
|
77
|
+
|
|
78
|
+
- default branch
|
|
79
|
+
- delete branch on merge
|
|
80
|
+
- auto-merge enabled
|
|
81
|
+
- squash-only merge policy
|
|
82
|
+
- create/update branch ruleset with required PR, 1 approval, stale review dismissal, resolved conversations, and deletion/force-push protection
|
|
83
|
+
|
|
84
|
+
If `gh` is missing or unauthenticated, command exits non-zero with actionable guidance.
|
|
85
|
+
|
|
86
|
+
## Trusted Publishing Note
|
|
87
|
+
|
|
88
|
+
If package does not exist on npm yet, first publish may be manual:
|
|
89
|
+
|
|
90
|
+
```bash
|
|
91
|
+
npm publish --access public
|
|
92
|
+
```
|
|
30
93
|
|
|
31
|
-
|
|
32
|
-
- Example: `@i-santos/swarm` creates `./swarm`.
|
|
33
|
-
- Template follows `npm init -y` behavior by default (no `private` field).
|
|
94
|
+
After first publish, configure npm Trusted Publisher using your owner, repository, workflow file (`.github/workflows/release.yml`), and branch (`main` by default).
|
package/lib/run.js
CHANGED
|
@@ -1,49 +1,202 @@
|
|
|
1
1
|
const fs = require('fs');
|
|
2
2
|
const path = require('path');
|
|
3
|
+
const { spawnSync } = require('child_process');
|
|
4
|
+
|
|
5
|
+
const CHANGESETS_DEP = '@changesets/cli';
|
|
6
|
+
const CHANGESETS_DEP_VERSION = '^2.29.7';
|
|
7
|
+
const DEFAULT_BASE_BRANCH = 'main';
|
|
8
|
+
const DEFAULT_RULESET_NAME = 'Default main branch protection';
|
|
9
|
+
|
|
10
|
+
const MANAGED_FILE_SPECS = [
|
|
11
|
+
['.changeset/config.json', '.changeset/config.json'],
|
|
12
|
+
['.changeset/README.md', '.changeset/README.md'],
|
|
13
|
+
['.github/workflows/ci.yml', '.github/workflows/ci.yml'],
|
|
14
|
+
['.github/workflows/release.yml', '.github/workflows/release.yml'],
|
|
15
|
+
['.github/PULL_REQUEST_TEMPLATE.md', '.github/PULL_REQUEST_TEMPLATE.md'],
|
|
16
|
+
['.github/CODEOWNERS', '.github/CODEOWNERS'],
|
|
17
|
+
['CONTRIBUTING.md', 'CONTRIBUTING.md'],
|
|
18
|
+
['README.md', 'README.md'],
|
|
19
|
+
['.gitignore', '.gitignore']
|
|
20
|
+
];
|
|
3
21
|
|
|
4
22
|
function usage() {
|
|
5
23
|
return [
|
|
6
|
-
'
|
|
7
|
-
' create-package-starter --name <
|
|
24
|
+
'Usage:',
|
|
25
|
+
' create-package-starter --name <name> [--out <directory>] [--default-branch <branch>]',
|
|
26
|
+
' create-package-starter init [--dir <directory>] [--force] [--cleanup-legacy-release] [--scope <scope>] [--default-branch <branch>]',
|
|
27
|
+
' create-package-starter setup-github [--repo <owner/repo>] [--default-branch <branch>] [--ruleset <path>] [--dry-run]',
|
|
8
28
|
'',
|
|
9
|
-
'
|
|
29
|
+
'Examples:',
|
|
10
30
|
' create-package-starter --name hello-package',
|
|
11
|
-
' create-package-starter --name @i-santos/swarm',
|
|
12
|
-
' create-package-starter --
|
|
31
|
+
' create-package-starter --name @i-santos/swarm --out ./packages',
|
|
32
|
+
' create-package-starter init --dir ./my-package',
|
|
33
|
+
' create-package-starter init --cleanup-legacy-release',
|
|
34
|
+
' create-package-starter setup-github --repo i-santos/firestack --dry-run'
|
|
13
35
|
].join('\n');
|
|
14
36
|
}
|
|
15
37
|
|
|
16
|
-
function
|
|
38
|
+
function parseValueFlag(argv, index, flag) {
|
|
39
|
+
const value = argv[index + 1];
|
|
40
|
+
if (!value || value.startsWith('--')) {
|
|
41
|
+
throw new Error(`Missing value for ${flag}\\n\\n${usage()}`);
|
|
42
|
+
}
|
|
43
|
+
|
|
44
|
+
return value;
|
|
45
|
+
}
|
|
46
|
+
|
|
47
|
+
function parseCreateArgs(argv) {
|
|
17
48
|
const args = {
|
|
18
|
-
out: process.cwd()
|
|
49
|
+
out: process.cwd(),
|
|
50
|
+
defaultBranch: DEFAULT_BASE_BRANCH
|
|
19
51
|
};
|
|
20
52
|
|
|
21
53
|
for (let i = 0; i < argv.length; i += 1) {
|
|
22
54
|
const token = argv[i];
|
|
23
55
|
|
|
24
56
|
if (token === '--name') {
|
|
25
|
-
args.name = argv
|
|
57
|
+
args.name = parseValueFlag(argv, i, '--name');
|
|
26
58
|
i += 1;
|
|
27
59
|
continue;
|
|
28
60
|
}
|
|
29
61
|
|
|
30
62
|
if (token === '--out') {
|
|
31
|
-
args.out = argv
|
|
63
|
+
args.out = parseValueFlag(argv, i, '--out');
|
|
64
|
+
i += 1;
|
|
65
|
+
continue;
|
|
66
|
+
}
|
|
67
|
+
|
|
68
|
+
if (token === '--default-branch') {
|
|
69
|
+
args.defaultBranch = parseValueFlag(argv, i, '--default-branch');
|
|
70
|
+
i += 1;
|
|
71
|
+
continue;
|
|
72
|
+
}
|
|
73
|
+
|
|
74
|
+
if (token === '--help' || token === '-h') {
|
|
75
|
+
args.help = true;
|
|
76
|
+
continue;
|
|
77
|
+
}
|
|
78
|
+
|
|
79
|
+
throw new Error(`Invalid argument: ${token}\\n\\n${usage()}`);
|
|
80
|
+
}
|
|
81
|
+
|
|
82
|
+
return args;
|
|
83
|
+
}
|
|
84
|
+
|
|
85
|
+
function parseInitArgs(argv) {
|
|
86
|
+
const args = {
|
|
87
|
+
dir: process.cwd(),
|
|
88
|
+
force: false,
|
|
89
|
+
cleanupLegacyRelease: false,
|
|
90
|
+
defaultBranch: DEFAULT_BASE_BRANCH,
|
|
91
|
+
scope: ''
|
|
92
|
+
};
|
|
93
|
+
|
|
94
|
+
for (let i = 0; i < argv.length; i += 1) {
|
|
95
|
+
const token = argv[i];
|
|
96
|
+
|
|
97
|
+
if (token === '--dir') {
|
|
98
|
+
args.dir = parseValueFlag(argv, i, '--dir');
|
|
99
|
+
i += 1;
|
|
100
|
+
continue;
|
|
101
|
+
}
|
|
102
|
+
|
|
103
|
+
if (token === '--scope') {
|
|
104
|
+
args.scope = parseValueFlag(argv, i, '--scope');
|
|
105
|
+
i += 1;
|
|
106
|
+
continue;
|
|
107
|
+
}
|
|
108
|
+
|
|
109
|
+
if (token === '--default-branch') {
|
|
110
|
+
args.defaultBranch = parseValueFlag(argv, i, '--default-branch');
|
|
32
111
|
i += 1;
|
|
33
112
|
continue;
|
|
34
113
|
}
|
|
35
114
|
|
|
115
|
+
if (token === '--force') {
|
|
116
|
+
args.force = true;
|
|
117
|
+
continue;
|
|
118
|
+
}
|
|
119
|
+
|
|
120
|
+
if (token === '--cleanup-legacy-release') {
|
|
121
|
+
args.cleanupLegacyRelease = true;
|
|
122
|
+
continue;
|
|
123
|
+
}
|
|
124
|
+
|
|
36
125
|
if (token === '--help' || token === '-h') {
|
|
37
126
|
args.help = true;
|
|
38
127
|
continue;
|
|
39
128
|
}
|
|
40
129
|
|
|
41
|
-
throw new Error(`
|
|
130
|
+
throw new Error(`Invalid argument: ${token}\\n\\n${usage()}`);
|
|
42
131
|
}
|
|
43
132
|
|
|
44
133
|
return args;
|
|
45
134
|
}
|
|
46
135
|
|
|
136
|
+
function parseSetupGithubArgs(argv) {
|
|
137
|
+
const args = {
|
|
138
|
+
defaultBranch: DEFAULT_BASE_BRANCH,
|
|
139
|
+
dryRun: false
|
|
140
|
+
};
|
|
141
|
+
|
|
142
|
+
for (let i = 0; i < argv.length; i += 1) {
|
|
143
|
+
const token = argv[i];
|
|
144
|
+
|
|
145
|
+
if (token === '--repo') {
|
|
146
|
+
args.repo = parseValueFlag(argv, i, '--repo');
|
|
147
|
+
i += 1;
|
|
148
|
+
continue;
|
|
149
|
+
}
|
|
150
|
+
|
|
151
|
+
if (token === '--default-branch') {
|
|
152
|
+
args.defaultBranch = parseValueFlag(argv, i, '--default-branch');
|
|
153
|
+
i += 1;
|
|
154
|
+
continue;
|
|
155
|
+
}
|
|
156
|
+
|
|
157
|
+
if (token === '--ruleset') {
|
|
158
|
+
args.ruleset = parseValueFlag(argv, i, '--ruleset');
|
|
159
|
+
i += 1;
|
|
160
|
+
continue;
|
|
161
|
+
}
|
|
162
|
+
|
|
163
|
+
if (token === '--dry-run') {
|
|
164
|
+
args.dryRun = true;
|
|
165
|
+
continue;
|
|
166
|
+
}
|
|
167
|
+
|
|
168
|
+
if (token === '--help' || token === '-h') {
|
|
169
|
+
args.help = true;
|
|
170
|
+
continue;
|
|
171
|
+
}
|
|
172
|
+
|
|
173
|
+
throw new Error(`Invalid argument: ${token}\\n\\n${usage()}`);
|
|
174
|
+
}
|
|
175
|
+
|
|
176
|
+
return args;
|
|
177
|
+
}
|
|
178
|
+
|
|
179
|
+
function parseArgs(argv) {
|
|
180
|
+
if (argv[0] === 'init') {
|
|
181
|
+
return {
|
|
182
|
+
mode: 'init',
|
|
183
|
+
args: parseInitArgs(argv.slice(1))
|
|
184
|
+
};
|
|
185
|
+
}
|
|
186
|
+
|
|
187
|
+
if (argv[0] === 'setup-github') {
|
|
188
|
+
return {
|
|
189
|
+
mode: 'setup-github',
|
|
190
|
+
args: parseSetupGithubArgs(argv.slice(1))
|
|
191
|
+
};
|
|
192
|
+
}
|
|
193
|
+
|
|
194
|
+
return {
|
|
195
|
+
mode: 'create',
|
|
196
|
+
args: parseCreateArgs(argv)
|
|
197
|
+
};
|
|
198
|
+
}
|
|
199
|
+
|
|
47
200
|
function validateName(name) {
|
|
48
201
|
if (typeof name !== 'string') {
|
|
49
202
|
return false;
|
|
@@ -59,67 +212,521 @@ function packageDirFromName(packageName) {
|
|
|
59
212
|
return parts[parts.length - 1];
|
|
60
213
|
}
|
|
61
214
|
|
|
62
|
-
function
|
|
215
|
+
function deriveScope(argsScope, packageName) {
|
|
216
|
+
if (argsScope) {
|
|
217
|
+
return argsScope;
|
|
218
|
+
}
|
|
219
|
+
|
|
220
|
+
if (typeof packageName === 'string' && packageName.startsWith('@')) {
|
|
221
|
+
const first = packageName.split('/')[0];
|
|
222
|
+
return first.slice(1);
|
|
223
|
+
}
|
|
224
|
+
|
|
225
|
+
return 'team';
|
|
226
|
+
}
|
|
227
|
+
|
|
228
|
+
function renderTemplateString(source, variables) {
|
|
229
|
+
let output = source;
|
|
230
|
+
|
|
231
|
+
for (const [key, value] of Object.entries(variables)) {
|
|
232
|
+
output = output.replace(new RegExp(`__${key}__`, 'g'), value);
|
|
233
|
+
}
|
|
234
|
+
|
|
235
|
+
return output;
|
|
236
|
+
}
|
|
237
|
+
|
|
238
|
+
function copyDirRecursive(sourceDir, targetDir, variables, relativeBase = '') {
|
|
63
239
|
fs.mkdirSync(targetDir, { recursive: true });
|
|
64
240
|
const entries = fs.readdirSync(sourceDir, { withFileTypes: true });
|
|
241
|
+
const createdFiles = [];
|
|
65
242
|
|
|
66
243
|
for (const entry of entries) {
|
|
67
244
|
const srcPath = path.join(sourceDir, entry.name);
|
|
68
245
|
const destPath = path.join(targetDir, entry.name);
|
|
246
|
+
const relativePath = path.posix.join(relativeBase, entry.name);
|
|
69
247
|
|
|
70
248
|
if (entry.isDirectory()) {
|
|
71
|
-
copyDirRecursive(srcPath, destPath);
|
|
249
|
+
createdFiles.push(...copyDirRecursive(srcPath, destPath, variables, relativePath));
|
|
72
250
|
continue;
|
|
73
251
|
}
|
|
74
252
|
|
|
75
|
-
fs.
|
|
253
|
+
const source = fs.readFileSync(srcPath, 'utf8');
|
|
254
|
+
const rendered = renderTemplateString(source, variables);
|
|
255
|
+
fs.writeFileSync(destPath, rendered);
|
|
256
|
+
createdFiles.push(relativePath);
|
|
76
257
|
}
|
|
258
|
+
|
|
259
|
+
return createdFiles;
|
|
77
260
|
}
|
|
78
261
|
|
|
79
|
-
function
|
|
80
|
-
|
|
81
|
-
const output = source.replace(/__PACKAGE_NAME__/g, variables.packageName);
|
|
262
|
+
function readJsonFile(filePath) {
|
|
263
|
+
let raw;
|
|
82
264
|
|
|
83
|
-
|
|
265
|
+
try {
|
|
266
|
+
raw = fs.readFileSync(filePath, 'utf8');
|
|
267
|
+
} catch (error) {
|
|
268
|
+
throw new Error(`Failed to read ${filePath}: ${error.message}`);
|
|
269
|
+
}
|
|
270
|
+
|
|
271
|
+
try {
|
|
272
|
+
return JSON.parse(raw);
|
|
273
|
+
} catch (error) {
|
|
274
|
+
throw new Error(`Invalid JSON in ${filePath}: ${error.message}`);
|
|
275
|
+
}
|
|
84
276
|
}
|
|
85
277
|
|
|
86
|
-
|
|
87
|
-
|
|
278
|
+
function writeJsonFile(filePath, value) {
|
|
279
|
+
fs.writeFileSync(filePath, `${JSON.stringify(value, null, 2)}\n`);
|
|
280
|
+
}
|
|
88
281
|
|
|
89
|
-
|
|
90
|
-
|
|
91
|
-
|
|
282
|
+
function createSummary() {
|
|
283
|
+
return {
|
|
284
|
+
createdFiles: [],
|
|
285
|
+
overwrittenFiles: [],
|
|
286
|
+
skippedFiles: [],
|
|
287
|
+
updatedScriptKeys: [],
|
|
288
|
+
skippedScriptKeys: [],
|
|
289
|
+
removedScriptKeys: [],
|
|
290
|
+
updatedDependencyKeys: [],
|
|
291
|
+
skippedDependencyKeys: [],
|
|
292
|
+
warnings: []
|
|
293
|
+
};
|
|
294
|
+
}
|
|
295
|
+
|
|
296
|
+
function printSummary(title, summary) {
|
|
297
|
+
const list = (values) => (values.length ? values.join(', ') : 'none');
|
|
298
|
+
|
|
299
|
+
console.log(title);
|
|
300
|
+
console.log(`files created: ${list(summary.createdFiles)}`);
|
|
301
|
+
console.log(`files overwritten: ${list(summary.overwrittenFiles)}`);
|
|
302
|
+
console.log(`files skipped: ${list(summary.skippedFiles)}`);
|
|
303
|
+
console.log(`scripts updated: ${list(summary.updatedScriptKeys)}`);
|
|
304
|
+
console.log(`scripts skipped: ${list(summary.skippedScriptKeys)}`);
|
|
305
|
+
console.log(`scripts removed: ${list(summary.removedScriptKeys)}`);
|
|
306
|
+
console.log(`dependencies updated: ${list(summary.updatedDependencyKeys)}`);
|
|
307
|
+
console.log(`dependencies skipped: ${list(summary.skippedDependencyKeys)}`);
|
|
308
|
+
console.log(`warnings: ${list(summary.warnings)}`);
|
|
309
|
+
}
|
|
310
|
+
|
|
311
|
+
function ensureFileFromTemplate(targetPath, templatePath, options) {
|
|
312
|
+
const exists = fs.existsSync(targetPath);
|
|
313
|
+
|
|
314
|
+
if (exists && !options.force) {
|
|
315
|
+
return 'skipped';
|
|
316
|
+
}
|
|
317
|
+
|
|
318
|
+
const source = fs.readFileSync(templatePath, 'utf8');
|
|
319
|
+
const rendered = renderTemplateString(source, options.variables);
|
|
320
|
+
|
|
321
|
+
fs.mkdirSync(path.dirname(targetPath), { recursive: true });
|
|
322
|
+
fs.writeFileSync(targetPath, rendered);
|
|
323
|
+
|
|
324
|
+
if (exists) {
|
|
325
|
+
return 'overwritten';
|
|
326
|
+
}
|
|
327
|
+
|
|
328
|
+
return 'created';
|
|
329
|
+
}
|
|
330
|
+
|
|
331
|
+
function detectEquivalentManagedFile(packageDir, targetRelativePath) {
|
|
332
|
+
if (targetRelativePath !== '.github/PULL_REQUEST_TEMPLATE.md') {
|
|
333
|
+
return targetRelativePath;
|
|
334
|
+
}
|
|
335
|
+
|
|
336
|
+
const canonicalPath = path.join(packageDir, targetRelativePath);
|
|
337
|
+
if (fs.existsSync(canonicalPath)) {
|
|
338
|
+
return targetRelativePath;
|
|
339
|
+
}
|
|
340
|
+
|
|
341
|
+
const legacyLowercase = '.github/pull_request_template.md';
|
|
342
|
+
if (fs.existsSync(path.join(packageDir, legacyLowercase))) {
|
|
343
|
+
return legacyLowercase;
|
|
344
|
+
}
|
|
345
|
+
|
|
346
|
+
return targetRelativePath;
|
|
347
|
+
}
|
|
348
|
+
|
|
349
|
+
function updateManagedFiles(packageDir, templateDir, options, summary) {
|
|
350
|
+
for (const [targetRelativePath, templateRelativePath] of MANAGED_FILE_SPECS) {
|
|
351
|
+
const effectiveTargetRelative = detectEquivalentManagedFile(packageDir, targetRelativePath);
|
|
352
|
+
const targetPath = path.join(packageDir, effectiveTargetRelative);
|
|
353
|
+
const templatePath = path.join(templateDir, templateRelativePath);
|
|
354
|
+
|
|
355
|
+
if (!fs.existsSync(templatePath)) {
|
|
356
|
+
throw new Error(`Template not found: ${templatePath}`);
|
|
357
|
+
}
|
|
358
|
+
|
|
359
|
+
const result = ensureFileFromTemplate(targetPath, templatePath, {
|
|
360
|
+
force: options.force,
|
|
361
|
+
variables: options.variables
|
|
362
|
+
});
|
|
363
|
+
|
|
364
|
+
if (result === 'created') {
|
|
365
|
+
summary.createdFiles.push(targetRelativePath);
|
|
366
|
+
} else if (result === 'overwritten') {
|
|
367
|
+
summary.overwrittenFiles.push(targetRelativePath);
|
|
368
|
+
} else {
|
|
369
|
+
summary.skippedFiles.push(targetRelativePath);
|
|
370
|
+
}
|
|
371
|
+
}
|
|
372
|
+
}
|
|
373
|
+
|
|
374
|
+
function removeLegacyReleaseScripts(packageJson, summary) {
|
|
375
|
+
const keys = Object.keys(packageJson.scripts || {});
|
|
376
|
+
|
|
377
|
+
for (const key of keys) {
|
|
378
|
+
const isLegacy = key === 'release:dist-tags'
|
|
379
|
+
|| key.startsWith('release:beta')
|
|
380
|
+
|| key.startsWith('release:stable')
|
|
381
|
+
|| key.startsWith('release:promote')
|
|
382
|
+
|| key.startsWith('release:rollback');
|
|
383
|
+
|
|
384
|
+
if (!isLegacy) {
|
|
385
|
+
continue;
|
|
386
|
+
}
|
|
387
|
+
|
|
388
|
+
delete packageJson.scripts[key];
|
|
389
|
+
summary.removedScriptKeys.push(key);
|
|
390
|
+
}
|
|
391
|
+
}
|
|
392
|
+
|
|
393
|
+
function configureExistingPackage(packageDir, templateDir, options) {
|
|
394
|
+
if (!fs.existsSync(packageDir)) {
|
|
395
|
+
throw new Error(`Directory not found: ${packageDir}`);
|
|
396
|
+
}
|
|
397
|
+
|
|
398
|
+
const packageJsonPath = path.join(packageDir, 'package.json');
|
|
399
|
+
if (!fs.existsSync(packageJsonPath)) {
|
|
400
|
+
throw new Error(`package.json not found in ${packageDir}`);
|
|
401
|
+
}
|
|
402
|
+
|
|
403
|
+
const packageJson = readJsonFile(packageJsonPath);
|
|
404
|
+
packageJson.scripts = packageJson.scripts || {};
|
|
405
|
+
packageJson.devDependencies = packageJson.devDependencies || {};
|
|
406
|
+
|
|
407
|
+
const summary = createSummary();
|
|
408
|
+
|
|
409
|
+
const desiredScripts = {
|
|
410
|
+
check: 'npm run test',
|
|
411
|
+
changeset: 'changeset',
|
|
412
|
+
'version-packages': 'changeset version',
|
|
413
|
+
release: 'npm run check && changeset publish'
|
|
414
|
+
};
|
|
415
|
+
|
|
416
|
+
let packageJsonChanged = false;
|
|
417
|
+
|
|
418
|
+
for (const [key, value] of Object.entries(desiredScripts)) {
|
|
419
|
+
const exists = Object.prototype.hasOwnProperty.call(packageJson.scripts, key);
|
|
420
|
+
|
|
421
|
+
if (key === 'check') {
|
|
422
|
+
if (!exists) {
|
|
423
|
+
packageJson.scripts[key] = value;
|
|
424
|
+
packageJsonChanged = true;
|
|
425
|
+
summary.updatedScriptKeys.push(key);
|
|
426
|
+
} else if (options.force && packageJson.scripts[key] !== value) {
|
|
427
|
+
packageJson.scripts[key] = value;
|
|
428
|
+
packageJsonChanged = true;
|
|
429
|
+
summary.updatedScriptKeys.push(key);
|
|
430
|
+
} else {
|
|
431
|
+
summary.skippedScriptKeys.push(key);
|
|
432
|
+
}
|
|
433
|
+
continue;
|
|
434
|
+
}
|
|
435
|
+
|
|
436
|
+
if (!exists || options.force) {
|
|
437
|
+
if (!exists || packageJson.scripts[key] !== value) {
|
|
438
|
+
packageJson.scripts[key] = value;
|
|
439
|
+
packageJsonChanged = true;
|
|
440
|
+
}
|
|
441
|
+
summary.updatedScriptKeys.push(key);
|
|
442
|
+
continue;
|
|
443
|
+
}
|
|
444
|
+
|
|
445
|
+
summary.skippedScriptKeys.push(key);
|
|
446
|
+
}
|
|
447
|
+
|
|
448
|
+
const depExists = Object.prototype.hasOwnProperty.call(packageJson.devDependencies, CHANGESETS_DEP);
|
|
449
|
+
|
|
450
|
+
if (!depExists || options.force) {
|
|
451
|
+
if (!depExists || packageJson.devDependencies[CHANGESETS_DEP] !== CHANGESETS_DEP_VERSION) {
|
|
452
|
+
packageJson.devDependencies[CHANGESETS_DEP] = CHANGESETS_DEP_VERSION;
|
|
453
|
+
packageJsonChanged = true;
|
|
454
|
+
}
|
|
455
|
+
summary.updatedDependencyKeys.push(CHANGESETS_DEP);
|
|
456
|
+
} else {
|
|
457
|
+
summary.skippedDependencyKeys.push(CHANGESETS_DEP);
|
|
458
|
+
}
|
|
459
|
+
|
|
460
|
+
if (options.cleanupLegacyRelease) {
|
|
461
|
+
const before = summary.removedScriptKeys.length;
|
|
462
|
+
removeLegacyReleaseScripts(packageJson, summary);
|
|
463
|
+
if (summary.removedScriptKeys.length > before) {
|
|
464
|
+
packageJsonChanged = true;
|
|
465
|
+
}
|
|
466
|
+
}
|
|
467
|
+
|
|
468
|
+
const packageName = packageJson.name || packageDirFromName(path.basename(packageDir));
|
|
469
|
+
|
|
470
|
+
updateManagedFiles(packageDir, templateDir, {
|
|
471
|
+
force: options.force,
|
|
472
|
+
variables: {
|
|
473
|
+
PACKAGE_NAME: packageName,
|
|
474
|
+
DEFAULT_BRANCH: options.defaultBranch,
|
|
475
|
+
SCOPE: deriveScope(options.scope, packageName)
|
|
476
|
+
}
|
|
477
|
+
}, summary);
|
|
478
|
+
|
|
479
|
+
if (packageJsonChanged) {
|
|
480
|
+
writeJsonFile(packageJsonPath, packageJson);
|
|
92
481
|
}
|
|
93
482
|
|
|
483
|
+
return summary;
|
|
484
|
+
}
|
|
485
|
+
|
|
486
|
+
function createNewPackage(args) {
|
|
94
487
|
if (!validateName(args.name)) {
|
|
95
|
-
throw new Error('
|
|
488
|
+
throw new Error('Provide a valid package name with --name (for example: hello-package or @i-santos/swarm).');
|
|
96
489
|
}
|
|
97
490
|
|
|
98
491
|
const packageRoot = path.resolve(__dirname, '..');
|
|
99
492
|
const templateDir = path.join(packageRoot, 'template');
|
|
100
493
|
|
|
101
494
|
if (!fs.existsSync(templateDir)) {
|
|
102
|
-
throw new Error(`
|
|
495
|
+
throw new Error(`Template not found in ${templateDir}`);
|
|
103
496
|
}
|
|
104
497
|
|
|
105
498
|
const outputDir = path.resolve(args.out);
|
|
106
499
|
const targetDir = path.join(outputDir, packageDirFromName(args.name));
|
|
107
500
|
|
|
108
501
|
if (fs.existsSync(targetDir)) {
|
|
109
|
-
throw new Error(`
|
|
502
|
+
throw new Error(`Directory already exists: ${targetDir}`);
|
|
110
503
|
}
|
|
111
504
|
|
|
112
|
-
|
|
505
|
+
const summary = createSummary();
|
|
113
506
|
|
|
114
|
-
|
|
115
|
-
|
|
507
|
+
const createdFiles = copyDirRecursive(templateDir, targetDir, {
|
|
508
|
+
PACKAGE_NAME: args.name,
|
|
509
|
+
DEFAULT_BRANCH: args.defaultBranch,
|
|
510
|
+
SCOPE: deriveScope('', args.name)
|
|
116
511
|
});
|
|
117
512
|
|
|
118
|
-
|
|
119
|
-
|
|
513
|
+
summary.createdFiles.push(...createdFiles);
|
|
514
|
+
|
|
515
|
+
summary.updatedScriptKeys.push('check', 'changeset', 'version-packages', 'release');
|
|
516
|
+
summary.updatedDependencyKeys.push(CHANGESETS_DEP);
|
|
517
|
+
|
|
518
|
+
printSummary(`Package created in ${targetDir}`, summary);
|
|
519
|
+
}
|
|
520
|
+
|
|
521
|
+
function initExistingPackage(args) {
|
|
522
|
+
const packageRoot = path.resolve(__dirname, '..');
|
|
523
|
+
const templateDir = path.join(packageRoot, 'template');
|
|
524
|
+
const targetDir = path.resolve(args.dir);
|
|
525
|
+
|
|
526
|
+
const summary = configureExistingPackage(targetDir, templateDir, args);
|
|
527
|
+
printSummary(`Project initialized in ${targetDir}`, summary);
|
|
528
|
+
}
|
|
529
|
+
|
|
530
|
+
function execCommand(command, args, options = {}) {
|
|
531
|
+
return spawnSync(command, args, {
|
|
532
|
+
encoding: 'utf8',
|
|
533
|
+
...options
|
|
120
534
|
});
|
|
535
|
+
}
|
|
536
|
+
|
|
537
|
+
function parseRepoFromRemote(remoteUrl) {
|
|
538
|
+
const trimmed = remoteUrl.trim();
|
|
539
|
+
const httpsMatch = trimmed.match(/github\.com[/:]([^/]+\/[^/.]+)(?:\.git)?$/);
|
|
540
|
+
|
|
541
|
+
if (httpsMatch) {
|
|
542
|
+
return httpsMatch[1];
|
|
543
|
+
}
|
|
544
|
+
|
|
545
|
+
return '';
|
|
546
|
+
}
|
|
547
|
+
|
|
548
|
+
function resolveRepo(args, deps) {
|
|
549
|
+
if (args.repo) {
|
|
550
|
+
return args.repo;
|
|
551
|
+
}
|
|
552
|
+
|
|
553
|
+
const remote = deps.exec('git', ['config', '--get', 'remote.origin.url']);
|
|
554
|
+
if (remote.status !== 0 || !remote.stdout.trim()) {
|
|
555
|
+
throw new Error('Could not infer repository. Use --repo <owner/repo>.');
|
|
556
|
+
}
|
|
557
|
+
|
|
558
|
+
const repo = parseRepoFromRemote(remote.stdout);
|
|
559
|
+
if (!repo) {
|
|
560
|
+
throw new Error('Could not parse GitHub repository from remote.origin.url. Use --repo <owner/repo>.');
|
|
561
|
+
}
|
|
562
|
+
|
|
563
|
+
return repo;
|
|
564
|
+
}
|
|
565
|
+
|
|
566
|
+
function createBaseRulesetPayload(defaultBranch) {
|
|
567
|
+
return {
|
|
568
|
+
name: DEFAULT_RULESET_NAME,
|
|
569
|
+
target: 'branch',
|
|
570
|
+
enforcement: 'active',
|
|
571
|
+
conditions: {
|
|
572
|
+
ref_name: {
|
|
573
|
+
include: [`refs/heads/${defaultBranch}`],
|
|
574
|
+
exclude: []
|
|
575
|
+
}
|
|
576
|
+
},
|
|
577
|
+
bypass_actors: [],
|
|
578
|
+
rules: [
|
|
579
|
+
{ type: 'deletion' },
|
|
580
|
+
{ type: 'non_fast_forward' },
|
|
581
|
+
{
|
|
582
|
+
type: 'pull_request',
|
|
583
|
+
parameters: {
|
|
584
|
+
required_approving_review_count: 1,
|
|
585
|
+
dismiss_stale_reviews_on_push: true,
|
|
586
|
+
require_code_owner_review: false,
|
|
587
|
+
require_last_push_approval: false,
|
|
588
|
+
required_review_thread_resolution: true
|
|
589
|
+
}
|
|
590
|
+
}
|
|
591
|
+
]
|
|
592
|
+
};
|
|
593
|
+
}
|
|
594
|
+
|
|
595
|
+
function createRulesetPayload(args) {
|
|
596
|
+
if (!args.ruleset) {
|
|
597
|
+
return createBaseRulesetPayload(args.defaultBranch);
|
|
598
|
+
}
|
|
599
|
+
|
|
600
|
+
const rulesetPath = path.resolve(args.ruleset);
|
|
601
|
+
if (!fs.existsSync(rulesetPath)) {
|
|
602
|
+
throw new Error(`Ruleset file not found: ${rulesetPath}`);
|
|
603
|
+
}
|
|
604
|
+
|
|
605
|
+
return readJsonFile(rulesetPath);
|
|
606
|
+
}
|
|
607
|
+
|
|
608
|
+
function ghApi(deps, method, endpoint, payload) {
|
|
609
|
+
const args = ['api', '--method', method, endpoint];
|
|
610
|
+
|
|
611
|
+
if (payload !== undefined) {
|
|
612
|
+
args.push('--input', '-');
|
|
613
|
+
}
|
|
614
|
+
|
|
615
|
+
return deps.exec('gh', args, {
|
|
616
|
+
input: payload !== undefined ? `${JSON.stringify(payload)}\n` : undefined
|
|
617
|
+
});
|
|
618
|
+
}
|
|
619
|
+
|
|
620
|
+
function ensureGhAvailable(deps) {
|
|
621
|
+
const version = deps.exec('gh', ['--version']);
|
|
622
|
+
if (version.status !== 0) {
|
|
623
|
+
throw new Error('GitHub CLI (gh) is required. Install it from https://cli.github.com/ and rerun.');
|
|
624
|
+
}
|
|
625
|
+
|
|
626
|
+
const auth = deps.exec('gh', ['auth', 'status']);
|
|
627
|
+
if (auth.status !== 0) {
|
|
628
|
+
throw new Error('GitHub CLI is not authenticated. Run "gh auth login" and rerun.');
|
|
629
|
+
}
|
|
630
|
+
}
|
|
631
|
+
|
|
632
|
+
function parseJsonOutput(output, fallbackError) {
|
|
633
|
+
try {
|
|
634
|
+
return JSON.parse(output);
|
|
635
|
+
} catch (error) {
|
|
636
|
+
throw new Error(fallbackError);
|
|
637
|
+
}
|
|
638
|
+
}
|
|
639
|
+
|
|
640
|
+
function upsertRuleset(deps, repo, rulesetPayload) {
|
|
641
|
+
const listResult = ghApi(deps, 'GET', `/repos/${repo}/rulesets`);
|
|
642
|
+
if (listResult.status !== 0) {
|
|
643
|
+
throw new Error(`Failed to list rulesets: ${listResult.stderr || listResult.stdout}`.trim());
|
|
644
|
+
}
|
|
645
|
+
|
|
646
|
+
const rulesets = parseJsonOutput(listResult.stdout || '[]', 'Failed to parse rulesets response from GitHub API.');
|
|
647
|
+
const existing = rulesets.find((ruleset) => ruleset.name === rulesetPayload.name);
|
|
648
|
+
|
|
649
|
+
if (!existing) {
|
|
650
|
+
const createResult = ghApi(deps, 'POST', `/repos/${repo}/rulesets`, rulesetPayload);
|
|
651
|
+
if (createResult.status !== 0) {
|
|
652
|
+
throw new Error(`Failed to create ruleset: ${createResult.stderr || createResult.stdout}`.trim());
|
|
653
|
+
}
|
|
654
|
+
|
|
655
|
+
return 'created';
|
|
656
|
+
}
|
|
657
|
+
|
|
658
|
+
const updateResult = ghApi(deps, 'PUT', `/repos/${repo}/rulesets/${existing.id}`, rulesetPayload);
|
|
659
|
+
if (updateResult.status !== 0) {
|
|
660
|
+
throw new Error(`Failed to update ruleset: ${updateResult.stderr || updateResult.stdout}`.trim());
|
|
661
|
+
}
|
|
662
|
+
|
|
663
|
+
return 'updated';
|
|
664
|
+
}
|
|
665
|
+
|
|
666
|
+
function setupGithub(args, dependencies = {}) {
|
|
667
|
+
const deps = {
|
|
668
|
+
exec: dependencies.exec || execCommand
|
|
669
|
+
};
|
|
670
|
+
|
|
671
|
+
ensureGhAvailable(deps);
|
|
672
|
+
|
|
673
|
+
const repo = resolveRepo(args, deps);
|
|
674
|
+
const rulesetPayload = createRulesetPayload(args);
|
|
675
|
+
const summary = createSummary();
|
|
676
|
+
|
|
677
|
+
summary.updatedScriptKeys.push('repository.default_branch', 'repository.delete_branch_on_merge', 'repository.allow_auto_merge', 'repository.merge_policy');
|
|
678
|
+
|
|
679
|
+
if (args.dryRun) {
|
|
680
|
+
summary.warnings.push(`dry-run: would update repository settings for ${repo}`);
|
|
681
|
+
summary.warnings.push(`dry-run: would upsert ruleset "${rulesetPayload.name}" for refs/heads/${args.defaultBranch}`);
|
|
682
|
+
printSummary(`GitHub settings dry-run for ${repo}`, summary);
|
|
683
|
+
return;
|
|
684
|
+
}
|
|
685
|
+
|
|
686
|
+
const repoPayload = {
|
|
687
|
+
default_branch: args.defaultBranch,
|
|
688
|
+
delete_branch_on_merge: true,
|
|
689
|
+
allow_auto_merge: true,
|
|
690
|
+
allow_squash_merge: true,
|
|
691
|
+
allow_merge_commit: false,
|
|
692
|
+
allow_rebase_merge: false
|
|
693
|
+
};
|
|
694
|
+
|
|
695
|
+
const patchRepo = ghApi(deps, 'PATCH', `/repos/${repo}`, repoPayload);
|
|
696
|
+
if (patchRepo.status !== 0) {
|
|
697
|
+
throw new Error(`Failed to update repository settings: ${patchRepo.stderr || patchRepo.stdout}`.trim());
|
|
698
|
+
}
|
|
699
|
+
|
|
700
|
+
const upsertResult = upsertRuleset(deps, repo, rulesetPayload);
|
|
701
|
+
summary.overwrittenFiles.push(`github-ruleset:${upsertResult}`);
|
|
702
|
+
|
|
703
|
+
printSummary(`GitHub settings applied to ${repo}`, summary);
|
|
704
|
+
}
|
|
705
|
+
|
|
706
|
+
async function run(argv, dependencies = {}) {
|
|
707
|
+
const parsed = parseArgs(argv);
|
|
708
|
+
|
|
709
|
+
if (parsed.args.help) {
|
|
710
|
+
console.log(usage());
|
|
711
|
+
return;
|
|
712
|
+
}
|
|
713
|
+
|
|
714
|
+
if (parsed.mode === 'init') {
|
|
715
|
+
initExistingPackage(parsed.args);
|
|
716
|
+
return;
|
|
717
|
+
}
|
|
718
|
+
|
|
719
|
+
if (parsed.mode === 'setup-github') {
|
|
720
|
+
setupGithub(parsed.args, dependencies);
|
|
721
|
+
return;
|
|
722
|
+
}
|
|
121
723
|
|
|
122
|
-
|
|
724
|
+
createNewPackage(parsed.args);
|
|
123
725
|
}
|
|
124
726
|
|
|
125
|
-
module.exports = {
|
|
727
|
+
module.exports = {
|
|
728
|
+
run,
|
|
729
|
+
parseRepoFromRemote,
|
|
730
|
+
createBaseRulesetPayload,
|
|
731
|
+
setupGithub
|
|
732
|
+
};
|
package/package.json
CHANGED
|
@@ -1,4 +1,4 @@
|
|
|
1
1
|
# Changesets
|
|
2
2
|
|
|
3
|
-
-
|
|
4
|
-
-
|
|
3
|
+
- Add a changeset in each PR with release-impacting changes: `npm run changeset`.
|
|
4
|
+
- The release workflow opens/updates the versioning PR automatically.
|
|
@@ -0,0 +1,29 @@
|
|
|
1
|
+
name: CI
|
|
2
|
+
|
|
3
|
+
on:
|
|
4
|
+
pull_request:
|
|
5
|
+
push:
|
|
6
|
+
branches:
|
|
7
|
+
- __DEFAULT_BRANCH__
|
|
8
|
+
|
|
9
|
+
jobs:
|
|
10
|
+
check:
|
|
11
|
+
runs-on: ubuntu-latest
|
|
12
|
+
strategy:
|
|
13
|
+
matrix:
|
|
14
|
+
node-version: [18, 20]
|
|
15
|
+
steps:
|
|
16
|
+
- name: Checkout
|
|
17
|
+
uses: actions/checkout@v4
|
|
18
|
+
|
|
19
|
+
- name: Setup Node.js
|
|
20
|
+
uses: actions/setup-node@v4
|
|
21
|
+
with:
|
|
22
|
+
node-version: ${{ matrix.node-version }}
|
|
23
|
+
cache: npm
|
|
24
|
+
|
|
25
|
+
- name: Install
|
|
26
|
+
run: npm ci
|
|
27
|
+
|
|
28
|
+
- name: Check
|
|
29
|
+
run: npm run check
|
|
@@ -3,7 +3,7 @@ name: Release
|
|
|
3
3
|
on:
|
|
4
4
|
push:
|
|
5
5
|
branches:
|
|
6
|
-
-
|
|
6
|
+
- __DEFAULT_BRANCH__
|
|
7
7
|
|
|
8
8
|
permissions:
|
|
9
9
|
contents: write
|
|
@@ -26,8 +26,6 @@ jobs:
|
|
|
26
26
|
cache: npm
|
|
27
27
|
registry-url: https://registry.npmjs.org
|
|
28
28
|
|
|
29
|
-
- run: npm install -g npm@latest
|
|
30
|
-
|
|
31
29
|
- name: Install
|
|
32
30
|
run: npm ci
|
|
33
31
|
|
|
@@ -42,5 +40,4 @@ jobs:
|
|
|
42
40
|
title: "chore: release packages"
|
|
43
41
|
commit: "chore: release packages"
|
|
44
42
|
env:
|
|
45
|
-
NODE_AUTH_TOKEN: ""
|
|
46
43
|
GITHUB_TOKEN: ${{ secrets.CHANGESETS_GH_TOKEN || secrets.GITHUB_TOKEN }}
|
|
@@ -0,0 +1,28 @@
|
|
|
1
|
+
# Contributing
|
|
2
|
+
|
|
3
|
+
## Local setup
|
|
4
|
+
|
|
5
|
+
1. Install dependencies: `npm ci`
|
|
6
|
+
2. Run checks: `npm run check`
|
|
7
|
+
|
|
8
|
+
## Release process
|
|
9
|
+
|
|
10
|
+
1. Add a changeset in each release-impacting PR: `npm run changeset`.
|
|
11
|
+
2. Merge PRs into `__DEFAULT_BRANCH__`.
|
|
12
|
+
3. `.github/workflows/release.yml` opens/updates `chore: release packages`.
|
|
13
|
+
4. Merge the release PR to publish.
|
|
14
|
+
|
|
15
|
+
## Trusted Publishing
|
|
16
|
+
|
|
17
|
+
If the package does not exist on npm yet, the first publish can be manual:
|
|
18
|
+
|
|
19
|
+
```bash
|
|
20
|
+
npm publish --access public
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
After first publish, configure npm Trusted Publisher with:
|
|
24
|
+
|
|
25
|
+
- owner
|
|
26
|
+
- repository
|
|
27
|
+
- workflow file (`.github/workflows/release.yml`)
|
|
28
|
+
- branch (`__DEFAULT_BRANCH__`)
|
package/template/README.md
CHANGED
|
@@ -1,17 +1,32 @@
|
|
|
1
1
|
# __PACKAGE_NAME__
|
|
2
2
|
|
|
3
|
-
|
|
3
|
+
Package created by `@i-santos/create-package-starter`.
|
|
4
4
|
|
|
5
|
-
##
|
|
5
|
+
## Scripts
|
|
6
6
|
|
|
7
7
|
- `npm run check`
|
|
8
8
|
- `npm run changeset`
|
|
9
9
|
- `npm run version-packages`
|
|
10
10
|
- `npm run release`
|
|
11
11
|
|
|
12
|
-
##
|
|
12
|
+
## Release flow
|
|
13
13
|
|
|
14
|
-
1.
|
|
15
|
-
2.
|
|
16
|
-
3.
|
|
17
|
-
4.
|
|
14
|
+
1. Add a changeset in your PR: `npm run changeset`.
|
|
15
|
+
2. Merge into `__DEFAULT_BRANCH__`.
|
|
16
|
+
3. `.github/workflows/release.yml` creates or updates `chore: release packages`.
|
|
17
|
+
4. Merge the release PR to publish.
|
|
18
|
+
|
|
19
|
+
## Trusted Publishing
|
|
20
|
+
|
|
21
|
+
If this package does not exist on npm yet, first publish can be manual:
|
|
22
|
+
|
|
23
|
+
```bash
|
|
24
|
+
npm publish --access public
|
|
25
|
+
```
|
|
26
|
+
|
|
27
|
+
After first publish, configure npm Trusted Publisher:
|
|
28
|
+
|
|
29
|
+
- owner
|
|
30
|
+
- repository
|
|
31
|
+
- workflow file (`.github/workflows/release.yml`)
|
|
32
|
+
- branch (`__DEFAULT_BRANCH__`)
|