@i-santos/create-package-starter 1.1.0 → 1.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,51 +1,94 @@
1
1
  # @i-santos/create-package-starter
2
2
 
3
- Scaffold new npm packages with a standardized Changesets release workflow.
3
+ Scaffold and standardize npm packages with a Changesets-first release workflow.
4
4
 
5
5
  ## Install / Run
6
6
 
7
7
  ```bash
8
8
  npx @i-santos/create-package-starter --name hello-package
9
- npx @i-santos/create-package-starter --name @i-santos/swarm
9
+ npx @i-santos/create-package-starter --name @i-santos/swarm --default-branch main
10
10
  npx @i-santos/create-package-starter init --dir ./existing-package
11
+ npx @i-santos/create-package-starter setup-github --repo i-santos/firestack --dry-run
11
12
  ```
12
13
 
13
- ## Options
14
+ ## Commands
14
15
 
15
16
  Create new package:
16
17
 
17
18
  - `--name <name>` (required, supports `pkg` and `@scope/pkg`)
18
19
  - `--out <directory>` (default: current directory)
20
+ - `--default-branch <branch>` (default: `main`)
19
21
 
20
22
  Bootstrap existing package:
21
23
 
22
24
  - `init`
23
25
  - `--dir <directory>` (default: current directory)
24
- - `--force` (overwrite managed files/scripts/dependency keys)
26
+ - `--force` (overwrite managed files/script keys/dependency versions)
27
+ - `--cleanup-legacy-release` (remove `release:beta*`, `release:stable*`, `release:promote*`, `release:rollback*`, `release:dist-tags`)
28
+ - `--scope <scope>` (optional placeholder helper for docs/templates)
29
+ - `--default-branch <branch>` (default: `main`)
25
30
 
26
- ## Output
31
+ Configure GitHub repository settings:
27
32
 
28
- Generated package includes:
33
+ - `setup-github`
34
+ - `--repo <owner/repo>` (optional; inferred from `remote.origin.url` when omitted)
35
+ - `--default-branch <branch>` (default: `main`)
36
+ - `--ruleset <path>` (optional JSON override)
37
+ - `--dry-run` (prints intended operations only)
29
38
 
30
- - `changeset`
31
- - `version-packages`
32
- - `release`
33
- - `.github/workflows/release.yml`
39
+ ## Managed Standards
40
+
41
+ The generated and managed baseline includes:
42
+
43
+ - `package.json` scripts: `check`, `changeset`, `version-packages`, `release`
44
+ - `@changesets/cli` in `devDependencies`
34
45
  - `.changeset/config.json`
46
+ - `.changeset/README.md`
47
+ - `.github/workflows/ci.yml`
48
+ - `.github/workflows/release.yml`
49
+ - `.github/PULL_REQUEST_TEMPLATE.md`
50
+ - `.github/CODEOWNERS`
51
+ - `CONTRIBUTING.md`
52
+ - `README.md`
53
+ - `.gitignore`
54
+
55
+ ## Init Behavior
56
+
57
+ - Default mode is safe-merge: existing managed files and keys are preserved.
58
+ - `--force` overwrites managed files and managed script/dependency keys.
59
+ - Existing custom `check` script is preserved unless `--force`.
60
+ - Existing `@changesets/cli` version is preserved unless `--force`.
61
+ - Lowercase `.github/pull_request_template.md` is recognized as an existing equivalent template.
62
+
63
+ ## Output Summary Contract
35
64
 
36
- plus a minimal README, CHANGELOG, `.gitignore`, and check script.
65
+ All commands print a deterministic summary with:
37
66
 
38
- ## Existing Project Bootstrap
67
+ - files created
68
+ - files overwritten
69
+ - files skipped
70
+ - scripts updated/skipped/removed
71
+ - dependencies updated/skipped
72
+ - warnings
39
73
 
40
- `init` configures an existing npm package directory in-place:
74
+ ## setup-github Behavior
41
75
 
42
- - ensures scripts `changeset`, `version-packages`, `release`
43
- - ensures `@changesets/cli` in `devDependencies`
44
- - creates (or preserves) `.changeset/config.json`, `.changeset/README.md`, and `.github/workflows/release.yml`
45
- - default mode is safe-merge; use `--force` to overwrite managed files/keys
76
+ `setup-github` applies repository defaults via `gh` API:
46
77
 
47
- ## Notes
78
+ - default branch
79
+ - delete branch on merge
80
+ - auto-merge enabled
81
+ - squash-only merge policy
82
+ - create/update branch ruleset with required PR, 1 approval, stale review dismissal, resolved conversations, and deletion/force-push protection
83
+
84
+ If `gh` is missing or unauthenticated, command exits non-zero with actionable guidance.
85
+
86
+ ## Trusted Publishing Note
87
+
88
+ If package does not exist on npm yet, first publish may be manual:
89
+
90
+ ```bash
91
+ npm publish --access public
92
+ ```
48
93
 
49
- - For scoped names, folder uses the short package name.
50
- - Example: `@i-santos/swarm` creates `./swarm`.
51
- - Template follows `npm init -y` behavior by default (no `private` field).
94
+ After first publish, configure npm Trusted Publisher using your owner, repository, workflow file (`.github/workflows/release.yml`), and branch (`main` by default).
package/lib/run.js CHANGED
@@ -1,36 +1,72 @@
1
1
  const fs = require('fs');
2
2
  const path = require('path');
3
+ const { spawnSync } = require('child_process');
4
+
5
+ const CHANGESETS_DEP = '@changesets/cli';
6
+ const CHANGESETS_DEP_VERSION = '^2.29.7';
7
+ const DEFAULT_BASE_BRANCH = 'main';
8
+ const DEFAULT_RULESET_NAME = 'Default main branch protection';
9
+
10
+ const MANAGED_FILE_SPECS = [
11
+ ['.changeset/config.json', '.changeset/config.json'],
12
+ ['.changeset/README.md', '.changeset/README.md'],
13
+ ['.github/workflows/ci.yml', '.github/workflows/ci.yml'],
14
+ ['.github/workflows/release.yml', '.github/workflows/release.yml'],
15
+ ['.github/PULL_REQUEST_TEMPLATE.md', '.github/PULL_REQUEST_TEMPLATE.md'],
16
+ ['.github/CODEOWNERS', '.github/CODEOWNERS'],
17
+ ['CONTRIBUTING.md', 'CONTRIBUTING.md'],
18
+ ['README.md', 'README.md'],
19
+ ['.gitignore', '.gitignore']
20
+ ];
3
21
 
4
22
  function usage() {
5
23
  return [
6
- 'Uso:',
7
- ' create-package-starter --name <nome> [--out <diretorio>]',
8
- ' create-package-starter init [--dir <diretorio>] [--force]',
24
+ 'Usage:',
25
+ ' create-package-starter --name <name> [--out <directory>] [--default-branch <branch>]',
26
+ ' create-package-starter init [--dir <directory>] [--force] [--cleanup-legacy-release] [--scope <scope>] [--default-branch <branch>]',
27
+ ' create-package-starter setup-github [--repo <owner/repo>] [--default-branch <branch>] [--ruleset <path>] [--dry-run]',
9
28
  '',
10
- 'Exemplo:',
29
+ 'Examples:',
11
30
  ' create-package-starter --name hello-package',
12
31
  ' create-package-starter --name @i-santos/swarm --out ./packages',
13
- ' create-package-starter init --dir ./meu-pacote',
14
- ' create-package-starter init --force'
32
+ ' create-package-starter init --dir ./my-package',
33
+ ' create-package-starter init --cleanup-legacy-release',
34
+ ' create-package-starter setup-github --repo i-santos/firestack --dry-run'
15
35
  ].join('\n');
16
36
  }
17
37
 
38
+ function parseValueFlag(argv, index, flag) {
39
+ const value = argv[index + 1];
40
+ if (!value || value.startsWith('--')) {
41
+ throw new Error(`Missing value for ${flag}\\n\\n${usage()}`);
42
+ }
43
+
44
+ return value;
45
+ }
46
+
18
47
  function parseCreateArgs(argv) {
19
48
  const args = {
20
- out: process.cwd()
49
+ out: process.cwd(),
50
+ defaultBranch: DEFAULT_BASE_BRANCH
21
51
  };
22
52
 
23
53
  for (let i = 0; i < argv.length; i += 1) {
24
54
  const token = argv[i];
25
55
 
26
56
  if (token === '--name') {
27
- args.name = argv[i + 1];
57
+ args.name = parseValueFlag(argv, i, '--name');
28
58
  i += 1;
29
59
  continue;
30
60
  }
31
61
 
32
62
  if (token === '--out') {
33
- args.out = argv[i + 1];
63
+ args.out = parseValueFlag(argv, i, '--out');
64
+ i += 1;
65
+ continue;
66
+ }
67
+
68
+ if (token === '--default-branch') {
69
+ args.defaultBranch = parseValueFlag(argv, i, '--default-branch');
34
70
  i += 1;
35
71
  continue;
36
72
  }
@@ -40,7 +76,7 @@ function parseCreateArgs(argv) {
40
76
  continue;
41
77
  }
42
78
 
43
- throw new Error(`Argumento inválido: ${token}\n\n${usage()}`);
79
+ throw new Error(`Invalid argument: ${token}\\n\\n${usage()}`);
44
80
  }
45
81
 
46
82
  return args;
@@ -49,14 +85,29 @@ function parseCreateArgs(argv) {
49
85
  function parseInitArgs(argv) {
50
86
  const args = {
51
87
  dir: process.cwd(),
52
- force: false
88
+ force: false,
89
+ cleanupLegacyRelease: false,
90
+ defaultBranch: DEFAULT_BASE_BRANCH,
91
+ scope: ''
53
92
  };
54
93
 
55
94
  for (let i = 0; i < argv.length; i += 1) {
56
95
  const token = argv[i];
57
96
 
58
97
  if (token === '--dir') {
59
- args.dir = argv[i + 1];
98
+ args.dir = parseValueFlag(argv, i, '--dir');
99
+ i += 1;
100
+ continue;
101
+ }
102
+
103
+ if (token === '--scope') {
104
+ args.scope = parseValueFlag(argv, i, '--scope');
105
+ i += 1;
106
+ continue;
107
+ }
108
+
109
+ if (token === '--default-branch') {
110
+ args.defaultBranch = parseValueFlag(argv, i, '--default-branch');
60
111
  i += 1;
61
112
  continue;
62
113
  }
@@ -66,12 +117,60 @@ function parseInitArgs(argv) {
66
117
  continue;
67
118
  }
68
119
 
120
+ if (token === '--cleanup-legacy-release') {
121
+ args.cleanupLegacyRelease = true;
122
+ continue;
123
+ }
124
+
69
125
  if (token === '--help' || token === '-h') {
70
126
  args.help = true;
71
127
  continue;
72
128
  }
73
129
 
74
- throw new Error(`Argumento inválido: ${token}\n\n${usage()}`);
130
+ throw new Error(`Invalid argument: ${token}\\n\\n${usage()}`);
131
+ }
132
+
133
+ return args;
134
+ }
135
+
136
+ function parseSetupGithubArgs(argv) {
137
+ const args = {
138
+ defaultBranch: DEFAULT_BASE_BRANCH,
139
+ dryRun: false
140
+ };
141
+
142
+ for (let i = 0; i < argv.length; i += 1) {
143
+ const token = argv[i];
144
+
145
+ if (token === '--repo') {
146
+ args.repo = parseValueFlag(argv, i, '--repo');
147
+ i += 1;
148
+ continue;
149
+ }
150
+
151
+ if (token === '--default-branch') {
152
+ args.defaultBranch = parseValueFlag(argv, i, '--default-branch');
153
+ i += 1;
154
+ continue;
155
+ }
156
+
157
+ if (token === '--ruleset') {
158
+ args.ruleset = parseValueFlag(argv, i, '--ruleset');
159
+ i += 1;
160
+ continue;
161
+ }
162
+
163
+ if (token === '--dry-run') {
164
+ args.dryRun = true;
165
+ continue;
166
+ }
167
+
168
+ if (token === '--help' || token === '-h') {
169
+ args.help = true;
170
+ continue;
171
+ }
172
+
173
+ throw new Error(`Invalid argument: ${token}\\n\\n${usage()}`);
75
174
  }
76
175
 
77
176
  return args;
@@ -85,6 +184,13 @@ function parseArgs(argv) {
85
184
  };
86
185
  }
87
186
 
187
+ if (argv[0] === 'setup-github') {
188
+ return {
189
+ mode: 'setup-github',
190
+ args: parseSetupGithubArgs(argv.slice(1))
191
+ };
192
+ }
193
+
88
194
  return {
89
195
  mode: 'create',
90
196
  args: parseCreateArgs(argv)
@@ -106,28 +212,51 @@ function packageDirFromName(packageName) {
106
212
  return parts[parts.length - 1];
107
213
  }
108
214
 
109
- function copyDirRecursive(sourceDir, targetDir) {
215
+ function deriveScope(argsScope, packageName) {
216
+ if (argsScope) {
217
+ return argsScope;
218
+ }
219
+
220
+ if (typeof packageName === 'string' && packageName.startsWith('@')) {
221
+ const first = packageName.split('/')[0];
222
+ return first.slice(1);
223
+ }
224
+
225
+ return 'team';
226
+ }
227
+
228
+ function renderTemplateString(source, variables) {
229
+ let output = source;
230
+
231
+ for (const [key, value] of Object.entries(variables)) {
232
+ output = output.replace(new RegExp(`__${key}__`, 'g'), value);
233
+ }
234
+
235
+ return output;
236
+ }
237
+
238
+ function copyDirRecursive(sourceDir, targetDir, variables, relativeBase = '') {
110
239
  fs.mkdirSync(targetDir, { recursive: true });
111
240
  const entries = fs.readdirSync(sourceDir, { withFileTypes: true });
241
+ const createdFiles = [];
112
242
 
113
243
  for (const entry of entries) {
114
244
  const srcPath = path.join(sourceDir, entry.name);
115
245
  const destPath = path.join(targetDir, entry.name);
246
+ const relativePath = path.posix.join(relativeBase, entry.name);
116
247
 
117
248
  if (entry.isDirectory()) {
118
- copyDirRecursive(srcPath, destPath);
249
+ createdFiles.push(...copyDirRecursive(srcPath, destPath, variables, relativePath));
119
250
  continue;
120
251
  }
121
252
 
122
- fs.copyFileSync(srcPath, destPath);
253
+ const source = fs.readFileSync(srcPath, 'utf8');
254
+ const rendered = renderTemplateString(source, variables);
255
+ fs.writeFileSync(destPath, rendered);
256
+ createdFiles.push(relativePath);
123
257
  }
124
- }
125
258
 
126
- function renderTemplateFile(filePath, variables) {
127
- const source = fs.readFileSync(filePath, 'utf8');
128
- const output = source.replace(/__PACKAGE_NAME__/g, variables.packageName);
129
-
130
- fs.writeFileSync(filePath, output);
259
+ return createdFiles;
131
260
  }
132
261
 
133
262
  function readJsonFile(filePath) {
@@ -136,13 +265,13 @@ function readJsonFile(filePath) {
136
265
  try {
137
266
  raw = fs.readFileSync(filePath, 'utf8');
138
267
  } catch (error) {
139
- throw new Error(`Erro ao ler ${filePath}: ${error.message}`);
268
+ throw new Error(`Failed to read ${filePath}: ${error.message}`);
140
269
  }
141
270
 
142
271
  try {
143
272
  return JSON.parse(raw);
144
273
  } catch (error) {
145
- throw new Error(`Erro ao parsear JSON em ${filePath}: ${error.message}`);
274
+ throw new Error(`Invalid JSON in ${filePath}: ${error.message}`);
146
275
  }
147
276
  }
148
277
 
@@ -150,19 +279,47 @@ function writeJsonFile(filePath, value) {
150
279
  fs.writeFileSync(filePath, `${JSON.stringify(value, null, 2)}\n`);
151
280
  }
152
281
 
153
- function ensureFileFromTemplate(targetPath, templatePath, options) {
154
- if (!fs.existsSync(templatePath)) {
155
- throw new Error(`Erro: template não encontrado em ${templatePath}`);
156
- }
282
+ function createSummary() {
283
+ return {
284
+ createdFiles: [],
285
+ overwrittenFiles: [],
286
+ skippedFiles: [],
287
+ updatedScriptKeys: [],
288
+ skippedScriptKeys: [],
289
+ removedScriptKeys: [],
290
+ updatedDependencyKeys: [],
291
+ skippedDependencyKeys: [],
292
+ warnings: []
293
+ };
294
+ }
295
+
296
+ function printSummary(title, summary) {
297
+ const list = (values) => (values.length ? values.join(', ') : 'none');
298
+
299
+ console.log(title);
300
+ console.log(`files created: ${list(summary.createdFiles)}`);
301
+ console.log(`files overwritten: ${list(summary.overwrittenFiles)}`);
302
+ console.log(`files skipped: ${list(summary.skippedFiles)}`);
303
+ console.log(`scripts updated: ${list(summary.updatedScriptKeys)}`);
304
+ console.log(`scripts skipped: ${list(summary.skippedScriptKeys)}`);
305
+ console.log(`scripts removed: ${list(summary.removedScriptKeys)}`);
306
+ console.log(`dependencies updated: ${list(summary.updatedDependencyKeys)}`);
307
+ console.log(`dependencies skipped: ${list(summary.skippedDependencyKeys)}`);
308
+ console.log(`warnings: ${list(summary.warnings)}`);
309
+ }
157
310
 
311
+ function ensureFileFromTemplate(targetPath, templatePath, options) {
158
312
  const exists = fs.existsSync(targetPath);
159
313
 
160
314
  if (exists && !options.force) {
161
315
  return 'skipped';
162
316
  }
163
317
 
318
+ const source = fs.readFileSync(templatePath, 'utf8');
319
+ const rendered = renderTemplateString(source, options.variables);
320
+
164
321
  fs.mkdirSync(path.dirname(targetPath), { recursive: true });
165
- fs.copyFileSync(templatePath, targetPath);
322
+ fs.writeFileSync(targetPath, rendered);
166
323
 
167
324
  if (exists) {
168
325
  return 'overwritten';
@@ -171,42 +328,112 @@ function ensureFileFromTemplate(targetPath, templatePath, options) {
171
328
  return 'created';
172
329
  }
173
330
 
174
- function configureExistingPackage(packageDir, templateDir, force) {
331
+ function detectEquivalentManagedFile(packageDir, targetRelativePath) {
332
+ if (targetRelativePath !== '.github/PULL_REQUEST_TEMPLATE.md') {
333
+ return targetRelativePath;
334
+ }
335
+
336
+ const canonicalPath = path.join(packageDir, targetRelativePath);
337
+ if (fs.existsSync(canonicalPath)) {
338
+ return targetRelativePath;
339
+ }
340
+
341
+ const legacyLowercase = '.github/pull_request_template.md';
342
+ if (fs.existsSync(path.join(packageDir, legacyLowercase))) {
343
+ return legacyLowercase;
344
+ }
345
+
346
+ return targetRelativePath;
347
+ }
348
+
349
+ function updateManagedFiles(packageDir, templateDir, options, summary) {
350
+ for (const [targetRelativePath, templateRelativePath] of MANAGED_FILE_SPECS) {
351
+ const effectiveTargetRelative = detectEquivalentManagedFile(packageDir, targetRelativePath);
352
+ const targetPath = path.join(packageDir, effectiveTargetRelative);
353
+ const templatePath = path.join(templateDir, templateRelativePath);
354
+
355
+ if (!fs.existsSync(templatePath)) {
356
+ throw new Error(`Template not found: ${templatePath}`);
357
+ }
358
+
359
+ const result = ensureFileFromTemplate(targetPath, templatePath, {
360
+ force: options.force,
361
+ variables: options.variables
362
+ });
363
+
364
+ if (result === 'created') {
365
+ summary.createdFiles.push(targetRelativePath);
366
+ } else if (result === 'overwritten') {
367
+ summary.overwrittenFiles.push(targetRelativePath);
368
+ } else {
369
+ summary.skippedFiles.push(targetRelativePath);
370
+ }
371
+ }
372
+ }
373
+
374
+ function removeLegacyReleaseScripts(packageJson, summary) {
375
+ const keys = Object.keys(packageJson.scripts || {});
376
+
377
+ for (const key of keys) {
378
+ const isLegacy = key === 'release:dist-tags'
379
+ || key.startsWith('release:beta')
380
+ || key.startsWith('release:stable')
381
+ || key.startsWith('release:promote')
382
+ || key.startsWith('release:rollback');
383
+
384
+ if (!isLegacy) {
385
+ continue;
386
+ }
387
+
388
+ delete packageJson.scripts[key];
389
+ summary.removedScriptKeys.push(key);
390
+ }
391
+ }
392
+
393
+ function configureExistingPackage(packageDir, templateDir, options) {
175
394
  if (!fs.existsSync(packageDir)) {
176
- throw new Error(`Erro: diretório não encontrado: ${packageDir}`);
395
+ throw new Error(`Directory not found: ${packageDir}`);
177
396
  }
178
397
 
179
398
  const packageJsonPath = path.join(packageDir, 'package.json');
180
399
  if (!fs.existsSync(packageJsonPath)) {
181
- throw new Error(`Erro: package.json não encontrado em ${packageDir}.`);
400
+ throw new Error(`package.json not found in ${packageDir}`);
182
401
  }
183
402
 
184
403
  const packageJson = readJsonFile(packageJsonPath);
185
404
  packageJson.scripts = packageJson.scripts || {};
186
405
  packageJson.devDependencies = packageJson.devDependencies || {};
187
406
 
407
+ const summary = createSummary();
408
+
188
409
  const desiredScripts = {
410
+ check: 'npm run test',
189
411
  changeset: 'changeset',
190
412
  'version-packages': 'changeset version',
191
413
  release: 'npm run check && changeset publish'
192
414
  };
193
415
 
194
- const summary = {
195
- createdFiles: [],
196
- overwrittenFiles: [],
197
- skippedFiles: [],
198
- updatedScriptKeys: [],
199
- skippedScriptKeys: [],
200
- updatedDependencyKeys: [],
201
- skippedDependencyKeys: []
202
- };
203
-
204
416
  let packageJsonChanged = false;
205
417
 
206
418
  for (const [key, value] of Object.entries(desiredScripts)) {
207
419
  const exists = Object.prototype.hasOwnProperty.call(packageJson.scripts, key);
208
420
 
209
- if (!exists || force) {
421
+ if (key === 'check') {
422
+ if (!exists) {
423
+ packageJson.scripts[key] = value;
424
+ packageJsonChanged = true;
425
+ summary.updatedScriptKeys.push(key);
426
+ } else if (options.force && packageJson.scripts[key] !== value) {
427
+ packageJson.scripts[key] = value;
428
+ packageJsonChanged = true;
429
+ summary.updatedScriptKeys.push(key);
430
+ } else {
431
+ summary.skippedScriptKeys.push(key);
432
+ }
433
+ continue;
434
+ }
435
+
436
+ if (!exists || options.force) {
210
437
  if (!exists || packageJson.scripts[key] !== value) {
211
438
  packageJson.scripts[key] = value;
212
439
  packageJsonChanged = true;
@@ -218,88 +445,77 @@ function configureExistingPackage(packageDir, templateDir, force) {
218
445
  summary.skippedScriptKeys.push(key);
219
446
  }
220
447
 
221
- const dependencyKey = '@changesets/cli';
222
- const dependencyValue = '^2.29.7';
223
- const depExists = Object.prototype.hasOwnProperty.call(packageJson.devDependencies, dependencyKey);
448
+ const depExists = Object.prototype.hasOwnProperty.call(packageJson.devDependencies, CHANGESETS_DEP);
224
449
 
225
- if (!depExists || force) {
226
- if (!depExists || packageJson.devDependencies[dependencyKey] !== dependencyValue) {
227
- packageJson.devDependencies[dependencyKey] = dependencyValue;
450
+ if (!depExists || options.force) {
451
+ if (!depExists || packageJson.devDependencies[CHANGESETS_DEP] !== CHANGESETS_DEP_VERSION) {
452
+ packageJson.devDependencies[CHANGESETS_DEP] = CHANGESETS_DEP_VERSION;
228
453
  packageJsonChanged = true;
229
454
  }
230
- summary.updatedDependencyKeys.push(dependencyKey);
455
+ summary.updatedDependencyKeys.push(CHANGESETS_DEP);
231
456
  } else {
232
- summary.skippedDependencyKeys.push(dependencyKey);
457
+ summary.skippedDependencyKeys.push(CHANGESETS_DEP);
233
458
  }
234
459
 
235
- if (packageJsonChanged) {
236
- writeJsonFile(packageJsonPath, packageJson);
460
+ if (options.cleanupLegacyRelease) {
461
+ const before = summary.removedScriptKeys.length;
462
+ removeLegacyReleaseScripts(packageJson, summary);
463
+ if (summary.removedScriptKeys.length > before) {
464
+ packageJsonChanged = true;
465
+ }
237
466
  }
238
467
 
239
- const fileSpecs = [
240
- ['.changeset/config.json', '.changeset/config.json'],
241
- ['.changeset/README.md', '.changeset/README.md'],
242
- ['.github/workflows/release.yml', '.github/workflows/release.yml']
243
- ];
468
+ const packageName = packageJson.name || packageDirFromName(path.basename(packageDir));
244
469
 
245
- for (const [targetRelativePath, templateRelativePath] of fileSpecs) {
246
- const targetPath = path.join(packageDir, targetRelativePath);
247
- const templatePath = path.join(templateDir, templateRelativePath);
248
- const result = ensureFileFromTemplate(targetPath, templatePath, { force });
249
-
250
- if (result === 'created') {
251
- summary.createdFiles.push(targetRelativePath);
252
- } else if (result === 'overwritten') {
253
- summary.overwrittenFiles.push(targetRelativePath);
254
- } else {
255
- summary.skippedFiles.push(targetRelativePath);
470
+ updateManagedFiles(packageDir, templateDir, {
471
+ force: options.force,
472
+ variables: {
473
+ PACKAGE_NAME: packageName,
474
+ DEFAULT_BRANCH: options.defaultBranch,
475
+ SCOPE: deriveScope(options.scope, packageName)
256
476
  }
257
- }
477
+ }, summary);
258
478
 
259
- if (!packageJson.scripts.check) {
260
- console.warn('Aviso: script "check" não encontrado. O script "release" executa "npm run check".');
479
+ if (packageJsonChanged) {
480
+ writeJsonFile(packageJsonPath, packageJson);
261
481
  }
262
482
 
263
- console.log(`Projeto inicializado em ${packageDir}`);
264
- console.log(`Arquivos criados: ${summary.createdFiles.length ? summary.createdFiles.join(', ') : 'nenhum'}`);
265
- console.log(`Arquivos sobrescritos: ${summary.overwrittenFiles.length ? summary.overwrittenFiles.join(', ') : 'nenhum'}`);
266
- console.log(`Arquivos ignorados: ${summary.skippedFiles.length ? summary.skippedFiles.join(', ') : 'nenhum'}`);
267
- console.log(`Scripts atualizados: ${summary.updatedScriptKeys.length ? summary.updatedScriptKeys.join(', ') : 'nenhum'}`);
268
- console.log(`Scripts preservados: ${summary.skippedScriptKeys.length ? summary.skippedScriptKeys.join(', ') : 'nenhum'}`);
269
- console.log(`Dependências atualizadas: ${summary.updatedDependencyKeys.length ? summary.updatedDependencyKeys.join(', ') : 'nenhum'}`);
270
- console.log(`Dependências preservadas: ${summary.skippedDependencyKeys.length ? summary.skippedDependencyKeys.join(', ') : 'nenhum'}`);
483
+ return summary;
271
484
  }
272
485
 
273
486
  function createNewPackage(args) {
274
487
  if (!validateName(args.name)) {
275
- throw new Error('Erro: informe um nome válido com --name (ex: hello-package ou @i-santos/swarm).');
488
+ throw new Error('Provide a valid package name with --name (for example: hello-package or @i-santos/swarm).');
276
489
  }
277
490
 
278
491
  const packageRoot = path.resolve(__dirname, '..');
279
492
  const templateDir = path.join(packageRoot, 'template');
280
493
 
281
494
  if (!fs.existsSync(templateDir)) {
282
- throw new Error(`Erro: template não encontrado em ${templateDir}`);
495
+ throw new Error(`Template not found in ${templateDir}`);
283
496
  }
284
497
 
285
498
  const outputDir = path.resolve(args.out);
286
499
  const targetDir = path.join(outputDir, packageDirFromName(args.name));
287
500
 
288
501
  if (fs.existsSync(targetDir)) {
289
- throw new Error(`Erro: diretório já existe: ${targetDir}`);
502
+ throw new Error(`Directory already exists: ${targetDir}`);
290
503
  }
291
504
 
292
- copyDirRecursive(templateDir, targetDir);
505
+ const summary = createSummary();
293
506
 
294
- renderTemplateFile(path.join(targetDir, 'package.json'), {
295
- packageName: args.name
507
+ const createdFiles = copyDirRecursive(templateDir, targetDir, {
508
+ PACKAGE_NAME: args.name,
509
+ DEFAULT_BRANCH: args.defaultBranch,
510
+ SCOPE: deriveScope('', args.name)
296
511
  });
297
512
 
298
- renderTemplateFile(path.join(targetDir, 'README.md'), {
299
- packageName: args.name
300
- });
513
+ summary.createdFiles.push(...createdFiles);
514
+
515
+ summary.updatedScriptKeys.push('check', 'changeset', 'version-packages', 'release');
516
+ summary.updatedDependencyKeys.push(CHANGESETS_DEP);
301
517
 
302
- console.log(`Pacote criado em ${targetDir}`);
518
+ printSummary(`Package created in ${targetDir}`, summary);
303
519
  }
304
520
 
305
521
  function initExistingPackage(args) {
@@ -307,10 +523,187 @@ function initExistingPackage(args) {
307
523
  const templateDir = path.join(packageRoot, 'template');
308
524
  const targetDir = path.resolve(args.dir);
309
525
 
310
- configureExistingPackage(targetDir, templateDir, args.force);
526
+ const summary = configureExistingPackage(targetDir, templateDir, args);
527
+ printSummary(`Project initialized in ${targetDir}`, summary);
528
+ }
529
+
530
+ function execCommand(command, args, options = {}) {
531
+ return spawnSync(command, args, {
532
+ encoding: 'utf8',
533
+ ...options
534
+ });
535
+ }
536
+
537
+ function parseRepoFromRemote(remoteUrl) {
538
+ const trimmed = remoteUrl.trim();
539
+ const httpsMatch = trimmed.match(/github\.com[/:]([^/]+\/[^/.]+)(?:\.git)?$/);
540
+
541
+ if (httpsMatch) {
542
+ return httpsMatch[1];
543
+ }
544
+
545
+ return '';
546
+ }
547
+
548
+ function resolveRepo(args, deps) {
549
+ if (args.repo) {
550
+ return args.repo;
551
+ }
552
+
553
+ const remote = deps.exec('git', ['config', '--get', 'remote.origin.url']);
554
+ if (remote.status !== 0 || !remote.stdout.trim()) {
555
+ throw new Error('Could not infer repository. Use --repo <owner/repo>.');
556
+ }
557
+
558
+ const repo = parseRepoFromRemote(remote.stdout);
559
+ if (!repo) {
560
+ throw new Error('Could not parse GitHub repository from remote.origin.url. Use --repo <owner/repo>.');
561
+ }
562
+
563
+ return repo;
564
+ }
565
+
566
+ function createBaseRulesetPayload(defaultBranch) {
567
+ return {
568
+ name: DEFAULT_RULESET_NAME,
569
+ target: 'branch',
570
+ enforcement: 'active',
571
+ conditions: {
572
+ ref_name: {
573
+ include: [`refs/heads/${defaultBranch}`],
574
+ exclude: []
575
+ }
576
+ },
577
+ bypass_actors: [],
578
+ rules: [
579
+ { type: 'deletion' },
580
+ { type: 'non_fast_forward' },
581
+ {
582
+ type: 'pull_request',
583
+ parameters: {
584
+ required_approving_review_count: 1,
585
+ dismiss_stale_reviews_on_push: true,
586
+ require_code_owner_review: false,
587
+ require_last_push_approval: false,
588
+ required_review_thread_resolution: true
589
+ }
590
+ }
591
+ ]
592
+ };
593
+ }
594
+
595
+ function createRulesetPayload(args) {
596
+ if (!args.ruleset) {
597
+ return createBaseRulesetPayload(args.defaultBranch);
598
+ }
599
+
600
+ const rulesetPath = path.resolve(args.ruleset);
601
+ if (!fs.existsSync(rulesetPath)) {
602
+ throw new Error(`Ruleset file not found: ${rulesetPath}`);
603
+ }
604
+
605
+ return readJsonFile(rulesetPath);
606
+ }
607
+
608
+ function ghApi(deps, method, endpoint, payload) {
609
+ const args = ['api', '--method', method, endpoint];
610
+
611
+ if (payload !== undefined) {
612
+ args.push('--input', '-');
613
+ }
614
+
615
+ return deps.exec('gh', args, {
616
+ input: payload !== undefined ? `${JSON.stringify(payload)}\n` : undefined
617
+ });
618
+ }
619
+
620
+ function ensureGhAvailable(deps) {
621
+ const version = deps.exec('gh', ['--version']);
622
+ if (version.status !== 0) {
623
+ throw new Error('GitHub CLI (gh) is required. Install it from https://cli.github.com/ and rerun.');
624
+ }
625
+
626
+ const auth = deps.exec('gh', ['auth', 'status']);
627
+ if (auth.status !== 0) {
628
+ throw new Error('GitHub CLI is not authenticated. Run "gh auth login" and rerun.');
629
+ }
311
630
  }
312
631
 
313
- async function run(argv) {
632
+ function parseJsonOutput(output, fallbackError) {
633
+ try {
634
+ return JSON.parse(output);
635
+ } catch (error) {
636
+ throw new Error(fallbackError);
637
+ }
638
+ }
639
+
640
+ function upsertRuleset(deps, repo, rulesetPayload) {
641
+ const listResult = ghApi(deps, 'GET', `/repos/${repo}/rulesets`);
642
+ if (listResult.status !== 0) {
643
+ throw new Error(`Failed to list rulesets: ${listResult.stderr || listResult.stdout}`.trim());
644
+ }
645
+
646
+ const rulesets = parseJsonOutput(listResult.stdout || '[]', 'Failed to parse rulesets response from GitHub API.');
647
+ const existing = rulesets.find((ruleset) => ruleset.name === rulesetPayload.name);
648
+
649
+ if (!existing) {
650
+ const createResult = ghApi(deps, 'POST', `/repos/${repo}/rulesets`, rulesetPayload);
651
+ if (createResult.status !== 0) {
652
+ throw new Error(`Failed to create ruleset: ${createResult.stderr || createResult.stdout}`.trim());
653
+ }
654
+
655
+ return 'created';
656
+ }
657
+
658
+ const updateResult = ghApi(deps, 'PUT', `/repos/${repo}/rulesets/${existing.id}`, rulesetPayload);
659
+ if (updateResult.status !== 0) {
660
+ throw new Error(`Failed to update ruleset: ${updateResult.stderr || updateResult.stdout}`.trim());
661
+ }
662
+
663
+ return 'updated';
664
+ }
665
+
666
+ function setupGithub(args, dependencies = {}) {
667
+ const deps = {
668
+ exec: dependencies.exec || execCommand
669
+ };
670
+
671
+ ensureGhAvailable(deps);
672
+
673
+ const repo = resolveRepo(args, deps);
674
+ const rulesetPayload = createRulesetPayload(args);
675
+ const summary = createSummary();
676
+
677
+ summary.updatedScriptKeys.push('repository.default_branch', 'repository.delete_branch_on_merge', 'repository.allow_auto_merge', 'repository.merge_policy');
678
+
679
+ if (args.dryRun) {
680
+ summary.warnings.push(`dry-run: would update repository settings for ${repo}`);
681
+ summary.warnings.push(`dry-run: would upsert ruleset "${rulesetPayload.name}" for refs/heads/${args.defaultBranch}`);
682
+ printSummary(`GitHub settings dry-run for ${repo}`, summary);
683
+ return;
684
+ }
685
+
686
+ const repoPayload = {
687
+ default_branch: args.defaultBranch,
688
+ delete_branch_on_merge: true,
689
+ allow_auto_merge: true,
690
+ allow_squash_merge: true,
691
+ allow_merge_commit: false,
692
+ allow_rebase_merge: false
693
+ };
694
+
695
+ const patchRepo = ghApi(deps, 'PATCH', `/repos/${repo}`, repoPayload);
696
+ if (patchRepo.status !== 0) {
697
+ throw new Error(`Failed to update repository settings: ${patchRepo.stderr || patchRepo.stdout}`.trim());
698
+ }
699
+
700
+ const upsertResult = upsertRuleset(deps, repo, rulesetPayload);
701
+ summary.overwrittenFiles.push(`github-ruleset:${upsertResult}`);
702
+
703
+ printSummary(`GitHub settings applied to ${repo}`, summary);
704
+ }
705
+
706
+ async function run(argv, dependencies = {}) {
314
707
  const parsed = parseArgs(argv);
315
708
 
316
709
  if (parsed.args.help) {
@@ -323,7 +716,17 @@ async function run(argv) {
323
716
  return;
324
717
  }
325
718
 
719
+ if (parsed.mode === 'setup-github') {
720
+ setupGithub(parsed.args, dependencies);
721
+ return;
722
+ }
723
+
326
724
  createNewPackage(parsed.args);
327
725
  }
328
726
 
329
- module.exports = { run };
727
+ module.exports = {
728
+ run,
729
+ parseRepoFromRemote,
730
+ createBaseRulesetPayload,
731
+ setupGithub
732
+ };
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@i-santos/create-package-starter",
3
- "version": "1.1.0",
3
+ "version": "1.2.0",
4
4
  "description": "Scaffold new npm packages with a standardized Changesets release workflow",
5
5
  "license": "MIT",
6
6
  "author": "Igor Santos",
@@ -1,4 +1,4 @@
1
1
  # Changesets
2
2
 
3
- - Crie um changeset em cada PR com mudança de release: `npm run changeset`.
4
- - O workflow de release cria/atualiza o PR de versão automaticamente.
3
+ - Add a changeset in each PR with release-impacting changes: `npm run changeset`.
4
+ - The release workflow opens/updates the versioning PR automatically.
@@ -5,7 +5,7 @@
5
5
  "fixed": [],
6
6
  "linked": [],
7
7
  "access": "public",
8
- "baseBranch": "main",
8
+ "baseBranch": "__DEFAULT_BRANCH__",
9
9
  "updateInternalDependencies": "patch",
10
10
  "ignore": []
11
11
  }
@@ -0,0 +1,2 @@
1
+ # Keep CODEOWNERS simple by default and customize as needed.
2
+ * @__SCOPE__
@@ -0,0 +1,15 @@
1
+ ## Summary
2
+
3
+ -
4
+
5
+ ## Validation
6
+
7
+ - [ ] `npm run check`
8
+ - [ ] `npm run changeset` (if release-impacting)
9
+
10
+ ## Release Notes
11
+
12
+ - [ ] No release impact
13
+ - [ ] Patch
14
+ - [ ] Minor
15
+ - [ ] Major
@@ -0,0 +1,29 @@
1
+ name: CI
2
+
3
+ on:
4
+ pull_request:
5
+ push:
6
+ branches:
7
+ - __DEFAULT_BRANCH__
8
+
9
+ jobs:
10
+ check:
11
+ runs-on: ubuntu-latest
12
+ strategy:
13
+ matrix:
14
+ node-version: [18, 20]
15
+ steps:
16
+ - name: Checkout
17
+ uses: actions/checkout@v4
18
+
19
+ - name: Setup Node.js
20
+ uses: actions/setup-node@v4
21
+ with:
22
+ node-version: ${{ matrix.node-version }}
23
+ cache: npm
24
+
25
+ - name: Install
26
+ run: npm ci
27
+
28
+ - name: Check
29
+ run: npm run check
@@ -3,7 +3,7 @@ name: Release
3
3
  on:
4
4
  push:
5
5
  branches:
6
- - main
6
+ - __DEFAULT_BRANCH__
7
7
 
8
8
  permissions:
9
9
  contents: write
@@ -26,8 +26,6 @@ jobs:
26
26
  cache: npm
27
27
  registry-url: https://registry.npmjs.org
28
28
 
29
- - run: npm install -g npm@latest
30
-
31
29
  - name: Install
32
30
  run: npm ci
33
31
 
@@ -42,5 +40,4 @@ jobs:
42
40
  title: "chore: release packages"
43
41
  commit: "chore: release packages"
44
42
  env:
45
- NODE_AUTH_TOKEN: ""
46
43
  GITHUB_TOKEN: ${{ secrets.CHANGESETS_GH_TOKEN || secrets.GITHUB_TOKEN }}
@@ -0,0 +1,28 @@
1
+ # Contributing
2
+
3
+ ## Local setup
4
+
5
+ 1. Install dependencies: `npm ci`
6
+ 2. Run checks: `npm run check`
7
+
8
+ ## Release process
9
+
10
+ 1. Add a changeset in each release-impacting PR: `npm run changeset`.
11
+ 2. Merge PRs into `__DEFAULT_BRANCH__`.
12
+ 3. `.github/workflows/release.yml` opens/updates `chore: release packages`.
13
+ 4. Merge the release PR to publish.
14
+
15
+ ## Trusted Publishing
16
+
17
+ If the package does not exist on npm yet, the first publish can be manual:
18
+
19
+ ```bash
20
+ npm publish --access public
21
+ ```
22
+
23
+ After first publish, configure npm Trusted Publisher with:
24
+
25
+ - owner
26
+ - repository
27
+ - workflow file (`.github/workflows/release.yml`)
28
+ - branch (`__DEFAULT_BRANCH__`)
@@ -1,17 +1,32 @@
1
1
  # __PACKAGE_NAME__
2
2
 
3
- Pacote criado pelo `@i-santos/create-package-starter`.
3
+ Package created by `@i-santos/create-package-starter`.
4
4
 
5
- ## Comandos
5
+ ## Scripts
6
6
 
7
7
  - `npm run check`
8
8
  - `npm run changeset`
9
9
  - `npm run version-packages`
10
10
  - `npm run release`
11
11
 
12
- ## Fluxo de release
12
+ ## Release flow
13
13
 
14
- 1. Crie um changeset na PR: `npm run changeset`.
15
- 2. Faça merge na `main`.
16
- 3. O workflow `.github/workflows/release.yml` cria/atualiza a PR de release.
17
- 4. Ao merge da PR de release, o publish é executado no npm.
14
+ 1. Add a changeset in your PR: `npm run changeset`.
15
+ 2. Merge into `__DEFAULT_BRANCH__`.
16
+ 3. `.github/workflows/release.yml` creates or updates `chore: release packages`.
17
+ 4. Merge the release PR to publish.
18
+
19
+ ## Trusted Publishing
20
+
21
+ If this package does not exist on npm yet, first publish can be manual:
22
+
23
+ ```bash
24
+ npm publish --access public
25
+ ```
26
+
27
+ After first publish, configure npm Trusted Publisher:
28
+
29
+ - owner
30
+ - repository
31
+ - workflow file (`.github/workflows/release.yml`)
32
+ - branch (`__DEFAULT_BRANCH__`)