zx-bulk-release 3.1.7 → 3.1.9

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,3 +1,13 @@
1
+ ## [3.1.9](https://github.com/semrel-extra/zx-bulk-release/compare/v3.1.8...v3.1.9) (2026-04-13)
2
+
3
+ ### Fixes & improvements
4
+ * perf: replace cosmiconfig with own config reader ([f980590](https://github.com/semrel-extra/zx-bulk-release/commit/f980590edbf9aa9311110207e24849c6373c0feb))
5
+
6
+ ## [3.1.8](https://github.com/semrel-extra/zx-bulk-release/compare/v3.1.7...v3.1.8) (2026-04-13)
7
+
8
+ ### Fixes & improvements
9
+ * perf: minor internal code imprs ([f3f0aae](https://github.com/semrel-extra/zx-bulk-release/commit/f3f0aae2e7dc4045a6573a8e270acdf6a3d983d8))
10
+
1
11
  ## [3.1.7](https://github.com/semrel-extra/zx-bulk-release/compare/v3.1.6...v3.1.7) (2026-04-13)
2
12
 
3
13
  ### Fixes & improvements
package/README.md CHANGED
@@ -46,7 +46,7 @@ GH_TOKEN=ghtoken GH_USER=username NPM_TOKEN=npmtoken npx zx-bulk-release [opts]
46
46
  |------------------------------|---------------------------------------------------------------------------------------------------|------------------|
47
47
  | `--receive` | Consume rebuild signal, analyze, preflight. Writes `zbr-context.json`. Run before deps install. | |
48
48
  | `--pack [dir]` | Pack only: build, test, and write delivery tars to `dir`. No credentials needed. | `parcels` |
49
- | `--verify [dir]` | Verify untrusted parcels against context, copy valid ones to `parcels/`. Use `--context` for path. | `parcels` |
49
+ | `--verify [in:out]` | Verify untrusted parcels against context, copy valid ones to output dir. Use `--context` for path. | `parcels:parcels` |
50
50
  | `--context <path>` | Path to trusted `zbr-context.json` (used with `--verify`). | `zbr-context.json` |
51
51
  | `--deliver [dir]` | Deliver only: read tars from `dir` and run delivery channels. No source code needed. | `parcels` |
52
52
  | `--ignore` | Packages to ignore: `a, b` | |
@@ -181,7 +181,7 @@ jobs:
181
181
  path: parcels-unverified/
182
182
 
183
183
  # verify — validate untrusted parcels against trusted context
184
- - run: npx zx-bulk-release --verify parcels-unverified/
184
+ - run: npx zx-bulk-release --verify parcels-unverified/:parcels/
185
185
 
186
186
  # deliver — only verified parcels
187
187
  - run: npx zx-bulk-release --deliver
@@ -212,8 +212,8 @@ await run({
212
212
  ```
213
213
 
214
214
  ## Config
215
- ### cosmiconfig
216
- Any [cosmiconfig](https://github.com/davidtheclark/cosmiconfig) compliant format: `.releaserc`, `.release.json`, `.release.yaml`, etc in the package root or in the repo root dir.
215
+ ### Config files
216
+ [cosmiconfig](https://github.com/davidtheclark/cosmiconfig)-compatible lookup: `.releaserc`, `.release.json`, `.release.yaml`, `.releaserc.js`, `release.config.js`, or `release` key in `package.json`. Searched from the package root up to the repo root.
217
217
  ```json
218
218
  {
219
219
  "buildCmd": "yarn && yarn build",
@@ -381,7 +381,7 @@ Each step has a uniform signature `(pkg, ctx)`:
381
381
  - **`preflight`** — checks tag availability on remote, re-resolves version on conflict, skips duplicates.
382
382
  - **`build`** — runs `buildCmd` (with dep traversal and optional npm artifact fetch).
383
383
  - **`test`** — runs `testCmd`.
384
- - **`pack`** — stages delivery artifacts into self-describing tar containers (`npm pack`, docs copy, assets, release notes). Each tar is named `parcel.{sha7}.{channel}.{tag}.{hash6}.tar` and contains a `manifest.json` with channel name, delivery instructions, and template credentials (`${{ENV_VAR}}`). A directive meta-parcel is also generated, listing all parcels and their delivery steps. After this step, everything the courier needs is outside the project dir.
384
+ - **`pack`** — stages delivery artifacts into self-describing tar containers (`npm pack`, docs copy, assets, release notes). Each tar is named `parcel.{sha7}.{channel}.{name}.{version}.{hash6}.tar` and contains a `manifest.json` with channel name, delivery instructions, and template credentials (`${{ENV_VAR}}`). A directive meta-parcel is also generated, listing all parcels and their delivery steps. After this step, everything the courier needs is outside the project dir.
385
385
  - **`publish`** — hands tars to courier's `deliver()`, runs `cmd` channel separately. Tag push is handled by the `git-tag` channel.
386
386
  - **`clean`** — restores `package.json` files and unsets git user config.
387
387
 
@@ -397,13 +397,13 @@ Set `config.releaseRules` to override the default rules preset:
397
397
  ### Tar containers
398
398
  Each delivery artifact is a self-describing tar archive:
399
399
  ```
400
- parcel.{sha7}.{channel}.{tag}.{hash6}.tar
400
+ parcel.{sha7}.{channel}.{name}.{version}.{hash6}.tar
401
401
  manifest.json — channel name, delivery params, template credentials
402
402
  package.tgz — (npm channel) npm tarball
403
403
  assets/ — (gh-release channel) release assets
404
404
  docs/ — (gh-pages channel) documentation files
405
405
  ```
406
- The `sha7` prefix groups all parcels of one commit. The `hash6` suffix is a content hash for deduplication — two builds of the same commit producing identical content yield the same filename (last-writer-wins), while different content gets a different hash.
406
+ The `sha7` prefix groups all parcels of one commit. `name` is the sanitized package name (`@scope/pkg` → `scope-pkg`). The `hash6` suffix is a content hash for deduplication — two builds of the same commit producing identical content yield the same filename (last-writer-wins), while different content gets a different hash.
407
407
 
408
408
  A **directive** meta-parcel (`parcel.{sha7}.directive.{ts}.tar`) is generated alongside regular parcels. It contains the complete delivery map: package queue, per-package channel steps, and an authoritative list of parcel filenames. The directive enables coordinated delivery — see [DELIVER_SPEC.md](./DELIVER_SPEC.md).
409
409
 
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "zx-bulk-release",
3
3
  "alias": "bulk-release",
4
- "version": "3.1.7",
4
+ "version": "3.1.9",
5
5
  "description": "zx-based alternative for multi-semantic-release",
6
6
  "type": "module",
7
7
  "exports": {
@@ -19,25 +19,24 @@
19
19
  ],
20
20
  "scripts": {
21
21
  "test": "npm run test:unit",
22
- "test:unit": "uvu ./src/test -i fixtures -i utils -i integration",
22
+ "test:unit": "vitest run --dir src/test/js --exclude '**/integration*' --exclude '**/fixtures/**' --exclude '**/utils/**'",
23
23
  "test:it": "node ./src/test/js/integration.test.js",
24
- "test:cov": "c8 sh -c 'npx uvu ./src/test -i fixtures -i utils -i integration && node ./src/test/js/integration.test.js' && c8 report -r lcov",
24
+ "test:cov": "vitest run --coverage --dir src/test/js --exclude '**/fixtures/**' --exclude '**/utils/**'",
25
25
  "docs": "mkdir -p docs && cp ./README.md ./docs/README.md",
26
26
  "publish:beta": "npm publish --tag beta --no-git-tag-version",
27
27
  "build": "esbuild src/main/js/index.js --platform=node --outdir=target --bundle --format=esm --external:typescript"
28
28
  },
29
29
  "dependencies": {
30
30
  "@semrel-extra/topo": "^1.14.1",
31
- "cosmiconfig": "^9.0.1",
32
31
  "queuefy": "^1.2.1",
33
32
  "tar-stream": "^3.1.8",
34
33
  "zx-extra": "4.0.20"
35
34
  },
36
35
  "devDependencies": {
37
- "c8": "^11.0.0",
36
+ "@vitest/coverage-v8": "^4.1.4",
38
37
  "esbuild": "^0.28.0",
39
- "uvu": "^0.5.6",
40
- "verdaccio": "6.5.0"
38
+ "verdaccio": "6.5.0",
39
+ "vitest": "^4.1.4"
41
40
  },
42
41
  "publishConfig": {
43
42
  "access": "public"
@@ -47,10 +46,5 @@
47
46
  "url": "git+https://github.com/semrel-extra/zx-bulk-release.git"
48
47
  },
49
48
  "author": "Anton Golub <antongolub@antongolub.com>",
50
- "license": "MIT",
51
- "c8": {
52
- "exclude": [
53
- "src/test/**"
54
- ]
55
- }
49
+ "license": "MIT"
56
50
  }
@@ -1,4 +1,4 @@
1
- import { cosmiconfig } from 'cosmiconfig'
1
+ import { fs, path, YAML } from 'zx'
2
2
  import { asArray, camelize, memoizeBy } from './util.js'
3
3
  import { GH_URL, resolveGhApiUrl } from './post/api/gh.js'
4
4
  import { DEFAULT_GIT_COMMITTER_NAME, DEFAULT_GIT_COMMITTER_EMAIL } from './post/api/git.js'
@@ -29,12 +29,41 @@ export const defaultConfig = {
29
29
  export const getPkgConfig = async (cwd, env) =>
30
30
  normalizePkgConfig((await Promise.all(asArray(cwd).map(readPkgConfig))).find(Boolean) || defaultConfig, env)
31
31
 
32
- export const readPkgConfig = memoizeBy(async (cwd) => cosmiconfig(CONFIG_NAME, {
33
- searchPlaces: CONFIG_FILES,
34
- searchStrategy: 'global', // https://github.com/cosmiconfig/cosmiconfig/releases/tag/v9.0.0
35
- })
36
- .search(cwd)
37
- .then(r => r?.config))
32
+ export const cosmiconfig = (name, {searchPlaces}) => {
33
+ const load = async (filePath) => {
34
+ if (filePath.endsWith('.js') || filePath.endsWith('.cjs'))
35
+ return (await import(filePath)).default
36
+ const raw = await fs.readFile(filePath, 'utf8')
37
+ if (filePath.endsWith('.yaml') || filePath.endsWith('.yml'))
38
+ return YAML.parse(raw)
39
+ try { return JSON.parse(raw) } catch { return YAML.parse(raw) }
40
+ }
41
+
42
+ const search = async (cwd) => {
43
+ let dir = path.resolve(cwd)
44
+ while (true) {
45
+ for (const file of searchPlaces) {
46
+ const filePath = path.resolve(dir, file)
47
+ if (!await fs.pathExists(filePath)) continue
48
+ if (file === 'package.json') {
49
+ const pkg = await fs.readJson(filePath)
50
+ if (pkg[name]) return pkg[name]
51
+ continue
52
+ }
53
+ return load(filePath)
54
+ }
55
+ const parent = path.dirname(dir)
56
+ if (parent === dir) return
57
+ dir = parent
58
+ }
59
+ }
60
+
61
+ return {search}
62
+ }
63
+
64
+ export const readPkgConfig = memoizeBy(async (cwd) =>
65
+ cosmiconfig(CONFIG_NAME, {searchPlaces: CONFIG_FILES}).search(cwd)
66
+ )
38
67
 
39
68
  export const normalizePkgConfig = (config, env) => {
40
69
  const envConfig = parseEnv(env)
@@ -1,10 +1,10 @@
1
- export const DEFAULT_GIT_COMMITTER_NAME = 'Semrel Extra Bot'
2
- export const DEFAULT_GIT_COMMITTER_EMAIL = 'semrel-extra-bot@hotmail.com'
3
-
4
1
  import {$, fs, path, tempy, copy} from 'zx-extra'
5
2
  import {log} from '../log.js'
6
3
  import {attempt2, attempt3, memoizeBy} from '../../util.js'
7
4
 
5
+ export const DEFAULT_GIT_COMMITTER_NAME = 'Semrel Extra Bot'
6
+ export const DEFAULT_GIT_COMMITTER_EMAIL = 'semrel-extra-bot@hotmail.com'
7
+
8
8
  export const fetchRepo = memoizeBy(async ({cwd: _cwd, branch, origin: _origin, basicAuth}) => {
9
9
  if (!_origin && !_cwd) throw new Error('fetchRepo requires either origin or cwd')
10
10
  const origin = _origin || (await getRepo(_cwd, {basicAuth})).repoAuthedUrl
@@ -3,8 +3,9 @@ import _fs from 'node:fs/promises'
3
3
  import _path from 'node:path'
4
4
  import tar from 'tar-stream'
5
5
  import {Readable} from 'node:stream'
6
- import {log} from '../log.js'
7
6
  import {$, semver, fs, INI, fetch, tempy} from 'zx-extra'
7
+
8
+ import {log} from '../log.js'
8
9
  import {attempt2, memoizeBy} from '../../util.js'
9
10
 
10
11
  const FETCH_TIMEOUT_MS = 15_000
@@ -1,8 +1,7 @@
1
1
  import tar from 'tar-stream'
2
2
  import crypto from 'node:crypto'
3
- import {fs, path} from 'zx-extra'
4
3
  import {pipeline} from 'node:stream/promises'
5
- import {createWriteStream, createReadStream} from 'node:fs'
4
+ import {fs, path} from 'zx-extra'
6
5
 
7
6
  export const packTar = async (tarPath, manifest, files = []) => {
8
7
  await fs.ensureDir(path.dirname(tarPath))
@@ -20,13 +19,13 @@ export const packTar = async (tarPath, manifest, files = []) => {
20
19
  }
21
20
 
22
21
  pack.finalize()
23
- await pipeline(pack, createWriteStream(tarPath))
22
+ await pipeline(pack, fs.createWriteStream(tarPath))
24
23
  return tarPath
25
24
  }
26
25
 
27
26
  export const hashFile = async (filePath) => {
28
27
  const hash = crypto.createHash('sha1')
29
- await pipeline(createReadStream(filePath), hash)
28
+ await pipeline(fs.createReadStream(filePath), hash)
30
29
  return hash.digest('hex').slice(0, 6)
31
30
  }
32
31
 
@@ -68,7 +67,7 @@ export const unpackTar = async (tarPath, destDir) => {
68
67
  extract.on('error', reject)
69
68
  })
70
69
 
71
- await pipeline(createReadStream(tarPath), extract)
70
+ await pipeline(fs.createReadStream(tarPath), extract)
72
71
  await done
73
72
  return {manifest, dir: destDir}
74
73
  }
@@ -19,7 +19,7 @@ function proc(stdout = '', code = 0) {
19
19
  return p
20
20
  }
21
21
 
22
- export function createMock(responses = []) {
22
+ export function createSpawnMock(responses = []) {
23
23
  const calls = []
24
24
  const spawn = (cmd, args) => {
25
25
  const command = args?.[args.length - 1] || cmd