@mr-aftab-ahmad-khan/drift-check 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +10 -0
- package/LICENSE +15 -0
- package/README.md +293 -0
- package/dist/chunk-Q6ZBG4TH.js +274 -0
- package/dist/chunk-Q6ZBG4TH.js.map +1 -0
- package/dist/cli.cjs +360 -0
- package/dist/cli.cjs.map +1 -0
- package/dist/cli.d.cts +1 -0
- package/dist/cli.d.ts +1 -0
- package/dist/cli.js +87 -0
- package/dist/cli.js.map +1 -0
- package/dist/index.cjs +323 -0
- package/dist/index.cjs.map +1 -0
- package/dist/index.d.cts +77 -0
- package/dist/index.d.ts +77 -0
- package/dist/index.js +33 -0
- package/dist/index.js.map +1 -0
- package/package.json +71 -0
package/CHANGELOG.md
ADDED
|
@@ -0,0 +1,10 @@
|
|
|
1
|
+
# Changelog
|
|
2
|
+
|
|
3
|
+
## [0.1.0] — 2026-05-15
|
|
4
|
+
|
|
5
|
+
### Added
|
|
6
|
+
|
|
7
|
+
- `scan`, `fix`, `report`, and `ignore` commands.
|
|
8
|
+
- Auto-detects pnpm, yarn, and npm workspace layouts.
|
|
9
|
+
- Severity classification (patch / minor / major) with `--dry-run` plan output.
|
|
10
|
+
- `.driftignore` support and JSON output for CI consumption.
|
package/LICENSE
ADDED
|
@@ -0,0 +1,15 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2026 drift-check contributors
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND.
|
package/README.md
ADDED
|
@@ -0,0 +1,293 @@
|
|
|
1
|
+
# drift-check
|
|
2
|
+
|
|
3
|
+
[](https://www.npmjs.com/package/@mr-aftab-ahmad-khan/drift-check)
|
|
4
|
+
[](./LICENSE)
|
|
5
|
+
[](https://www.typescriptlang.org/)
|
|
6
|
+
|
|
7
|
+
**In a 10-package monorepo, React will be at 3 different versions within 6 months.** `drift-check` catches version drift across **pnpm**, **npm**, and **yarn** workspaces in CI before it becomes a Saturday-night debugging session. Auto-detects workspace format, classifies drift as patch/minor/major, fixes in place, and integrates with GitHub Actions, GitLab CI, Turborepo, and husky.
|
|
8
|
+
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
## Installation
|
|
12
|
+
|
|
13
|
+
```bash
|
|
14
|
+
npm install -g drift-check
|
|
15
|
+
pnpm add -g drift-check
|
|
16
|
+
yarn global add drift-check
|
|
17
|
+
|
|
18
|
+
# Or one-off:
|
|
19
|
+
npx drift-check
|
|
20
|
+
```
|
|
21
|
+
|
|
22
|
+
---
|
|
23
|
+
|
|
24
|
+
## Quick Start
|
|
25
|
+
|
|
26
|
+
```bash
|
|
27
|
+
$ npx drift-check
|
|
28
|
+
drift-check — 2 drifted package(s) across 3 workspaces
|
|
29
|
+
|
|
30
|
+
MINOR react
|
|
31
|
+
- ^18.2.0 (apps/web / dependencies)
|
|
32
|
+
- ^18.3.1 (packages/ui / dependencies)
|
|
33
|
+
- ^18.3.1 (packages/forms / dependencies)
|
|
34
|
+
|
|
35
|
+
MAJOR zod
|
|
36
|
+
- ^3.22.4 (apps/web / dependencies)
|
|
37
|
+
- ^4.0.0 (packages/ui / dependencies)
|
|
38
|
+
```
|
|
39
|
+
|
|
40
|
+
---
|
|
41
|
+
|
|
42
|
+
## Core Usage Examples
|
|
43
|
+
|
|
44
|
+
### 1. Scan and read the report
|
|
45
|
+
|
|
46
|
+
```bash
|
|
47
|
+
drift-check
|
|
48
|
+
```
|
|
49
|
+
|
|
50
|
+
### 2. Auto-fix to the highest currently used
|
|
51
|
+
|
|
52
|
+
```bash
|
|
53
|
+
drift-check fix
|
|
54
|
+
# Then run your package manager's install:
|
|
55
|
+
# pnpm install / npm install / yarn install
|
|
56
|
+
```
|
|
57
|
+
|
|
58
|
+
### 3. Dry-run
|
|
59
|
+
|
|
60
|
+
```bash
|
|
61
|
+
drift-check fix --dry-run
|
|
62
|
+
```
|
|
63
|
+
|
|
64
|
+
### 4. Ignore an intentional mismatch
|
|
65
|
+
|
|
66
|
+
```bash
|
|
67
|
+
drift-check ignore @types/node
|
|
68
|
+
```
|
|
69
|
+
|
|
70
|
+
### 5. Fix one package across the monorepo
|
|
71
|
+
|
|
72
|
+
```bash
|
|
73
|
+
drift-check fix --pkg react
|
|
74
|
+
```
|
|
75
|
+
|
|
76
|
+
### 6. JSON output
|
|
77
|
+
|
|
78
|
+
```bash
|
|
79
|
+
drift-check report --format json | jq '.drifted[] | select(.severity == "major")'
|
|
80
|
+
```
|
|
81
|
+
|
|
82
|
+
---
|
|
83
|
+
|
|
84
|
+
## CI/CD Integration Examples
|
|
85
|
+
|
|
86
|
+
### GitHub Actions
|
|
87
|
+
|
|
88
|
+
```yaml
|
|
89
|
+
name: drift
|
|
90
|
+
on: [pull_request]
|
|
91
|
+
jobs:
|
|
92
|
+
drift:
|
|
93
|
+
runs-on: ubuntu-latest
|
|
94
|
+
steps:
|
|
95
|
+
- uses: actions/checkout@v4
|
|
96
|
+
- uses: actions/setup-node@v4
|
|
97
|
+
with: { node-version: 20 }
|
|
98
|
+
- run: npx drift-check --fail-on minor
|
|
99
|
+
```
|
|
100
|
+
|
|
101
|
+
### GitLab CI
|
|
102
|
+
|
|
103
|
+
```yaml
|
|
104
|
+
drift:
|
|
105
|
+
stage: test
|
|
106
|
+
image: node:20
|
|
107
|
+
script:
|
|
108
|
+
- npx drift-check --fail-on minor
|
|
109
|
+
```
|
|
110
|
+
|
|
111
|
+
### Turborepo
|
|
112
|
+
|
|
113
|
+
```json
|
|
114
|
+
// turbo.json
|
|
115
|
+
{
|
|
116
|
+
"pipeline": {
|
|
117
|
+
"drift": { "outputs": [] }
|
|
118
|
+
}
|
|
119
|
+
}
|
|
120
|
+
```
|
|
121
|
+
|
|
122
|
+
```json
|
|
123
|
+
// package.json
|
|
124
|
+
{ "scripts": { "drift": "drift-check --fail-on minor" } }
|
|
125
|
+
```
|
|
126
|
+
|
|
127
|
+
### Pre-push hook (husky)
|
|
128
|
+
|
|
129
|
+
```bash
|
|
130
|
+
# .husky/pre-push
|
|
131
|
+
#!/bin/sh
|
|
132
|
+
npx drift-check --fail-on major
|
|
133
|
+
```
|
|
134
|
+
|
|
135
|
+
---
|
|
136
|
+
|
|
137
|
+
## Configuration Reference
|
|
138
|
+
|
|
139
|
+
`.driftrc.json`:
|
|
140
|
+
|
|
141
|
+
```json
|
|
142
|
+
{
|
|
143
|
+
"failOn": "minor",
|
|
144
|
+
"ignore": ["@types/node"],
|
|
145
|
+
"workspaceGlobs": ["packages/*", "apps/*", "tools/*"]
|
|
146
|
+
}
|
|
147
|
+
```
|
|
148
|
+
|
|
149
|
+
| Option | Type | Default | Description |
|
|
150
|
+
| ---------------- | ----------- | ----------------------------- | ---------------------------------------- |
|
|
151
|
+
| `failOn` | `Severity` | `undefined` | Exit 1 when any drift ≥ this severity |
|
|
152
|
+
| `ignore` | `string[]` | `[]` | Package names to skip |
|
|
153
|
+
| `workspaceGlobs` | `string[]` | `["packages/*","apps/*"]` | Extra glob patterns to scan |
|
|
154
|
+
|
|
155
|
+
---
|
|
156
|
+
|
|
157
|
+
## `.driftignore` Format
|
|
158
|
+
|
|
159
|
+
Plain text, one package name per line, with `#` comments:
|
|
160
|
+
|
|
161
|
+
```
|
|
162
|
+
# Allow intentional mismatch — vue 3 in legacy app
|
|
163
|
+
vue
|
|
164
|
+
|
|
165
|
+
# Type packages drift independently of runtime versions
|
|
166
|
+
@types/node
|
|
167
|
+
```
|
|
168
|
+
|
|
169
|
+
---
|
|
170
|
+
|
|
171
|
+
## TypeScript Types
|
|
172
|
+
|
|
173
|
+
```ts
|
|
174
|
+
import { scan, planFix, applyFix, type DriftReport, type Severity } from "@mr-aftab-ahmad-khan/drift-check";
|
|
175
|
+
|
|
176
|
+
const report: DriftReport = scan({ cwd: process.cwd() });
|
|
177
|
+
for (const d of report.drifted) {
|
|
178
|
+
console.log(d.name, d.severity, d.versions.length, "versions");
|
|
179
|
+
}
|
|
180
|
+
```
|
|
181
|
+
|
|
182
|
+
---
|
|
183
|
+
|
|
184
|
+
## CLI Reference
|
|
185
|
+
|
|
186
|
+
```
|
|
187
|
+
drift-check scan [options] (default command)
|
|
188
|
+
--format <fmt> pretty | json (default pretty)
|
|
189
|
+
--fail-on <severity> patch | minor | major
|
|
190
|
+
|
|
191
|
+
drift-check fix [options]
|
|
192
|
+
--target <target> version string or 'latest' (default 'latest')
|
|
193
|
+
--pkg <name> limit to a single package
|
|
194
|
+
--dry-run preview only
|
|
195
|
+
|
|
196
|
+
drift-check ignore <name>
|
|
197
|
+
Append a package name to .driftignore.
|
|
198
|
+
|
|
199
|
+
drift-check report [options]
|
|
200
|
+
Alias for `scan --format json`.
|
|
201
|
+
```
|
|
202
|
+
|
|
203
|
+
Sample output:
|
|
204
|
+
|
|
205
|
+
```bash
|
|
206
|
+
$ drift-check fix --dry-run
|
|
207
|
+
react: ^18.2.0 → ^18.3.1 /repo/apps/web/package.json
|
|
208
|
+
drift-check: dry-run, no files modified
|
|
209
|
+
```
|
|
210
|
+
|
|
211
|
+
---
|
|
212
|
+
|
|
213
|
+
## Real-World Recipe — Turborepo Monorepo with 15 Packages
|
|
214
|
+
|
|
215
|
+
```bash
|
|
216
|
+
# Initial scan
|
|
217
|
+
$ npx drift-check
|
|
218
|
+
drift-check — 3 drifted package(s) across 15 workspaces
|
|
219
|
+
MINOR react (^18.2.0 vs ^18.3.1 vs ^18.3.1)
|
|
220
|
+
PATCH typescript (5.4.0 vs 5.4.2)
|
|
221
|
+
MINOR zod (^3.22.4 vs ^3.23.0)
|
|
222
|
+
```
|
|
223
|
+
|
|
224
|
+
```json
|
|
225
|
+
// .driftrc.json
|
|
226
|
+
{
|
|
227
|
+
"failOn": "minor",
|
|
228
|
+
"ignore": ["@internal/legacy"]
|
|
229
|
+
}
|
|
230
|
+
```
|
|
231
|
+
|
|
232
|
+
```yaml
|
|
233
|
+
# .github/workflows/drift.yml
|
|
234
|
+
name: drift
|
|
235
|
+
on: [pull_request]
|
|
236
|
+
jobs:
|
|
237
|
+
drift:
|
|
238
|
+
runs-on: ubuntu-latest
|
|
239
|
+
steps:
|
|
240
|
+
- uses: actions/checkout@v4
|
|
241
|
+
- uses: actions/setup-node@v4
|
|
242
|
+
with: { node-version: 20 }
|
|
243
|
+
- run: npx drift-check --fail-on minor
|
|
244
|
+
```
|
|
245
|
+
|
|
246
|
+
```bash
|
|
247
|
+
# Developer locally
|
|
248
|
+
$ npx drift-check fix
|
|
249
|
+
react: ^18.2.0 → ^18.3.1 apps/web/package.json
|
|
250
|
+
typescript: 5.4.0 → 5.4.2 packages/ui/package.json
|
|
251
|
+
zod: ^3.22.4 → ^3.23.0 apps/web/package.json
|
|
252
|
+
drift-check: updated 3 file(s). Run your package manager's install.
|
|
253
|
+
$ pnpm install
|
|
254
|
+
$ git commit -am "chore: align drifting deps"
|
|
255
|
+
```
|
|
256
|
+
|
|
257
|
+
```bash
|
|
258
|
+
# Slack notification on scheduled run
|
|
259
|
+
$ drift-check report --format json > drift.json
|
|
260
|
+
$ curl -X POST -H 'content-type: application/json' \
|
|
261
|
+
-d "$(jq -c '{text: "Weekly drift: " + (.drifted | length | tostring) + " packages"}' drift.json)" \
|
|
262
|
+
"$SLACK_WEBHOOK_URL"
|
|
263
|
+
```
|
|
264
|
+
|
|
265
|
+
---
|
|
266
|
+
|
|
267
|
+
## How Drift Happens
|
|
268
|
+
|
|
269
|
+
1. **Lazy package adds.** Someone runs `npm install foo` in a single package; whatever version npm picks today is locked there.
|
|
270
|
+
2. **Bulk updates without a refresh.** `pnpm update --recursive` skips ranges that already satisfy a dependency, so older ranges stay.
|
|
271
|
+
3. **Independent dependabot updates.** Dependabot bumps `apps/web` to `react@18.3.1` but never touches `packages/ui`.
|
|
272
|
+
|
|
273
|
+
`drift-check`'s CI integration prevents (1) by failing PRs that introduce a new mismatch; the `fix` command resolves (2) and (3) in one commit.
|
|
274
|
+
|
|
275
|
+
---
|
|
276
|
+
|
|
277
|
+
## Comparison Table
|
|
278
|
+
|
|
279
|
+
| Feature | Manual review | syncpack | **drift-check** |
|
|
280
|
+
| -------------------------- | :-----------: | :------: | :-------------: |
|
|
281
|
+
| Auto workspace detection | ❌ | ✅ | ✅ |
|
|
282
|
+
| Severity levels | ❌ | ❌ | ✅ |
|
|
283
|
+
| Auto-fix | ❌ | ✅ | ✅ |
|
|
284
|
+
| CI integration | ❌ | ✅ | ✅ |
|
|
285
|
+
| Dry-run | ❌ | ⚠️ | ✅ |
|
|
286
|
+
| `.driftignore` | ❌ | ⚠️ | ✅ |
|
|
287
|
+
| JSON output | ❌ | ✅ | ✅ |
|
|
288
|
+
|
|
289
|
+
---
|
|
290
|
+
|
|
291
|
+
## License
|
|
292
|
+
|
|
293
|
+
MIT
|
|
@@ -0,0 +1,274 @@
|
|
|
1
|
+
// src/workspace.ts
|
|
2
|
+
import { existsSync, readFileSync, readdirSync, statSync } from "fs";
|
|
3
|
+
import { dirname, join, resolve } from "path";
|
|
4
|
+
import YAML from "yaml";
|
|
5
|
+
function readJson(file) {
|
|
6
|
+
try {
|
|
7
|
+
return JSON.parse(readFileSync(file, "utf8"));
|
|
8
|
+
} catch {
|
|
9
|
+
return void 0;
|
|
10
|
+
}
|
|
11
|
+
}
|
|
12
|
+
function detectWorkspaces(root, extraGlobs) {
|
|
13
|
+
root = resolve(root);
|
|
14
|
+
const candidates = /* @__PURE__ */ new Set();
|
|
15
|
+
const pnpm = join(root, "pnpm-workspace.yaml");
|
|
16
|
+
if (existsSync(pnpm)) {
|
|
17
|
+
try {
|
|
18
|
+
const doc = YAML.parse(readFileSync(pnpm, "utf8"));
|
|
19
|
+
for (const g of doc.packages ?? []) expandGlob(root, g, candidates);
|
|
20
|
+
} catch {
|
|
21
|
+
}
|
|
22
|
+
}
|
|
23
|
+
const rootPkg = readJson(join(root, "package.json"));
|
|
24
|
+
if (rootPkg) {
|
|
25
|
+
const ws = rootPkg.workspaces;
|
|
26
|
+
if (Array.isArray(ws)) {
|
|
27
|
+
for (const g of ws) if (typeof g === "string") expandGlob(root, g, candidates);
|
|
28
|
+
} else if (ws && typeof ws === "object" && Array.isArray(ws.packages)) {
|
|
29
|
+
for (const g of ws.packages) expandGlob(root, g, candidates);
|
|
30
|
+
}
|
|
31
|
+
}
|
|
32
|
+
for (const g of extraGlobs ?? ["packages/*", "apps/*"]) expandGlob(root, g, candidates);
|
|
33
|
+
if (candidates.size === 0) {
|
|
34
|
+
candidates.add(join(root, "package.json"));
|
|
35
|
+
}
|
|
36
|
+
return [...candidates];
|
|
37
|
+
}
|
|
38
|
+
function expandGlob(root, glob, out) {
|
|
39
|
+
if (!glob.includes("*")) {
|
|
40
|
+
const pj = join(root, glob, "package.json");
|
|
41
|
+
if (existsSync(pj)) out.add(pj);
|
|
42
|
+
return;
|
|
43
|
+
}
|
|
44
|
+
const segments = glob.split("/");
|
|
45
|
+
expand(root, segments, 0, out);
|
|
46
|
+
}
|
|
47
|
+
function expand(dir, segments, idx, out) {
|
|
48
|
+
if (idx === segments.length) {
|
|
49
|
+
const pj = join(dir, "package.json");
|
|
50
|
+
if (existsSync(pj)) out.add(pj);
|
|
51
|
+
return;
|
|
52
|
+
}
|
|
53
|
+
const seg = segments[idx];
|
|
54
|
+
if (seg === "*") {
|
|
55
|
+
if (!existsSync(dir)) return;
|
|
56
|
+
for (const entry of readdirSync(dir)) {
|
|
57
|
+
const sub = join(dir, entry);
|
|
58
|
+
try {
|
|
59
|
+
if (!statSync(sub).isDirectory()) continue;
|
|
60
|
+
} catch {
|
|
61
|
+
continue;
|
|
62
|
+
}
|
|
63
|
+
expand(sub, segments, idx + 1, out);
|
|
64
|
+
}
|
|
65
|
+
} else if (seg === "**") {
|
|
66
|
+
walk(dir, (d) => expand(d, segments, idx + 1, out));
|
|
67
|
+
} else {
|
|
68
|
+
expand(join(dir, seg), segments, idx + 1, out);
|
|
69
|
+
}
|
|
70
|
+
}
|
|
71
|
+
function walk(dir, visit) {
|
|
72
|
+
if (!existsSync(dir)) return;
|
|
73
|
+
visit(dir);
|
|
74
|
+
for (const entry of readdirSync(dir)) {
|
|
75
|
+
if (entry === "node_modules" || entry.startsWith(".")) continue;
|
|
76
|
+
const sub = join(dir, entry);
|
|
77
|
+
try {
|
|
78
|
+
if (statSync(sub).isDirectory()) walk(sub, visit);
|
|
79
|
+
} catch {
|
|
80
|
+
}
|
|
81
|
+
}
|
|
82
|
+
}
|
|
83
|
+
function loadWorkspaces(root, extraGlobs) {
|
|
84
|
+
return detectWorkspaces(root, extraGlobs).map((file) => {
|
|
85
|
+
const pj = readJson(file);
|
|
86
|
+
if (!pj) return void 0;
|
|
87
|
+
return {
|
|
88
|
+
name: pj.name ?? dirname(file),
|
|
89
|
+
version: pj.version ?? "0.0.0",
|
|
90
|
+
packageJsonPath: file,
|
|
91
|
+
dependencies: pj.dependencies ?? {},
|
|
92
|
+
devDependencies: pj.devDependencies ?? {},
|
|
93
|
+
peerDependencies: pj.peerDependencies ?? {},
|
|
94
|
+
optionalDependencies: pj.optionalDependencies ?? {}
|
|
95
|
+
};
|
|
96
|
+
}).filter((p) => Boolean(p));
|
|
97
|
+
}
|
|
98
|
+
function readDriftIgnore(root) {
|
|
99
|
+
const file = join(root, ".driftignore");
|
|
100
|
+
if (!existsSync(file)) return [];
|
|
101
|
+
return readFileSync(file, "utf8").split(/\r?\n/).map((s) => s.trim()).filter((s) => s && !s.startsWith("#"));
|
|
102
|
+
}
|
|
103
|
+
|
|
104
|
+
// src/semver.ts
|
|
105
|
+
var SEMVER_RE = /(\d+)\.(\d+)\.(\d+)(?:-([0-9A-Za-z-.]+))?/;
|
|
106
|
+
function extractVersion(range) {
|
|
107
|
+
const m = SEMVER_RE.exec(range);
|
|
108
|
+
if (!m) return void 0;
|
|
109
|
+
return { major: Number(m[1]), minor: Number(m[2]), patch: Number(m[3]) };
|
|
110
|
+
}
|
|
111
|
+
function severity(a, b) {
|
|
112
|
+
const va = extractVersion(a);
|
|
113
|
+
const vb = extractVersion(b);
|
|
114
|
+
if (!va || !vb) return a === b ? "none" : "patch";
|
|
115
|
+
if (va.major !== vb.major) return "major";
|
|
116
|
+
if (va.minor !== vb.minor) return "minor";
|
|
117
|
+
if (va.patch !== vb.patch) return "patch";
|
|
118
|
+
if (a !== b) return "patch";
|
|
119
|
+
return "none";
|
|
120
|
+
}
|
|
121
|
+
function highestRange(ranges) {
|
|
122
|
+
let best = ranges[0];
|
|
123
|
+
let bestVer = extractVersion(best);
|
|
124
|
+
for (const r of ranges) {
|
|
125
|
+
const v = extractVersion(r);
|
|
126
|
+
if (!v) continue;
|
|
127
|
+
if (!bestVer || compare(v, bestVer) > 0) {
|
|
128
|
+
best = r;
|
|
129
|
+
bestVer = v;
|
|
130
|
+
}
|
|
131
|
+
}
|
|
132
|
+
return best;
|
|
133
|
+
}
|
|
134
|
+
function compare(a, b) {
|
|
135
|
+
if (a.major !== b.major) return a.major - b.major;
|
|
136
|
+
if (a.minor !== b.minor) return a.minor - b.minor;
|
|
137
|
+
return a.patch - b.patch;
|
|
138
|
+
}
|
|
139
|
+
var SEVERITY_RANK = { patch: 0, minor: 1, major: 2 };
|
|
140
|
+
function maxSeverity(values) {
|
|
141
|
+
let max = "patch";
|
|
142
|
+
for (const v of values) if (SEVERITY_RANK[v] > SEVERITY_RANK[max]) max = v;
|
|
143
|
+
return max;
|
|
144
|
+
}
|
|
145
|
+
|
|
146
|
+
// src/scan.ts
|
|
147
|
+
import { readFileSync as readFileSync2, writeFileSync } from "fs";
|
|
148
|
+
var FIELDS = ["dependencies", "devDependencies", "peerDependencies", "optionalDependencies"];
|
|
149
|
+
function scan(options = {}) {
|
|
150
|
+
const cwd = options.cwd ?? process.cwd();
|
|
151
|
+
const workspaces = loadWorkspaces(cwd, options.workspaceGlobs);
|
|
152
|
+
const ignore = /* @__PURE__ */ new Set([...options.ignore ?? [], ...readDriftIgnore(cwd)]);
|
|
153
|
+
const byDep = /* @__PURE__ */ new Map();
|
|
154
|
+
for (const ws of workspaces) {
|
|
155
|
+
for (const field of FIELDS) {
|
|
156
|
+
for (const [name, version] of Object.entries(ws[field])) {
|
|
157
|
+
if (ignore.has(name)) continue;
|
|
158
|
+
const list = byDep.get(name) ?? [];
|
|
159
|
+
list.push({ workspace: ws.name, version, field, path: ws.packageJsonPath });
|
|
160
|
+
byDep.set(name, list);
|
|
161
|
+
}
|
|
162
|
+
}
|
|
163
|
+
}
|
|
164
|
+
const drifted = [];
|
|
165
|
+
for (const [name, occurrences] of byDep) {
|
|
166
|
+
const unique = [...new Set(occurrences.map((o) => o.version))];
|
|
167
|
+
if (unique.length <= 1) continue;
|
|
168
|
+
const severities = [];
|
|
169
|
+
for (let i = 0; i < unique.length; i++) {
|
|
170
|
+
for (let j = i + 1; j < unique.length; j++) {
|
|
171
|
+
const s = severity(unique[i], unique[j]);
|
|
172
|
+
if (s !== "none") severities.push(s);
|
|
173
|
+
}
|
|
174
|
+
}
|
|
175
|
+
drifted.push({
|
|
176
|
+
name,
|
|
177
|
+
severity: maxSeverity(severities),
|
|
178
|
+
versions: occurrences
|
|
179
|
+
});
|
|
180
|
+
}
|
|
181
|
+
drifted.sort((a, b) => a.name.localeCompare(b.name));
|
|
182
|
+
return { workspaces, drifted, generatedAt: (/* @__PURE__ */ new Date()).toISOString() };
|
|
183
|
+
}
|
|
184
|
+
function planFix(report, options = {}) {
|
|
185
|
+
const changes = [];
|
|
186
|
+
for (const d of report.drifted) {
|
|
187
|
+
if (options.only && d.name !== options.only) continue;
|
|
188
|
+
const target = options.target && options.target !== "latest" ? options.target : highestRange(d.versions.map((v) => v.version));
|
|
189
|
+
for (const v of d.versions) {
|
|
190
|
+
if (v.version !== target) {
|
|
191
|
+
changes.push({ path: v.path, field: v.field, name: d.name, from: v.version, to: target });
|
|
192
|
+
}
|
|
193
|
+
}
|
|
194
|
+
}
|
|
195
|
+
return { changes };
|
|
196
|
+
}
|
|
197
|
+
function applyFix(plan) {
|
|
198
|
+
const grouped = /* @__PURE__ */ new Map();
|
|
199
|
+
for (const c of plan.changes) {
|
|
200
|
+
const arr = grouped.get(c.path) ?? [];
|
|
201
|
+
arr.push(c);
|
|
202
|
+
grouped.set(c.path, arr);
|
|
203
|
+
}
|
|
204
|
+
for (const [file, changes] of grouped) {
|
|
205
|
+
const src = readFileSync2(file, "utf8");
|
|
206
|
+
let next = src;
|
|
207
|
+
for (const c of changes) {
|
|
208
|
+
const re = new RegExp(`("${escape(c.name)}"\\s*:\\s*")${escape(c.from)}(")`);
|
|
209
|
+
next = next.replace(re, (_m, p1, p2) => `${p1}${c.to}${p2}`);
|
|
210
|
+
}
|
|
211
|
+
writeFileSync(file, next, "utf8");
|
|
212
|
+
}
|
|
213
|
+
return { filesChanged: grouped.size };
|
|
214
|
+
}
|
|
215
|
+
function escape(s) {
|
|
216
|
+
return s.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
|
|
217
|
+
}
|
|
218
|
+
|
|
219
|
+
// src/format.ts
|
|
220
|
+
import pc from "picocolors";
|
|
221
|
+
var sevColor = {
|
|
222
|
+
patch: pc.gray,
|
|
223
|
+
minor: pc.yellow,
|
|
224
|
+
major: pc.red
|
|
225
|
+
};
|
|
226
|
+
function formatPretty(report) {
|
|
227
|
+
if (report.drifted.length === 0) {
|
|
228
|
+
return pc.green(`drift-check: no drift across ${report.workspaces.length} workspaces`);
|
|
229
|
+
}
|
|
230
|
+
const lines = [];
|
|
231
|
+
lines.push(pc.bold(`drift-check \u2014 ${report.drifted.length} drifted package(s) across ${report.workspaces.length} workspaces`));
|
|
232
|
+
for (const d of report.drifted) {
|
|
233
|
+
lines.push("");
|
|
234
|
+
lines.push(` ${sevColor[d.severity](d.severity.toUpperCase())} ${pc.bold(d.name)}`);
|
|
235
|
+
for (const v of d.versions) {
|
|
236
|
+
lines.push(` - ${v.version} ${pc.gray(`(${v.workspace} / ${v.field})`)}`);
|
|
237
|
+
}
|
|
238
|
+
}
|
|
239
|
+
return lines.join("\n");
|
|
240
|
+
}
|
|
241
|
+
function formatJson(report) {
|
|
242
|
+
return JSON.stringify(report, null, 2);
|
|
243
|
+
}
|
|
244
|
+
|
|
245
|
+
// src/config.ts
|
|
246
|
+
import { existsSync as existsSync2, readFileSync as readFileSync3 } from "fs";
|
|
247
|
+
import { join as join2 } from "path";
|
|
248
|
+
function loadConfig(cwd) {
|
|
249
|
+
const f = join2(cwd, ".driftrc.json");
|
|
250
|
+
if (!existsSync2(f)) return {};
|
|
251
|
+
try {
|
|
252
|
+
return JSON.parse(readFileSync3(f, "utf8"));
|
|
253
|
+
} catch {
|
|
254
|
+
return {};
|
|
255
|
+
}
|
|
256
|
+
}
|
|
257
|
+
|
|
258
|
+
export {
|
|
259
|
+
detectWorkspaces,
|
|
260
|
+
loadWorkspaces,
|
|
261
|
+
readDriftIgnore,
|
|
262
|
+
extractVersion,
|
|
263
|
+
severity,
|
|
264
|
+
highestRange,
|
|
265
|
+
SEVERITY_RANK,
|
|
266
|
+
maxSeverity,
|
|
267
|
+
scan,
|
|
268
|
+
planFix,
|
|
269
|
+
applyFix,
|
|
270
|
+
formatPretty,
|
|
271
|
+
formatJson,
|
|
272
|
+
loadConfig
|
|
273
|
+
};
|
|
274
|
+
//# sourceMappingURL=chunk-Q6ZBG4TH.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"sources":["../src/workspace.ts","../src/semver.ts","../src/scan.ts","../src/format.ts","../src/config.ts"],"sourcesContent":["import { existsSync, readFileSync, readdirSync, statSync } from \"node:fs\";\nimport { dirname, join, resolve } from \"node:path\";\nimport YAML from \"yaml\";\nimport type { WorkspacePackage } from \"./types.js\";\n\nfunction readJson(file: string): Record<string, unknown> | undefined {\n try {\n return JSON.parse(readFileSync(file, \"utf8\")) as Record<string, unknown>;\n } catch {\n return undefined;\n }\n}\n\n/** Detect workspace format and return absolute paths to every workspace `package.json`. */\nexport function detectWorkspaces(root: string, extraGlobs?: string[]): string[] {\n root = resolve(root);\n const candidates = new Set<string>();\n const pnpm = join(root, \"pnpm-workspace.yaml\");\n if (existsSync(pnpm)) {\n try {\n const doc = YAML.parse(readFileSync(pnpm, \"utf8\")) as { packages?: string[] };\n for (const g of doc.packages ?? []) expandGlob(root, g, candidates);\n } catch {\n // ignore\n }\n }\n const rootPkg = readJson(join(root, \"package.json\"));\n if (rootPkg) {\n const ws = rootPkg.workspaces;\n if (Array.isArray(ws)) {\n for (const g of ws) if (typeof g === \"string\") expandGlob(root, g, candidates);\n } else if (ws && typeof ws === \"object\" && Array.isArray((ws as { packages?: string[] }).packages)) {\n for (const g of (ws as { packages: string[] }).packages) expandGlob(root, g, candidates);\n }\n }\n for (const g of extraGlobs ?? [\"packages/*\", \"apps/*\"]) expandGlob(root, g, candidates);\n\n if (candidates.size === 0) {\n candidates.add(join(root, \"package.json\"));\n }\n return [...candidates];\n}\n\nfunction expandGlob(root: string, glob: string, out: Set<string>): void {\n if (!glob.includes(\"*\")) {\n const pj = join(root, glob, \"package.json\");\n if (existsSync(pj)) out.add(pj);\n return;\n }\n const segments = glob.split(\"/\");\n expand(root, segments, 0, out);\n}\n\nfunction expand(dir: string, segments: string[], idx: number, out: Set<string>): void {\n if (idx === segments.length) {\n const pj = join(dir, \"package.json\");\n if (existsSync(pj)) out.add(pj);\n return;\n }\n const seg = segments[idx]!;\n if (seg === \"*\") {\n if (!existsSync(dir)) return;\n for (const entry of readdirSync(dir)) {\n const sub = join(dir, entry);\n try { if (!statSync(sub).isDirectory()) continue; } catch { continue; }\n expand(sub, segments, idx + 1, out);\n }\n } else if (seg === \"**\") {\n walk(dir, (d) => expand(d, segments, idx + 1, out));\n } else {\n expand(join(dir, seg), segments, idx + 1, out);\n }\n}\n\nfunction walk(dir: string, visit: (d: string) => void): void {\n if (!existsSync(dir)) return;\n visit(dir);\n for (const entry of readdirSync(dir)) {\n if (entry === \"node_modules\" || entry.startsWith(\".\")) continue;\n const sub = join(dir, entry);\n try { if (statSync(sub).isDirectory()) walk(sub, visit); } catch { /* skip */ }\n }\n}\n\nexport function loadWorkspaces(root: string, extraGlobs?: string[]): WorkspacePackage[] {\n return detectWorkspaces(root, extraGlobs)\n .map((file) => {\n const pj = readJson(file);\n if (!pj) return undefined;\n return {\n name: (pj.name as string) ?? dirname(file),\n version: (pj.version as string) ?? \"0.0.0\",\n packageJsonPath: file,\n dependencies: (pj.dependencies as Record<string, string>) ?? {},\n devDependencies: (pj.devDependencies as Record<string, string>) ?? {},\n peerDependencies: (pj.peerDependencies as Record<string, string>) ?? {},\n optionalDependencies: (pj.optionalDependencies as Record<string, string>) ?? {},\n };\n })\n .filter((p): p is WorkspacePackage => Boolean(p));\n}\n\nexport function readDriftIgnore(root: string): string[] {\n const file = join(root, \".driftignore\");\n if (!existsSync(file)) return [];\n return readFileSync(file, \"utf8\")\n .split(/\\r?\\n/)\n .map((s) => s.trim())\n .filter((s) => s && !s.startsWith(\"#\"));\n}\n","import type { Severity } from \"./types.js\";\n\nconst SEMVER_RE = /(\\d+)\\.(\\d+)\\.(\\d+)(?:-([0-9A-Za-z-.]+))?/;\n\nexport function extractVersion(range: string): { major: number; minor: number; patch: number } | undefined {\n const m = SEMVER_RE.exec(range);\n if (!m) return undefined;\n return { major: Number(m[1]), minor: Number(m[2]), patch: Number(m[3]) };\n}\n\nexport function severity(a: string, b: string): Severity | \"none\" {\n const va = extractVersion(a);\n const vb = extractVersion(b);\n if (!va || !vb) return a === b ? \"none\" : \"patch\";\n if (va.major !== vb.major) return \"major\";\n if (va.minor !== vb.minor) return \"minor\";\n if (va.patch !== vb.patch) return \"patch\";\n if (a !== b) return \"patch\";\n return \"none\";\n}\n\nexport function highestRange(ranges: string[]): string {\n let best = ranges[0]!;\n let bestVer = extractVersion(best);\n for (const r of ranges) {\n const v = extractVersion(r);\n if (!v) continue;\n if (!bestVer || compare(v, bestVer) > 0) {\n best = r;\n bestVer = v;\n }\n }\n return best;\n}\n\nfunction compare(a: { major: number; minor: number; patch: number }, b: { major: number; minor: number; patch: number }): number {\n if (a.major !== b.major) return a.major - b.major;\n if (a.minor !== b.minor) return a.minor - b.minor;\n return a.patch - b.patch;\n}\n\nexport const SEVERITY_RANK: Record<Severity, number> = { patch: 0, minor: 1, major: 2 };\n\nexport function maxSeverity(values: Severity[]): Severity {\n let max: Severity = \"patch\";\n for (const v of values) if (SEVERITY_RANK[v] > SEVERITY_RANK[max]) max = v;\n return max;\n}\n","import { loadWorkspaces, readDriftIgnore } from \"./workspace.js\";\nimport { highestRange, maxSeverity, severity } from \"./semver.js\";\nimport type { DepField, DriftReport, DriftedDependency, ScanOptions, Severity } from \"./types.js\";\n\nconst FIELDS: DepField[] = [\"dependencies\", \"devDependencies\", \"peerDependencies\", \"optionalDependencies\"];\n\nexport function scan(options: ScanOptions = {}): DriftReport {\n const cwd = options.cwd ?? process.cwd();\n const workspaces = loadWorkspaces(cwd, options.workspaceGlobs);\n const ignore = new Set([...(options.ignore ?? []), ...readDriftIgnore(cwd)]);\n\n const byDep = new Map<string, DriftedDependency[\"versions\"]>();\n for (const ws of workspaces) {\n for (const field of FIELDS) {\n for (const [name, version] of Object.entries(ws[field])) {\n if (ignore.has(name)) continue;\n const list = byDep.get(name) ?? [];\n list.push({ workspace: ws.name, version, field, path: ws.packageJsonPath });\n byDep.set(name, list);\n }\n }\n }\n\n const drifted: DriftedDependency[] = [];\n for (const [name, occurrences] of byDep) {\n const unique = [...new Set(occurrences.map((o) => o.version))];\n if (unique.length <= 1) continue;\n const severities: Severity[] = [];\n for (let i = 0; i < unique.length; i++) {\n for (let j = i + 1; j < unique.length; j++) {\n const s = severity(unique[i]!, unique[j]!);\n if (s !== \"none\") severities.push(s);\n }\n }\n drifted.push({\n name,\n severity: maxSeverity(severities),\n versions: occurrences,\n });\n }\n\n drifted.sort((a, b) => a.name.localeCompare(b.name));\n return { workspaces, drifted, generatedAt: new Date().toISOString() };\n}\n\nexport interface FixPlan {\n changes: {\n path: string;\n field: DepField;\n name: string;\n from: string;\n to: string;\n }[];\n}\n\nexport function planFix(report: DriftReport, options: { only?: string; target?: \"latest\" | string } = {}): FixPlan {\n const changes: FixPlan[\"changes\"] = [];\n for (const d of report.drifted) {\n if (options.only && d.name !== options.only) continue;\n const target = options.target && options.target !== \"latest\" ? options.target : highestRange(d.versions.map((v) => v.version));\n for (const v of d.versions) {\n if (v.version !== target) {\n changes.push({ path: v.path, field: v.field, name: d.name, from: v.version, to: target });\n }\n }\n }\n return { changes };\n}\n\nimport { readFileSync, writeFileSync } from \"node:fs\";\n\nexport function applyFix(plan: FixPlan): { filesChanged: number } {\n const grouped = new Map<string, FixPlan[\"changes\"]>();\n for (const c of plan.changes) {\n const arr = grouped.get(c.path) ?? [];\n arr.push(c);\n grouped.set(c.path, arr);\n }\n for (const [file, changes] of grouped) {\n const src = readFileSync(file, \"utf8\");\n let next = src;\n for (const c of changes) {\n const re = new RegExp(`(\"${escape(c.name)}\"\\\\s*:\\\\s*\")${escape(c.from)}(\")`);\n next = next.replace(re, (_m, p1: string, p2: string) => `${p1}${c.to}${p2}`);\n }\n writeFileSync(file, next, \"utf8\");\n }\n return { filesChanged: grouped.size };\n}\n\nfunction escape(s: string): string {\n return s.replace(/[.*+?^${}()|[\\]\\\\]/g, \"\\\\$&\");\n}\n","import pc from \"picocolors\";\nimport type { DriftReport, Severity } from \"./types.js\";\n\nconst sevColor: Record<Severity, (s: string) => string> = {\n patch: pc.gray,\n minor: pc.yellow,\n major: pc.red,\n};\n\nexport function formatPretty(report: DriftReport): string {\n if (report.drifted.length === 0) {\n return pc.green(`drift-check: no drift across ${report.workspaces.length} workspaces`);\n }\n const lines: string[] = [];\n lines.push(pc.bold(`drift-check — ${report.drifted.length} drifted package(s) across ${report.workspaces.length} workspaces`));\n for (const d of report.drifted) {\n lines.push(\"\");\n lines.push(` ${sevColor[d.severity](d.severity.toUpperCase())} ${pc.bold(d.name)}`);\n for (const v of d.versions) {\n lines.push(` - ${v.version} ${pc.gray(`(${v.workspace} / ${v.field})`)}`);\n }\n }\n return lines.join(\"\\n\");\n}\n\nexport function formatJson(report: DriftReport): string {\n return JSON.stringify(report, null, 2);\n}\n","import { existsSync, readFileSync } from \"node:fs\";\nimport { join } from \"node:path\";\nimport type { Severity } from \"./types.js\";\n\nexport interface DriftConfig {\n failOn?: Severity;\n ignore?: string[];\n workspaceGlobs?: string[];\n}\n\nexport function loadConfig(cwd: string): DriftConfig {\n const f = join(cwd, \".driftrc.json\");\n if (!existsSync(f)) return {};\n try {\n return JSON.parse(readFileSync(f, \"utf8\")) as DriftConfig;\n } catch {\n return {};\n }\n}\n"],"mappings":";AAAA,SAAS,YAAY,cAAc,aAAa,gBAAgB;AAChE,SAAS,SAAS,MAAM,eAAe;AACvC,OAAO,UAAU;AAGjB,SAAS,SAAS,MAAmD;AACnE,MAAI;AACF,WAAO,KAAK,MAAM,aAAa,MAAM,MAAM,CAAC;AAAA,EAC9C,QAAQ;AACN,WAAO;AAAA,EACT;AACF;AAGO,SAAS,iBAAiB,MAAc,YAAiC;AAC9E,SAAO,QAAQ,IAAI;AACnB,QAAM,aAAa,oBAAI,IAAY;AACnC,QAAM,OAAO,KAAK,MAAM,qBAAqB;AAC7C,MAAI,WAAW,IAAI,GAAG;AACpB,QAAI;AACF,YAAM,MAAM,KAAK,MAAM,aAAa,MAAM,MAAM,CAAC;AACjD,iBAAW,KAAK,IAAI,YAAY,CAAC,EAAG,YAAW,MAAM,GAAG,UAAU;AAAA,IACpE,QAAQ;AAAA,IAER;AAAA,EACF;AACA,QAAM,UAAU,SAAS,KAAK,MAAM,cAAc,CAAC;AACnD,MAAI,SAAS;AACX,UAAM,KAAK,QAAQ;AACnB,QAAI,MAAM,QAAQ,EAAE,GAAG;AACrB,iBAAW,KAAK,GAAI,KAAI,OAAO,MAAM,SAAU,YAAW,MAAM,GAAG,UAAU;AAAA,IAC/E,WAAW,MAAM,OAAO,OAAO,YAAY,MAAM,QAAS,GAA+B,QAAQ,GAAG;AAClG,iBAAW,KAAM,GAA8B,SAAU,YAAW,MAAM,GAAG,UAAU;AAAA,IACzF;AAAA,EACF;AACA,aAAW,KAAK,cAAc,CAAC,cAAc,QAAQ,EAAG,YAAW,MAAM,GAAG,UAAU;AAEtF,MAAI,WAAW,SAAS,GAAG;AACzB,eAAW,IAAI,KAAK,MAAM,cAAc,CAAC;AAAA,EAC3C;AACA,SAAO,CAAC,GAAG,UAAU;AACvB;AAEA,SAAS,WAAW,MAAc,MAAc,KAAwB;AACtE,MAAI,CAAC,KAAK,SAAS,GAAG,GAAG;AACvB,UAAM,KAAK,KAAK,MAAM,MAAM,cAAc;AAC1C,QAAI,WAAW,EAAE,EAAG,KAAI,IAAI,EAAE;AAC9B;AAAA,EACF;AACA,QAAM,WAAW,KAAK,MAAM,GAAG;AAC/B,SAAO,MAAM,UAAU,GAAG,GAAG;AAC/B;AAEA,SAAS,OAAO,KAAa,UAAoB,KAAa,KAAwB;AACpF,MAAI,QAAQ,SAAS,QAAQ;AAC3B,UAAM,KAAK,KAAK,KAAK,cAAc;AACnC,QAAI,WAAW,EAAE,EAAG,KAAI,IAAI,EAAE;AAC9B;AAAA,EACF;AACA,QAAM,MAAM,SAAS,GAAG;AACxB,MAAI,QAAQ,KAAK;AACf,QAAI,CAAC,WAAW,GAAG,EAAG;AACtB,eAAW,SAAS,YAAY,GAAG,GAAG;AACpC,YAAM,MAAM,KAAK,KAAK,KAAK;AAC3B,UAAI;AAAE,YAAI,CAAC,SAAS,GAAG,EAAE,YAAY,EAAG;AAAA,MAAU,QAAQ;AAAE;AAAA,MAAU;AACtE,aAAO,KAAK,UAAU,MAAM,GAAG,GAAG;AAAA,IACpC;AAAA,EACF,WAAW,QAAQ,MAAM;AACvB,SAAK,KAAK,CAAC,MAAM,OAAO,GAAG,UAAU,MAAM,GAAG,GAAG,CAAC;AAAA,EACpD,OAAO;AACL,WAAO,KAAK,KAAK,GAAG,GAAG,UAAU,MAAM,GAAG,GAAG;AAAA,EAC/C;AACF;AAEA,SAAS,KAAK,KAAa,OAAkC;AAC3D,MAAI,CAAC,WAAW,GAAG,EAAG;AACtB,QAAM,GAAG;AACT,aAAW,SAAS,YAAY,GAAG,GAAG;AACpC,QAAI,UAAU,kBAAkB,MAAM,WAAW,GAAG,EAAG;AACvD,UAAM,MAAM,KAAK,KAAK,KAAK;AAC3B,QAAI;AAAE,UAAI,SAAS,GAAG,EAAE,YAAY,EAAG,MAAK,KAAK,KAAK;AAAA,IAAG,QAAQ;AAAA,IAAa;AAAA,EAChF;AACF;AAEO,SAAS,eAAe,MAAc,YAA2C;AACtF,SAAO,iBAAiB,MAAM,UAAU,EACrC,IAAI,CAAC,SAAS;AACb,UAAM,KAAK,SAAS,IAAI;AACxB,QAAI,CAAC,GAAI,QAAO;AAChB,WAAO;AAAA,MACL,MAAO,GAAG,QAAmB,QAAQ,IAAI;AAAA,MACzC,SAAU,GAAG,WAAsB;AAAA,MACnC,iBAAiB;AAAA,MACjB,cAAe,GAAG,gBAA2C,CAAC;AAAA,MAC9D,iBAAkB,GAAG,mBAA8C,CAAC;AAAA,MACpE,kBAAmB,GAAG,oBAA+C,CAAC;AAAA,MACtE,sBAAuB,GAAG,wBAAmD,CAAC;AAAA,IAChF;AAAA,EACF,CAAC,EACA,OAAO,CAAC,MAA6B,QAAQ,CAAC,CAAC;AACpD;AAEO,SAAS,gBAAgB,MAAwB;AACtD,QAAM,OAAO,KAAK,MAAM,cAAc;AACtC,MAAI,CAAC,WAAW,IAAI,EAAG,QAAO,CAAC;AAC/B,SAAO,aAAa,MAAM,MAAM,EAC7B,MAAM,OAAO,EACb,IAAI,CAAC,MAAM,EAAE,KAAK,CAAC,EACnB,OAAO,CAAC,MAAM,KAAK,CAAC,EAAE,WAAW,GAAG,CAAC;AAC1C;;;AC3GA,IAAM,YAAY;AAEX,SAAS,eAAe,OAA4E;AACzG,QAAM,IAAI,UAAU,KAAK,KAAK;AAC9B,MAAI,CAAC,EAAG,QAAO;AACf,SAAO,EAAE,OAAO,OAAO,EAAE,CAAC,CAAC,GAAG,OAAO,OAAO,EAAE,CAAC,CAAC,GAAG,OAAO,OAAO,EAAE,CAAC,CAAC,EAAE;AACzE;AAEO,SAAS,SAAS,GAAW,GAA8B;AAChE,QAAM,KAAK,eAAe,CAAC;AAC3B,QAAM,KAAK,eAAe,CAAC;AAC3B,MAAI,CAAC,MAAM,CAAC,GAAI,QAAO,MAAM,IAAI,SAAS;AAC1C,MAAI,GAAG,UAAU,GAAG,MAAO,QAAO;AAClC,MAAI,GAAG,UAAU,GAAG,MAAO,QAAO;AAClC,MAAI,GAAG,UAAU,GAAG,MAAO,QAAO;AAClC,MAAI,MAAM,EAAG,QAAO;AACpB,SAAO;AACT;AAEO,SAAS,aAAa,QAA0B;AACrD,MAAI,OAAO,OAAO,CAAC;AACnB,MAAI,UAAU,eAAe,IAAI;AACjC,aAAW,KAAK,QAAQ;AACtB,UAAM,IAAI,eAAe,CAAC;AAC1B,QAAI,CAAC,EAAG;AACR,QAAI,CAAC,WAAW,QAAQ,GAAG,OAAO,IAAI,GAAG;AACvC,aAAO;AACP,gBAAU;AAAA,IACZ;AAAA,EACF;AACA,SAAO;AACT;AAEA,SAAS,QAAQ,GAAoD,GAA4D;AAC/H,MAAI,EAAE,UAAU,EAAE,MAAO,QAAO,EAAE,QAAQ,EAAE;AAC5C,MAAI,EAAE,UAAU,EAAE,MAAO,QAAO,EAAE,QAAQ,EAAE;AAC5C,SAAO,EAAE,QAAQ,EAAE;AACrB;AAEO,IAAM,gBAA0C,EAAE,OAAO,GAAG,OAAO,GAAG,OAAO,EAAE;AAE/E,SAAS,YAAY,QAA8B;AACxD,MAAI,MAAgB;AACpB,aAAW,KAAK,OAAQ,KAAI,cAAc,CAAC,IAAI,cAAc,GAAG,EAAG,OAAM;AACzE,SAAO;AACT;;;ACsBA,SAAS,gBAAAA,eAAc,qBAAqB;AAjE5C,IAAM,SAAqB,CAAC,gBAAgB,mBAAmB,oBAAoB,sBAAsB;AAElG,SAAS,KAAK,UAAuB,CAAC,GAAgB;AAC3D,QAAM,MAAM,QAAQ,OAAO,QAAQ,IAAI;AACvC,QAAM,aAAa,eAAe,KAAK,QAAQ,cAAc;AAC7D,QAAM,SAAS,oBAAI,IAAI,CAAC,GAAI,QAAQ,UAAU,CAAC,GAAI,GAAG,gBAAgB,GAAG,CAAC,CAAC;AAE3E,QAAM,QAAQ,oBAAI,IAA2C;AAC7D,aAAW,MAAM,YAAY;AAC3B,eAAW,SAAS,QAAQ;AAC1B,iBAAW,CAAC,MAAM,OAAO,KAAK,OAAO,QAAQ,GAAG,KAAK,CAAC,GAAG;AACvD,YAAI,OAAO,IAAI,IAAI,EAAG;AACtB,cAAM,OAAO,MAAM,IAAI,IAAI,KAAK,CAAC;AACjC,aAAK,KAAK,EAAE,WAAW,GAAG,MAAM,SAAS,OAAO,MAAM,GAAG,gBAAgB,CAAC;AAC1E,cAAM,IAAI,MAAM,IAAI;AAAA,MACtB;AAAA,IACF;AAAA,EACF;AAEA,QAAM,UAA+B,CAAC;AACtC,aAAW,CAAC,MAAM,WAAW,KAAK,OAAO;AACvC,UAAM,SAAS,CAAC,GAAG,IAAI,IAAI,YAAY,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;AAC7D,QAAI,OAAO,UAAU,EAAG;AACxB,UAAM,aAAyB,CAAC;AAChC,aAAS,IAAI,GAAG,IAAI,OAAO,QAAQ,KAAK;AACtC,eAAS,IAAI,IAAI,GAAG,IAAI,OAAO,QAAQ,KAAK;AAC1C,cAAM,IAAI,SAAS,OAAO,CAAC,GAAI,OAAO,CAAC,CAAE;AACzC,YAAI,MAAM,OAAQ,YAAW,KAAK,CAAC;AAAA,MACrC;AAAA,IACF;AACA,YAAQ,KAAK;AAAA,MACX;AAAA,MACA,UAAU,YAAY,UAAU;AAAA,MAChC,UAAU;AAAA,IACZ,CAAC;AAAA,EACH;AAEA,UAAQ,KAAK,CAAC,GAAG,MAAM,EAAE,KAAK,cAAc,EAAE,IAAI,CAAC;AACnD,SAAO,EAAE,YAAY,SAAS,cAAa,oBAAI,KAAK,GAAE,YAAY,EAAE;AACtE;AAYO,SAAS,QAAQ,QAAqB,UAAyD,CAAC,GAAY;AACjH,QAAM,UAA8B,CAAC;AACrC,aAAW,KAAK,OAAO,SAAS;AAC9B,QAAI,QAAQ,QAAQ,EAAE,SAAS,QAAQ,KAAM;AAC7C,UAAM,SAAS,QAAQ,UAAU,QAAQ,WAAW,WAAW,QAAQ,SAAS,aAAa,EAAE,SAAS,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC;AAC7H,eAAW,KAAK,EAAE,UAAU;AAC1B,UAAI,EAAE,YAAY,QAAQ;AACxB,gBAAQ,KAAK,EAAE,MAAM,EAAE,MAAM,OAAO,EAAE,OAAO,MAAM,EAAE,MAAM,MAAM,EAAE,SAAS,IAAI,OAAO,CAAC;AAAA,MAC1F;AAAA,IACF;AAAA,EACF;AACA,SAAO,EAAE,QAAQ;AACnB;AAIO,SAAS,SAAS,MAAyC;AAChE,QAAM,UAAU,oBAAI,IAAgC;AACpD,aAAW,KAAK,KAAK,SAAS;AAC5B,UAAM,MAAM,QAAQ,IAAI,EAAE,IAAI,KAAK,CAAC;AACpC,QAAI,KAAK,CAAC;AACV,YAAQ,IAAI,EAAE,MAAM,GAAG;AAAA,EACzB;AACA,aAAW,CAAC,MAAM,OAAO,KAAK,SAAS;AACrC,UAAM,MAAMA,cAAa,MAAM,MAAM;AACrC,QAAI,OAAO;AACX,eAAW,KAAK,SAAS;AACvB,YAAM,KAAK,IAAI,OAAO,KAAK,OAAO,EAAE,IAAI,CAAC,eAAe,OAAO,EAAE,IAAI,CAAC,KAAK;AAC3E,aAAO,KAAK,QAAQ,IAAI,CAAC,IAAI,IAAY,OAAe,GAAG,EAAE,GAAG,EAAE,EAAE,GAAG,EAAE,EAAE;AAAA,IAC7E;AACA,kBAAc,MAAM,MAAM,MAAM;AAAA,EAClC;AACA,SAAO,EAAE,cAAc,QAAQ,KAAK;AACtC;AAEA,SAAS,OAAO,GAAmB;AACjC,SAAO,EAAE,QAAQ,uBAAuB,MAAM;AAChD;;;AC5FA,OAAO,QAAQ;AAGf,IAAM,WAAoD;AAAA,EACxD,OAAO,GAAG;AAAA,EACV,OAAO,GAAG;AAAA,EACV,OAAO,GAAG;AACZ;AAEO,SAAS,aAAa,QAA6B;AACxD,MAAI,OAAO,QAAQ,WAAW,GAAG;AAC/B,WAAO,GAAG,MAAM,gCAAgC,OAAO,WAAW,MAAM,aAAa;AAAA,EACvF;AACA,QAAM,QAAkB,CAAC;AACzB,QAAM,KAAK,GAAG,KAAK,sBAAiB,OAAO,QAAQ,MAAM,8BAA8B,OAAO,WAAW,MAAM,aAAa,CAAC;AAC7H,aAAW,KAAK,OAAO,SAAS;AAC9B,UAAM,KAAK,EAAE;AACb,UAAM,KAAK,KAAK,SAAS,EAAE,QAAQ,EAAE,EAAE,SAAS,YAAY,CAAC,CAAC,KAAK,GAAG,KAAK,EAAE,IAAI,CAAC,EAAE;AACpF,eAAW,KAAK,EAAE,UAAU;AAC1B,YAAM,KAAK,SAAS,EAAE,OAAO,KAAK,GAAG,KAAK,IAAI,EAAE,SAAS,MAAM,EAAE,KAAK,GAAG,CAAC,EAAE;AAAA,IAC9E;AAAA,EACF;AACA,SAAO,MAAM,KAAK,IAAI;AACxB;AAEO,SAAS,WAAW,QAA6B;AACtD,SAAO,KAAK,UAAU,QAAQ,MAAM,CAAC;AACvC;;;AC3BA,SAAS,cAAAC,aAAY,gBAAAC,qBAAoB;AACzC,SAAS,QAAAC,aAAY;AASd,SAAS,WAAW,KAA0B;AACnD,QAAM,IAAIA,MAAK,KAAK,eAAe;AACnC,MAAI,CAACF,YAAW,CAAC,EAAG,QAAO,CAAC;AAC5B,MAAI;AACF,WAAO,KAAK,MAAMC,cAAa,GAAG,MAAM,CAAC;AAAA,EAC3C,QAAQ;AACN,WAAO,CAAC;AAAA,EACV;AACF;","names":["readFileSync","existsSync","readFileSync","join"]}
|