rot-detector 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +154 -0
- package/dist/index.cjs +831 -0
- package/dist/index.d.cts +1 -0
- package/package.json +56 -0
package/README.md
ADDED
|
@@ -0,0 +1,154 @@
|
|
|
1
|
+
# 🧟 Dependency Rot Detector
|
|
2
|
+
|
|
3
|
+
[](https://www.npmjs.com/package/rot-detector)
|
|
4
|
+
[](https://opensource.org/licenses/MIT)
|
|
5
|
+
|
|
6
|
+
> **Find abandoned dependencies before they become security nightmares.**
|
|
7
|
+
|
|
8
|
+
A CLI tool that scans your `package.json` (NPM) or `requirements.txt` (Python) to detect **software rot** - dependencies that are abandoned, poorly maintained, or pose supply chain risks.
|
|
9
|
+
|
|
10
|
+
## 🤔 The Problem
|
|
11
|
+
|
|
12
|
+
`npm audit` and Snyk tell you about known CVEs. They **DON'T** tell you:
|
|
13
|
+
|
|
14
|
+
- 📅 A library hasn't been updated in **4 years**
|
|
15
|
+
- 👤 A package has only **1 maintainer** (bus factor risk)
|
|
16
|
+
- ⚖️ A dependency uses a **deprecated license**
|
|
17
|
+
|
|
18
|
+
This is "Software Rot" - a security bomb waiting to explode. 💣
|
|
19
|
+
|
|
20
|
+
## 🚀 Quick Start
|
|
21
|
+
|
|
22
|
+
```bash
|
|
23
|
+
# Install globally
|
|
24
|
+
npm install -g rot-detector
|
|
25
|
+
|
|
26
|
+
# Scan your project
|
|
27
|
+
rot-detector scan .
|
|
28
|
+
|
|
29
|
+
# Or use npx (no install)
|
|
30
|
+
npx rot-detector scan ./package.json
|
|
31
|
+
```
|
|
32
|
+
|
|
33
|
+
## 📊 Example Output
|
|
34
|
+
|
|
35
|
+
```
|
|
36
|
+
🧟 Dependency Rot Detector
|
|
37
|
+
Scanned: ./package.json
|
|
38
|
+
|
|
39
|
+
┌────────────────────────┬────────┬────────────────┬─────────────┬───────────────┬────────────┐
|
|
40
|
+
│ Package │ Score │ Last Update │ Maintainers │ License │ Status │
|
|
41
|
+
├────────────────────────┼────────┼────────────────┼─────────────┼───────────────┼────────────┤
|
|
42
|
+
│ abandoned-lib │ 🔴 15 │ 4 years ago │ 1 │ GPL-2.0 │ Critical │
|
|
43
|
+
│ old-but-ok │ 🟡 65 │ 8 months ago │ 2 │ MIT │ Warning │
|
|
44
|
+
│ react │ 🟢 95 │ 2 days ago │ 15 │ MIT │ Healthy │
|
|
45
|
+
└────────────────────────┴────────┴────────────────┴─────────────┴───────────────┴────────────┘
|
|
46
|
+
|
|
47
|
+
Summary: 🟢 1 Healthy | 🟡 1 Warning | 🔴 1 Critical
|
|
48
|
+
```
|
|
49
|
+
|
|
50
|
+
## 📋 Features
|
|
51
|
+
|
|
52
|
+
| Feature | Description |
|
|
53
|
+
|---------|-------------|
|
|
54
|
+
| 🔍 **NPM + PyPI Support** | Scans `package.json` and `requirements.txt` |
|
|
55
|
+
| 📈 **Health Scoring** | 0-100 score based on freshness, maintainers, license |
|
|
56
|
+
| 🎨 **Beautiful CLI Output** | Color-coded risk indicators |
|
|
57
|
+
| 📊 **JSON Export** | `--json` flag for CI/CD integration |
|
|
58
|
+
| ⚡ **GitHub Integration** | Optional enhanced repo analysis |
|
|
59
|
+
| 🚨 **Threshold Checks** | Fail builds if score drops below threshold |
|
|
60
|
+
|
|
61
|
+
## ⚙️ CLI Options
|
|
62
|
+
|
|
63
|
+
```bash
|
|
64
|
+
rot-detector scan [path] [options]
|
|
65
|
+
|
|
66
|
+
Options:
|
|
67
|
+
--json Output results as JSON
|
|
68
|
+
--threshold <score> Fail if any dependency scores below threshold
|
|
69
|
+
--github-token <tok> GitHub token for enhanced repo analysis
|
|
70
|
+
--no-github Skip GitHub analysis (faster)
|
|
71
|
+
--dev Include devDependencies
|
|
72
|
+
-v, --verbose Verbose output
|
|
73
|
+
```
|
|
74
|
+
|
|
75
|
+
## 🏆 Health Score Breakdown
|
|
76
|
+
|
|
77
|
+
Each dependency is scored 0-100 based on:
|
|
78
|
+
|
|
79
|
+
| Factor | Weight | Scoring |
|
|
80
|
+
|--------|--------|---------|
|
|
81
|
+
| **Freshness** | 40% | < 6 months = 100, > 3 years = 5 |
|
|
82
|
+
| **Maintainers** | 30% | 5+ = 100, 1 = 40, 0 = 10 |
|
|
83
|
+
| **License** | 30% | OSI approved = 100, Unknown = 60 |
|
|
84
|
+
|
|
85
|
+
### Risk Levels
|
|
86
|
+
- 🟢 **Healthy** (80-100): Well maintained, safe to use
|
|
87
|
+
- 🟡 **Warning** (50-79): Review recommended
|
|
88
|
+
- 🔴 **Critical** (0-49): Replace immediately!
|
|
89
|
+
|
|
90
|
+
## 🔧 CI/CD Integration
|
|
91
|
+
|
|
92
|
+
### GitHub Actions
|
|
93
|
+
|
|
94
|
+
```yaml
|
|
95
|
+
name: Dependency Health Check
|
|
96
|
+
on: [push, pull_request]
|
|
97
|
+
|
|
98
|
+
jobs:
|
|
99
|
+
rot-check:
|
|
100
|
+
runs-on: ubuntu-latest
|
|
101
|
+
steps:
|
|
102
|
+
- uses: actions/checkout@v4
|
|
103
|
+
- uses: actions/setup-node@v4
|
|
104
|
+
with:
|
|
105
|
+
node-version: '18'
|
|
106
|
+
|
|
107
|
+
- name: Check for dependency rot
|
|
108
|
+
run: npx rot-detector scan --threshold 50
|
|
109
|
+
```
|
|
110
|
+
|
|
111
|
+
### Pre-commit Hook
|
|
112
|
+
|
|
113
|
+
```bash
|
|
114
|
+
# .husky/pre-commit
|
|
115
|
+
npx rot-detector scan --threshold 60
|
|
116
|
+
```
|
|
117
|
+
|
|
118
|
+
## 🛠️ Development
|
|
119
|
+
|
|
120
|
+
```bash
|
|
121
|
+
# Clone the repo
|
|
122
|
+
git clone https://github.com/notsointresting/rot-detector.git
|
|
123
|
+
cd rot-detector
|
|
124
|
+
|
|
125
|
+
# Install dependencies
|
|
126
|
+
npm install
|
|
127
|
+
|
|
128
|
+
# Run in development mode
|
|
129
|
+
npm run dev -- scan ./sample/package.json
|
|
130
|
+
|
|
131
|
+
# Build for production
|
|
132
|
+
npm run build
|
|
133
|
+
|
|
134
|
+
# Run tests
|
|
135
|
+
npm test
|
|
136
|
+
```
|
|
137
|
+
|
|
138
|
+
## 🤝 Contributing
|
|
139
|
+
|
|
140
|
+
Contributions are welcome! Feel free to:
|
|
141
|
+
|
|
142
|
+
1. 🐛 Report bugs
|
|
143
|
+
2. 💡 Suggest features
|
|
144
|
+
3. 🔀 Submit pull requests
|
|
145
|
+
|
|
146
|
+
## 📄 License
|
|
147
|
+
|
|
148
|
+
MIT © [notsointresting](https://github.com/notsointresting)
|
|
149
|
+
|
|
150
|
+
---
|
|
151
|
+
|
|
152
|
+
<p align="center">
|
|
153
|
+
Made with 🧟 by developers who got burned by abandoned dependencies
|
|
154
|
+
</p>
|
package/dist/index.cjs
ADDED
|
@@ -0,0 +1,831 @@
|
|
|
1
|
+
#!/usr/bin/env node
|
|
2
|
+
"use strict";
|
|
3
|
+
var __create = Object.create;
|
|
4
|
+
var __defProp = Object.defineProperty;
|
|
5
|
+
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
|
6
|
+
var __getOwnPropNames = Object.getOwnPropertyNames;
|
|
7
|
+
var __getProtoOf = Object.getPrototypeOf;
|
|
8
|
+
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
|
9
|
+
var __copyProps = (to, from, except, desc) => {
|
|
10
|
+
if (from && typeof from === "object" || typeof from === "function") {
|
|
11
|
+
for (let key of __getOwnPropNames(from))
|
|
12
|
+
if (!__hasOwnProp.call(to, key) && key !== except)
|
|
13
|
+
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
|
14
|
+
}
|
|
15
|
+
return to;
|
|
16
|
+
};
|
|
17
|
+
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
|
18
|
+
// If the importer is in node compatibility mode or this is not an ESM
|
|
19
|
+
// file that has been converted to a CommonJS file using a Babel-
|
|
20
|
+
// compatible transform (i.e. "__esModule" has not been set), then set
|
|
21
|
+
// "default" to the CommonJS "module.exports" for node compatibility.
|
|
22
|
+
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
|
23
|
+
mod
|
|
24
|
+
));
|
|
25
|
+
|
|
26
|
+
// src/cli/index.ts
|
|
27
|
+
var import_commander = require("commander");
|
|
28
|
+
var import_ora = __toESM(require("ora"), 1);
|
|
29
|
+
var import_chalk2 = __toESM(require("chalk"), 1);
|
|
30
|
+
var path4 = __toESM(require("path"), 1);
|
|
31
|
+
var fs3 = __toESM(require("fs"), 1);
|
|
32
|
+
|
|
33
|
+
// src/parsers/package-json.ts
|
|
34
|
+
var fs = __toESM(require("fs"), 1);
|
|
35
|
+
var path = __toESM(require("path"), 1);
|
|
36
|
+
function parsePackageJson(filePath) {
|
|
37
|
+
const absolutePath = path.resolve(filePath);
|
|
38
|
+
if (!fs.existsSync(absolutePath)) {
|
|
39
|
+
throw new Error(`File not found: ${absolutePath}`);
|
|
40
|
+
}
|
|
41
|
+
const content = fs.readFileSync(absolutePath, "utf-8");
|
|
42
|
+
let packageJson;
|
|
43
|
+
try {
|
|
44
|
+
packageJson = JSON.parse(content);
|
|
45
|
+
} catch (e) {
|
|
46
|
+
throw new Error(`Invalid JSON in ${filePath}: ${e.message}`);
|
|
47
|
+
}
|
|
48
|
+
const dependencies = [];
|
|
49
|
+
if (packageJson.dependencies) {
|
|
50
|
+
for (const [name, version] of Object.entries(packageJson.dependencies)) {
|
|
51
|
+
dependencies.push({
|
|
52
|
+
name,
|
|
53
|
+
version: cleanVersion(version),
|
|
54
|
+
type: "direct",
|
|
55
|
+
source: "npm",
|
|
56
|
+
isDev: false
|
|
57
|
+
});
|
|
58
|
+
}
|
|
59
|
+
}
|
|
60
|
+
if (packageJson.devDependencies) {
|
|
61
|
+
for (const [name, version] of Object.entries(packageJson.devDependencies)) {
|
|
62
|
+
dependencies.push({
|
|
63
|
+
name,
|
|
64
|
+
version: cleanVersion(version),
|
|
65
|
+
type: "direct",
|
|
66
|
+
source: "npm",
|
|
67
|
+
isDev: true
|
|
68
|
+
});
|
|
69
|
+
}
|
|
70
|
+
}
|
|
71
|
+
return dependencies;
|
|
72
|
+
}
|
|
73
|
+
function cleanVersion(version) {
|
|
74
|
+
return version.replace(/^[\^~>=<]+/, "").trim();
|
|
75
|
+
}
|
|
76
|
+
|
|
77
|
+
// src/parsers/requirements-txt.ts
|
|
78
|
+
var fs2 = __toESM(require("fs"), 1);
|
|
79
|
+
var path2 = __toESM(require("path"), 1);
|
|
80
|
+
function parseRequirementsTxt(filePath) {
|
|
81
|
+
const absolutePath = path2.resolve(filePath);
|
|
82
|
+
if (!fs2.existsSync(absolutePath)) {
|
|
83
|
+
throw new Error(`File not found: ${absolutePath}`);
|
|
84
|
+
}
|
|
85
|
+
const content = fs2.readFileSync(absolutePath, "utf-8");
|
|
86
|
+
const lines = content.split("\n");
|
|
87
|
+
const dependencies = [];
|
|
88
|
+
for (const rawLine of lines) {
|
|
89
|
+
const line = rawLine.trim();
|
|
90
|
+
if (!line || line.startsWith("#")) {
|
|
91
|
+
continue;
|
|
92
|
+
}
|
|
93
|
+
if (line.startsWith("-")) {
|
|
94
|
+
continue;
|
|
95
|
+
}
|
|
96
|
+
if (line.includes("://") || line.startsWith("git+")) {
|
|
97
|
+
continue;
|
|
98
|
+
}
|
|
99
|
+
const parsed = parseRequirementLine(line);
|
|
100
|
+
if (parsed) {
|
|
101
|
+
dependencies.push({
|
|
102
|
+
name: parsed.name,
|
|
103
|
+
version: parsed.version,
|
|
104
|
+
type: "direct",
|
|
105
|
+
source: "pypi",
|
|
106
|
+
isDev: false
|
|
107
|
+
// requirements.txt doesn't distinguish
|
|
108
|
+
});
|
|
109
|
+
}
|
|
110
|
+
}
|
|
111
|
+
return dependencies;
|
|
112
|
+
}
|
|
113
|
+
function parseRequirementLine(line) {
|
|
114
|
+
const commentIndex = line.indexOf("#");
|
|
115
|
+
if (commentIndex !== -1) {
|
|
116
|
+
line = line.substring(0, commentIndex).trim();
|
|
117
|
+
}
|
|
118
|
+
if (!line) return null;
|
|
119
|
+
const markerIndex = line.indexOf(";");
|
|
120
|
+
if (markerIndex !== -1) {
|
|
121
|
+
line = line.substring(0, markerIndex).trim();
|
|
122
|
+
}
|
|
123
|
+
const match = line.match(/^([a-zA-Z0-9_-]+)(?:\[[^\]]+\])?\s*(.*)$/);
|
|
124
|
+
if (!match) {
|
|
125
|
+
return null;
|
|
126
|
+
}
|
|
127
|
+
const name = match[1];
|
|
128
|
+
let version = "*";
|
|
129
|
+
if (match[2]) {
|
|
130
|
+
const versionMatch = match[2].match(/[=~<>!]+\s*([0-9][0-9a-zA-Z.*-]*)/);
|
|
131
|
+
if (versionMatch) {
|
|
132
|
+
version = versionMatch[1];
|
|
133
|
+
}
|
|
134
|
+
}
|
|
135
|
+
return { name, version };
|
|
136
|
+
}
|
|
137
|
+
|
|
138
|
+
// src/parsers/index.ts
|
|
139
|
+
var path3 = __toESM(require("path"), 1);
|
|
140
|
+
function detectFileType(filePath) {
|
|
141
|
+
const basename3 = path3.basename(filePath).toLowerCase();
|
|
142
|
+
if (basename3 === "package.json") {
|
|
143
|
+
return "npm";
|
|
144
|
+
}
|
|
145
|
+
if (basename3 === "requirements.txt" || basename3.endsWith(".txt")) {
|
|
146
|
+
return "pypi";
|
|
147
|
+
}
|
|
148
|
+
return "unknown";
|
|
149
|
+
}
|
|
150
|
+
function parseDependencyFile(filePath) {
|
|
151
|
+
const fileType = detectFileType(filePath);
|
|
152
|
+
switch (fileType) {
|
|
153
|
+
case "npm":
|
|
154
|
+
return parsePackageJson(filePath);
|
|
155
|
+
case "pypi":
|
|
156
|
+
return parseRequirementsTxt(filePath);
|
|
157
|
+
default:
|
|
158
|
+
throw new Error(`Unsupported file type: ${path3.basename(filePath)}`);
|
|
159
|
+
}
|
|
160
|
+
}
|
|
161
|
+
|
|
162
|
+
// src/clients/npm-client.ts
|
|
163
|
+
var import_axios = __toESM(require("axios"), 1);
|
|
164
|
+
var NPM_REGISTRY_URL = "https://registry.npmjs.org";
|
|
165
|
+
async function fetchNpmMetadata(packageName) {
|
|
166
|
+
try {
|
|
167
|
+
const response = await import_axios.default.get(
|
|
168
|
+
`${NPM_REGISTRY_URL}/${encodeURIComponent(packageName)}`,
|
|
169
|
+
{
|
|
170
|
+
timeout: 1e4,
|
|
171
|
+
headers: {
|
|
172
|
+
"Accept": "application/json"
|
|
173
|
+
}
|
|
174
|
+
}
|
|
175
|
+
);
|
|
176
|
+
const data = response.data;
|
|
177
|
+
const latestVersion = data["dist-tags"]?.latest || "";
|
|
178
|
+
let lastPublished = null;
|
|
179
|
+
if (data.time) {
|
|
180
|
+
const modifiedTime = data.time.modified || data.time[latestVersion];
|
|
181
|
+
if (modifiedTime) {
|
|
182
|
+
lastPublished = new Date(modifiedTime);
|
|
183
|
+
}
|
|
184
|
+
}
|
|
185
|
+
const maintainers = (data.maintainers || []).map((m) => ({
|
|
186
|
+
name: m.name,
|
|
187
|
+
email: m.email
|
|
188
|
+
}));
|
|
189
|
+
let repositoryUrl = null;
|
|
190
|
+
if (data.repository?.url) {
|
|
191
|
+
repositoryUrl = cleanGitUrl(data.repository.url);
|
|
192
|
+
}
|
|
193
|
+
return {
|
|
194
|
+
name: packageName,
|
|
195
|
+
version: latestVersion,
|
|
196
|
+
lastPublished,
|
|
197
|
+
maintainers,
|
|
198
|
+
license: data.license || null,
|
|
199
|
+
repositoryUrl,
|
|
200
|
+
homepage: data.homepage || null,
|
|
201
|
+
description: data.description || null
|
|
202
|
+
};
|
|
203
|
+
} catch (error) {
|
|
204
|
+
if (import_axios.default.isAxiosError(error)) {
|
|
205
|
+
if (error.response?.status === 404) {
|
|
206
|
+
throw new Error(`Package not found: ${packageName}`);
|
|
207
|
+
}
|
|
208
|
+
throw new Error(`Failed to fetch ${packageName}: ${error.message}`);
|
|
209
|
+
}
|
|
210
|
+
throw error;
|
|
211
|
+
}
|
|
212
|
+
}
|
|
213
|
+
function cleanGitUrl(url) {
|
|
214
|
+
return url.replace(/^git\+/, "").replace(/^git:\/\//, "https://").replace(/^ssh:\/\/git@/, "https://").replace(/\.git$/, "").replace(/git@github\.com:/, "https://github.com/");
|
|
215
|
+
}
|
|
216
|
+
|
|
217
|
+
// src/clients/pypi-client.ts
|
|
218
|
+
var import_axios2 = __toESM(require("axios"), 1);
|
|
219
|
+
var PYPI_API_URL = "https://pypi.org/pypi";
|
|
220
|
+
async function fetchPyPIMetadata(packageName) {
|
|
221
|
+
try {
|
|
222
|
+
const response = await import_axios2.default.get(
|
|
223
|
+
`${PYPI_API_URL}/${encodeURIComponent(packageName)}/json`,
|
|
224
|
+
{
|
|
225
|
+
timeout: 1e4,
|
|
226
|
+
headers: {
|
|
227
|
+
"Accept": "application/json"
|
|
228
|
+
}
|
|
229
|
+
}
|
|
230
|
+
);
|
|
231
|
+
const data = response.data;
|
|
232
|
+
const info = data.info;
|
|
233
|
+
let lastPublished = null;
|
|
234
|
+
const latestVersion = info.version;
|
|
235
|
+
if (data.releases && data.releases[latestVersion]) {
|
|
236
|
+
const releases = data.releases[latestVersion];
|
|
237
|
+
if (releases.length > 0) {
|
|
238
|
+
lastPublished = new Date(releases[0].upload_time);
|
|
239
|
+
}
|
|
240
|
+
}
|
|
241
|
+
const maintainers = [];
|
|
242
|
+
if (info.maintainer) {
|
|
243
|
+
maintainers.push({
|
|
244
|
+
name: info.maintainer,
|
|
245
|
+
email: info.maintainer_email
|
|
246
|
+
});
|
|
247
|
+
} else if (info.author) {
|
|
248
|
+
maintainers.push({
|
|
249
|
+
name: info.author,
|
|
250
|
+
email: info.author_email
|
|
251
|
+
});
|
|
252
|
+
}
|
|
253
|
+
let repositoryUrl = null;
|
|
254
|
+
if (info.project_urls) {
|
|
255
|
+
repositoryUrl = info.project_urls["Source"] || info.project_urls["Repository"] || info.project_urls["Code"] || info.project_urls["GitHub"] || info.project_urls["Homepage"] || null;
|
|
256
|
+
}
|
|
257
|
+
return {
|
|
258
|
+
name: packageName,
|
|
259
|
+
version: latestVersion,
|
|
260
|
+
lastPublished,
|
|
261
|
+
maintainers,
|
|
262
|
+
license: info.license || null,
|
|
263
|
+
repositoryUrl,
|
|
264
|
+
homepage: info.home_page || null,
|
|
265
|
+
description: info.summary || null
|
|
266
|
+
};
|
|
267
|
+
} catch (error) {
|
|
268
|
+
if (import_axios2.default.isAxiosError(error)) {
|
|
269
|
+
if (error.response?.status === 404) {
|
|
270
|
+
throw new Error(`Package not found: ${packageName}`);
|
|
271
|
+
}
|
|
272
|
+
throw new Error(`Failed to fetch ${packageName}: ${error.message}`);
|
|
273
|
+
}
|
|
274
|
+
throw error;
|
|
275
|
+
}
|
|
276
|
+
}
|
|
277
|
+
|
|
278
|
+
// src/clients/github-client.ts
|
|
279
|
+
var import_axios3 = __toESM(require("axios"), 1);
|
|
280
|
+
var GITHUB_API_URL = "https://api.github.com";
|
|
281
|
+
function parseGitHubUrl(url) {
|
|
282
|
+
const patterns = [
|
|
283
|
+
/github\.com[/:]([^/]+)\/([^/.\s]+)/
|
|
284
|
+
];
|
|
285
|
+
for (const pattern of patterns) {
|
|
286
|
+
const match = url.match(pattern);
|
|
287
|
+
if (match) {
|
|
288
|
+
return {
|
|
289
|
+
owner: match[1],
|
|
290
|
+
repo: match[2].replace(/\.git$/, "")
|
|
291
|
+
};
|
|
292
|
+
}
|
|
293
|
+
}
|
|
294
|
+
return null;
|
|
295
|
+
}
|
|
296
|
+
async function fetchGitHubRepoHealth(repositoryUrl, token) {
|
|
297
|
+
const parsed = parseGitHubUrl(repositoryUrl);
|
|
298
|
+
if (!parsed) {
|
|
299
|
+
return null;
|
|
300
|
+
}
|
|
301
|
+
const { owner, repo } = parsed;
|
|
302
|
+
const headers = {
|
|
303
|
+
"Accept": "application/vnd.github.v3+json",
|
|
304
|
+
"User-Agent": "rot-detector-cli"
|
|
305
|
+
};
|
|
306
|
+
if (token) {
|
|
307
|
+
headers["Authorization"] = `Bearer ${token}`;
|
|
308
|
+
}
|
|
309
|
+
try {
|
|
310
|
+
const repoResponse = await import_axios3.default.get(
|
|
311
|
+
`${GITHUB_API_URL}/repos/${owner}/${repo}`,
|
|
312
|
+
{ headers, timeout: 1e4 }
|
|
313
|
+
);
|
|
314
|
+
const repoData = repoResponse.data;
|
|
315
|
+
let lastCommitDate = null;
|
|
316
|
+
try {
|
|
317
|
+
const commitsResponse = await import_axios3.default.get(
|
|
318
|
+
`${GITHUB_API_URL}/repos/${owner}/${repo}/commits`,
|
|
319
|
+
{ headers, timeout: 1e4, params: { per_page: 1 } }
|
|
320
|
+
);
|
|
321
|
+
if (commitsResponse.data.length > 0) {
|
|
322
|
+
lastCommitDate = new Date(commitsResponse.data[0].commit.committer.date);
|
|
323
|
+
}
|
|
324
|
+
} catch {
|
|
325
|
+
}
|
|
326
|
+
let contributorCount = 0;
|
|
327
|
+
try {
|
|
328
|
+
const contributorsResponse = await import_axios3.default.get(
|
|
329
|
+
`${GITHUB_API_URL}/repos/${owner}/${repo}/contributors`,
|
|
330
|
+
{ headers, timeout: 1e4, params: { per_page: 1, anon: false } }
|
|
331
|
+
);
|
|
332
|
+
const linkHeader = contributorsResponse.headers["link"];
|
|
333
|
+
if (linkHeader) {
|
|
334
|
+
const match = linkHeader.match(/page=(\d+)>; rel="last"/);
|
|
335
|
+
if (match) {
|
|
336
|
+
contributorCount = parseInt(match[1], 10);
|
|
337
|
+
}
|
|
338
|
+
} else {
|
|
339
|
+
contributorCount = contributorsResponse.data.length;
|
|
340
|
+
}
|
|
341
|
+
} catch {
|
|
342
|
+
}
|
|
343
|
+
return {
|
|
344
|
+
lastCommitDate,
|
|
345
|
+
contributorCount,
|
|
346
|
+
openIssues: repoData.open_issues_count || 0,
|
|
347
|
+
stars: repoData.stargazers_count || 0,
|
|
348
|
+
isArchived: repoData.archived || false
|
|
349
|
+
};
|
|
350
|
+
} catch (error) {
|
|
351
|
+
if (import_axios3.default.isAxiosError(error)) {
|
|
352
|
+
if (error.response?.status === 403) {
|
|
353
|
+
console.warn("GitHub API rate limit reached. Consider using --github-token");
|
|
354
|
+
return null;
|
|
355
|
+
}
|
|
356
|
+
if (error.response?.status === 404) {
|
|
357
|
+
return null;
|
|
358
|
+
}
|
|
359
|
+
}
|
|
360
|
+
return null;
|
|
361
|
+
}
|
|
362
|
+
}
|
|
363
|
+
|
|
364
|
+
// src/analyzer/health-scorer.ts
|
|
365
|
+
var OSI_APPROVED_LICENSES = /* @__PURE__ */ new Set([
|
|
366
|
+
"MIT",
|
|
367
|
+
"Apache-2.0",
|
|
368
|
+
"BSD-2-Clause",
|
|
369
|
+
"BSD-3-Clause",
|
|
370
|
+
"ISC",
|
|
371
|
+
"MPL-2.0",
|
|
372
|
+
"GPL-3.0",
|
|
373
|
+
"GPL-3.0-only",
|
|
374
|
+
"GPL-3.0-or-later",
|
|
375
|
+
"LGPL-3.0",
|
|
376
|
+
"LGPL-3.0-only",
|
|
377
|
+
"LGPL-3.0-or-later",
|
|
378
|
+
"AGPL-3.0",
|
|
379
|
+
"Unlicense",
|
|
380
|
+
"0BSD",
|
|
381
|
+
"CC0-1.0",
|
|
382
|
+
"Zlib",
|
|
383
|
+
"Artistic-2.0",
|
|
384
|
+
"EPL-2.0",
|
|
385
|
+
"EUPL-1.2"
|
|
386
|
+
]);
|
|
387
|
+
var DEPRECATED_LICENSES = /* @__PURE__ */ new Set([
|
|
388
|
+
"GPL-2.0",
|
|
389
|
+
"LGPL-2.0",
|
|
390
|
+
"LGPL-2.1",
|
|
391
|
+
"BSD-4-Clause",
|
|
392
|
+
"WTFPL"
|
|
393
|
+
]);
|
|
394
|
+
var WEIGHTS = {
|
|
395
|
+
freshness: 0.4,
|
|
396
|
+
maintainers: 0.3,
|
|
397
|
+
license: 0.3
|
|
398
|
+
};
|
|
399
|
+
function calculateHealthScore(metadata, githubHealth) {
|
|
400
|
+
const freshness = calculateFreshnessScore(metadata, githubHealth);
|
|
401
|
+
const maintainerHealth = calculateMaintainerScore(metadata, githubHealth);
|
|
402
|
+
const licenseHealth = calculateLicenseScore(metadata);
|
|
403
|
+
const overall = Math.round(
|
|
404
|
+
freshness.score * WEIGHTS.freshness + maintainerHealth.score * WEIGHTS.maintainers + licenseHealth.score * WEIGHTS.license
|
|
405
|
+
);
|
|
406
|
+
return {
|
|
407
|
+
overall,
|
|
408
|
+
freshness,
|
|
409
|
+
maintainerHealth,
|
|
410
|
+
licenseHealth
|
|
411
|
+
};
|
|
412
|
+
}
|
|
413
|
+
function calculateFreshnessScore(metadata, githubHealth) {
|
|
414
|
+
const lastUpdate = githubHealth?.lastCommitDate || metadata.lastPublished;
|
|
415
|
+
if (!lastUpdate) {
|
|
416
|
+
return {
|
|
417
|
+
score: 50,
|
|
418
|
+
// Unknown, give benefit of doubt
|
|
419
|
+
lastUpdate: null,
|
|
420
|
+
daysSinceUpdate: null,
|
|
421
|
+
status: "unknown"
|
|
422
|
+
};
|
|
423
|
+
}
|
|
424
|
+
const now = /* @__PURE__ */ new Date();
|
|
425
|
+
const daysSinceUpdate = Math.floor(
|
|
426
|
+
(now.getTime() - lastUpdate.getTime()) / (1e3 * 60 * 60 * 24)
|
|
427
|
+
);
|
|
428
|
+
let score;
|
|
429
|
+
let status;
|
|
430
|
+
if (daysSinceUpdate < 180) {
|
|
431
|
+
score = 100;
|
|
432
|
+
status = "active";
|
|
433
|
+
} else if (daysSinceUpdate < 365) {
|
|
434
|
+
score = 75;
|
|
435
|
+
status = "active";
|
|
436
|
+
} else if (daysSinceUpdate < 730) {
|
|
437
|
+
score = 40;
|
|
438
|
+
status = "stale";
|
|
439
|
+
} else if (daysSinceUpdate < 1095) {
|
|
440
|
+
score = 20;
|
|
441
|
+
status = "abandoned";
|
|
442
|
+
} else {
|
|
443
|
+
score = 5;
|
|
444
|
+
status = "abandoned";
|
|
445
|
+
}
|
|
446
|
+
if (githubHealth?.isArchived) {
|
|
447
|
+
score = Math.min(score, 10);
|
|
448
|
+
status = "abandoned";
|
|
449
|
+
}
|
|
450
|
+
return {
|
|
451
|
+
score,
|
|
452
|
+
lastUpdate,
|
|
453
|
+
daysSinceUpdate,
|
|
454
|
+
status
|
|
455
|
+
};
|
|
456
|
+
}
|
|
457
|
+
function calculateMaintainerScore(metadata, githubHealth) {
|
|
458
|
+
const count = githubHealth?.contributorCount || metadata.maintainers.length;
|
|
459
|
+
let score;
|
|
460
|
+
let status;
|
|
461
|
+
if (count >= 5) {
|
|
462
|
+
score = 100;
|
|
463
|
+
status = "healthy";
|
|
464
|
+
} else if (count >= 3) {
|
|
465
|
+
score = 85;
|
|
466
|
+
status = "healthy";
|
|
467
|
+
} else if (count === 2) {
|
|
468
|
+
score = 70;
|
|
469
|
+
status = "warning";
|
|
470
|
+
} else if (count === 1) {
|
|
471
|
+
score = 40;
|
|
472
|
+
status = "warning";
|
|
473
|
+
} else {
|
|
474
|
+
score = 10;
|
|
475
|
+
status = "critical";
|
|
476
|
+
}
|
|
477
|
+
return {
|
|
478
|
+
score,
|
|
479
|
+
count,
|
|
480
|
+
status
|
|
481
|
+
};
|
|
482
|
+
}
|
|
483
|
+
function calculateLicenseScore(metadata) {
|
|
484
|
+
const license = metadata.license;
|
|
485
|
+
if (!license) {
|
|
486
|
+
return {
|
|
487
|
+
score: 30,
|
|
488
|
+
license: null,
|
|
489
|
+
status: "unknown"
|
|
490
|
+
};
|
|
491
|
+
}
|
|
492
|
+
const normalizedLicense = license.toUpperCase().replace(/\s+/g, "-");
|
|
493
|
+
for (const deprecated of DEPRECATED_LICENSES) {
|
|
494
|
+
if (normalizedLicense.includes(deprecated.toUpperCase())) {
|
|
495
|
+
return {
|
|
496
|
+
score: 50,
|
|
497
|
+
license,
|
|
498
|
+
status: "deprecated"
|
|
499
|
+
};
|
|
500
|
+
}
|
|
501
|
+
}
|
|
502
|
+
for (const approved of OSI_APPROVED_LICENSES) {
|
|
503
|
+
if (normalizedLicense.includes(approved.toUpperCase())) {
|
|
504
|
+
return {
|
|
505
|
+
score: 100,
|
|
506
|
+
license,
|
|
507
|
+
status: "approved"
|
|
508
|
+
};
|
|
509
|
+
}
|
|
510
|
+
}
|
|
511
|
+
return {
|
|
512
|
+
score: 60,
|
|
513
|
+
license,
|
|
514
|
+
status: "warning"
|
|
515
|
+
};
|
|
516
|
+
}
|
|
517
|
+
function formatDaysSinceUpdate(days) {
|
|
518
|
+
if (days === null) return "Unknown";
|
|
519
|
+
if (days < 1) return "Today";
|
|
520
|
+
if (days === 1) return "1 day ago";
|
|
521
|
+
if (days < 30) return `${days} days ago`;
|
|
522
|
+
if (days < 60) return "1 month ago";
|
|
523
|
+
if (days < 365) return `${Math.floor(days / 30)} months ago`;
|
|
524
|
+
if (days < 730) return "1 year ago";
|
|
525
|
+
return `${Math.floor(days / 365)} years ago`;
|
|
526
|
+
}
|
|
527
|
+
|
|
528
|
+
// src/cli/reporter.ts
|
|
529
|
+
var import_chalk = __toESM(require("chalk"), 1);
|
|
530
|
+
var import_cli_table3 = __toESM(require("cli-table3"), 1);
|
|
531
|
+
|
|
532
|
+
// src/types/index.ts
|
|
533
|
+
function getRiskLevel(score) {
|
|
534
|
+
if (score === null) return "unknown";
|
|
535
|
+
if (score >= 80) return "healthy";
|
|
536
|
+
if (score >= 50) return "warning";
|
|
537
|
+
return "critical";
|
|
538
|
+
}
|
|
539
|
+
|
|
540
|
+
// src/cli/reporter.ts
|
|
541
|
+
function printReport(result, verbose = false) {
|
|
542
|
+
console.log("\n");
|
|
543
|
+
console.log(import_chalk.default.bold.cyan("\u{1F9DF} Dependency Rot Detector"));
|
|
544
|
+
console.log(import_chalk.default.gray(`Scanned: ${result.file}`));
|
|
545
|
+
console.log(import_chalk.default.gray(`Source: ${result.source.toUpperCase()}`));
|
|
546
|
+
console.log(import_chalk.default.gray(`Time: ${result.scannedAt.toISOString()}`));
|
|
547
|
+
console.log("\n");
|
|
548
|
+
const table = new import_cli_table3.default({
|
|
549
|
+
head: [
|
|
550
|
+
import_chalk.default.white.bold("Package"),
|
|
551
|
+
import_chalk.default.white.bold("Score"),
|
|
552
|
+
import_chalk.default.white.bold("Last Update"),
|
|
553
|
+
import_chalk.default.white.bold("Maintainers"),
|
|
554
|
+
import_chalk.default.white.bold("License"),
|
|
555
|
+
import_chalk.default.white.bold("Status")
|
|
556
|
+
],
|
|
557
|
+
colWidths: [30, 8, 16, 13, 15, 12],
|
|
558
|
+
style: {
|
|
559
|
+
head: [],
|
|
560
|
+
border: ["gray"]
|
|
561
|
+
}
|
|
562
|
+
});
|
|
563
|
+
const sorted = [...result.dependencies].sort((a, b) => {
|
|
564
|
+
const scoreA = a.health?.overall ?? 999;
|
|
565
|
+
const scoreB = b.health?.overall ?? 999;
|
|
566
|
+
return scoreA - scoreB;
|
|
567
|
+
});
|
|
568
|
+
for (const dep of sorted) {
|
|
569
|
+
const row = formatDependencyRow(dep, verbose);
|
|
570
|
+
table.push(row);
|
|
571
|
+
}
|
|
572
|
+
console.log(table.toString());
|
|
573
|
+
console.log("\n");
|
|
574
|
+
printSummary(result);
|
|
575
|
+
}
|
|
576
|
+
function formatDependencyRow(dep, verbose) {
|
|
577
|
+
if (dep.error) {
|
|
578
|
+
return [
|
|
579
|
+
import_chalk.default.gray(dep.dependency.name),
|
|
580
|
+
import_chalk.default.gray("\u2014"),
|
|
581
|
+
import_chalk.default.gray("\u2014"),
|
|
582
|
+
import_chalk.default.gray("\u2014"),
|
|
583
|
+
import_chalk.default.gray("\u2014"),
|
|
584
|
+
import_chalk.default.red("Error")
|
|
585
|
+
];
|
|
586
|
+
}
|
|
587
|
+
if (!dep.health) {
|
|
588
|
+
return [
|
|
589
|
+
import_chalk.default.gray(dep.dependency.name),
|
|
590
|
+
import_chalk.default.gray("\u2014"),
|
|
591
|
+
import_chalk.default.gray("\u2014"),
|
|
592
|
+
import_chalk.default.gray("\u2014"),
|
|
593
|
+
import_chalk.default.gray("\u2014"),
|
|
594
|
+
import_chalk.default.yellow("Unknown")
|
|
595
|
+
];
|
|
596
|
+
}
|
|
597
|
+
const score = dep.health.overall;
|
|
598
|
+
const risk = getRiskLevel(score);
|
|
599
|
+
let scoreStr;
|
|
600
|
+
let statusStr;
|
|
601
|
+
let statusEmoji;
|
|
602
|
+
switch (risk) {
|
|
603
|
+
case "healthy":
|
|
604
|
+
scoreStr = import_chalk.default.green.bold(score.toString());
|
|
605
|
+
statusStr = import_chalk.default.green("Healthy");
|
|
606
|
+
statusEmoji = "\u{1F7E2}";
|
|
607
|
+
break;
|
|
608
|
+
case "warning":
|
|
609
|
+
scoreStr = import_chalk.default.yellow.bold(score.toString());
|
|
610
|
+
statusStr = import_chalk.default.yellow("Warning");
|
|
611
|
+
statusEmoji = "\u{1F7E1}";
|
|
612
|
+
break;
|
|
613
|
+
case "critical":
|
|
614
|
+
scoreStr = import_chalk.default.red.bold(score.toString());
|
|
615
|
+
statusStr = import_chalk.default.red("Critical");
|
|
616
|
+
statusEmoji = "\u{1F534}";
|
|
617
|
+
break;
|
|
618
|
+
default:
|
|
619
|
+
scoreStr = import_chalk.default.gray(score.toString());
|
|
620
|
+
statusStr = import_chalk.default.gray("Unknown");
|
|
621
|
+
statusEmoji = "\u26AA";
|
|
622
|
+
}
|
|
623
|
+
const lastUpdate = formatDaysSinceUpdate(dep.health.freshness.daysSinceUpdate);
|
|
624
|
+
let lastUpdateStr;
|
|
625
|
+
if (dep.health.freshness.status === "abandoned") {
|
|
626
|
+
lastUpdateStr = import_chalk.default.red(lastUpdate);
|
|
627
|
+
} else if (dep.health.freshness.status === "stale") {
|
|
628
|
+
lastUpdateStr = import_chalk.default.yellow(lastUpdate);
|
|
629
|
+
} else {
|
|
630
|
+
lastUpdateStr = import_chalk.default.green(lastUpdate);
|
|
631
|
+
}
|
|
632
|
+
const maintainerCount = dep.health.maintainerHealth.count;
|
|
633
|
+
let maintainerStr;
|
|
634
|
+
if (maintainerCount >= 3) {
|
|
635
|
+
maintainerStr = import_chalk.default.green(maintainerCount.toString());
|
|
636
|
+
} else if (maintainerCount >= 2) {
|
|
637
|
+
maintainerStr = import_chalk.default.yellow(maintainerCount.toString());
|
|
638
|
+
} else {
|
|
639
|
+
maintainerStr = import_chalk.default.red(maintainerCount.toString());
|
|
640
|
+
}
|
|
641
|
+
const license = dep.health.licenseHealth.license || "Unknown";
|
|
642
|
+
let licenseStr;
|
|
643
|
+
if (dep.health.licenseHealth.status === "approved") {
|
|
644
|
+
licenseStr = import_chalk.default.green(truncate(license, 13));
|
|
645
|
+
} else if (dep.health.licenseHealth.status === "deprecated") {
|
|
646
|
+
licenseStr = import_chalk.default.red(truncate(license, 13));
|
|
647
|
+
} else {
|
|
648
|
+
licenseStr = import_chalk.default.yellow(truncate(license, 13));
|
|
649
|
+
}
|
|
650
|
+
let pkgName = dep.dependency.name;
|
|
651
|
+
if (dep.dependency.isDev) {
|
|
652
|
+
pkgName = import_chalk.default.gray(`${pkgName} (dev)`);
|
|
653
|
+
}
|
|
654
|
+
return [
|
|
655
|
+
truncate(pkgName, 28),
|
|
656
|
+
`${statusEmoji} ${scoreStr}`,
|
|
657
|
+
lastUpdateStr,
|
|
658
|
+
maintainerStr,
|
|
659
|
+
licenseStr,
|
|
660
|
+
statusStr
|
|
661
|
+
];
|
|
662
|
+
}
|
|
663
|
+
function printSummary(result) {
|
|
664
|
+
const { summary } = result;
|
|
665
|
+
console.log(import_chalk.default.bold("Summary"));
|
|
666
|
+
console.log(import_chalk.default.gray("\u2500".repeat(50)));
|
|
667
|
+
const total = import_chalk.default.white.bold(summary.total.toString());
|
|
668
|
+
const healthy = import_chalk.default.green.bold(summary.healthy.toString());
|
|
669
|
+
const warning = import_chalk.default.yellow.bold(summary.warning.toString());
|
|
670
|
+
const critical = import_chalk.default.red.bold(summary.critical.toString());
|
|
671
|
+
const failed = import_chalk.default.gray.bold(summary.failed.toString());
|
|
672
|
+
console.log(` Total packages: ${total}`);
|
|
673
|
+
console.log(` \u{1F7E2} Healthy (80-100): ${healthy}`);
|
|
674
|
+
console.log(` \u{1F7E1} Warning (50-79): ${warning}`);
|
|
675
|
+
console.log(` \u{1F534} Critical (0-49): ${critical}`);
|
|
676
|
+
if (summary.failed > 0) {
|
|
677
|
+
console.log(` \u26AA Failed to check: ${failed}`);
|
|
678
|
+
}
|
|
679
|
+
console.log("\n");
|
|
680
|
+
if (summary.critical > 0) {
|
|
681
|
+
console.log(import_chalk.default.red.bold("\u26A0\uFE0F Action Required!"));
|
|
682
|
+
console.log(import_chalk.default.red("Some dependencies are critical and should be reviewed or replaced."));
|
|
683
|
+
} else if (summary.warning > 0) {
|
|
684
|
+
console.log(import_chalk.default.yellow("\u{1F4CB} Some dependencies could use attention."));
|
|
685
|
+
} else {
|
|
686
|
+
console.log(import_chalk.default.green("\u2705 All dependencies look healthy!"));
|
|
687
|
+
}
|
|
688
|
+
console.log("\n");
|
|
689
|
+
}
|
|
690
|
+
function printJsonReport(result) {
|
|
691
|
+
const jsonResult = {
|
|
692
|
+
...result,
|
|
693
|
+
scannedAt: result.scannedAt.toISOString(),
|
|
694
|
+
dependencies: result.dependencies.map((dep) => ({
|
|
695
|
+
...dep,
|
|
696
|
+
metadata: dep.metadata ? {
|
|
697
|
+
...dep.metadata,
|
|
698
|
+
lastPublished: dep.metadata.lastPublished?.toISOString() || null
|
|
699
|
+
} : null,
|
|
700
|
+
health: dep.health ? {
|
|
701
|
+
...dep.health,
|
|
702
|
+
freshness: {
|
|
703
|
+
...dep.health.freshness,
|
|
704
|
+
lastUpdate: dep.health.freshness.lastUpdate?.toISOString() || null
|
|
705
|
+
}
|
|
706
|
+
} : null
|
|
707
|
+
}))
|
|
708
|
+
};
|
|
709
|
+
console.log(JSON.stringify(jsonResult, null, 2));
|
|
710
|
+
}
|
|
711
|
+
function truncate(str, maxLen) {
|
|
712
|
+
if (str.length <= maxLen) return str;
|
|
713
|
+
return str.substring(0, maxLen - 1) + "\u2026";
|
|
714
|
+
}
|
|
715
|
+
|
|
716
|
+
// src/cli/index.ts
|
|
717
|
+
var program = new import_commander.Command();
|
|
718
|
+
program.name("rot-detector").description("\u{1F9DF} Detect dependency rot in your projects").version("1.0.0");
|
|
719
|
+
program.command("scan").description("Scan a dependency file for software rot").argument("[path]", "Path to package.json or requirements.txt", ".").option("--json", "Output results as JSON").option("--threshold <score>", "Fail if any dependency scores below this threshold", "0").option("--github-token <token>", "GitHub token for enhanced repo analysis").option("--no-github", "Skip GitHub repository analysis").option("--dev", "Include dev dependencies in analysis").option("-v, --verbose", "Show verbose output").action(async (inputPath, options) => {
|
|
720
|
+
try {
|
|
721
|
+
await runScan(inputPath, options);
|
|
722
|
+
} catch (error) {
|
|
723
|
+
console.error(import_chalk2.default.red(`Error: ${error.message}`));
|
|
724
|
+
process.exit(1);
|
|
725
|
+
}
|
|
726
|
+
});
|
|
727
|
+
async function runScan(inputPath, options) {
|
|
728
|
+
let filePath = path4.resolve(inputPath);
|
|
729
|
+
if (fs3.existsSync(filePath) && fs3.statSync(filePath).isDirectory()) {
|
|
730
|
+
if (fs3.existsSync(path4.join(filePath, "package.json"))) {
|
|
731
|
+
filePath = path4.join(filePath, "package.json");
|
|
732
|
+
} else if (fs3.existsSync(path4.join(filePath, "requirements.txt"))) {
|
|
733
|
+
filePath = path4.join(filePath, "requirements.txt");
|
|
734
|
+
} else {
|
|
735
|
+
throw new Error("No package.json or requirements.txt found in directory");
|
|
736
|
+
}
|
|
737
|
+
}
|
|
738
|
+
if (!fs3.existsSync(filePath)) {
|
|
739
|
+
throw new Error(`File not found: ${filePath}`);
|
|
740
|
+
}
|
|
741
|
+
const fileType = detectFileType(filePath);
|
|
742
|
+
if (fileType === "unknown") {
|
|
743
|
+
throw new Error(`Unsupported file type: ${path4.basename(filePath)}`);
|
|
744
|
+
}
|
|
745
|
+
const spinner = (0, import_ora.default)("Parsing dependencies...").start();
|
|
746
|
+
let dependencies;
|
|
747
|
+
try {
|
|
748
|
+
dependencies = parseDependencyFile(filePath);
|
|
749
|
+
} catch (error) {
|
|
750
|
+
spinner.fail("Failed to parse dependencies");
|
|
751
|
+
throw error;
|
|
752
|
+
}
|
|
753
|
+
if (!options.dev && fileType === "npm") {
|
|
754
|
+
dependencies = dependencies.filter((d) => !d.isDev);
|
|
755
|
+
}
|
|
756
|
+
spinner.succeed(`Found ${dependencies.length} dependencies`);
|
|
757
|
+
const analyses = [];
|
|
758
|
+
const analyzeSpinner = (0, import_ora.default)("Analyzing dependencies...").start();
|
|
759
|
+
for (let i = 0; i < dependencies.length; i++) {
|
|
760
|
+
const dep = dependencies[i];
|
|
761
|
+
analyzeSpinner.text = `Analyzing ${dep.name} (${i + 1}/${dependencies.length})...`;
|
|
762
|
+
try {
|
|
763
|
+
let metadata;
|
|
764
|
+
if (dep.source === "npm") {
|
|
765
|
+
metadata = await fetchNpmMetadata(dep.name);
|
|
766
|
+
} else if (dep.source === "pypi") {
|
|
767
|
+
metadata = await fetchPyPIMetadata(dep.name);
|
|
768
|
+
} else {
|
|
769
|
+
throw new Error(`Unsupported source: ${dep.source}`);
|
|
770
|
+
}
|
|
771
|
+
let githubHealth = null;
|
|
772
|
+
if (options.github !== false && metadata.repositoryUrl) {
|
|
773
|
+
githubHealth = await fetchGitHubRepoHealth(
|
|
774
|
+
metadata.repositoryUrl,
|
|
775
|
+
options.githubToken
|
|
776
|
+
);
|
|
777
|
+
}
|
|
778
|
+
const health = calculateHealthScore(metadata, githubHealth);
|
|
779
|
+
analyses.push({
|
|
780
|
+
dependency: dep,
|
|
781
|
+
metadata,
|
|
782
|
+
health
|
|
783
|
+
});
|
|
784
|
+
} catch (error) {
|
|
785
|
+
analyses.push({
|
|
786
|
+
dependency: dep,
|
|
787
|
+
metadata: null,
|
|
788
|
+
health: null,
|
|
789
|
+
error: error.message
|
|
790
|
+
});
|
|
791
|
+
}
|
|
792
|
+
await sleep(100);
|
|
793
|
+
}
|
|
794
|
+
analyzeSpinner.succeed("Analysis complete");
|
|
795
|
+
const summary = {
|
|
796
|
+
total: analyses.length,
|
|
797
|
+
healthy: analyses.filter((a) => getRiskLevel(a.health?.overall ?? null) === "healthy").length,
|
|
798
|
+
warning: analyses.filter((a) => getRiskLevel(a.health?.overall ?? null) === "warning").length,
|
|
799
|
+
critical: analyses.filter((a) => getRiskLevel(a.health?.overall ?? null) === "critical").length,
|
|
800
|
+
failed: analyses.filter((a) => a.error !== void 0).length
|
|
801
|
+
};
|
|
802
|
+
const result = {
|
|
803
|
+
file: filePath,
|
|
804
|
+
source: fileType,
|
|
805
|
+
scannedAt: /* @__PURE__ */ new Date(),
|
|
806
|
+
dependencies: analyses,
|
|
807
|
+
summary
|
|
808
|
+
};
|
|
809
|
+
if (options.json) {
|
|
810
|
+
printJsonReport(result);
|
|
811
|
+
} else {
|
|
812
|
+
printReport(result, options.verbose);
|
|
813
|
+
}
|
|
814
|
+
const threshold = parseInt(options.threshold || "0", 10);
|
|
815
|
+
if (threshold > 0) {
|
|
816
|
+
const belowThreshold = analyses.filter(
|
|
817
|
+
(a) => a.health && a.health.overall < threshold
|
|
818
|
+
);
|
|
819
|
+
if (belowThreshold.length > 0) {
|
|
820
|
+
console.error(
|
|
821
|
+
import_chalk2.default.red(`
|
|
822
|
+
\u274C ${belowThreshold.length} dependencies scored below threshold of ${threshold}`)
|
|
823
|
+
);
|
|
824
|
+
process.exit(1);
|
|
825
|
+
}
|
|
826
|
+
}
|
|
827
|
+
}
|
|
828
|
+
function sleep(ms) {
|
|
829
|
+
return new Promise((resolve4) => setTimeout(resolve4, ms));
|
|
830
|
+
}
|
|
831
|
+
program.parse();
|
package/dist/index.d.cts
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
#!/usr/bin/env node
|
package/package.json
ADDED
|
@@ -0,0 +1,56 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "rot-detector",
|
|
3
|
+
"version": "1.0.0",
|
|
4
|
+
"description": "🧟 CLI tool to detect dependency rot in your projects - find abandoned, unmaintained dependencies",
|
|
5
|
+
"main": "dist/index.cjs",
|
|
6
|
+
"bin": {
|
|
7
|
+
"rot-detector": "dist/index.cjs"
|
|
8
|
+
},
|
|
9
|
+
"files": [
|
|
10
|
+
"dist",
|
|
11
|
+
"README.md"
|
|
12
|
+
],
|
|
13
|
+
"repository": {
|
|
14
|
+
"type": "git",
|
|
15
|
+
"url": "https://github.com/notsointresting/rot-detector"
|
|
16
|
+
},
|
|
17
|
+
"homepage": "https://github.com/notsointresting/rot-detector#readme",
|
|
18
|
+
"bugs": {
|
|
19
|
+
"url": "https://github.com/notsointresting/rot-detector/issues"
|
|
20
|
+
},
|
|
21
|
+
"scripts": {
|
|
22
|
+
"build": "tsup src/cli/index.ts --format cjs --dts --clean",
|
|
23
|
+
"dev": "tsx src/cli/index.ts",
|
|
24
|
+
"test": "vitest run",
|
|
25
|
+
"test:watch": "vitest",
|
|
26
|
+
"lint": "eslint src --ext .ts"
|
|
27
|
+
},
|
|
28
|
+
"keywords": [
|
|
29
|
+
"dependency",
|
|
30
|
+
"security",
|
|
31
|
+
"audit",
|
|
32
|
+
"npm",
|
|
33
|
+
"pypi",
|
|
34
|
+
"supply-chain"
|
|
35
|
+
],
|
|
36
|
+
"author": "",
|
|
37
|
+
"license": "MIT",
|
|
38
|
+
"devDependencies": {
|
|
39
|
+
"@types/node": "^20.10.0",
|
|
40
|
+
"tsup": "^8.0.1",
|
|
41
|
+
"tsx": "^4.7.0",
|
|
42
|
+
"typescript": "^5.3.3",
|
|
43
|
+
"vitest": "^1.1.0"
|
|
44
|
+
},
|
|
45
|
+
"dependencies": {
|
|
46
|
+
"axios": "^1.6.3",
|
|
47
|
+
"chalk": "^5.3.0",
|
|
48
|
+
"commander": "^11.1.0",
|
|
49
|
+
"cli-table3": "^0.6.3",
|
|
50
|
+
"ora": "^8.0.1"
|
|
51
|
+
},
|
|
52
|
+
"engines": {
|
|
53
|
+
"node": ">=18.0.0"
|
|
54
|
+
},
|
|
55
|
+
"type": "module"
|
|
56
|
+
}
|