@lateos/npm-scan 0.4.0 → 0.5.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +23 -7
- package/api/README.md +32 -0
- package/api/__init__.py +0 -0
- package/api/main.py +43 -0
- package/api/requirements.txt +7 -0
- package/api/routers/__init__.py +0 -0
- package/api/routers/auth.py +35 -0
- package/api/routers/health.py +10 -0
- package/api/routers/scans.py +66 -0
- package/api/routers/webhooks.py +78 -0
- package/backend/db/pg-schema.sql +155 -0
- package/backend/license.js +76 -4
- package/cli/cli.js +26 -8
- package/deploy/helm/npm-scan/Chart.yaml +16 -0
- package/deploy/helm/npm-scan/templates/_helpers.tpl +9 -0
- package/deploy/helm/npm-scan/templates/api.yaml +66 -0
- package/deploy/helm/npm-scan/templates/ingress.yaml +28 -0
- package/deploy/helm/npm-scan/templates/postgresql.yaml +67 -0
- package/deploy/helm/npm-scan/templates/secrets.yaml +19 -0
- package/deploy/helm/npm-scan/templates/worker.yaml +32 -0
- package/deploy/helm/npm-scan/values.yaml +73 -0
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -17,11 +17,17 @@ npx @lateos/npm-scan scan lodash
|
|
|
17
17
|
|
|
18
18
|
## Features
|
|
19
19
|
|
|
20
|
-
- **Static Analysis** — detects malicious lifecycle scripts, obfuscated payloads, credential harvesting, persistence, network exfiltration, dependency confusion, typosquatting, tarball tampering, conditional triggers,
|
|
20
|
+
- **Static Analysis** — detects malicious lifecycle scripts, obfuscated payloads, credential harvesting, persistence, network exfiltration, dependency confusion, typosquatting, tarball tampering, conditional triggers, sandbox evasion, and transitive propagation (ATK-001–011)
|
|
21
21
|
- **SBOM Output** — CycloneDX 1.5 and SPDX 2.3 with findings mapped as vulnerabilities
|
|
22
|
-
- **NIST 800-161 Compliance** — HTML report includes control traceability matrix (SR-2.1 → SR-
|
|
22
|
+
- **NIST 800-161 Compliance** — HTML report includes control traceability matrix (SR-2.1 → SR-11.4)
|
|
23
|
+
- **EU CRA Compliance** — report maps findings to Cyber Resilience Act articles and Annex I requirements
|
|
24
|
+
- **SIEM Export** — CEF format for Splunk and other SIEM ingestion (premium)
|
|
25
|
+
- **EU CRA Compliance** — report maps findings to Cyber Resilience Act articles (premium)
|
|
26
|
+
- **License Key Gating** — premium features locked behind signed license keys
|
|
27
|
+
- **REST API** — FastAPI-based API with webhooks, auth, scan management (premium)
|
|
28
|
+
- **Kubernetes / Helm** — Helm chart for deploying the full pipeline on K8s (premium)
|
|
23
29
|
- **SQLite Storage** — local scan history, zero external dependencies
|
|
24
|
-
- **CLI** — `scan`, `scan-lockfile`, `report --sbom --html --nist`
|
|
30
|
+
- **CLI** — `scan`, `scan-lockfile`, `report --sbom --html --nist --cra --siem`
|
|
25
31
|
- **Dynamic Sandbox** — gVisor-based isolation (premium, documented in `docs/sandbox-threat-model.md`)
|
|
26
32
|
- **GitHub Action** — scans lockfile on PRs
|
|
27
33
|
- **Docker** — multi-arch images via GHCR
|
|
@@ -38,17 +44,25 @@ npm-scan report -i <id> Show findings for a scan
|
|
|
38
44
|
npm-scan report -i <id> --sbom Generate CycloneDX SBOM
|
|
39
45
|
npm-scan report -i <id> --sbom spdx Generate SPDX SBOM
|
|
40
46
|
npm-scan report -i <id> --html Generate HTML report (with NIST table)
|
|
47
|
+
npm-scan report -i <id> --nist Print NIST 800-161 compliance table
|
|
48
|
+
npm-scan report -i <id> --cra Print EU CRA compliance table
|
|
49
|
+
npm-scan report -i <id> --siem cef Generate SIEM CEF output (premium)
|
|
41
50
|
npm-scan report --html Generate HTML report for all scans
|
|
51
|
+
npm-scan report --nist Print NIST compliance for all scans
|
|
52
|
+
npm-scan report --cra Print EU CRA compliance for all scans (premium)
|
|
53
|
+
npm-scan report --siem cef Generate SIEM for all scans (premium)
|
|
42
54
|
```
|
|
43
55
|
|
|
44
56
|
## Architecture
|
|
45
57
|
|
|
46
58
|
```
|
|
47
59
|
cli/ Commander.js CLI entrypoint
|
|
48
|
-
backend/ Detectors, fetch, SQLite db, SBOM, report
|
|
60
|
+
backend/ Detectors, fetch, SQLite db, SBOM, report, license, SIEM, CRA
|
|
61
|
+
api/ FastAPI REST API + webhooks (premium)
|
|
49
62
|
docker/ Multi-arch Docker images + compose
|
|
63
|
+
deploy/ Kubernetes Helm chart (premium)
|
|
50
64
|
docs/ Project plan, attack taxonomy (ATK), sandbox threat model
|
|
51
|
-
tests/ Corpus: 5 clean +
|
|
65
|
+
tests/ Corpus: 5 clean + 33 malicious packages
|
|
52
66
|
```
|
|
53
67
|
|
|
54
68
|
## Detectors (ATK Taxonomy)
|
|
@@ -65,6 +79,8 @@ tests/ Corpus: 5 clean + 30 malicious packages
|
|
|
65
79
|
| ATK-008 | Tarball tampering (published ≠ source) | high |
|
|
66
80
|
| ATK-009 | Conditional/dormant triggers (CI, time) | high |
|
|
67
81
|
| ATK-010 | Sandbox evasion / anti-analysis | medium |
|
|
82
|
+
| ATK-011 | Transitive propagation (worm) | high |
|
|
83
|
+
| ATK-011 | Transitive propagation (worm) | high |
|
|
68
84
|
|
|
69
85
|
See `docs/attack-taxonomy.md` for full NIST 800-161 mappings, evasion surfaces, and PoC examples.
|
|
70
86
|
|
|
@@ -73,8 +89,8 @@ See `docs/attack-taxonomy.md` for full NIST 800-161 mappings, evasion surfaces,
|
|
|
73
89
|
```bash
|
|
74
90
|
npm install
|
|
75
91
|
npm run dev # CLI stub
|
|
76
|
-
npm run test # Unit tests (
|
|
77
|
-
npm run corpus # False-positive corpus test (
|
|
92
|
+
npm run test # Unit tests (14)
|
|
93
|
+
npm run corpus # False-positive corpus test (33 malicious, 5 clean)
|
|
78
94
|
```
|
|
79
95
|
|
|
80
96
|
## License
|
package/api/README.md
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
1
|
+
# npm-scan REST API
|
|
2
|
+
|
|
3
|
+
FastAPI-based REST API for hosted/team tier. Requires premium or enterprise license.
|
|
4
|
+
|
|
5
|
+
## Quick Start
|
|
6
|
+
|
|
7
|
+
```bash
|
|
8
|
+
pip install -r api/requirements.txt
|
|
9
|
+
python -m api.main
|
|
10
|
+
```
|
|
11
|
+
|
|
12
|
+
## Endpoints
|
|
13
|
+
|
|
14
|
+
| Method | Path | Description |
|
|
15
|
+
|--------|------|-------------|
|
|
16
|
+
| POST | /api/v1/scan | Submit a package for scanning |
|
|
17
|
+
| GET | /api/v1/scans | List recent scans |
|
|
18
|
+
| GET | /api/v1/scans/{id} | Get scan details |
|
|
19
|
+
| GET | /api/v1/scans/{id}/findings | Get findings for a scan |
|
|
20
|
+
| GET | /api/v1/scans/{id}/report | Generate report |
|
|
21
|
+
| POST | /api/v1/webhooks | Register a webhook |
|
|
22
|
+
| GET | /api/v1/webhooks | List webhooks |
|
|
23
|
+
| DELETE | /api/v1/webhooks/{id} | Delete a webhook |
|
|
24
|
+
| POST | /api/v1/auth/login | Login |
|
|
25
|
+
| POST | /api/v1/auth/register | Register |
|
|
26
|
+
| GET | /api/v1/health | Health check |
|
|
27
|
+
|
|
28
|
+
## Authentication
|
|
29
|
+
|
|
30
|
+
All endpoints except `/api/v1/health`, `/api/v1/auth/login`, and `/api/v1/auth/register` require an API key or session token.
|
|
31
|
+
|
|
32
|
+
Pass as header: `Authorization: Bearer <api_key>`
|
package/api/__init__.py
ADDED
|
File without changes
|
package/api/main.py
ADDED
|
@@ -0,0 +1,43 @@
|
|
|
1
|
+
"""
|
|
2
|
+
npm-scan REST API — FastAPI application.
|
|
3
|
+
Requires premium/enterprise license key for all endpoints.
|
|
4
|
+
"""
|
|
5
|
+
|
|
6
|
+
from fastapi import FastAPI, Depends, HTTPException, status
|
|
7
|
+
from fastapi.middleware.cors import CORSMiddleware
|
|
8
|
+
import os
|
|
9
|
+
|
|
10
|
+
from .routers import scans, webhooks, auth, health
|
|
11
|
+
|
|
12
|
+
app = FastAPI(
|
|
13
|
+
title="npm-scan API",
|
|
14
|
+
version=os.environ.get("npm_package_version", "0.5.0"),
|
|
15
|
+
description="npm supply chain security scanner — REST API",
|
|
16
|
+
)
|
|
17
|
+
|
|
18
|
+
app.add_middleware(
|
|
19
|
+
CORSMiddleware,
|
|
20
|
+
allow_origins=["*"],
|
|
21
|
+
allow_credentials=True,
|
|
22
|
+
allow_methods=["*"],
|
|
23
|
+
allow_headers=["*"],
|
|
24
|
+
)
|
|
25
|
+
|
|
26
|
+
app.include_router(health.router, prefix="/api/v1", tags=["health"])
|
|
27
|
+
app.include_router(auth.router, prefix="/api/v1/auth", tags=["auth"])
|
|
28
|
+
app.include_router(scans.router, prefix="/api/v1", tags=["scans"])
|
|
29
|
+
app.include_router(webhooks.router, prefix="/api/v1", tags=["webhooks"])
|
|
30
|
+
|
|
31
|
+
|
|
32
|
+
def main():
|
|
33
|
+
import uvicorn
|
|
34
|
+
uvicorn.run(
|
|
35
|
+
"api.main:app",
|
|
36
|
+
host=os.environ.get("API_HOST", "0.0.0.0"),
|
|
37
|
+
port=int(os.environ.get("API_PORT", "8000")),
|
|
38
|
+
reload=os.environ.get("API_RELOAD", "false").lower() == "true",
|
|
39
|
+
)
|
|
40
|
+
|
|
41
|
+
|
|
42
|
+
if __name__ == "__main__":
|
|
43
|
+
main()
|
|
File without changes
|
|
@@ -0,0 +1,35 @@
|
|
|
1
|
+
"""Authentication endpoints — login, register, API key management."""
|
|
2
|
+
|
|
3
|
+
from fastapi import APIRouter, HTTPException, Depends
|
|
4
|
+
from pydantic import BaseModel, EmailStr
|
|
5
|
+
from typing import Optional
|
|
6
|
+
import os
|
|
7
|
+
|
|
8
|
+
router = APIRouter()
|
|
9
|
+
|
|
10
|
+
|
|
11
|
+
class RegisterRequest(BaseModel):
|
|
12
|
+
email: EmailStr
|
|
13
|
+
name: str
|
|
14
|
+
password: str
|
|
15
|
+
team_name: Optional[str] = None
|
|
16
|
+
|
|
17
|
+
|
|
18
|
+
class LoginRequest(BaseModel):
|
|
19
|
+
email: EmailStr
|
|
20
|
+
password: str
|
|
21
|
+
|
|
22
|
+
|
|
23
|
+
class TokenResponse(BaseModel):
|
|
24
|
+
access_token: str
|
|
25
|
+
token_type: str = "bearer"
|
|
26
|
+
|
|
27
|
+
|
|
28
|
+
@router.post("/register", response_model=TokenResponse)
|
|
29
|
+
async def register(req: RegisterRequest):
|
|
30
|
+
raise HTTPException(status_code=501, detail="Registration requires PostgreSQL backend — not yet connected")
|
|
31
|
+
|
|
32
|
+
|
|
33
|
+
@router.post("/login", response_model=TokenResponse)
|
|
34
|
+
async def login(req: LoginRequest):
|
|
35
|
+
raise HTTPException(status_code=501, detail="Login requires PostgreSQL backend — not yet connected")
|
|
@@ -0,0 +1,66 @@
|
|
|
1
|
+
"""Scan endpoints — submit, list, retrieve scans and findings."""
|
|
2
|
+
|
|
3
|
+
from fastapi import APIRouter, HTTPException, Query
|
|
4
|
+
from pydantic import BaseModel
|
|
5
|
+
from typing import Optional, List
|
|
6
|
+
from datetime import datetime
|
|
7
|
+
|
|
8
|
+
router = APIRouter()
|
|
9
|
+
|
|
10
|
+
|
|
11
|
+
class ScanRequest(BaseModel):
|
|
12
|
+
package_name: str
|
|
13
|
+
version: Optional[str] = "latest"
|
|
14
|
+
|
|
15
|
+
|
|
16
|
+
class Finding(BaseModel):
|
|
17
|
+
atk_id: str
|
|
18
|
+
severity: str
|
|
19
|
+
title: Optional[str] = None
|
|
20
|
+
description: Optional[str] = None
|
|
21
|
+
evidence: Optional[str] = None
|
|
22
|
+
|
|
23
|
+
|
|
24
|
+
class Scan(BaseModel):
|
|
25
|
+
id: str
|
|
26
|
+
package_name: str
|
|
27
|
+
version: str
|
|
28
|
+
status: str
|
|
29
|
+
scanned_at: datetime
|
|
30
|
+
findings: List[Finding] = []
|
|
31
|
+
|
|
32
|
+
|
|
33
|
+
SCANS_DB: list[Scan] = []
|
|
34
|
+
|
|
35
|
+
|
|
36
|
+
@router.post("/scan", status_code=201)
|
|
37
|
+
async def submit_scan(req: ScanRequest):
|
|
38
|
+
"""Submit a package for scanning (delegates to Node.js CLI)."""
|
|
39
|
+
raise HTTPException(
|
|
40
|
+
status_code=501,
|
|
41
|
+
detail="Scan execution requires async worker — use `npm-scan scan <package>` via CLI"
|
|
42
|
+
)
|
|
43
|
+
|
|
44
|
+
|
|
45
|
+
@router.get("/scans", response_model=List[Scan])
|
|
46
|
+
async def list_scans(limit: int = Query(10, ge=1, le=100)):
|
|
47
|
+
"""List recent scans."""
|
|
48
|
+
return SCANS_DB[-limit:][::-1]
|
|
49
|
+
|
|
50
|
+
|
|
51
|
+
@router.get("/scans/{scan_id}")
|
|
52
|
+
async def get_scan(scan_id: str):
|
|
53
|
+
"""Get scan details by ID."""
|
|
54
|
+
for scan in SCANS_DB:
|
|
55
|
+
if scan.id == scan_id:
|
|
56
|
+
return scan
|
|
57
|
+
raise HTTPException(status_code=404, detail=f"Scan {scan_id} not found")
|
|
58
|
+
|
|
59
|
+
|
|
60
|
+
@router.get("/scans/{scan_id}/findings")
|
|
61
|
+
async def get_findings(scan_id: str):
|
|
62
|
+
"""Get findings for a specific scan."""
|
|
63
|
+
for scan in SCANS_DB:
|
|
64
|
+
if scan.id == scan_id:
|
|
65
|
+
return scan.findings
|
|
66
|
+
raise HTTPException(status_code=404, detail=f"Scan {scan_id} not found")
|
|
@@ -0,0 +1,78 @@
|
|
|
1
|
+
"""Webhook endpoints — register, list, delete webhooks."""
|
|
2
|
+
|
|
3
|
+
from fastapi import APIRouter, HTTPException
|
|
4
|
+
from pydantic import BaseModel, HttpUrl
|
|
5
|
+
from typing import List, Optional
|
|
6
|
+
from datetime import datetime
|
|
7
|
+
import hashlib
|
|
8
|
+
import hmac
|
|
9
|
+
import json
|
|
10
|
+
import os
|
|
11
|
+
|
|
12
|
+
router = APIRouter()
|
|
13
|
+
|
|
14
|
+
|
|
15
|
+
class WebhookCreate(BaseModel):
|
|
16
|
+
url: HttpUrl
|
|
17
|
+
events: List[str] = ["scan.completed", "finding.critical"]
|
|
18
|
+
|
|
19
|
+
|
|
20
|
+
class Webhook(BaseModel):
|
|
21
|
+
id: str
|
|
22
|
+
url: str
|
|
23
|
+
events: List[str]
|
|
24
|
+
active: bool = True
|
|
25
|
+
secret: Optional[str] = None
|
|
26
|
+
created_at: datetime
|
|
27
|
+
|
|
28
|
+
|
|
29
|
+
HOOKS_DB: list[Webhook] = []
|
|
30
|
+
|
|
31
|
+
|
|
32
|
+
@router.post("/webhooks", status_code=201)
|
|
33
|
+
async def create_webhook(hook: WebhookCreate):
|
|
34
|
+
"""Register a new webhook endpoint."""
|
|
35
|
+
wh = Webhook(
|
|
36
|
+
id=hash(str(hook.url) + str(datetime.now())),
|
|
37
|
+
url=str(hook.url),
|
|
38
|
+
events=list(set(hook.events)),
|
|
39
|
+
secret=os.urandom(16).hex(),
|
|
40
|
+
created_at=datetime.now(),
|
|
41
|
+
)
|
|
42
|
+
HOOKS_DB.append(wh)
|
|
43
|
+
return wh
|
|
44
|
+
|
|
45
|
+
|
|
46
|
+
@router.get("/webhooks")
|
|
47
|
+
async def list_webhooks():
|
|
48
|
+
"""List all registered webhooks."""
|
|
49
|
+
return HOOKS_DB
|
|
50
|
+
|
|
51
|
+
|
|
52
|
+
@router.delete("/webhooks/{hook_id}")
|
|
53
|
+
async def delete_webhook(hook_id: str):
|
|
54
|
+
"""Delete a webhook by ID."""
|
|
55
|
+
for i, wh in enumerate(HOOKS_DB):
|
|
56
|
+
if wh.id == hook_id:
|
|
57
|
+
HOOKS_DB.pop(i)
|
|
58
|
+
return {"deleted": hook_id}
|
|
59
|
+
raise HTTPException(status_code=404, detail="Webhook not found")
|
|
60
|
+
|
|
61
|
+
|
|
62
|
+
async def dispatch_webhooks(event: str, payload: dict):
|
|
63
|
+
"""Dispatch an event to all subscribed webhooks (called by worker)."""
|
|
64
|
+
import httpx
|
|
65
|
+
|
|
66
|
+
for wh in HOOKS_DB:
|
|
67
|
+
if not wh.active or event not in wh.events:
|
|
68
|
+
continue
|
|
69
|
+
body = json.dumps({"event": event, "payload": payload, "timestamp": datetime.now().isoformat()})
|
|
70
|
+
sig = hmac.new(wh.secret.encode(), body.encode(), hashlib.sha256).hexdigest()
|
|
71
|
+
async with httpx.AsyncClient() as client:
|
|
72
|
+
try:
|
|
73
|
+
await client.post(wh.url, content=body, headers={
|
|
74
|
+
"Content-Type": "application/json",
|
|
75
|
+
"X-Webhook-Signature": sig,
|
|
76
|
+
})
|
|
77
|
+
except Exception:
|
|
78
|
+
pass # log failure in production
|
|
@@ -0,0 +1,155 @@
|
|
|
1
|
+
-- PostgreSQL schema for hosted/team tier (premium)
|
|
2
|
+
-- Extends the SQLite schema with teams, users, RBAC, audit logs, webhooks
|
|
3
|
+
|
|
4
|
+
-- Extensions
|
|
5
|
+
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
|
|
6
|
+
CREATE EXTENSION IF NOT EXISTS "pgcrypto";
|
|
7
|
+
|
|
8
|
+
-- Teams / Organizations
|
|
9
|
+
CREATE TABLE IF NOT EXISTS teams (
|
|
10
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
11
|
+
name TEXT NOT NULL,
|
|
12
|
+
slug TEXT UNIQUE NOT NULL,
|
|
13
|
+
license_edition TEXT NOT NULL DEFAULT 'community',
|
|
14
|
+
license_key TEXT,
|
|
15
|
+
license_expires_at TIMESTAMPTZ,
|
|
16
|
+
max_seats INTEGER NOT NULL DEFAULT 5,
|
|
17
|
+
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
|
18
|
+
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
|
19
|
+
);
|
|
20
|
+
|
|
21
|
+
-- Users
|
|
22
|
+
CREATE TABLE IF NOT EXISTS users (
|
|
23
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
24
|
+
email TEXT UNIQUE NOT NULL,
|
|
25
|
+
name TEXT NOT NULL,
|
|
26
|
+
password_hash TEXT NOT NULL,
|
|
27
|
+
team_id UUID REFERENCES teams(id) ON DELETE CASCADE,
|
|
28
|
+
role TEXT NOT NULL CHECK (role IN ('admin', 'editor', 'viewer')) DEFAULT 'viewer',
|
|
29
|
+
last_login_at TIMESTAMPTZ,
|
|
30
|
+
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
|
31
|
+
);
|
|
32
|
+
|
|
33
|
+
-- Scans (extends SQLite scans with team ownership)
|
|
34
|
+
CREATE TABLE IF NOT EXISTS scans (
|
|
35
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
36
|
+
team_id UUID REFERENCES teams(id) ON DELETE CASCADE,
|
|
37
|
+
user_id UUID REFERENCES users(id) ON DELETE SET NULL,
|
|
38
|
+
package_name TEXT NOT NULL,
|
|
39
|
+
version TEXT,
|
|
40
|
+
status TEXT NOT NULL DEFAULT 'pending'
|
|
41
|
+
CHECK (status IN ('pending', 'fetching', 'analyzing', 'completed', 'failed')),
|
|
42
|
+
sbom_json JSONB,
|
|
43
|
+
findings_summary JSONB,
|
|
44
|
+
duration_ms INTEGER,
|
|
45
|
+
scanned_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
|
46
|
+
);
|
|
47
|
+
|
|
48
|
+
-- Findings
|
|
49
|
+
CREATE TABLE IF NOT EXISTS findings (
|
|
50
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
51
|
+
scan_id UUID NOT NULL REFERENCES scans(id) ON DELETE CASCADE,
|
|
52
|
+
atk_id TEXT NOT NULL,
|
|
53
|
+
severity TEXT NOT NULL CHECK (severity IN ('info', 'low', 'medium', 'high', 'critical')),
|
|
54
|
+
title TEXT,
|
|
55
|
+
description TEXT,
|
|
56
|
+
evidence TEXT,
|
|
57
|
+
mitigation TEXT,
|
|
58
|
+
file_path TEXT,
|
|
59
|
+
line_number INTEGER
|
|
60
|
+
);
|
|
61
|
+
|
|
62
|
+
-- Indexes
|
|
63
|
+
CREATE INDEX IF NOT EXISTS idx_scans_team ON scans(team_id);
|
|
64
|
+
CREATE INDEX IF NOT EXISTS idx_scans_package ON scans(package_name);
|
|
65
|
+
CREATE INDEX IF NOT EXISTS idx_scans_status ON scans(status);
|
|
66
|
+
CREATE INDEX IF NOT EXISTS idx_scans_created ON scans(scanned_at DESC);
|
|
67
|
+
CREATE INDEX IF NOT EXISTS idx_findings_scan ON findings(scan_id);
|
|
68
|
+
CREATE INDEX IF NOT EXISTS idx_findings_atk ON findings(atk_id);
|
|
69
|
+
CREATE INDEX IF NOT EXISTS idx_findings_severity ON findings(severity);
|
|
70
|
+
CREATE INDEX IF NOT EXISTS idx_users_team ON users(team_id);
|
|
71
|
+
|
|
72
|
+
-- Audit log
|
|
73
|
+
CREATE TABLE IF NOT EXISTS audit_log (
|
|
74
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
75
|
+
team_id UUID NOT NULL REFERENCES teams(id) ON DELETE CASCADE,
|
|
76
|
+
user_id UUID REFERENCES users(id) ON DELETE SET NULL,
|
|
77
|
+
action TEXT NOT NULL,
|
|
78
|
+
resource_type TEXT NOT NULL,
|
|
79
|
+
resource_id TEXT,
|
|
80
|
+
details JSONB,
|
|
81
|
+
ip_address INET,
|
|
82
|
+
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
|
83
|
+
);
|
|
84
|
+
|
|
85
|
+
CREATE INDEX IF NOT EXISTS idx_audit_team ON audit_log(team_id, created_at DESC);
|
|
86
|
+
|
|
87
|
+
-- Webhooks
|
|
88
|
+
CREATE TABLE IF NOT EXISTS webhooks (
|
|
89
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
90
|
+
team_id UUID NOT NULL REFERENCES teams(id) ON DELETE CASCADE,
|
|
91
|
+
url TEXT NOT NULL,
|
|
92
|
+
secret TEXT NOT NULL DEFAULT encode(gen_random_bytes(32), 'hex'),
|
|
93
|
+
events TEXT[] NOT NULL DEFAULT '{}',
|
|
94
|
+
active BOOLEAN NOT NULL DEFAULT true,
|
|
95
|
+
last_triggered_at TIMESTAMPTZ,
|
|
96
|
+
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
|
97
|
+
);
|
|
98
|
+
|
|
99
|
+
CREATE INDEX IF NOT EXISTS idx_webhooks_team ON webhooks(team_id);
|
|
100
|
+
|
|
101
|
+
-- API keys
|
|
102
|
+
CREATE TABLE IF NOT EXISTS api_keys (
|
|
103
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
104
|
+
team_id UUID NOT NULL REFERENCES teams(id) ON DELETE CASCADE,
|
|
105
|
+
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
|
106
|
+
name TEXT NOT NULL,
|
|
107
|
+
key_hash TEXT NOT NULL,
|
|
108
|
+
scopes TEXT[] NOT NULL DEFAULT '{}',
|
|
109
|
+
last_used_at TIMESTAMPTZ,
|
|
110
|
+
expires_at TIMESTAMPTZ,
|
|
111
|
+
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
|
112
|
+
);
|
|
113
|
+
|
|
114
|
+
CREATE INDEX IF NOT EXISTS idx_api_keys_team ON api_keys(team_id);
|
|
115
|
+
|
|
116
|
+
-- Session tokens
|
|
117
|
+
CREATE TABLE IF NOT EXISTS sessions (
|
|
118
|
+
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
|
119
|
+
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
|
120
|
+
token_hash TEXT NOT NULL,
|
|
121
|
+
expires_at TIMESTAMPTZ NOT NULL,
|
|
122
|
+
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
|
123
|
+
);
|
|
124
|
+
|
|
125
|
+
CREATE INDEX IF NOT EXISTS idx_sessions_user ON sessions(user_id);
|
|
126
|
+
CREATE INDEX IF NOT EXISTS idx_sessions_expires ON sessions(expires_at);
|
|
127
|
+
|
|
128
|
+
-- Materialized view: package risk aggregation
|
|
129
|
+
CREATE MATERIALIZED VIEW IF NOT EXISTS package_risk AS
|
|
130
|
+
SELECT
|
|
131
|
+
s.package_name,
|
|
132
|
+
s.version,
|
|
133
|
+
COUNT(DISTINCT f.id) AS finding_count,
|
|
134
|
+
COUNT(DISTINCT f.id) FILTER (WHERE f.severity IN ('high', 'critical')) AS high_crit_count,
|
|
135
|
+
ARRAY_AGG(DISTINCT f.atk_id) AS atk_ids,
|
|
136
|
+
MAX(s.scanned_at) AS last_scanned
|
|
137
|
+
FROM scans s
|
|
138
|
+
JOIN findings f ON f.scan_id = s.id
|
|
139
|
+
WHERE s.status = 'completed'
|
|
140
|
+
GROUP BY s.package_name, s.version;
|
|
141
|
+
|
|
142
|
+
CREATE UNIQUE INDEX IF NOT EXISTS idx_package_risk_pkg ON package_risk(package_name, version);
|
|
143
|
+
|
|
144
|
+
-- Function: touch updated_at
|
|
145
|
+
CREATE OR REPLACE FUNCTION touch_updated_at()
|
|
146
|
+
RETURNS TRIGGER AS $$
|
|
147
|
+
BEGIN
|
|
148
|
+
NEW.updated_at = NOW();
|
|
149
|
+
RETURN NEW;
|
|
150
|
+
END;
|
|
151
|
+
$$ LANGUAGE plpgsql;
|
|
152
|
+
|
|
153
|
+
CREATE TRIGGER trg_teams_updated_at
|
|
154
|
+
BEFORE UPDATE ON teams
|
|
155
|
+
FOR EACH ROW EXECUTE FUNCTION touch_updated_at();
|
package/backend/license.js
CHANGED
|
@@ -1,13 +1,85 @@
|
|
|
1
|
+
import { createHmac, timingSafeEqual } from 'crypto';
|
|
2
|
+
|
|
3
|
+
const HMAC_KEY = process.env.NPM_SCAN_LICENSE_SECRET || 'npm-scan-default-dev-key';
|
|
4
|
+
|
|
5
|
+
const FEATURE_TIERS = {
|
|
6
|
+
community: [],
|
|
7
|
+
premium: ['sandbox', 'siem', 'cra', 'nist-pdf', 'rest-api', 'webhooks', 'helm'],
|
|
8
|
+
enterprise: ['sandbox', 'siem', 'cra', 'nist-pdf', 'rest-api', 'webhooks', 'helm', 'sso', 'audit-logs', 'pg-backend', 'kubernetes'],
|
|
9
|
+
};
|
|
10
|
+
|
|
11
|
+
const ALL_FEATURES = Object.values(FEATURE_TIERS).flat();
|
|
12
|
+
const ALLOWED_UNLOCKED = ['sbom', 'nist-html', 'html-report', 'sqlite'];
|
|
13
|
+
|
|
14
|
+
function generateSignature(payload) {
|
|
15
|
+
return createHmac('sha256', HMAC_KEY).update(JSON.stringify(payload)).digest('hex');
|
|
16
|
+
}
|
|
17
|
+
|
|
18
|
+
export function generateKey(edition, options = {}) {
|
|
19
|
+
const payload = {
|
|
20
|
+
edition,
|
|
21
|
+
issued: new Date().toISOString(),
|
|
22
|
+
exp: options.expiresAt || null,
|
|
23
|
+
seats: options.seats || 1,
|
|
24
|
+
org: options.org || null,
|
|
25
|
+
};
|
|
26
|
+
const sig = generateSignature(payload);
|
|
27
|
+
const encoded = Buffer.from(JSON.stringify(payload)).toString('base64url');
|
|
28
|
+
return `npm-scan-${edition}-${encoded}.${sig}`;
|
|
29
|
+
}
|
|
30
|
+
|
|
1
31
|
export function validateLicense(key, feature = '*') {
|
|
2
|
-
if (!key
|
|
3
|
-
throw new Error(
|
|
32
|
+
if (!key) {
|
|
33
|
+
throw new Error('No license key provided');
|
|
34
|
+
}
|
|
35
|
+
|
|
36
|
+
if (feature === 'scan' || ALLOWED_UNLOCKED.includes(feature)) {
|
|
37
|
+
return { edition: 'community', features: ALL_FEATURES };
|
|
38
|
+
}
|
|
39
|
+
|
|
40
|
+
const parts = key.split('-');
|
|
41
|
+
if (parts.length < 4 || !key.includes('.')) {
|
|
42
|
+
throw new Error('Invalid license key format');
|
|
43
|
+
}
|
|
44
|
+
|
|
45
|
+
const edition = parts[2];
|
|
46
|
+
const encodedPayload = parts.slice(3).join('-').split('.')[0];
|
|
47
|
+
const sig = key.split('.')[1];
|
|
48
|
+
|
|
49
|
+
let payload;
|
|
50
|
+
try {
|
|
51
|
+
payload = JSON.parse(Buffer.from(encodedPayload, 'base64url').toString('utf8'));
|
|
52
|
+
} catch {
|
|
53
|
+
throw new Error('Invalid license key payload');
|
|
4
54
|
}
|
|
5
|
-
|
|
55
|
+
|
|
56
|
+
const expectedSig = generateSignature(payload);
|
|
57
|
+
const sigBuf = Buffer.from(sig, 'hex');
|
|
58
|
+
const expectedBuf = Buffer.from(expectedSig, 'hex');
|
|
59
|
+
if (sigBuf.length !== expectedBuf.length || !timingSafeEqual(sigBuf, expectedBuf)) {
|
|
60
|
+
throw new Error('Invalid license key signature');
|
|
61
|
+
}
|
|
62
|
+
|
|
63
|
+
if (payload.exp && new Date(payload.exp) < new Date()) {
|
|
64
|
+
throw new Error('License key expired');
|
|
65
|
+
}
|
|
66
|
+
|
|
67
|
+
const allowed = FEATURE_TIERS[edition];
|
|
68
|
+
if (!allowed) {
|
|
69
|
+
throw new Error(`Unknown license edition: ${edition}`);
|
|
70
|
+
}
|
|
71
|
+
|
|
72
|
+
if (feature !== '*' && !allowed.includes(feature) && !ALLOWED_UNLOCKED.includes(feature)) {
|
|
73
|
+
throw new Error(`Feature "${feature}" requires ${edition === 'community' ? 'premium' : 'enterprise'} license`);
|
|
74
|
+
}
|
|
75
|
+
|
|
76
|
+
return { edition, features: allowed, ...payload };
|
|
6
77
|
}
|
|
7
78
|
|
|
8
79
|
export function isFeatureEnabled(feature, licenseKey = process.env.NPM_SCAN_LICENSE_KEY) {
|
|
9
80
|
try {
|
|
10
|
-
|
|
81
|
+
validateLicense(licenseKey, feature);
|
|
82
|
+
return true;
|
|
11
83
|
} catch {
|
|
12
84
|
return false;
|
|
13
85
|
}
|
package/cli/cli.js
CHANGED
|
@@ -1,11 +1,21 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
2
|
|
|
3
3
|
import { Command } from 'commander';
|
|
4
|
+
import { isFeatureEnabled, generateKey } from '../backend/license.js';
|
|
5
|
+
|
|
6
|
+
function requirePremium(feature, licenseKey) {
|
|
7
|
+
if (!isFeatureEnabled(feature, licenseKey)) {
|
|
8
|
+
console.error(`Error: "${feature}" requires a premium license key.`);
|
|
9
|
+
console.error(` Pass --license-key <key> or set NPM_SCAN_LICENSE_KEY env var.`);
|
|
10
|
+
console.error(` Generate a dev key: require('@lateos/npm-scan/backend/license').generateKey('premium')`);
|
|
11
|
+
process.exit(1);
|
|
12
|
+
}
|
|
13
|
+
}
|
|
4
14
|
|
|
5
15
|
const program = new Command()
|
|
6
16
|
.name('npm-scan')
|
|
7
17
|
.description('npm supply chain security scanner')
|
|
8
|
-
.version('0.
|
|
18
|
+
.version('0.5.0');
|
|
9
19
|
|
|
10
20
|
program
|
|
11
21
|
.command('scan')
|
|
@@ -47,13 +57,15 @@ program
|
|
|
47
57
|
.description('Generate report')
|
|
48
58
|
.option('-i, --id <id>', 'Scan ID')
|
|
49
59
|
.option('--sbom [format]', 'SBOM format (json/xml/spdx)')
|
|
50
|
-
.option('--html', 'HTML report')
|
|
60
|
+
.option('--html', 'HTML report')
|
|
51
61
|
.option('--nist', 'NIST 800-161 compliance report')
|
|
52
62
|
.option('--cra', 'EU CRA compliance report')
|
|
53
63
|
.option('--siem <format>', 'SIEM format (cef)')
|
|
54
64
|
.option('-l, --license-key <key>', 'Premium license')
|
|
55
65
|
.action(async (options) => {
|
|
66
|
+
const licenseKey = options.licenseKey || process.env.NPM_SCAN_LICENSE_KEY;
|
|
56
67
|
const { getRecentScans, getFindings, getScan } = await import('../backend/db.js');
|
|
68
|
+
|
|
57
69
|
if (options.id) {
|
|
58
70
|
const findings = getFindings(options.id);
|
|
59
71
|
const scanInfo = getScan(options.id);
|
|
@@ -61,16 +73,19 @@ program
|
|
|
61
73
|
const pkgVer = scanInfo?.version || 'unknown';
|
|
62
74
|
const pkg = { name: pkgName, version: pkgVer };
|
|
63
75
|
const scan = findings.length ? { package_name: pkgName, version: pkgVer, findings } : null;
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
|
|
67
|
-
console.log(sbom);
|
|
68
|
-
} else if (options.siem) {
|
|
76
|
+
|
|
77
|
+
if (options.siem) {
|
|
78
|
+
requirePremium('siem', licenseKey);
|
|
69
79
|
const { generateSIEM } = await import('../backend/siem/index.js');
|
|
70
80
|
console.log(generateSIEM(scan ? [scan] : [], options.siem));
|
|
71
81
|
} else if (options.cra) {
|
|
82
|
+
requirePremium('cra', licenseKey);
|
|
72
83
|
const { generateCRA } = await import('../backend/cra.js');
|
|
73
84
|
console.log(generateCRA(scan ? [scan] : []));
|
|
85
|
+
} else if (options.sbom) {
|
|
86
|
+
const { generateSBOM } = await import('../backend/sbom.js');
|
|
87
|
+
const sbom = generateSBOM(pkg, findings, options.sbom === true ? 'json' : options.sbom);
|
|
88
|
+
console.log(sbom);
|
|
74
89
|
} else if (options.html || options.nist) {
|
|
75
90
|
const { generateHTML } = await import('../backend/report.js');
|
|
76
91
|
const html = generateHTML(scan ? [scan] : []);
|
|
@@ -81,10 +96,13 @@ program
|
|
|
81
96
|
} else {
|
|
82
97
|
const scans = getRecentScans();
|
|
83
98
|
const scansWithFindings = scans.map(s => ({ ...s, findings: getFindings(s.id) }));
|
|
84
|
-
|
|
99
|
+
|
|
100
|
+
if (options.siem) {
|
|
101
|
+
requirePremium('siem', licenseKey);
|
|
85
102
|
const { generateSIEM } = await import('../backend/siem/index.js');
|
|
86
103
|
console.log(generateSIEM(scansWithFindings, options.siem));
|
|
87
104
|
} else if (options.cra) {
|
|
105
|
+
requirePremium('cra', licenseKey);
|
|
88
106
|
const { generateCRA } = await import('../backend/cra.js');
|
|
89
107
|
console.log(generateCRA(scansWithFindings));
|
|
90
108
|
} else if (options.html || options.nist) {
|
|
@@ -0,0 +1,16 @@
|
|
|
1
|
+
apiVersion: v2
|
|
2
|
+
name: npm-scan
|
|
3
|
+
description: npm supply chain security scanner — Helm chart for Kubernetes deployment
|
|
4
|
+
type: application
|
|
5
|
+
version: 0.5.0
|
|
6
|
+
appVersion: "0.5.0"
|
|
7
|
+
keywords:
|
|
8
|
+
- npm
|
|
9
|
+
- security
|
|
10
|
+
- supply-chain
|
|
11
|
+
- scanner
|
|
12
|
+
sources:
|
|
13
|
+
- https://github.com/YOUR_GITHUB_USERNAME/npm-scan
|
|
14
|
+
maintainers:
|
|
15
|
+
- name: Lateos
|
|
16
|
+
email: hello@lateos.ai
|
|
@@ -0,0 +1,9 @@
|
|
|
1
|
+
{{- define "npm-scan.name" -}}
|
|
2
|
+
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }}
|
|
3
|
+
{{- end }}
|
|
4
|
+
|
|
5
|
+
{{- define "npm-scan.labels" -}}
|
|
6
|
+
helm.sh/chart: {{ .Chart.Name }}-{{ .Chart.Version }}
|
|
7
|
+
app.kubernetes.io/instance: {{ .Release.Name }}
|
|
8
|
+
app.kubernetes.io/managed-by: {{ .Release.Service }}
|
|
9
|
+
{{- end }}
|
|
@@ -0,0 +1,66 @@
|
|
|
1
|
+
apiVersion: apps/v1
|
|
2
|
+
kind: Deployment
|
|
3
|
+
metadata:
|
|
4
|
+
name: {{ include "npm-scan.name" . }}-api
|
|
5
|
+
labels:
|
|
6
|
+
app: {{ include "npm-scan.name" . }}-api
|
|
7
|
+
{{- include "npm-scan.labels" . | nindent 4 }}
|
|
8
|
+
spec:
|
|
9
|
+
replicas: {{ .Values.api.replicas }}
|
|
10
|
+
selector:
|
|
11
|
+
matchLabels:
|
|
12
|
+
app: {{ include "npm-scan.name" . }}-api
|
|
13
|
+
template:
|
|
14
|
+
metadata:
|
|
15
|
+
labels:
|
|
16
|
+
app: {{ include "npm-scan.name" . }}-api
|
|
17
|
+
spec:
|
|
18
|
+
containers:
|
|
19
|
+
- name: api
|
|
20
|
+
image: "{{ .Values.image.repository }}:{{ .Values.image.tag }}"
|
|
21
|
+
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
|
22
|
+
command: ["python", "-m", "api.main"]
|
|
23
|
+
ports:
|
|
24
|
+
- containerPort: {{ .Values.api.port }}
|
|
25
|
+
env:
|
|
26
|
+
- name: API_PORT
|
|
27
|
+
value: "{{ .Values.api.port }}"
|
|
28
|
+
- name: API_HOST
|
|
29
|
+
value: "{{ .Values.api.host }}"
|
|
30
|
+
- name: NPM_SCAN_LICENSE_KEY
|
|
31
|
+
valueFrom:
|
|
32
|
+
secretKeyRef:
|
|
33
|
+
name: {{ include "npm-scan.name" . }}-license
|
|
34
|
+
key: key
|
|
35
|
+
optional: true
|
|
36
|
+
{{- if .Values.postgresql.enabled }}
|
|
37
|
+
- name: PG_HOST
|
|
38
|
+
value: "{{ .Values.postgresql.host }}"
|
|
39
|
+
- name: PG_PORT
|
|
40
|
+
value: "{{ .Values.postgresql.port }}"
|
|
41
|
+
- name: PG_DATABASE
|
|
42
|
+
value: "{{ .Values.postgresql.database }}"
|
|
43
|
+
- name: PG_USERNAME
|
|
44
|
+
value: "{{ .Values.postgresql.username }}"
|
|
45
|
+
- name: PG_PASSWORD
|
|
46
|
+
valueFrom:
|
|
47
|
+
secretKeyRef:
|
|
48
|
+
name: {{ .Values.postgresql.existingSecret | default (printf "%s-pg" (include "npm-scan.name" .)) }}
|
|
49
|
+
key: password
|
|
50
|
+
optional: true
|
|
51
|
+
{{- end }}
|
|
52
|
+
resources: {{- toYaml .Values.api.resources | nindent 12 }}
|
|
53
|
+
---
|
|
54
|
+
apiVersion: v1
|
|
55
|
+
kind: Service
|
|
56
|
+
metadata:
|
|
57
|
+
name: {{ include "npm-scan.name" . }}-api
|
|
58
|
+
labels:
|
|
59
|
+
app: {{ include "npm-scan.name" . }}-api
|
|
60
|
+
spec:
|
|
61
|
+
type: {{ .Values.service.type }}
|
|
62
|
+
ports:
|
|
63
|
+
- port: {{ .Values.service.port }}
|
|
64
|
+
targetPort: {{ .Values.api.port }}
|
|
65
|
+
selector:
|
|
66
|
+
app: {{ include "npm-scan.name" . }}-api
|
|
@@ -0,0 +1,28 @@
|
|
|
1
|
+
{{- if .Values.ingress.enabled -}}
|
|
2
|
+
apiVersion: networking.k8s.io/v1
|
|
3
|
+
kind: Ingress
|
|
4
|
+
metadata:
|
|
5
|
+
name: {{ include "npm-scan.name" . }}
|
|
6
|
+
labels: {{- include "npm-scan.labels" . | nindent 4 }}
|
|
7
|
+
{{- with .Values.ingress.annotations }}
|
|
8
|
+
annotations: {{- toYaml . | nindent 4 }}
|
|
9
|
+
{{- end }}
|
|
10
|
+
spec:
|
|
11
|
+
{{- with .Values.ingress.className }}
|
|
12
|
+
ingressClassName: {{ . }}
|
|
13
|
+
{{- end }}
|
|
14
|
+
rules:
|
|
15
|
+
- host: {{ .Values.ingress.host | quote }}
|
|
16
|
+
http:
|
|
17
|
+
paths:
|
|
18
|
+
- path: /
|
|
19
|
+
pathType: Prefix
|
|
20
|
+
backend:
|
|
21
|
+
service:
|
|
22
|
+
name: {{ include "npm-scan.name" . }}-api
|
|
23
|
+
port:
|
|
24
|
+
number: {{ .Values.service.port }}
|
|
25
|
+
{{- with .Values.ingress.tls }}
|
|
26
|
+
tls: {{- toYaml . | nindent 4 }}
|
|
27
|
+
{{- end }}
|
|
28
|
+
{{- end }}
|
|
@@ -0,0 +1,67 @@
|
|
|
1
|
+
{{- if .Values.postgresql.enabled }}
|
|
2
|
+
apiVersion: apps/v1
|
|
3
|
+
kind: Deployment
|
|
4
|
+
metadata:
|
|
5
|
+
name: {{ include "npm-scan.name" . }}-postgresql
|
|
6
|
+
labels:
|
|
7
|
+
app: {{ include "npm-scan.name" . }}-postgresql
|
|
8
|
+
spec:
|
|
9
|
+
replicas: 1
|
|
10
|
+
selector:
|
|
11
|
+
matchLabels:
|
|
12
|
+
app: {{ include "npm-scan.name" . }}-postgresql
|
|
13
|
+
template:
|
|
14
|
+
metadata:
|
|
15
|
+
labels:
|
|
16
|
+
app: {{ include "npm-scan.name" . }}-postgresql
|
|
17
|
+
spec:
|
|
18
|
+
containers:
|
|
19
|
+
- name: postgresql
|
|
20
|
+
image: postgres:16-alpine
|
|
21
|
+
ports:
|
|
22
|
+
- containerPort: 5432
|
|
23
|
+
env:
|
|
24
|
+
- name: POSTGRES_DB
|
|
25
|
+
value: "{{ .Values.postgresql.database }}"
|
|
26
|
+
- name: POSTGRES_USER
|
|
27
|
+
value: "{{ .Values.postgresql.username }}"
|
|
28
|
+
- name: POSTGRES_PASSWORD
|
|
29
|
+
valueFrom:
|
|
30
|
+
secretKeyRef:
|
|
31
|
+
name: {{ include "npm-scan.name" . }}-pg
|
|
32
|
+
key: password
|
|
33
|
+
{{- if .Values.persistence.enabled }}
|
|
34
|
+
volumeMounts:
|
|
35
|
+
- name: data
|
|
36
|
+
mountPath: /var/lib/postgresql/data
|
|
37
|
+
volumes:
|
|
38
|
+
- name: data
|
|
39
|
+
persistentVolumeClaim:
|
|
40
|
+
claimName: {{ include "npm-scan.name" . }}-pg
|
|
41
|
+
{{- end }}
|
|
42
|
+
---
|
|
43
|
+
apiVersion: v1
|
|
44
|
+
kind: Service
|
|
45
|
+
metadata:
|
|
46
|
+
name: {{ include "npm-scan.name" . }}-postgresql
|
|
47
|
+
spec:
|
|
48
|
+
ports:
|
|
49
|
+
- port: 5432
|
|
50
|
+
selector:
|
|
51
|
+
app: {{ include "npm-scan.name" . }}-postgresql
|
|
52
|
+
---
|
|
53
|
+
{{- if .Values.persistence.enabled }}
|
|
54
|
+
apiVersion: v1
|
|
55
|
+
kind: PersistentVolumeClaim
|
|
56
|
+
metadata:
|
|
57
|
+
name: {{ include "npm-scan.name" . }}-pg
|
|
58
|
+
spec:
|
|
59
|
+
accessModes: [ReadWriteOnce]
|
|
60
|
+
resources:
|
|
61
|
+
requests:
|
|
62
|
+
storage: {{ .Values.persistence.size }}
|
|
63
|
+
{{- with .Values.persistence.storageClass }}
|
|
64
|
+
storageClassName: {{ . }}
|
|
65
|
+
{{- end }}
|
|
66
|
+
{{- end }}
|
|
67
|
+
{{- end }}
|
|
@@ -0,0 +1,19 @@
|
|
|
1
|
+
apiVersion: v1
|
|
2
|
+
kind: Secret
|
|
3
|
+
metadata:
|
|
4
|
+
name: {{ include "npm-scan.name" . }}-license
|
|
5
|
+
labels: {{- include "npm-scan.labels" . | nindent 4 }}
|
|
6
|
+
type: Opaque
|
|
7
|
+
stringData:
|
|
8
|
+
key: "{{ .Values.license.key }}"
|
|
9
|
+
---
|
|
10
|
+
{{- if not .Values.postgresql.existingSecret }}
|
|
11
|
+
apiVersion: v1
|
|
12
|
+
kind: Secret
|
|
13
|
+
metadata:
|
|
14
|
+
name: {{ include "npm-scan.name" . }}-pg
|
|
15
|
+
labels: {{- include "npm-scan.labels" . | nindent 4 }}
|
|
16
|
+
type: Opaque
|
|
17
|
+
stringData:
|
|
18
|
+
password: "{{ .Values.postgresql.password }}"
|
|
19
|
+
{{- end }}
|
|
@@ -0,0 +1,32 @@
|
|
|
1
|
+
{{- if .Values.worker.enabled }}
|
|
2
|
+
apiVersion: apps/v1
|
|
3
|
+
kind: Deployment
|
|
4
|
+
metadata:
|
|
5
|
+
name: {{ include "npm-scan.name" . }}-worker
|
|
6
|
+
labels:
|
|
7
|
+
app: {{ include "npm-scan.name" . }}-worker
|
|
8
|
+
{{- include "npm-scan.labels" . | nindent 4 }}
|
|
9
|
+
spec:
|
|
10
|
+
replicas: {{ .Values.worker.replicas }}
|
|
11
|
+
selector:
|
|
12
|
+
matchLabels:
|
|
13
|
+
app: {{ include "npm-scan.name" . }}-worker
|
|
14
|
+
template:
|
|
15
|
+
metadata:
|
|
16
|
+
labels:
|
|
17
|
+
app: {{ include "npm-scan.name" . }}-worker
|
|
18
|
+
spec:
|
|
19
|
+
containers:
|
|
20
|
+
- name: worker
|
|
21
|
+
image: "{{ .Values.image.repository }}:{{ .Values.image.tag }}"
|
|
22
|
+
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
|
23
|
+
command: ["node", "cli/cli.js"]
|
|
24
|
+
env:
|
|
25
|
+
- name: NPM_SCAN_LICENSE_KEY
|
|
26
|
+
valueFrom:
|
|
27
|
+
secretKeyRef:
|
|
28
|
+
name: {{ include "npm-scan.name" . }}-license
|
|
29
|
+
key: key
|
|
30
|
+
optional: true
|
|
31
|
+
resources: {{- toYaml .Values.worker.resources | nindent 12 }}
|
|
32
|
+
{{- end }}
|
|
@@ -0,0 +1,73 @@
|
|
|
1
|
+
# Helm values for npm-scan
|
|
2
|
+
# Override per environment: helm install -f values-prod.yaml
|
|
3
|
+
|
|
4
|
+
image:
|
|
5
|
+
repository: ghcr.io/lateos/npm-scan
|
|
6
|
+
tag: latest
|
|
7
|
+
pullPolicy: Always
|
|
8
|
+
|
|
9
|
+
replicaCount: 1
|
|
10
|
+
|
|
11
|
+
license:
|
|
12
|
+
# --license-key or NPM_SCAN_LICENSE_KEY env var
|
|
13
|
+
key: ""
|
|
14
|
+
secret: ""
|
|
15
|
+
|
|
16
|
+
postgresql:
|
|
17
|
+
enabled: true
|
|
18
|
+
host: ""
|
|
19
|
+
port: 5432
|
|
20
|
+
database: npm_scan
|
|
21
|
+
username: npm_scan
|
|
22
|
+
password: ""
|
|
23
|
+
existingSecret: ""
|
|
24
|
+
|
|
25
|
+
api:
|
|
26
|
+
enabled: true
|
|
27
|
+
port: 8000
|
|
28
|
+
host: 0.0.0.0
|
|
29
|
+
replicas: 1
|
|
30
|
+
corsOrigins: ["*"]
|
|
31
|
+
resources:
|
|
32
|
+
requests:
|
|
33
|
+
cpu: 100m
|
|
34
|
+
memory: 128Mi
|
|
35
|
+
limits:
|
|
36
|
+
cpu: 500m
|
|
37
|
+
memory: 512Mi
|
|
38
|
+
|
|
39
|
+
worker:
|
|
40
|
+
enabled: true
|
|
41
|
+
replicas: 2
|
|
42
|
+
resources:
|
|
43
|
+
requests:
|
|
44
|
+
cpu: 200m
|
|
45
|
+
memory: 256Mi
|
|
46
|
+
limits:
|
|
47
|
+
cpu: 1
|
|
48
|
+
memory: 1Gi
|
|
49
|
+
|
|
50
|
+
ingress:
|
|
51
|
+
enabled: false
|
|
52
|
+
className: ""
|
|
53
|
+
annotations: {}
|
|
54
|
+
host: npm-scan.example.com
|
|
55
|
+
tls: []
|
|
56
|
+
|
|
57
|
+
service:
|
|
58
|
+
type: ClusterIP
|
|
59
|
+
port: 80
|
|
60
|
+
|
|
61
|
+
persistence:
|
|
62
|
+
enabled: true
|
|
63
|
+
size: 10Gi
|
|
64
|
+
storageClass: ""
|
|
65
|
+
|
|
66
|
+
nodeSelector: {}
|
|
67
|
+
tolerations: []
|
|
68
|
+
affinity: {}
|
|
69
|
+
|
|
70
|
+
redis:
|
|
71
|
+
enabled: false
|
|
72
|
+
host: ""
|
|
73
|
+
port: 6379
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@lateos/npm-scan",
|
|
3
|
-
"version": "0.
|
|
3
|
+
"version": "0.5.0",
|
|
4
4
|
"description": "Powerful npm supply chain security scanner - detects malicious packages (Shai-Hulud style), behavioral analysis, SBOM, and compliance reporting.",
|
|
5
5
|
"main": "backend/index.js",
|
|
6
6
|
"bin": {
|