wolverine-ai 1.1.0 → 1.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,24 +10,29 @@ Built on patterns from [claw-code](https://github.com/instructkr/claw-code) —
10
10
 
11
11
  ## Quick Start
12
12
 
13
+ ### Install from npm
14
+
13
15
  ```bash
14
- # Install from npm
15
16
  npm i wolverine-ai
17
+ cp node_modules/wolverine-ai/.env.example .env.local
18
+ # Edit .env.local — add your OPENAI_API_KEY
19
+ npx wolverine server/index.js
20
+ ```
21
+
22
+ ### Or clone from GitHub
16
23
 
17
- # Or clone from GitHub
24
+ ```bash
18
25
  git clone https://github.com/bobbyswhip/Wolverine.git
19
26
  cd Wolverine
20
27
  npm install
21
-
22
- # Configure
23
28
  cp .env.example .env.local
24
- # Edit .env.local — add your OPENAI_API_KEY and generate an ADMIN_KEY
25
-
26
- # Run
29
+ # Edit .env.local — add your OPENAI_API_KEY
27
30
  npm start
28
- # or: npx wolverine server/index.js
29
31
  ```
30
32
 
33
+ [![npm](https://img.shields.io/npm/v/wolverine-ai)](https://www.npmjs.com/package/wolverine-ai)
34
+ [![GitHub](https://img.shields.io/github/stars/bobbyswhip/Wolverine)](https://github.com/bobbyswhip/Wolverine)
35
+
31
36
  Dashboard opens at `http://localhost:PORT+1`. Server runs on `PORT`.
32
37
 
33
38
  ### Try a Demo
@@ -68,7 +73,7 @@ wolverine/
68
73
  │ │ ├── ai-client.js ← OpenAI client (Chat + Responses API)
69
74
  │ │ ├── models.js ← 10-model configuration system
70
75
  │ │ ├── verifier.js ← Fix verification (syntax + boot probe)
71
- │ │ ├── error-parser.js ← Stack trace parsing
76
+ │ │ ├── error-parser.js ← Stack trace parsing + error classification
72
77
  │ │ ├── patcher.js ← File patching with sandbox
73
78
  │ │ ├── health-monitor.js← PM2-style health checks
74
79
  │ │ ├── config.js ← Config loader (settings.json + env)
@@ -135,26 +140,38 @@ wolverine/
135
140
 
136
141
  ```
137
142
  Server crashes
138
- → Error parsed (file, line, message)
143
+ → Error parsed (file, line, message, errorType)
144
+ → Error classified: missing_module | missing_file | permission | port_conflict | syntax | runtime
139
145
  → Secrets redacted from error output
140
146
  → Prompt injection scan (AUDIT_MODEL)
141
147
  → Human-required check (expired keys, service down → notify, don't waste tokens)
142
148
  → Rate limit check (error loop → exponential backoff)
143
149
 
150
+ Operational Fix (zero AI tokens):
151
+ → "Cannot find module 'cors'" → npm install cors (instant, free)
152
+ → ENOENT on config file → create missing file with defaults
153
+ → EACCES/EPERM → chmod 755
154
+ → If operational fix works → done. No AI needed.
155
+
144
156
  Goal Loop (iterate until fixed or exhausted):
145
157
  Iteration 1: Fast path (CODING_MODEL, single file, ~1-2k tokens)
146
- Apply patch Verify (syntax check + boot probe) Pass? Done.
158
+ AI returns code changes AND/OR shell commands (npm install, mkdir, etc.)
159
+ → Execute commands first, apply patches second
160
+ → Verify (syntax check + boot probe) → Pass? Done.
147
161
  Iteration 2: Single agent (REASONING_MODEL, multi-file, 10 tools)
148
- Explores codebase FixVerify Pass? Done.
162
+ Agent has error patternfix strategy table
163
+ → Uses bash_exec for npm install, chmod, config creation
164
+ → Uses edit_file for code fixes
165
+ → Verify → Pass? Done.
149
166
  Iteration 3: Sub-agents (explore → plan → fix)
150
167
  → Explorer finds relevant files (read-only)
151
- → Planner proposes fix strategy (read-only)
152
- → Fixer executes the plan (write access)
168
+ → Planner considers operational vs code fixes
169
+ → Fixer has bash_exec + file tools (can npm install AND edit code)
153
170
  → Deep research (RESEARCH_MODEL) feeds into context
154
171
  → Each failure feeds into the next attempt
155
172
 
156
173
  After fix:
157
- → Record to repair history (error, resolution, tokens, cost)
174
+ → Record to repair history (error, resolution, tokens, cost, mode)
158
175
  → Store in brain for future reference
159
176
  → Promote backup to stable after 30min uptime
160
177
  ```
@@ -194,7 +211,7 @@ For complex repairs, wolverine spawns specialized sub-agents that run in sequenc
194
211
  |-------|--------|-------|------|
195
212
  | `explore` | Read-only | REASONING | Investigate codebase, find relevant files |
196
213
  | `plan` | Read-only | REASONING | Analyze problem, propose fix strategy |
197
- | `fix` | Read+write | CODING | Execute targeted fix from plan |
214
+ | `fix` | Read+write+shell | CODING | Execute targeted fix code edits AND npm install/chmod |
198
215
  | `verify` | Read-only | REASONING | Check if fix actually works |
199
216
  | `research` | Read-only | RESEARCH | Search brain + web for solutions |
200
217
  | `security` | Read-only | AUDIT | Audit code for vulnerabilities |
@@ -219,8 +236,7 @@ Real-time web UI at `http://localhost:PORT+1`:
219
236
  | **Performance** | Endpoint response times, request rates, error rates |
220
237
  | **Command** | Admin chat interface — ask questions or build features |
221
238
  | **Analytics** | Memory/CPU charts, route health, per-route response times + trends |
222
- | **Command** | Admin chat interface ask questions or build features |
223
- | **Backups** | Full server/ snapshot history with status badges |
239
+ | **Backups** | Full backup management: rollback/hot-load buttons, undo, rollback log, admin IP allowlist |
224
240
  | **Brain** | Vector store stats (23 seed docs), namespace counts, function map |
225
241
  | **Repairs** | Error/resolution audit trail: error, fix, tokens, cost, duration |
226
242
  | **Tools** | Agent tool harness listing (10 built-in + MCP) |
@@ -236,7 +252,7 @@ Three routes (AI-classified per command):
236
252
  | **TOOLS** | TOOL_MODEL | call_endpoint, read_file, search_brain | Live data, file contents |
237
253
  | **AGENT** | CODING_MODEL | Full 10-tool harness | Build features, fix code |
238
254
 
239
- Secured with `WOLVERINE_ADMIN_KEY` + localhost-only IP check.
255
+ Secured with `WOLVERINE_ADMIN_KEY` + IP allowlist (localhost + `WOLVERINE_ADMIN_IPS`).
240
256
 
241
257
  ---
242
258
 
@@ -268,7 +284,7 @@ Reasoning models (`o-series`, `gpt-5-nano`) automatically get 4x token limits to
268
284
  | **Injection Detector** | Regex layer + AI audit (AUDIT_MODEL) on every error before repair |
269
285
  | **Sandbox** | All file operations locked to project directory, symlink escape detection |
270
286
  | **Protected Paths** | Agent blocked from modifying wolverine internals (`src/`, `bin/`, etc.) |
271
- | **Admin Auth** | Dashboard command interface requires key + localhost IP, timing-safe comparison, lockout after 10 failures |
287
+ | **Admin Auth** | Dashboard requires key + IP allowlist. Localhost always allowed. Remote IPs via `WOLVERINE_ADMIN_IPS` env var or `POST /api/admin/add-ip` at runtime. Timing-safe comparison, lockout after 10 failures |
272
288
  | **Rate Limiter** | Sliding window, min gap, hourly budget, exponential backoff on error loops |
273
289
  | **MCP Security** | Per-server tool allowlists, arg sanitization, result injection scanning |
274
290
  | **SQL Skill** | `sqlGuard()` middleware blocks 15 injection pattern families on all endpoints |
@@ -404,14 +420,29 @@ All demos use the `server/` directory pattern. Each demo:
404
420
 
405
421
  ## Backup System
406
422
 
407
- Full `server/` directory snapshots:
423
+ Full `server/` directory snapshots with lifecycle management:
408
424
 
409
- - Created before every repair attempt and every smart edit
425
+ - Created before every repair attempt and every smart edit (with reason string)
426
+ - Created on graceful shutdown (`createShutdownBackup()`)
410
427
  - Includes all files: `.js`, `.json`, `.sql`, `.db`, `.yaml`, configs
411
428
  - **Status lifecycle**: UNSTABLE → VERIFIED (fix passed) → STABLE (30min+ uptime)
412
- - **Retention**: unstable pruned after 7 days, stable keeps 1/day after 7 days
429
+ - **Retention**: unstable/verified pruned after 7 days, stable keeps 1/day after 7 days
413
430
  - Atomic writes prevent corruption on kill
414
431
 
432
+ **Rollback & Recovery:**
433
+
434
+ | Action | What it does |
435
+ |--------|-------------|
436
+ | **Rollback** | Restore any backup — creates a pre-rollback safety backup first, restarts server |
437
+ | **Undo Rollback** | Restore the pre-rollback state if the rollback made things worse |
438
+ | **Hot-load** | Load any backup as the current server state from the dashboard |
439
+ | **Rollback Log** | Full audit trail: timestamp, action, target backup, success/failure |
440
+
441
+ **Dashboard endpoints** (admin auth required):
442
+ - `POST /api/backups/:id/rollback` — rollback to specific backup
443
+ - `POST /api/backups/:id/hotload` — hot-load backup as current state
444
+ - `POST /api/backups/undo` — undo the last rollback
445
+
415
446
  ---
416
447
 
417
448
  ## Skills
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "wolverine-ai",
3
- "version": "1.1.0",
3
+ "version": "1.3.0",
4
4
  "description": "Self-healing Node.js server framework powered by AI. Catches crashes, diagnoses errors, generates fixes, verifies, and restarts — automatically.",
5
5
  "main": "src/index.js",
6
6
  "bin": {
@@ -262,11 +262,27 @@ Use these tools systematically:
262
262
  5. You can edit ANY file type: .js, .json, .sql, .yaml, .env, .dockerfile, .sh, etc.
263
263
  6. Prefer edit_file for small targeted fixes, write_file for major changes
264
264
  7. Use grep_code to find all usages before renaming something
265
- 8. Use bash_exec to run tests or check dependencies
265
+ 8. Use bash_exec to run tests, install packages, or check dependencies
266
+
267
+ CRITICAL — Not every crash is a code bug. Choose the right fix:
268
+
269
+ | Error Pattern | Root Cause | Correct Fix |
270
+ |---|---|---|
271
+ | Cannot find module 'X' | Missing npm package | bash_exec: npm install X |
272
+ | Cannot find module './X' | Wrong import path | edit_file: fix the require/import path |
273
+ | ENOENT: no such file | Missing config/data file | write_file: create the missing file |
274
+ | EACCES/EPERM | Permission denied | bash_exec: chmod or fix ownership |
275
+ | EADDRINUSE | Port conflict | bash_exec: kill process on port, or edit config |
276
+ | SyntaxError | Bad code | edit_file: fix the syntax |
277
+ | TypeError/ReferenceError | Logic bug | edit_file: fix the code |
278
+ | MODULE_NOT_FOUND + node_modules | Corrupted install | bash_exec: rm -rf node_modules && npm install |
279
+
280
+ ALWAYS check package.json before editing imports. If a module isn't a local file, use bash_exec to install it.
266
281
 
267
282
  Rules:
268
283
  - Read files before modifying them
269
284
  - Make minimal, targeted changes
285
+ - Use bash_exec for operational fixes (npm install, chmod, config creation)
270
286
  - When done, call the "done" tool with a summary
271
287
 
272
288
  Project root: ${this.cwd}
@@ -1,6 +1,7 @@
1
1
  const chalk = require("chalk");
2
2
  const { aiCall } = require("../core/ai-client");
3
3
  const { getModel } = require("../core/models");
4
+ const { redact } = require("../security/secret-redactor");
4
5
 
5
6
  /**
6
7
  * Research Agent — deep research + learning from experience.
@@ -18,7 +19,6 @@ class ResearchAgent {
18
19
  constructor(options = {}) {
19
20
  this.brain = options.brain;
20
21
  this.logger = options.logger;
21
- this.redactor = options.redactor;
22
22
  }
23
23
 
24
24
  /**
@@ -50,8 +50,8 @@ class ResearchAgent {
50
50
  async recordAttempt({ errorMessage, filePath, fix, success, explanation }) {
51
51
  if (!this.brain || !this.brain._initialized) return;
52
52
 
53
- const safeError = this.redactor ? this.redactor.redact(errorMessage) : errorMessage;
54
- const safeExplanation = this.redactor ? this.redactor.redact(explanation || fix || "") : (explanation || fix || "");
53
+ const safeError = redact(errorMessage);
54
+ const safeExplanation = redact(explanation || fix || "");
55
55
 
56
56
  const namespace = success ? "fixes" : "errors";
57
57
  const prefix = success ? "FIXED" : "FAILED";
@@ -66,7 +66,7 @@ class ResearchAgent {
66
66
  * Stores findings in brain for future reference.
67
67
  */
68
68
  async research(errorMessage, context) {
69
- const safeError = this.redactor ? this.redactor.redact(errorMessage) : errorMessage;
69
+ const safeError = redact(errorMessage);
70
70
 
71
71
  console.log(chalk.magenta(` 🔬 Deep research (${getModel("research")})...`));
72
72
 
@@ -25,7 +25,7 @@ const { getModel } = require("../core/models");
25
25
  const AGENT_TOOL_SETS = {
26
26
  explore: ["read_file", "glob_files", "grep_code", "git_log", "git_diff", "done"],
27
27
  plan: ["read_file", "glob_files", "grep_code", "search_brain", "done"],
28
- fix: ["read_file", "write_file", "edit_file", "glob_files", "grep_code", "done"],
28
+ fix: ["read_file", "write_file", "edit_file", "glob_files", "grep_code", "bash_exec", "done"],
29
29
  verify: ["read_file", "glob_files", "grep_code", "bash_exec", "done"],
30
30
  research: ["read_file", "grep_code", "web_fetch", "search_brain", "done"],
31
31
  security: ["read_file", "glob_files", "grep_code", "done"],
@@ -46,8 +46,8 @@ const AGENT_CONFIGS = {
46
46
  // System prompts per agent type
47
47
  const AGENT_PROMPTS = {
48
48
  explore: "You are an Explorer agent. Your job is to investigate the codebase and find files relevant to the problem. Read files, search for patterns, check git history. Report what you found — do NOT make changes.",
49
- plan: "You are a Planner agent. Your job is to analyze the problem and propose a fix strategy. Read the relevant files, understand the root cause, and describe step-by-step what needs to change. Do NOT make changes.",
50
- fix: "You are a Fixer agent. You receive a specific fix plan. Execute it precisely — edit only the files mentioned, make only the changes described. Use edit_file for surgical changes.",
49
+ plan: "You are a Planner agent. Your job is to analyze the problem and propose a fix strategy. Read the relevant files, understand the root cause, and describe step-by-step what needs to change. Consider: is this a code bug (edit files) or an operational issue (npm install, create missing config, fix permissions)? Check package.json for dependencies. Do NOT make changes.",
50
+ fix: "You are a Fixer agent. You receive a specific fix plan. Execute it precisely. Use edit_file for code fixes, bash_exec for operational fixes (npm install, chmod, mkdir, config creation). Not every error is a code bug missing modules need npm install, missing files need creation, permission errors need chmod. Check package.json before editing imports.",
51
51
  verify: "You are a Verifier agent. Check if a fix actually works. Read the modified files, look for issues, run tests if available. Report whether the fix is correct.",
52
52
  research: "You are a Research agent. Search the brain for past fixes to similar errors, and search the web for solutions. Report your findings.",
53
53
  security: "You are a Security agent. Audit the code for vulnerabilities: SQL injection, XSS, path traversal, hardcoded secrets, missing input validation. Report all findings.",
@@ -1,38 +1,24 @@
1
1
  const fs = require("fs");
2
2
  const path = require("path");
3
3
  const chalk = require("chalk");
4
+ const { redact } = require("../security/secret-redactor");
4
5
 
5
6
  /**
6
- * Smart Backup Manager — manages versioned backups with stability tracking.
7
+ * Backup Manager — full server/ directory snapshots with lifecycle management.
7
8
  *
8
- * Backup lifecycle:
9
- * 1. UNSTABLE: Created when a fix is applied, before verification
10
- * 2. VERIFIED: The fix ran without immediately crashing (same error)
11
- * 3. STABLE: The server ran successfully for STABILITY_THRESHOLD without crashing
9
+ * Lifecycle: UNSTABLE → VERIFIED → STABLE
10
+ * Every backup is a complete copy of server/ (code, configs, databases).
11
+ * Admins can rollback, undo rollbacks, and hot-load any backup state.
12
12
  *
13
- * Retention policy:
14
- * - Unstable backups: deleted after 7 days
15
- * - Verified backups: kept for 7 days, then pruned unless promoted to stable
16
- * - Stable backups: after 7 days, keep only 1 per day (most recent each day)
17
- *
18
- * Storage layout:
19
- * .wolverine/
20
- * backups/
21
- * manifest.json — tracks all backups with metadata
22
- * <timestamp>/ — one directory per backup event
23
- * <filename>.bak — the original file content
13
+ * Retention: unstable/verified pruned after 7 days.
14
+ * Stable backups older than 7 days → keep 1 per day (most recent).
24
15
  */
25
16
 
26
17
  const WOLVERINE_DIR = ".wolverine";
27
18
  const BACKUPS_DIR = path.join(WOLVERINE_DIR, "backups");
28
19
  const MANIFEST_FILE = path.join(BACKUPS_DIR, "manifest.json");
29
-
30
- // Stability threshold: how long the server must run without the same crash
31
- // to consider a fix "stable" (default: 30 minutes)
32
20
  const STABILITY_THRESHOLD_MS = 30 * 60 * 1000;
33
-
34
- // Retention: unstable/verified backups older than this are pruned
35
- const RETENTION_UNSTABLE_MS = 7 * 24 * 60 * 60 * 1000; // 7 days
21
+ const RETENTION_MS = 7 * 24 * 60 * 60 * 1000;
36
22
 
37
23
  class BackupManager {
38
24
  constructor(projectRoot) {
@@ -44,30 +30,25 @@ class BackupManager {
44
30
  }
45
31
 
46
32
  /**
47
- * Create a backup of specific files or the entire server/ directory.
48
- * Returns a backupId that can be used to rollback or promote.
49
- *
50
- * @param {string[]|null} filePaths — specific files, or null to backup entire server/
33
+ * Create a full server/ backup.
34
+ * @param {string} reason why this backup was created
35
+ * @returns {string} backupId
51
36
  */
52
- createBackup(filePaths) {
37
+ createBackup(reason = "manual") {
53
38
  const backupId = Date.now().toString(36) + "-" + Math.random().toString(36).slice(2, 6);
54
39
  const timestamp = Date.now();
55
40
  const backupDir = path.join(this.backupsDir, backupId);
56
41
  fs.mkdirSync(backupDir, { recursive: true });
57
42
 
58
- // If no specific files, backup the entire server/ directory
59
- if (!filePaths || filePaths.length === 0) {
60
- filePaths = this._collectServerFiles();
61
- }
62
-
43
+ const filePaths = this._collectServerFiles();
63
44
  const files = [];
45
+
64
46
  for (const filePath of filePaths) {
65
47
  const absPath = path.isAbsolute(filePath) ? filePath : path.resolve(this.projectRoot, filePath);
66
48
  if (!fs.existsSync(absPath)) continue;
67
- // Skip large files (>10MB) and binary blobs
68
49
  try {
69
50
  const stat = fs.statSync(absPath);
70
- if (stat.size > 10 * 1024 * 1024) continue;
51
+ if (stat.size > 10 * 1024 * 1024) continue; // skip >10MB
71
52
  } catch { continue; }
72
53
 
73
54
  const relativePath = path.relative(this.projectRoot, absPath);
@@ -85,7 +66,9 @@ class BackupManager {
85
66
  id: backupId,
86
67
  timestamp,
87
68
  status: "unstable",
69
+ reason: redact(reason),
88
70
  files,
71
+ fileCount: files.length,
89
72
  errorSignature: null,
90
73
  promotedAt: null,
91
74
  verifiedAt: null,
@@ -94,22 +77,29 @@ class BackupManager {
94
77
  this.manifest.backups.push(entry);
95
78
  this._saveManifest();
96
79
 
80
+ console.log(chalk.gray(` 💾 Backup ${backupId} (${files.length} files) — ${reason}`));
97
81
  return backupId;
98
82
  }
99
83
 
100
84
  /**
101
- * Rollback to a specific backup.
85
+ * Rollback to a specific backup. Creates a pre-rollback backup first.
86
+ * @returns {{ success, preRollbackId }}
102
87
  */
103
88
  rollbackTo(backupId) {
104
89
  const entry = this.manifest.backups.find(b => b.id === backupId);
105
90
  if (!entry) {
106
91
  console.log(chalk.red(`Backup ${backupId} not found.`));
107
- return false;
92
+ return { success: false };
108
93
  }
109
94
 
95
+ // Create a pre-rollback backup so admins can undo
96
+ const preRollbackId = this.createBackup(`pre-rollback (before restoring ${backupId})`);
97
+
110
98
  let allRestored = true;
111
99
  for (const file of entry.files) {
112
100
  if (fs.existsSync(file.backup)) {
101
+ // Ensure parent dir exists
102
+ fs.mkdirSync(path.dirname(file.original), { recursive: true });
113
103
  fs.copyFileSync(file.backup, file.original);
114
104
  console.log(chalk.yellow(` ↩️ Restored: ${file.relative}`));
115
105
  } else {
@@ -118,22 +108,64 @@ class BackupManager {
118
108
  }
119
109
  }
120
110
 
121
- return allRestored;
111
+ // Log the rollback
112
+ if (!this.manifest.rollbackLog) this.manifest.rollbackLog = [];
113
+ this.manifest.rollbackLog.push({
114
+ timestamp: Date.now(),
115
+ restoredBackupId: backupId,
116
+ preRollbackBackupId: preRollbackId,
117
+ success: allRestored,
118
+ });
119
+ this._saveManifest();
120
+
121
+ return { success: allRestored, preRollbackId };
122
122
  }
123
123
 
124
124
  /**
125
- * Rollback to the most recent backup (any status).
125
+ * Rollback the most recent backup.
126
126
  */
127
127
  rollbackLatest() {
128
- if (this.manifest.backups.length === 0) return false;
128
+ if (this.manifest.backups.length === 0) return { success: false };
129
129
  const latest = this.manifest.backups[this.manifest.backups.length - 1];
130
- console.log(chalk.yellow(`\n↩️ Rolling back to backup ${latest.id} (${new Date(latest.timestamp).toISOString()})...`));
130
+ console.log(chalk.yellow(`\n↩️ Rolling back to ${latest.id} (${new Date(latest.timestamp).toISOString()})...`));
131
131
  return this.rollbackTo(latest.id);
132
132
  }
133
133
 
134
134
  /**
135
- * Mark a backup as verified (fix didn't immediately reproduce the error).
135
+ * Undo the last rollback restores the pre-rollback state.
136
136
  */
137
+ undoRollback() {
138
+ if (!this.manifest.rollbackLog || this.manifest.rollbackLog.length === 0) {
139
+ console.log(chalk.red("No rollback to undo."));
140
+ return { success: false };
141
+ }
142
+ const lastRollback = this.manifest.rollbackLog[this.manifest.rollbackLog.length - 1];
143
+ console.log(chalk.yellow(`\n↩️ Undoing rollback — restoring pre-rollback state ${lastRollback.preRollbackBackupId}...`));
144
+
145
+ const entry = this.manifest.backups.find(b => b.id === lastRollback.preRollbackBackupId);
146
+ if (!entry) {
147
+ console.log(chalk.red("Pre-rollback backup not found."));
148
+ return { success: false };
149
+ }
150
+
151
+ let allRestored = true;
152
+ for (const file of entry.files) {
153
+ if (fs.existsSync(file.backup)) {
154
+ fs.mkdirSync(path.dirname(file.original), { recursive: true });
155
+ fs.copyFileSync(file.backup, file.original);
156
+ } else { allRestored = false; }
157
+ }
158
+
159
+ this.manifest.rollbackLog.push({
160
+ timestamp: Date.now(),
161
+ action: "undo",
162
+ restoredBackupId: lastRollback.preRollbackBackupId,
163
+ success: allRestored,
164
+ });
165
+ this._saveManifest();
166
+ return { success: allRestored };
167
+ }
168
+
137
169
  markVerified(backupId) {
138
170
  const entry = this.manifest.backups.find(b => b.id === backupId);
139
171
  if (entry && entry.status === "unstable") {
@@ -143,9 +175,6 @@ class BackupManager {
143
175
  }
144
176
  }
145
177
 
146
- /**
147
- * Mark a backup as stable (server ran for the full stability threshold).
148
- */
149
178
  markStable(backupId) {
150
179
  const entry = this.manifest.backups.find(b => b.id === backupId);
151
180
  if (entry && (entry.status === "verified" || entry.status === "unstable")) {
@@ -156,163 +185,124 @@ class BackupManager {
156
185
  }
157
186
  }
158
187
 
159
- /**
160
- * Set the error signature on a backup (for tracking what error this fix addressed).
161
- */
162
188
  setErrorSignature(backupId, signature) {
163
189
  const entry = this.manifest.backups.find(b => b.id === backupId);
164
- if (entry) {
165
- entry.errorSignature = signature;
166
- this._saveManifest();
167
- }
190
+ if (entry) { entry.errorSignature = signature; this._saveManifest(); }
168
191
  }
169
192
 
170
193
  /**
171
- * Run the retention policy prune old backups.
172
- *
173
- * Rules:
174
- * 1. Unstable/verified backups older than 7 days → delete
175
- * 2. Stable backups older than 7 days → keep only 1 per day (most recent each day)
176
- * 3. All stable backups within 7 days → keep
194
+ * Shutdown backup called on graceful server shutdown.
195
+ */
196
+ createShutdownBackup() {
197
+ return this.createBackup("server-shutdown");
198
+ }
199
+
200
+ /**
201
+ * Get all backups for dashboard.
202
+ */
203
+ getAll() {
204
+ return this.manifest.backups;
205
+ }
206
+
207
+ /**
208
+ * Get rollback log for dashboard.
209
+ */
210
+ getRollbackLog() {
211
+ return this.manifest.rollbackLog || [];
212
+ }
213
+
214
+ /**
215
+ * Prune old backups per retention policy.
177
216
  */
178
217
  prune() {
179
218
  const now = Date.now();
180
- const cutoff = now - RETENTION_UNSTABLE_MS;
219
+ const cutoff = now - RETENTION_MS;
181
220
  let pruned = 0;
182
-
183
- // Separate backups by status
184
221
  const toKeep = [];
185
222
  const stableOld = [];
186
223
 
187
224
  for (const entry of this.manifest.backups) {
188
225
  if (entry.status === "stable") {
189
- if (entry.timestamp < cutoff) {
190
- stableOld.push(entry);
191
- } else {
192
- toKeep.push(entry);
193
- }
226
+ if (entry.timestamp < cutoff) stableOld.push(entry);
227
+ else toKeep.push(entry);
194
228
  } else {
195
- // Unstable or verified
196
- if (entry.timestamp < cutoff) {
197
- this._deleteBackupFiles(entry);
198
- pruned++;
199
- } else {
200
- toKeep.push(entry);
201
- }
229
+ if (entry.timestamp < cutoff) { this._deleteBackupFiles(entry); pruned++; }
230
+ else toKeep.push(entry);
202
231
  }
203
232
  }
204
233
 
205
- // For old stable backups: keep 1 per day
206
234
  if (stableOld.length > 0) {
207
235
  const byDay = new Map();
208
236
  for (const entry of stableOld) {
209
237
  const dayKey = new Date(entry.timestamp).toISOString().slice(0, 10);
210
- if (!byDay.has(dayKey)) {
211
- byDay.set(dayKey, []);
212
- }
238
+ if (!byDay.has(dayKey)) byDay.set(dayKey, []);
213
239
  byDay.get(dayKey).push(entry);
214
240
  }
215
-
216
241
  for (const [, dayEntries] of byDay) {
217
- // Sort by timestamp descending, keep the newest per day
218
242
  dayEntries.sort((a, b) => b.timestamp - a.timestamp);
219
- toKeep.push(dayEntries[0]); // keep the most recent
220
- for (let i = 1; i < dayEntries.length; i++) {
221
- this._deleteBackupFiles(dayEntries[i]);
222
- pruned++;
223
- }
243
+ toKeep.push(dayEntries[0]);
244
+ for (let i = 1; i < dayEntries.length; i++) { this._deleteBackupFiles(dayEntries[i]); pruned++; }
224
245
  }
225
246
  }
226
247
 
227
248
  this.manifest.backups = toKeep;
228
249
  this._saveManifest();
229
-
230
- if (pruned > 0) {
231
- console.log(chalk.gray(` 🧹 Pruned ${pruned} old backup(s).`));
232
- }
233
-
250
+ if (pruned > 0) console.log(chalk.gray(` 🧹 Pruned ${pruned} old backup(s).`));
234
251
  return pruned;
235
252
  }
236
253
 
237
- /**
238
- * Get summary stats for logging.
239
- */
240
254
  getStats() {
241
255
  const counts = { unstable: 0, verified: 0, stable: 0 };
242
256
  for (const entry of this.manifest.backups) {
243
257
  counts[entry.status] = (counts[entry.status] || 0) + 1;
244
258
  }
245
- return {
246
- total: this.manifest.backups.length,
247
- ...counts,
248
- };
259
+ return { total: this.manifest.backups.length, ...counts };
249
260
  }
250
261
 
251
262
  // -- Private --
252
263
 
253
- _ensureDirs() {
254
- fs.mkdirSync(this.backupsDir, { recursive: true });
255
- }
264
+ _ensureDirs() { fs.mkdirSync(this.backupsDir, { recursive: true }); }
256
265
 
257
266
  _loadManifest() {
258
267
  if (fs.existsSync(this.manifestPath)) {
259
- try {
260
- return JSON.parse(fs.readFileSync(this.manifestPath, "utf-8"));
261
- } catch {
262
- return { version: 1, backups: [] };
263
- }
268
+ try { return JSON.parse(fs.readFileSync(this.manifestPath, "utf-8")); }
269
+ catch { return { version: 1, backups: [], rollbackLog: [] }; }
264
270
  }
265
- return { version: 1, backups: [] };
271
+ return { version: 1, backups: [], rollbackLog: [] };
266
272
  }
267
273
 
268
274
  _saveManifest() {
269
- fs.writeFileSync(this.manifestPath, JSON.stringify(this.manifest, null, 2), "utf-8");
275
+ const tmp = this.manifestPath + ".tmp";
276
+ fs.writeFileSync(tmp, JSON.stringify(this.manifest, null, 2), "utf-8");
277
+ fs.renameSync(tmp, this.manifestPath);
270
278
  }
271
279
 
272
280
  _deleteBackupFiles(entry) {
273
281
  const backupDir = path.join(this.backupsDir, entry.id);
274
282
  if (fs.existsSync(backupDir)) {
275
- for (const file of fs.readdirSync(backupDir)) {
276
- fs.unlinkSync(path.join(backupDir, file));
277
- }
283
+ for (const file of fs.readdirSync(backupDir)) fs.unlinkSync(path.join(backupDir, file));
278
284
  fs.rmdirSync(backupDir);
279
285
  }
280
286
  }
281
287
 
282
- /**
283
- * Collect all files in the server/ directory for full backup.
284
- * Includes: .js, .json, .sql, .db, .sqlite, .yaml, .yml, .env, .html, .css
285
- * Excludes: node_modules, .git, large binaries
286
- */
287
288
  _collectServerFiles() {
288
289
  const serverDir = path.join(this.projectRoot, "server");
289
290
  if (!fs.existsSync(serverDir)) return [];
290
-
291
291
  const files = [];
292
292
  const SKIP = new Set(["node_modules", ".git", ".wolverine"]);
293
- const MAX_FILE_SIZE = 10 * 1024 * 1024; // 10MB
294
293
 
295
294
  const walk = (dir) => {
296
295
  let entries;
297
296
  try { entries = fs.readdirSync(dir, { withFileTypes: true }); } catch { return; }
298
-
299
297
  for (const entry of entries) {
300
298
  if (SKIP.has(entry.name)) continue;
301
-
302
299
  const fullPath = path.join(dir, entry.name);
303
- if (entry.isDirectory()) {
304
- walk(fullPath);
305
- } else {
306
- try {
307
- const stat = fs.statSync(fullPath);
308
- if (stat.size <= MAX_FILE_SIZE) {
309
- files.push(fullPath);
310
- }
311
- } catch {}
300
+ if (entry.isDirectory()) walk(fullPath);
301
+ else {
302
+ try { if (fs.statSync(fullPath).size <= 10 * 1024 * 1024) files.push(fullPath); } catch {}
312
303
  }
313
304
  }
314
305
  };
315
-
316
306
  walk(serverDir);
317
307
  return files;
318
308
  }