@kendroger/io-snapshot 1.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.snapdiffrc.json +5 -0
- package/README.md +124 -0
- package/bin/cli.js +251 -0
- package/lib/config.js +42 -0
- package/lib/constants.js +10 -0
- package/lib/daemon.js +167 -0
- package/lib/logger.js +17 -0
- package/lib/paths.js +55 -0
- package/lib/recorder-inline.js +42 -0
- package/lib/recorder.js +38 -0
- package/lib/transformer.js +348 -0
- package/lib/verifier.js +165 -0
- package/package.json +59 -0
package/.snapdiffrc.json
ADDED
package/README.md
ADDED
|
@@ -0,0 +1,124 @@
|
|
|
1
|
+
# 📸 io-snapshot
|
|
2
|
+
|
|
3
|
+
`io-snapshot` is a powerful behavior-preservation tool designed for zero-regression refactoring. It captures the exact inputs and outputs of your functions during real-world execution and allows you to "replay" them later to ensure that your structural changes haven't introduced behavioral drift.
|
|
4
|
+
|
|
5
|
+
[](https://www.npmjs.com/package/@kendroger/io-snapshot)
|
|
6
|
+
[](https://opensource.org/licenses/MIT)
|
|
7
|
+
|
|
8
|
+
## 🚀 Why io-snapshot?
|
|
9
|
+
|
|
10
|
+
Unit tests verify what you _expect_ to happen. `io-snapshot` verifies what _actually happens_ in your application. It acts as a safety net for:
|
|
11
|
+
|
|
12
|
+
- **Major Refactors:** Switching from Promises to `async/await` or restructuring complex logic.
|
|
13
|
+
- **Dependency Swaps:** Replacing one library with another while maintaining the same interface.
|
|
14
|
+
- **Performance Tuning:** Ensuring that optimizations don't break edge cases or return values.
|
|
15
|
+
- **Legacy Code:** Safely refactoring codebases that lack traditional test coverage.
|
|
16
|
+
|
|
17
|
+
## 📦 Installation
|
|
18
|
+
|
|
19
|
+
Install globally or as a dev dependency:
|
|
20
|
+
|
|
21
|
+
```bash
|
|
22
|
+
# Global installation
|
|
23
|
+
npm install -g io-snapshot
|
|
24
|
+
|
|
25
|
+
# Local installation
|
|
26
|
+
npm install --save-dev io-snapshot
|
|
27
|
+
```
|
|
28
|
+
|
|
29
|
+
## 🛠 Workflow
|
|
30
|
+
|
|
31
|
+
`io-snapshot` follows a simple 4-step workflow:
|
|
32
|
+
|
|
33
|
+
### 1. Record
|
|
34
|
+
|
|
35
|
+
Inject the recorder into your target files and start capturing snapshots while you use your application.
|
|
36
|
+
|
|
37
|
+
```bash
|
|
38
|
+
io-snapshot record ./src/services/*.ts
|
|
39
|
+
```
|
|
40
|
+
|
|
41
|
+
_Wait for the "Recording started" message, then start and interact with your app._
|
|
42
|
+
|
|
43
|
+
### 2. Stop & Restore
|
|
44
|
+
|
|
45
|
+
Once you've captured enough data, stop the recording. This restores your original source code but preserves the snapshots in `.snaps.jsonl`.
|
|
46
|
+
|
|
47
|
+
```bash
|
|
48
|
+
io-snapshot stop
|
|
49
|
+
```
|
|
50
|
+
|
|
51
|
+
### 3. Refactor
|
|
52
|
+
|
|
53
|
+
Modify your code, optimize your functions, or swap dependencies. As long as the function name and exported interface remain the same, you're good to go.
|
|
54
|
+
|
|
55
|
+
### 4. Verify
|
|
56
|
+
|
|
57
|
+
Run the test command to replay the captured inputs against your new code and compare the outputs.
|
|
58
|
+
|
|
59
|
+
```bash
|
|
60
|
+
io-snapshot test ./src/services/*.ts
|
|
61
|
+
```
|
|
62
|
+
|
|
63
|
+
## ⌨️ Command Reference
|
|
64
|
+
|
|
65
|
+
| Command | Description |
|
|
66
|
+
| :---------------- | :--------------------------------------------------------------- |
|
|
67
|
+
| `record [target]` | Injects recorder, starts daemon, and begins capturing snapshots. |
|
|
68
|
+
| `stop` | Stops the daemon and restores original source code. |
|
|
69
|
+
| `test [target]` | Replays snapshots against current code and reports any drift. |
|
|
70
|
+
| `clean [target]` | Restores original files and deletes the snapshot data. |
|
|
71
|
+
| `inject [target]` | Explicitly injects the recorder without starting the daemon. |
|
|
72
|
+
|
|
73
|
+
## ⚙️ Configuration
|
|
74
|
+
|
|
75
|
+
You can configure `io-snapshot` via a `.iosnapshotrc.json` file in your project root.
|
|
76
|
+
|
|
77
|
+
**File Locations:**
|
|
78
|
+
|
|
79
|
+
- **`.snaps.jsonl`**: Stores your recorded snapshots directly in your project's root directory.
|
|
80
|
+
- **Temporary Session Files**: Files like the daemon's PID (`io-snapshot.pid`) and primary backups are stored in a unique, project-specific directory within your **operating system's temporary folder**. This ensures a clean project root and prevents conflicts.
|
|
81
|
+
- **Backup Files**:
|
|
82
|
+
- **Primary Backups**: Created in the OS temporary folder alongside other temporary session files.
|
|
83
|
+
- **Fallback Backups**: Created in a `.io-snapshot-backups/` directory in your project's root for persistence against OS temp folder cleanup.
|
|
84
|
+
|
|
85
|
+
```json
|
|
86
|
+
{
|
|
87
|
+
"port": 9444,
|
|
88
|
+
"timeout": 30,
|
|
89
|
+
"exclude": ["node_modules/**", "test/**"]
|
|
90
|
+
}
|
|
91
|
+
```
|
|
92
|
+
|
|
93
|
+
## 🤔 Troubleshooting
|
|
94
|
+
|
|
95
|
+
Here are some common issues and how to solve them in simple terms.
|
|
96
|
+
|
|
97
|
+
| Problem | Solution |
|
|
98
|
+
| :-------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
|
99
|
+
| **`"io-snapshot: command not found"`** | This usually means the tool wasn't installed correctly. Try running `npm install -g io-snapshot` again. If you installed it locally (`--save-dev`), you'll need to run it through an npm script. |
|
|
100
|
+
| **`"io-snapshot is already running!"`** | You have a previous session that wasn't stopped. Run `io-snapshot stop` to end it, and then you can start a new recording. |
|
|
101
|
+
| **No snapshots are being recorded.** | 1. Make sure you are running your application _after_ `io-snapshot record` says "Recording started." <br> 2. Check that the functions you want to record are **exported** from their files. <br> 3. Make sure you are interacting with the parts of your app that use those functions. |
|
|
102
|
+
| **`"EADDRINUSE: address already in use"`** | The port `io-snapshot` wants to use (default: 9444) is occupied. You can either stop the other program or tell `io-snapshot` to use a different port with the `-p` flag: `io-snapshot record -p 9445` |
|
|
103
|
+
| **Tests are passing, but I know the logic is different.** | `io-snapshot` checks if the final _output_ is the same for a given _input_. If your refactor produces the same result (e.g., changing a `for` loop to a `.map()`), `io-snapshot` will correctly report no change in behavior. It only cares about the "what," not the "how." |
|
|
104
|
+
|
|
105
|
+
## 🔍 How it Works
|
|
106
|
+
|
|
107
|
+
1. **Instrumentation:** It uses `ts-morph` and `Babel` to wrap your exported functions with a Proxy.
|
|
108
|
+
2. **Backup:** Before injecting, `io-snapshot` creates two backups of your original files:
|
|
109
|
+
- A **primary backup** in a temporary directory managed by your operating system.
|
|
110
|
+
- A **fallback backup** in a `.io-snapshot-backups/` directory within your project's root.
|
|
111
|
+
This ensures that even if OS temporary files are cleared, your original code can still be restored.
|
|
112
|
+
3. **Capture:** When the wrapped functions are called, the inputs and outputs are sent to a local background daemon.
|
|
113
|
+
4. **Storage:** Snapshots are serialized using `SuperJSON` (to preserve complex types like Dates and RegEx) and stored in a newline-delimited JSON file (`.snaps.jsonl`) in your project root.
|
|
114
|
+
5. **Verification:** The test runner imports your modified functions and feeds them the exact arguments from the snapshots, then performs a deep-diff on the results. When restoring, it first attempts to use the primary backup, falling back to the local backup if necessary.
|
|
115
|
+
|
|
116
|
+
## ⚠️ Requirements & Limitations
|
|
117
|
+
|
|
118
|
+
- Functions must be **exported** to be captured.
|
|
119
|
+
- Data must be **serializable** (SuperJSON handles many complex types, but extremely complex circular references or native handles might be tricky).
|
|
120
|
+
- Currently supports **ES Modules (ESM)** projects.
|
|
121
|
+
|
|
122
|
+
## 📄 License
|
|
123
|
+
|
|
124
|
+
MIT © [kendroger](https://github.com/kendroger)
|
package/bin/cli.js
ADDED
|
@@ -0,0 +1,251 @@
|
|
|
1
|
+
#!/usr/bin/env node
|
|
2
|
+
import { program } from 'commander';
|
|
3
|
+
import { spawn } from 'child_process';
|
|
4
|
+
import fs from 'fs';
|
|
5
|
+
import path from 'path';
|
|
6
|
+
import { fileURLToPath } from 'url';
|
|
7
|
+
import http from 'http';
|
|
8
|
+
import { SNAPSHOT_FILE } from '../lib/constants.js';
|
|
9
|
+
import { loadConfig, validatePort, validateTimeout, validateFilePattern } from '../lib/config.js';
|
|
10
|
+
import { log } from '../lib/logger.js';
|
|
11
|
+
import { getPidFilePath, ensureTempSessionDir } from '../lib/paths.js';
|
|
12
|
+
|
|
13
|
+
const __dirname = path.dirname(fileURLToPath(import.meta.url));
|
|
14
|
+
|
|
15
|
+
function savePid(pid) {
|
|
16
|
+
const pidPath = getPidFilePath();
|
|
17
|
+
ensureTempSessionDir();
|
|
18
|
+
fs.writeFileSync(pidPath, String(pid));
|
|
19
|
+
}
|
|
20
|
+
|
|
21
|
+
function getSavedPid() {
|
|
22
|
+
const pidPath = getPidFilePath();
|
|
23
|
+
if (fs.existsSync(pidPath)) {
|
|
24
|
+
return parseInt(fs.readFileSync(pidPath, 'utf8'), 10);
|
|
25
|
+
}
|
|
26
|
+
return null;
|
|
27
|
+
}
|
|
28
|
+
|
|
29
|
+
function removePid() {
|
|
30
|
+
const pidPath = getPidFilePath();
|
|
31
|
+
if (fs.existsSync(pidPath)) {
|
|
32
|
+
fs.unlinkSync(pidPath);
|
|
33
|
+
}
|
|
34
|
+
}
|
|
35
|
+
|
|
36
|
+
async function daemonRequest(endpoint, port, method = 'POST') {
|
|
37
|
+
return new Promise((resolve, reject) => {
|
|
38
|
+
const req = http.request(`http://localhost:${port}${endpoint}`, { method }, (res) => {
|
|
39
|
+
let data = '';
|
|
40
|
+
res.on('data', chunk => data += chunk);
|
|
41
|
+
res.on('end', () => {
|
|
42
|
+
try {
|
|
43
|
+
resolve(JSON.parse(data));
|
|
44
|
+
} catch {
|
|
45
|
+
resolve(data);
|
|
46
|
+
}
|
|
47
|
+
});
|
|
48
|
+
});
|
|
49
|
+
req.on('error', reject);
|
|
50
|
+
req.end();
|
|
51
|
+
});
|
|
52
|
+
}
|
|
53
|
+
|
|
54
|
+
program
|
|
55
|
+
.name('io-snapshot')
|
|
56
|
+
.description('Capture and compare function behavior snapshots for zero-regression refactoring');
|
|
57
|
+
|
|
58
|
+
program
|
|
59
|
+
.command('inject [target]')
|
|
60
|
+
.description('Inject recorder into target files (for explicit use)')
|
|
61
|
+
.option('-f, --force', 'Force re-inject even if already injected')
|
|
62
|
+
.action(async (target, opts) => {
|
|
63
|
+
try {
|
|
64
|
+
if (target) validateFilePattern(target);
|
|
65
|
+
} catch (error) {
|
|
66
|
+
log.error(error.message);
|
|
67
|
+
process.exit(1);
|
|
68
|
+
}
|
|
69
|
+
|
|
70
|
+
const { injectRecorder } = await import('../lib/transformer.js');
|
|
71
|
+
await injectRecorder(target, opts.force);
|
|
72
|
+
});
|
|
73
|
+
|
|
74
|
+
program
|
|
75
|
+
.command('record [target]')
|
|
76
|
+
.description('Inject, start daemon, and begin recording snapshots')
|
|
77
|
+
.option('-p, --port <port>', 'Port to run the daemon on')
|
|
78
|
+
.option('-t, --timeout <minutes>', 'Auto-shutdown after N minutes of inactivity')
|
|
79
|
+
.option('-f, --force', 'Force re-inject even if already injected')
|
|
80
|
+
.action(async (target, opts) => {
|
|
81
|
+
try {
|
|
82
|
+
if (opts.port) validatePort(opts.port);
|
|
83
|
+
if (opts.timeout) validateTimeout(opts.timeout);
|
|
84
|
+
if (target) validateFilePattern(target);
|
|
85
|
+
} catch (error) {
|
|
86
|
+
log.error(error.message);
|
|
87
|
+
process.exit(1);
|
|
88
|
+
}
|
|
89
|
+
|
|
90
|
+
const config = loadConfig();
|
|
91
|
+
const port = opts.port || config.port || 9444;
|
|
92
|
+
const timeout = opts.timeout || config.timeout || 30;
|
|
93
|
+
|
|
94
|
+
log.divider('IMPORTANT');
|
|
95
|
+
log.info('Run this command FIRST,');
|
|
96
|
+
log.info(' THEN start your app!');
|
|
97
|
+
log.divider();
|
|
98
|
+
|
|
99
|
+
log.workflow([
|
|
100
|
+
'io-snapshot record → Inject recorder + start background daemon',
|
|
101
|
+
'npm run dev → Start your app (daemon MUST be running)',
|
|
102
|
+
'Interact with app → Use your app to capture real-world data',
|
|
103
|
+
'io-snapshot stop → Stop recording and restore your original code',
|
|
104
|
+
'Modify your code → Perform your refactoring or changes',
|
|
105
|
+
'io-snapshot test → Verify new code against captured snapshots'
|
|
106
|
+
]);
|
|
107
|
+
|
|
108
|
+
const snapshotPath = path.resolve(process.cwd(), SNAPSHOT_FILE);
|
|
109
|
+
fs.writeFileSync(snapshotPath, '');
|
|
110
|
+
log.info('Cleared previous snapshot file for a fresh session.');
|
|
111
|
+
|
|
112
|
+
log.info('Checking for existing session...');
|
|
113
|
+
const savedPid = getSavedPid();
|
|
114
|
+
if (savedPid) {
|
|
115
|
+
try {
|
|
116
|
+
process.kill(savedPid, 0);
|
|
117
|
+
log.divider();
|
|
118
|
+
log.warn('io-snapshot is already running!');
|
|
119
|
+
log.warn(`A daemon is active (PID: ${savedPid})`);
|
|
120
|
+
log.warn('Please run "io-snapshot stop" first before starting a new session.');
|
|
121
|
+
log.divider();
|
|
122
|
+
process.exit(1);
|
|
123
|
+
} catch {
|
|
124
|
+
removePid();
|
|
125
|
+
}
|
|
126
|
+
}
|
|
127
|
+
|
|
128
|
+
log.step(1, 'Injecting recorder into files...');
|
|
129
|
+
const { injectRecorder } = await import('../lib/transformer.js');
|
|
130
|
+
await injectRecorder(target, opts.force);
|
|
131
|
+
|
|
132
|
+
log.step(2, 'Starting daemon...');
|
|
133
|
+
const env = {
|
|
134
|
+
...process.env,
|
|
135
|
+
IOSNAP_DAEMON_PORT: String(port),
|
|
136
|
+
IOSNAP_DAEMON_CORS: '*'
|
|
137
|
+
};
|
|
138
|
+
|
|
139
|
+
const child = spawn(process.execPath, [path.join(__dirname, '../lib/daemon.js')], {
|
|
140
|
+
cwd: process.cwd(),
|
|
141
|
+
stdio: 'ignore',
|
|
142
|
+
env,
|
|
143
|
+
detached: true
|
|
144
|
+
});
|
|
145
|
+
|
|
146
|
+
child.unref();
|
|
147
|
+
savePid(child.pid);
|
|
148
|
+
log.success(`Daemon started on port ${port} (PID: ${child.pid})`);
|
|
149
|
+
|
|
150
|
+
await new Promise(resolve => setTimeout(resolve, 1000));
|
|
151
|
+
|
|
152
|
+
log.step(3, 'Starting recording...');
|
|
153
|
+
|
|
154
|
+
try {
|
|
155
|
+
await daemonRequest('/record', port);
|
|
156
|
+
log.success('Recording active.');
|
|
157
|
+
log.info(`Snapshots will be saved to: ${snapshotPath}`);
|
|
158
|
+
log.divider();
|
|
159
|
+
log.success('NOW START YOUR APP (e.g., npm run dev)');
|
|
160
|
+
log.info('Interact with your app to capture snapshots.');
|
|
161
|
+
log.info('Run "io-snapshot stop" when done to restore original code.');
|
|
162
|
+
log.divider();
|
|
163
|
+
} catch (error) {
|
|
164
|
+
log.error(`Failed to start recording: ${error.message}`);
|
|
165
|
+
process.exit(1);
|
|
166
|
+
}
|
|
167
|
+
});
|
|
168
|
+
|
|
169
|
+
program
|
|
170
|
+
.command('stop')
|
|
171
|
+
.description('Stop recording, stop daemon, and restore original code')
|
|
172
|
+
.option('-p, --port <port>', 'Port where the daemon is running')
|
|
173
|
+
.action(async (opts) => {
|
|
174
|
+
try {
|
|
175
|
+
if (opts.port) validatePort(opts.port);
|
|
176
|
+
} catch (error) {
|
|
177
|
+
log.error(error.message);
|
|
178
|
+
process.exit(1);
|
|
179
|
+
}
|
|
180
|
+
|
|
181
|
+
const config = loadConfig();
|
|
182
|
+
const port = opts.port || config.port || 9444;
|
|
183
|
+
|
|
184
|
+
log.step(1, 'Stopping recording...');
|
|
185
|
+
try {
|
|
186
|
+
await daemonRequest('/stop', port);
|
|
187
|
+
log.success('Recording stopped.');
|
|
188
|
+
} catch (error) {
|
|
189
|
+
log.warn(`Daemon not responding: ${error.message}.`);
|
|
190
|
+
}
|
|
191
|
+
|
|
192
|
+
log.step(2, 'Stopping daemon...');
|
|
193
|
+
const savedPid = getSavedPid();
|
|
194
|
+
if (savedPid) {
|
|
195
|
+
try {
|
|
196
|
+
process.kill(savedPid, 'SIGTERM');
|
|
197
|
+
log.success(`Daemon (PID: ${savedPid}) stopped.`);
|
|
198
|
+
removePid();
|
|
199
|
+
} catch (error) {
|
|
200
|
+
log.warn('Daemon not running, cleaning up PID file.');
|
|
201
|
+
removePid();
|
|
202
|
+
}
|
|
203
|
+
}
|
|
204
|
+
|
|
205
|
+
log.step(3, 'Restoring original code...');
|
|
206
|
+
const { restore } = await import('../lib/transformer.js');
|
|
207
|
+
await restore(null, true);
|
|
208
|
+
|
|
209
|
+
const snapshotPath = path.resolve(process.cwd(), SNAPSHOT_FILE);
|
|
210
|
+
if (fs.existsSync(snapshotPath)) {
|
|
211
|
+
const stats = fs.statSync(snapshotPath);
|
|
212
|
+
log.success(`Snapshots preserved: ${snapshotPath} (${stats.size} bytes)`);
|
|
213
|
+
}
|
|
214
|
+
|
|
215
|
+
log.divider();
|
|
216
|
+
log.success('Original code restored. Snapshots preserved for testing.');
|
|
217
|
+
log.info('Run "io-snapshot test" to verify your code changes.');
|
|
218
|
+
log.divider();
|
|
219
|
+
});
|
|
220
|
+
|
|
221
|
+
program
|
|
222
|
+
.command('test [target]')
|
|
223
|
+
.description('Replay snapshots against current code to verify behavior')
|
|
224
|
+
.action(async (target) => {
|
|
225
|
+
try {
|
|
226
|
+
if (target) validateFilePattern(target);
|
|
227
|
+
} catch (error) {
|
|
228
|
+
log.error(error.message);
|
|
229
|
+
process.exit(1);
|
|
230
|
+
}
|
|
231
|
+
|
|
232
|
+
const { verifyDir } = await import('../lib/verifier.js');
|
|
233
|
+
await verifyDir(target);
|
|
234
|
+
});
|
|
235
|
+
|
|
236
|
+
program
|
|
237
|
+
.command('clean [target]')
|
|
238
|
+
.description('Restore original files and delete snapshots')
|
|
239
|
+
.action(async (target) => {
|
|
240
|
+
try {
|
|
241
|
+
if (target) validateFilePattern(target);
|
|
242
|
+
} catch (error) {
|
|
243
|
+
log.error(error.message);
|
|
244
|
+
process.exit(1);
|
|
245
|
+
}
|
|
246
|
+
|
|
247
|
+
const { restore } = await import('../lib/transformer.js');
|
|
248
|
+
await restore(target);
|
|
249
|
+
});
|
|
250
|
+
|
|
251
|
+
program.parse();
|
package/lib/config.js
ADDED
|
@@ -0,0 +1,42 @@
|
|
|
1
|
+
import fs from 'fs';
|
|
2
|
+
import path from 'path';
|
|
3
|
+
import { CONFIG_FILE } from './constants.js';
|
|
4
|
+
|
|
5
|
+
export function loadConfig() {
|
|
6
|
+
const configPath = path.resolve(process.cwd(), CONFIG_FILE);
|
|
7
|
+
|
|
8
|
+
if (!fs.existsSync(configPath)) {
|
|
9
|
+
return {};
|
|
10
|
+
}
|
|
11
|
+
|
|
12
|
+
try {
|
|
13
|
+
const content = fs.readFileSync(configPath, 'utf8');
|
|
14
|
+
return JSON.parse(content);
|
|
15
|
+
} catch (error) {
|
|
16
|
+
console.warn(`[io-snapshot] Warning: Failed to parse config file: ${error.message}`);
|
|
17
|
+
return {};
|
|
18
|
+
}
|
|
19
|
+
}
|
|
20
|
+
|
|
21
|
+
export function validatePort(port) {
|
|
22
|
+
const portNum = parseInt(port, 10);
|
|
23
|
+
if (isNaN(portNum) || !Number.isInteger(portNum) || portNum < 1 || portNum > 65535) {
|
|
24
|
+
throw new Error(`Invalid port: ${port}. Must be an integer between 1 and 65535.`);
|
|
25
|
+
}
|
|
26
|
+
return portNum;
|
|
27
|
+
}
|
|
28
|
+
|
|
29
|
+
export function validateTimeout(timeout) {
|
|
30
|
+
const timeoutNum = parseInt(timeout, 10);
|
|
31
|
+
if (isNaN(timeoutNum) || !Number.isInteger(timeoutNum) || timeoutNum < 1) {
|
|
32
|
+
throw new Error(`Invalid timeout: ${timeout}. Must be a positive integer.`);
|
|
33
|
+
}
|
|
34
|
+
return timeoutNum;
|
|
35
|
+
}
|
|
36
|
+
|
|
37
|
+
export function validateFilePattern(pattern) {
|
|
38
|
+
if (!pattern || typeof pattern !== 'string' || pattern.trim() === '') {
|
|
39
|
+
throw new Error('File pattern must be a non-empty string.');
|
|
40
|
+
}
|
|
41
|
+
return pattern.trim();
|
|
42
|
+
}
|
package/lib/constants.js
ADDED
|
@@ -0,0 +1,10 @@
|
|
|
1
|
+
export const DEFAULT_PORT = 9444;
|
|
2
|
+
export const DEFAULT_TIMEOUT = 30;
|
|
3
|
+
export const SNAPSHOT_FILE = '.snaps.jsonl';
|
|
4
|
+
export const CONFIG_FILE = '.iosnapshotrc.json';
|
|
5
|
+
export const BACKUP_EXT = '.snap.bak';
|
|
6
|
+
|
|
7
|
+
// Path-related constants
|
|
8
|
+
export const TEMP_SESSION_DIR_PREFIX = 'io-snapshot-session-';
|
|
9
|
+
export const PID_FILE_NAME = 'io-snapshot.pid';
|
|
10
|
+
export const LOCAL_BACKUP_DIR = '.io-snapshot-backups';
|
package/lib/daemon.js
ADDED
|
@@ -0,0 +1,167 @@
|
|
|
1
|
+
import http from 'http';
|
|
2
|
+
import fs from 'fs';
|
|
3
|
+
import path from 'path';
|
|
4
|
+
import { DEFAULT_PORT, DEFAULT_TIMEOUT, SNAPSHOT_FILE } from './constants.js';
|
|
5
|
+
import { loadConfig } from './config.js';
|
|
6
|
+
import { log } from './logger.js';
|
|
7
|
+
import { getPidFilePath, ensureTempSessionDir } from './paths.js';
|
|
8
|
+
|
|
9
|
+
let isRecording = false;
|
|
10
|
+
let lastTelemetryTime = Date.now();
|
|
11
|
+
let server = null;
|
|
12
|
+
let timeoutInterval = null;
|
|
13
|
+
let corsOrigin = '*';
|
|
14
|
+
|
|
15
|
+
function setCorsHeaders(res) {
|
|
16
|
+
if (corsOrigin === '*') {
|
|
17
|
+
res.setHeader('Access-Control-Allow-Origin', '*');
|
|
18
|
+
} else if (corsOrigin) {
|
|
19
|
+
res.setHeader('Access-Control-Allow-Origin', corsOrigin);
|
|
20
|
+
}
|
|
21
|
+
res.setHeader('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
|
|
22
|
+
res.setHeader('Access-Control-Allow-Headers', 'Content-Type');
|
|
23
|
+
}
|
|
24
|
+
|
|
25
|
+
function sendResponse(res, statusCode, data) {
|
|
26
|
+
setCorsHeaders(res);
|
|
27
|
+
res.writeHead(statusCode, { 'Content-Type': 'application/json' });
|
|
28
|
+
res.end(JSON.stringify(data));
|
|
29
|
+
}
|
|
30
|
+
|
|
31
|
+
function savePid(pid) {
|
|
32
|
+
ensureTempSessionDir();
|
|
33
|
+
fs.writeFileSync(getPidFilePath(), String(pid));
|
|
34
|
+
}
|
|
35
|
+
|
|
36
|
+
function removePid() {
|
|
37
|
+
const pidPath = getPidFilePath();
|
|
38
|
+
if (fs.existsSync(pidPath)) {
|
|
39
|
+
fs.unlinkSync(pidPath);
|
|
40
|
+
}
|
|
41
|
+
}
|
|
42
|
+
|
|
43
|
+
function resetTimeout() {
|
|
44
|
+
lastTelemetryTime = Date.now();
|
|
45
|
+
}
|
|
46
|
+
|
|
47
|
+
function startTimeoutWatcher(timeoutMinutes) {
|
|
48
|
+
if (timeoutInterval) clearInterval(timeoutInterval);
|
|
49
|
+
|
|
50
|
+
timeoutInterval = setInterval(() => {
|
|
51
|
+
const elapsed = (Date.now() - lastTelemetryTime) / 1000 / 60;
|
|
52
|
+
if (elapsed >= timeoutMinutes && server) {
|
|
53
|
+
log.error(`Daemon auto-shutting down after ${timeoutMinutes} minutes of inactivity.`);
|
|
54
|
+
stopServer();
|
|
55
|
+
}
|
|
56
|
+
}, 30000);
|
|
57
|
+
}
|
|
58
|
+
|
|
59
|
+
function stopServer() {
|
|
60
|
+
if (server) {
|
|
61
|
+
server.close();
|
|
62
|
+
server = null;
|
|
63
|
+
}
|
|
64
|
+
if (timeoutInterval) {
|
|
65
|
+
clearInterval(timeoutInterval);
|
|
66
|
+
timeoutInterval = null;
|
|
67
|
+
}
|
|
68
|
+
removePid();
|
|
69
|
+
process.exit(0);
|
|
70
|
+
}
|
|
71
|
+
|
|
72
|
+
async function handleTelemetry(req, res) {
|
|
73
|
+
let body = '';
|
|
74
|
+
for await (const chunk of req) {
|
|
75
|
+
body += chunk;
|
|
76
|
+
}
|
|
77
|
+
|
|
78
|
+
if (!isRecording) {
|
|
79
|
+
sendResponse(res, 200, { status: 'ignored', reason: 'not_recording' });
|
|
80
|
+
return;
|
|
81
|
+
}
|
|
82
|
+
|
|
83
|
+
resetTimeout();
|
|
84
|
+
|
|
85
|
+
try {
|
|
86
|
+
fs.appendFileSync(path.resolve(process.cwd(), SNAPSHOT_FILE), body + '\n');
|
|
87
|
+
sendResponse(res, 200, { status: 'captured' });
|
|
88
|
+
} catch (error) {
|
|
89
|
+
sendResponse(res, 500, { error: error.message });
|
|
90
|
+
}
|
|
91
|
+
}
|
|
92
|
+
|
|
93
|
+
function startDaemon(port, timeoutMinutes, corsOptions) {
|
|
94
|
+
if (corsOptions?.origin) {
|
|
95
|
+
corsOrigin = corsOptions.origin;
|
|
96
|
+
}
|
|
97
|
+
|
|
98
|
+
server = http.createServer(async (req, res) => {
|
|
99
|
+
const url = new URL(req.url, `http://localhost:${port}`);
|
|
100
|
+
|
|
101
|
+
if (req.method === 'OPTIONS') {
|
|
102
|
+
setCorsHeaders(res);
|
|
103
|
+
res.writeHead(204);
|
|
104
|
+
res.end();
|
|
105
|
+
return;
|
|
106
|
+
}
|
|
107
|
+
|
|
108
|
+
if (req.method === 'GET' && url.pathname === '/status') {
|
|
109
|
+
sendResponse(res, 200, {
|
|
110
|
+
isRecording,
|
|
111
|
+
timeout: timeoutMinutes,
|
|
112
|
+
uptime: process.uptime(),
|
|
113
|
+
corsOrigin
|
|
114
|
+
});
|
|
115
|
+
return;
|
|
116
|
+
}
|
|
117
|
+
|
|
118
|
+
if (req.method === 'POST' && url.pathname === '/record') {
|
|
119
|
+
isRecording = true;
|
|
120
|
+
resetTimeout();
|
|
121
|
+
log.success('Recording started.');
|
|
122
|
+
sendResponse(res, 200, { isRecording: true });
|
|
123
|
+
return;
|
|
124
|
+
}
|
|
125
|
+
|
|
126
|
+
if (req.method === 'POST' && url.pathname === '/stop') {
|
|
127
|
+
isRecording = false;
|
|
128
|
+
log.success('Recording stopped.');
|
|
129
|
+
sendResponse(res, 200, { isRecording: false });
|
|
130
|
+
return;
|
|
131
|
+
}
|
|
132
|
+
|
|
133
|
+
if (req.method === 'POST' && url.pathname === '/telemetry') {
|
|
134
|
+
await handleTelemetry(req, res);
|
|
135
|
+
return;
|
|
136
|
+
}
|
|
137
|
+
|
|
138
|
+
sendResponse(res, 404, { error: 'Not found' });
|
|
139
|
+
});
|
|
140
|
+
|
|
141
|
+
server.on('error', (error) => {
|
|
142
|
+
if (error.code === 'EADDRINUSE') {
|
|
143
|
+
log.error(`Port ${port} is already in use.`);
|
|
144
|
+
process.exit(1);
|
|
145
|
+
}
|
|
146
|
+
throw error;
|
|
147
|
+
});
|
|
148
|
+
|
|
149
|
+
server.listen(port, () => {
|
|
150
|
+
log.info(`Daemon running on http://localhost:${port}`);
|
|
151
|
+
savePid(process.pid);
|
|
152
|
+
startTimeoutWatcher(timeoutMinutes);
|
|
153
|
+
});
|
|
154
|
+
}
|
|
155
|
+
|
|
156
|
+
export function runDaemon(port, timeout, corsOptions) {
|
|
157
|
+
const config = loadConfig();
|
|
158
|
+
const actualPort = port || config.port || DEFAULT_PORT;
|
|
159
|
+
const actualTimeout = timeout || config.timeout || DEFAULT_TIMEOUT;
|
|
160
|
+
const corsEnv = process.env.SNAP_DAEMON_CORS || config.cors?.origin || '*';
|
|
161
|
+
startDaemon(actualPort, actualTimeout, { origin: corsEnv });
|
|
162
|
+
}
|
|
163
|
+
|
|
164
|
+
if (import.meta.url === `file://${process.argv[1]}`) {
|
|
165
|
+
const config = loadConfig();
|
|
166
|
+
runDaemon(config.port || DEFAULT_PORT, config.timeout || DEFAULT_TIMEOUT);
|
|
167
|
+
}
|
package/lib/logger.js
ADDED
|
@@ -0,0 +1,17 @@
|
|
|
1
|
+
const PREFIX = '[io-snapshot]';
|
|
2
|
+
|
|
3
|
+
export const log = {
|
|
4
|
+
info: (msg) => console.log(`${PREFIX} ${msg}`),
|
|
5
|
+
success: (msg) => console.log(`\x1b[32m${PREFIX} SUCCESS: ${msg}\x1b[0m`),
|
|
6
|
+
warn: (msg) => console.warn(`\x1b[33m${PREFIX} WARNING: ${msg}\x1b[0m`),
|
|
7
|
+
error: (msg) => console.error(`\x1b[31m${PREFIX} ERROR: ${msg}\x1b[0m`),
|
|
8
|
+
step: (num, msg) => console.log(`\x1b[34m${PREFIX} [Step ${num}] ${msg}\x1b[0m`),
|
|
9
|
+
divider: (label = '') => {
|
|
10
|
+
const line = '━'.repeat(20);
|
|
11
|
+
console.log(`\x1b[90m${line}${label ? ` ${label} ` : ''}${line}\x1b[0m`);
|
|
12
|
+
},
|
|
13
|
+
workflow: (steps) => {
|
|
14
|
+
console.log(`\x1b[36m${PREFIX} Workflow:\x1b[0m`);
|
|
15
|
+
steps.forEach((step, i) => console.log(` ${i + 1}. ${step}`));
|
|
16
|
+
}
|
|
17
|
+
};
|
package/lib/paths.js
ADDED
|
@@ -0,0 +1,55 @@
|
|
|
1
|
+
import fs from 'fs';
|
|
2
|
+
import os from 'os';
|
|
3
|
+
import path from 'path';
|
|
4
|
+
import crypto from 'crypto';
|
|
5
|
+
import {
|
|
6
|
+
TEMP_SESSION_DIR_PREFIX,
|
|
7
|
+
PID_FILE_NAME,
|
|
8
|
+
LOCAL_BACKUP_DIR,
|
|
9
|
+
BACKUP_EXT
|
|
10
|
+
} from './constants.js';
|
|
11
|
+
|
|
12
|
+
let _tempSessionDirPath = null;
|
|
13
|
+
|
|
14
|
+
function getProjectHash() {
|
|
15
|
+
return crypto.createHash('md5').update(process.cwd()).digest('hex');
|
|
16
|
+
}
|
|
17
|
+
|
|
18
|
+
export function getTempSessionDirPath() {
|
|
19
|
+
if (_tempSessionDirPath) {
|
|
20
|
+
return _tempSessionDirPath;
|
|
21
|
+
}
|
|
22
|
+
_tempSessionDirPath = path.join(os.tmpdir(), TEMP_SESSION_DIR_PREFIX + getProjectHash());
|
|
23
|
+
return _tempSessionDirPath;
|
|
24
|
+
}
|
|
25
|
+
|
|
26
|
+
export function ensureTempSessionDir() {
|
|
27
|
+
const dirPath = getTempSessionDirPath();
|
|
28
|
+
if (!fs.existsSync(dirPath)) {
|
|
29
|
+
fs.mkdirSync(dirPath, { recursive: true });
|
|
30
|
+
}
|
|
31
|
+
}
|
|
32
|
+
|
|
33
|
+
export function getPidFilePath() {
|
|
34
|
+
return path.join(getTempSessionDirPath(), PID_FILE_NAME);
|
|
35
|
+
}
|
|
36
|
+
|
|
37
|
+
// Gets the path for the primary backup in the OS temp directory
|
|
38
|
+
export function getPrimaryBackupPath(filePath) {
|
|
39
|
+
const absoluteFilePath = path.resolve(process.cwd(), filePath);
|
|
40
|
+
const relativePath = path.relative(process.cwd(), absoluteFilePath);
|
|
41
|
+
const backupPath = path.join(getTempSessionDirPath(), 'backup', relativePath) + BACKUP_EXT;
|
|
42
|
+
// Ensure the subdirectory structure exists
|
|
43
|
+
fs.mkdirSync(path.dirname(backupPath), { recursive: true });
|
|
44
|
+
return backupPath;
|
|
45
|
+
}
|
|
46
|
+
|
|
47
|
+
// Gets the path for the fallback backup in the project's root directory
|
|
48
|
+
export function getLocalBackupPath(filePath) {
|
|
49
|
+
const absoluteFilePath = path.resolve(process.cwd(), filePath);
|
|
50
|
+
const relativePath = path.relative(process.cwd(), absoluteFilePath);
|
|
51
|
+
const backupPath = path.join(process.cwd(), LOCAL_BACKUP_DIR, relativePath) + BACKUP_EXT;
|
|
52
|
+
// Ensure the subdirectory structure exists
|
|
53
|
+
fs.mkdirSync(path.dirname(backupPath), { recursive: true });
|
|
54
|
+
return backupPath;
|
|
55
|
+
}
|
|
@@ -0,0 +1,42 @@
|
|
|
1
|
+
export function getInlineRecorderCode() {
|
|
2
|
+
const port = process.env.IOSNAP_DAEMON_PORT || 9444;
|
|
3
|
+
|
|
4
|
+
return `
|
|
5
|
+
(function() {
|
|
6
|
+
var port = ${port};
|
|
7
|
+
var _snap_SuperJSON = null;
|
|
8
|
+
function _snap_getSuperJSON() {
|
|
9
|
+
if (_snap_SuperJSON) return _snap_SuperJSON;
|
|
10
|
+
try {
|
|
11
|
+
_snap_SuperJSON = { stringify: JSON.stringify, parse: JSON.parse };
|
|
12
|
+
return _snap_SuperJSON;
|
|
13
|
+
} catch (e) {
|
|
14
|
+
return { stringify: JSON.stringify, parse: JSON.parse };
|
|
15
|
+
}
|
|
16
|
+
}
|
|
17
|
+
function _snap_record(fn, fnName) {
|
|
18
|
+
return new Proxy(fn, {
|
|
19
|
+
async apply(target, thisArg, args) {
|
|
20
|
+
var result = Reflect.apply(target, thisArg, args);
|
|
21
|
+
var snapshot = {
|
|
22
|
+
fnName: fnName,
|
|
23
|
+
args: args,
|
|
24
|
+
result: result,
|
|
25
|
+
at: new Date().toISOString()
|
|
26
|
+
};
|
|
27
|
+
var SJ = _snap_getSuperJSON();
|
|
28
|
+
fetch('http://localhost:' + port + '/telemetry', {
|
|
29
|
+
method: 'POST',
|
|
30
|
+
headers: { 'Content-Type': 'application/json' },
|
|
31
|
+
body: SJ.stringify(snapshot)
|
|
32
|
+
}).catch(function(err) {
|
|
33
|
+
console.warn('[io-snapshot] Telemetry failed:', err.message);
|
|
34
|
+
});
|
|
35
|
+
return result;
|
|
36
|
+
}
|
|
37
|
+
});
|
|
38
|
+
}
|
|
39
|
+
window._snap_record = _snap_record;
|
|
40
|
+
})();
|
|
41
|
+
`.trim();
|
|
42
|
+
}
|
package/lib/recorder.js
ADDED
|
@@ -0,0 +1,38 @@
|
|
|
1
|
+
import SuperJSON from 'superjson';
|
|
2
|
+
|
|
3
|
+
function getPort() {
|
|
4
|
+
if (typeof process !== 'undefined' && process.env.IOSNAP_DAEMON_PORT) {
|
|
5
|
+
return parseInt(process.env.IOSNAP_DAEMON_PORT, 10);
|
|
6
|
+
}
|
|
7
|
+
if (typeof window !== 'undefined') {
|
|
8
|
+
return window.IOSNAP_DAEMON_PORT || 9444;
|
|
9
|
+
}
|
|
10
|
+
return 9444;
|
|
11
|
+
}
|
|
12
|
+
|
|
13
|
+
export function record(fn, fnName) {
|
|
14
|
+
return new Proxy(fn, {
|
|
15
|
+
async apply(target, thisArg, args) {
|
|
16
|
+
const result = await Reflect.apply(target, thisArg, args);
|
|
17
|
+
|
|
18
|
+
const snapshot = {
|
|
19
|
+
fnName,
|
|
20
|
+
args,
|
|
21
|
+
result,
|
|
22
|
+
at: new Date().toISOString()
|
|
23
|
+
};
|
|
24
|
+
|
|
25
|
+
const port = getPort();
|
|
26
|
+
|
|
27
|
+
fetch(`http://localhost:${port}/telemetry`, {
|
|
28
|
+
method: 'POST',
|
|
29
|
+
headers: { 'Content-Type': 'application/json' },
|
|
30
|
+
body: SuperJSON.stringify(snapshot)
|
|
31
|
+
}).catch((error) => {
|
|
32
|
+
console.warn(`[io-snapshot] Failed to send telemetry: ${error.message}`);
|
|
33
|
+
});
|
|
34
|
+
|
|
35
|
+
return result;
|
|
36
|
+
}
|
|
37
|
+
});
|
|
38
|
+
}
|
|
@@ -0,0 +1,348 @@
|
|
|
1
|
+
import { Project } from 'ts-morph';
|
|
2
|
+
import * as babelParser from '@babel/parser';
|
|
3
|
+
import * as babelTraverse from '@babel/traverse';
|
|
4
|
+
import * as babelTypes from '@babel/types';
|
|
5
|
+
import * as babelGenerator from '@babel/generator';
|
|
6
|
+
import fs from 'fs';
|
|
7
|
+
import path from 'path';
|
|
8
|
+
import { glob } from 'tinyglobby';
|
|
9
|
+
import { BACKUP_EXT, SNAPSHOT_FILE, LOCAL_BACKUP_DIR } from './constants.js';
|
|
10
|
+
import { getTempSessionDirPath, getPrimaryBackupPath, getLocalBackupPath } from './paths.js';
|
|
11
|
+
import { getInlineRecorderCode } from './recorder-inline.js';
|
|
12
|
+
import { log } from './logger.js';
|
|
13
|
+
|
|
14
|
+
const { parse } = babelParser;
|
|
15
|
+
const traverse = babelTraverse.default?.default || babelTraverse.default;
|
|
16
|
+
const t = babelTypes;
|
|
17
|
+
const generate = babelGenerator.default?.default || babelGenerator.default || babelGenerator;
|
|
18
|
+
|
|
19
|
+
function getProject() {
|
|
20
|
+
return new Project({
|
|
21
|
+
useInMemoryFileSystem: false,
|
|
22
|
+
});
|
|
23
|
+
}
|
|
24
|
+
|
|
25
|
+
function injectTSFile(filePath) {
|
|
26
|
+
const project = getProject();
|
|
27
|
+
const sourceFile = project.addSourceFileAtPath(filePath);
|
|
28
|
+
let modified = false;
|
|
29
|
+
|
|
30
|
+
log.info(`Injecting recorder into ${filePath}`);
|
|
31
|
+
|
|
32
|
+
const hasRecorder = sourceFile.getText().includes('window._snap_record') || sourceFile.getText().includes('globalThis._snap_record');
|
|
33
|
+
|
|
34
|
+
if (!hasRecorder) {
|
|
35
|
+
const inlineCode = getInlineRecorderCode();
|
|
36
|
+
sourceFile.insertText(0, inlineCode);
|
|
37
|
+
modified = true;
|
|
38
|
+
}
|
|
39
|
+
|
|
40
|
+
const exports = sourceFile.getExportedDeclarations();
|
|
41
|
+
|
|
42
|
+
if (exports.size === 0 && !modified) {
|
|
43
|
+
return false;
|
|
44
|
+
}
|
|
45
|
+
|
|
46
|
+
let functionsWrapped = false;
|
|
47
|
+
exports.forEach((declarations, name) => {
|
|
48
|
+
declarations.forEach(decl => {
|
|
49
|
+
const kind = decl.getKindName();
|
|
50
|
+
|
|
51
|
+
if (kind === 'FunctionDeclaration') {
|
|
52
|
+
const originalName = decl.getName();
|
|
53
|
+
decl.rename(`_snap_${originalName}`);
|
|
54
|
+
decl.setIsExported(false);
|
|
55
|
+
|
|
56
|
+
sourceFile.addVariableStatement({
|
|
57
|
+
isExported: true,
|
|
58
|
+
declarationKind: 'const',
|
|
59
|
+
declarations: [{
|
|
60
|
+
name: originalName,
|
|
61
|
+
initializer: `_snap_record(_snap_${originalName}, '${originalName}')`
|
|
62
|
+
}]
|
|
63
|
+
});
|
|
64
|
+
functionsWrapped = true;
|
|
65
|
+
}
|
|
66
|
+
|
|
67
|
+
if (kind === 'ArrowFunction' || kind === 'FunctionExpression') {
|
|
68
|
+
const variableStmt = decl.getFirstAncestor(a => a.getKindName() === 'VariableStatement');
|
|
69
|
+
if (variableStmt) {
|
|
70
|
+
const declarations = variableStmt.getDeclarations();
|
|
71
|
+
|
|
72
|
+
declarations.forEach(varDecl2 => {
|
|
73
|
+
const name = varDecl2.getName();
|
|
74
|
+
const init = varDecl2.getInitializer();
|
|
75
|
+
if (init && (init.getKindName() === 'ArrowFunction' || init.getKindName() === 'FunctionExpression')) {
|
|
76
|
+
const originalName = `_snap_${name}`;
|
|
77
|
+
init.replaceWithText(`_snap_record(${originalName}, '${name}')`);
|
|
78
|
+
varDecl2.setName(originalName);
|
|
79
|
+
functionsWrapped = true;
|
|
80
|
+
}
|
|
81
|
+
});
|
|
82
|
+
}
|
|
83
|
+
}
|
|
84
|
+
});
|
|
85
|
+
});
|
|
86
|
+
|
|
87
|
+
if (functionsWrapped) {
|
|
88
|
+
modified = true;
|
|
89
|
+
}
|
|
90
|
+
|
|
91
|
+
if (modified) {
|
|
92
|
+
sourceFile.saveSync();
|
|
93
|
+
}
|
|
94
|
+
|
|
95
|
+
return modified;
|
|
96
|
+
}
|
|
97
|
+
|
|
98
|
+
function injectJSFile(filePath) {
|
|
99
|
+
const code = fs.readFileSync(filePath, 'utf8');
|
|
100
|
+
|
|
101
|
+
log.info(`Injecting recorder into ${filePath}`);
|
|
102
|
+
|
|
103
|
+
let ast;
|
|
104
|
+
try {
|
|
105
|
+
ast = parse(code, {
|
|
106
|
+
sourceType: 'module',
|
|
107
|
+
plugins: ['jsx']
|
|
108
|
+
});
|
|
109
|
+
} catch (error) {
|
|
110
|
+
log.error(`Parser error in ${filePath}: ${error.message}`);
|
|
111
|
+
return false;
|
|
112
|
+
}
|
|
113
|
+
|
|
114
|
+
let modified = false;
|
|
115
|
+
|
|
116
|
+
const hasRecorder = ast.program.body.some(
|
|
117
|
+
node => (
|
|
118
|
+
t.isExpressionStatement(node) &&
|
|
119
|
+
node.expression.type === 'CallExpression' &&
|
|
120
|
+
node.expression.callee?.type === 'FunctionExpression' &&
|
|
121
|
+
node.expression.callee.body?.body?.some?.(stmt =>
|
|
122
|
+
stmt.type === 'ExpressionStatement' &&
|
|
123
|
+
stmt.expression.type === 'AssignmentExpression' &&
|
|
124
|
+
stmt.expression.left.property?.name === '_snap_record'
|
|
125
|
+
)
|
|
126
|
+
)
|
|
127
|
+
);
|
|
128
|
+
|
|
129
|
+
if (!hasRecorder) {
|
|
130
|
+
const inlineCode = getInlineRecorderCode();
|
|
131
|
+
const recorderAst = parse(inlineCode, { sourceType: 'script' });
|
|
132
|
+
ast.program.body.unshift(...recorderAst.program.body);
|
|
133
|
+
modified = true;
|
|
134
|
+
}
|
|
135
|
+
|
|
136
|
+
const newNodes = [];
|
|
137
|
+
let functionsWrapped = false;
|
|
138
|
+
|
|
139
|
+
traverse(ast, {
|
|
140
|
+
FunctionDeclaration(path) {
|
|
141
|
+
if (path.node.id && path.node.id.name && !path.node.id.name.startsWith('_snap_')) {
|
|
142
|
+
const { name } = path.node.id;
|
|
143
|
+
const newName = `_snap_${name}`;
|
|
144
|
+
|
|
145
|
+
const newFunction = t.functionDeclaration(
|
|
146
|
+
t.identifier(newName),
|
|
147
|
+
path.node.params,
|
|
148
|
+
path.node.body,
|
|
149
|
+
path.node.generator,
|
|
150
|
+
path.node.async
|
|
151
|
+
);
|
|
152
|
+
|
|
153
|
+
const wrapper = t.variableDeclaration('const', [
|
|
154
|
+
t.variableDeclarator(
|
|
155
|
+
t.identifier(name),
|
|
156
|
+
t.callExpression(t.identifier('_snap_record'), [
|
|
157
|
+
t.identifier(newName),
|
|
158
|
+
t.stringLiteral(name)
|
|
159
|
+
])
|
|
160
|
+
)
|
|
161
|
+
]);
|
|
162
|
+
|
|
163
|
+
if (path.parent.type === 'ExportNamedDeclaration' || path.parent.type === 'ExportDefaultDeclaration') {
|
|
164
|
+
const exportParent = path.findParent(p => p.isExportDeclaration());
|
|
165
|
+
if (exportParent) {
|
|
166
|
+
exportParent.replaceWith(newFunction);
|
|
167
|
+
path.skip();
|
|
168
|
+
newNodes.push(t.exportNamedDeclaration(wrapper));
|
|
169
|
+
functionsWrapped = true;
|
|
170
|
+
}
|
|
171
|
+
} else {
|
|
172
|
+
path.replaceWith(newFunction);
|
|
173
|
+
path.skip();
|
|
174
|
+
newNodes.push(wrapper);
|
|
175
|
+
functionsWrapped = true;
|
|
176
|
+
}
|
|
177
|
+
}
|
|
178
|
+
},
|
|
179
|
+
VariableDeclaration(path) {
|
|
180
|
+
path.get('declarations').forEach(declaratorPath => {
|
|
181
|
+
const declarator = declaratorPath.node;
|
|
182
|
+
if (t.isIdentifier(declarator.id) &&
|
|
183
|
+
!declarator.id.name.startsWith('_snap_') &&
|
|
184
|
+
(t.isArrowFunctionExpression(declarator.init) || t.isFunctionExpression(declarator.init))) {
|
|
185
|
+
|
|
186
|
+
const { name } = declarator.id;
|
|
187
|
+
const newName = `_snap_${name}`;
|
|
188
|
+
|
|
189
|
+
const originalFunction = declarator.init;
|
|
190
|
+
|
|
191
|
+
const newDeclarator = t.variableDeclarator(t.identifier(newName), originalFunction);
|
|
192
|
+
|
|
193
|
+
const wrapper = t.variableDeclaration('const', [
|
|
194
|
+
t.variableDeclarator(
|
|
195
|
+
t.identifier(name),
|
|
196
|
+
t.callExpression(t.identifier('_snap_record'), [
|
|
197
|
+
t.identifier(newName),
|
|
198
|
+
t.stringLiteral(name)
|
|
199
|
+
])
|
|
200
|
+
)
|
|
201
|
+
]);
|
|
202
|
+
|
|
203
|
+
if (path.parent.type === 'ExportNamedDeclaration') {
|
|
204
|
+
const exportParent = path.findParent(p => p.isExportDeclaration());
|
|
205
|
+
if (exportParent) {
|
|
206
|
+
exportParent.replaceWith(t.variableDeclaration(path.node.kind, [newDeclarator]));
|
|
207
|
+
path.skip();
|
|
208
|
+
newNodes.push(t.exportNamedDeclaration(wrapper));
|
|
209
|
+
functionsWrapped = true;
|
|
210
|
+
}
|
|
211
|
+
} else {
|
|
212
|
+
path.replaceWith(t.variableDeclaration(path.node.kind, [newDeclarator]));
|
|
213
|
+
path.skip();
|
|
214
|
+
newNodes.push(wrapper);
|
|
215
|
+
functionsWrapped = true;
|
|
216
|
+
}
|
|
217
|
+
}
|
|
218
|
+
});
|
|
219
|
+
}
|
|
220
|
+
});
|
|
221
|
+
|
|
222
|
+
if (functionsWrapped) {
|
|
223
|
+
modified = true;
|
|
224
|
+
}
|
|
225
|
+
|
|
226
|
+
if (!modified) {
|
|
227
|
+
return false;
|
|
228
|
+
}
|
|
229
|
+
|
|
230
|
+
ast.program.body.push(...newNodes);
|
|
231
|
+
|
|
232
|
+
const output = generate(ast, {}, code);
|
|
233
|
+
fs.writeFileSync(filePath, output.code);
|
|
234
|
+
return true;
|
|
235
|
+
}
|
|
236
|
+
|
|
237
|
+
function restoreFile(filePath) {
|
|
238
|
+
const primaryBackupPath = getPrimaryBackupPath(filePath);
|
|
239
|
+
const localBackupPath = getLocalBackupPath(filePath);
|
|
240
|
+
|
|
241
|
+
if (fs.existsSync(primaryBackupPath)) {
|
|
242
|
+
fs.copyFileSync(primaryBackupPath, filePath);
|
|
243
|
+
} else if (fs.existsSync(localBackupPath)) {
|
|
244
|
+
log.warn(`Primary backup not found for ${filePath}. Restoring from local fallback.`);
|
|
245
|
+
fs.copyFileSync(localBackupPath, filePath);
|
|
246
|
+
} else {
|
|
247
|
+
log.error(`No backup found for ${filePath}.`);
|
|
248
|
+
return false;
|
|
249
|
+
}
|
|
250
|
+
return true;
|
|
251
|
+
}
|
|
252
|
+
|
|
253
|
+
export async function injectRecorder(targetPattern, force = false) {
|
|
254
|
+
const defaultPattern = '**/*.{ts,tsx,js,jsx}';
|
|
255
|
+
const pattern = targetPattern || defaultPattern;
|
|
256
|
+
|
|
257
|
+
const files = await glob(pattern, {
|
|
258
|
+
ignore: ['node_modules/**', '**/node_modules/**', `${LOCAL_BACKUP_DIR}/**`]
|
|
259
|
+
});
|
|
260
|
+
|
|
261
|
+
if (files.length === 0) {
|
|
262
|
+
log.warn(`No files found matching: ${pattern}`);
|
|
263
|
+
return;
|
|
264
|
+
}
|
|
265
|
+
|
|
266
|
+
const sourceFiles = files.filter(f => !f.endsWith(BACKUP_EXT));
|
|
267
|
+
|
|
268
|
+
log.info(`Processing ${sourceFiles.length} files...`);
|
|
269
|
+
|
|
270
|
+
for (const file of sourceFiles) {
|
|
271
|
+
const primaryBackupPath = getPrimaryBackupPath(file);
|
|
272
|
+
|
|
273
|
+
if (fs.existsSync(primaryBackupPath) && !force) {
|
|
274
|
+
log.warn(`Skipping ${file} - already injected. Use --force to re-inject.`);
|
|
275
|
+
continue;
|
|
276
|
+
}
|
|
277
|
+
|
|
278
|
+
if (fs.existsSync(primaryBackupPath) && force) {
|
|
279
|
+
fs.copyFileSync(primaryBackupPath, file);
|
|
280
|
+
}
|
|
281
|
+
|
|
282
|
+
const localBackupPath = getLocalBackupPath(file);
|
|
283
|
+
fs.copyFileSync(file, primaryBackupPath);
|
|
284
|
+
fs.copyFileSync(file, localBackupPath);
|
|
285
|
+
|
|
286
|
+
try {
|
|
287
|
+
let injected = false;
|
|
288
|
+
if (file.endsWith('.ts') || file.endsWith('.tsx')) {
|
|
289
|
+
injectTSFile(file);
|
|
290
|
+
injected = true; // Assume true for TS files for now
|
|
291
|
+
} else {
|
|
292
|
+
injected = injectJSFile(file);
|
|
293
|
+
}
|
|
294
|
+
|
|
295
|
+
if (injected) {
|
|
296
|
+
log.success(`Injected into ${file}`);
|
|
297
|
+
} else {
|
|
298
|
+
log.info(`No functions to inject in ${file}, skipping.`);
|
|
299
|
+
// Restore the original file since no changes were made
|
|
300
|
+
restoreFile(file);
|
|
301
|
+
}
|
|
302
|
+
} catch (error) {
|
|
303
|
+
log.error(`Failed to inject ${file}: ${error.message}`);
|
|
304
|
+
restoreFile(file);
|
|
305
|
+
}
|
|
306
|
+
}
|
|
307
|
+
}
|
|
308
|
+
|
|
309
|
+
export async function restore(targetPattern, keepSnapshots = false) {
|
|
310
|
+
let filesToRestore = [];
|
|
311
|
+
|
|
312
|
+
if (targetPattern) {
|
|
313
|
+
const files = await glob(targetPattern);
|
|
314
|
+
filesToRestore = files.filter(f => !f.endsWith(BACKUP_EXT));
|
|
315
|
+
} else {
|
|
316
|
+
const backupDir = path.join(process.cwd(), LOCAL_BACKUP_DIR);
|
|
317
|
+
if (fs.existsSync(backupDir)) {
|
|
318
|
+
const bakFiles = await glob(`${backupDir}/**/*${BACKUP_EXT}`);
|
|
319
|
+
filesToRestore = bakFiles.map(f => {
|
|
320
|
+
const relativePath = path.relative(backupDir, f);
|
|
321
|
+
return relativePath.slice(0, -BACKUP_EXT.length);
|
|
322
|
+
});
|
|
323
|
+
}
|
|
324
|
+
}
|
|
325
|
+
|
|
326
|
+
for (const file of filesToRestore) {
|
|
327
|
+
if (restoreFile(file)) {
|
|
328
|
+
log.success(`Restored ${file}`);
|
|
329
|
+
}
|
|
330
|
+
}
|
|
331
|
+
|
|
332
|
+
const snapshotPath = path.resolve(process.cwd(), SNAPSHOT_FILE);
|
|
333
|
+
if (!keepSnapshots && fs.existsSync(snapshotPath)) {
|
|
334
|
+
fs.unlinkSync(snapshotPath);
|
|
335
|
+
log.success(`Removed ${SNAPSHOT_FILE}`);
|
|
336
|
+
}
|
|
337
|
+
|
|
338
|
+
const tempSessionDirPath = getTempSessionDirPath();
|
|
339
|
+
if (fs.existsSync(tempSessionDirPath)) {
|
|
340
|
+
fs.rmSync(tempSessionDirPath, { recursive: true, force: true });
|
|
341
|
+
}
|
|
342
|
+
|
|
343
|
+
const localBackupDirPath = path.join(process.cwd(), LOCAL_BACKUP_DIR);
|
|
344
|
+
if (fs.existsSync(localBackupDirPath)) {
|
|
345
|
+
fs.rmSync(localBackupDirPath, { recursive: true, force: true });
|
|
346
|
+
log.success(`Removed local backup directory: ${LOCAL_BACKUP_DIR}`);
|
|
347
|
+
}
|
|
348
|
+
}
|
package/lib/verifier.js
ADDED
|
@@ -0,0 +1,165 @@
|
|
|
1
|
+
import SuperJSON from 'superjson';
|
|
2
|
+
import fs from 'fs';
|
|
3
|
+
import path from 'path';
|
|
4
|
+
import { pathToFileURL } from 'url';
|
|
5
|
+
import diff from 'microdiff';
|
|
6
|
+
import { glob } from 'tinyglobby';
|
|
7
|
+
import { SNAPSHOT_FILE } from './constants.js';
|
|
8
|
+
import { loadConfig } from './config.js';
|
|
9
|
+
import { log } from './logger.js';
|
|
10
|
+
|
|
11
|
+
export async function verify(newFn, fnName) {
|
|
12
|
+
const config = loadConfig();
|
|
13
|
+
const port = config.port || 9444;
|
|
14
|
+
|
|
15
|
+
if (!fs.existsSync(path.resolve(process.cwd(), SNAPSHOT_FILE))) {
|
|
16
|
+
log.error(`No snapshots found! Run 'io-snapshot record' first.`);
|
|
17
|
+
return { passed: false, error: 'no_snapshots' };
|
|
18
|
+
}
|
|
19
|
+
|
|
20
|
+
const fileContent = fs.readFileSync(path.resolve(process.cwd(), SNAPSHOT_FILE), 'utf8');
|
|
21
|
+
const snapshots = fileContent.split('\n')
|
|
22
|
+
.filter(line => line.trim())
|
|
23
|
+
.map(line => {
|
|
24
|
+
try {
|
|
25
|
+
const parsed = SuperJSON.parse(line);
|
|
26
|
+
if (parsed?.fnName) return parsed;
|
|
27
|
+
} catch (error) {
|
|
28
|
+
log.warn(`Failed to parse snapshot line: ${error.message}`);
|
|
29
|
+
}
|
|
30
|
+
try {
|
|
31
|
+
const parsed = JSON.parse(line);
|
|
32
|
+
if (parsed?.fnName) return parsed;
|
|
33
|
+
} catch (error) {
|
|
34
|
+
log.warn(`Failed to parse snapshot line as JSON: ${error.message}`);
|
|
35
|
+
}
|
|
36
|
+
return null;
|
|
37
|
+
})
|
|
38
|
+
.filter(s => s?.fnName === fnName);
|
|
39
|
+
|
|
40
|
+
if (snapshots.length === 0) {
|
|
41
|
+
log.info(`No snapshots found for function '${fnName}'.`);
|
|
42
|
+
return { passed: true, skipped: true };
|
|
43
|
+
}
|
|
44
|
+
|
|
45
|
+
let allPassed = true;
|
|
46
|
+
let failedCount = 0;
|
|
47
|
+
|
|
48
|
+
for (const snap of snapshots) {
|
|
49
|
+
const newResult = await newFn(...snap.args);
|
|
50
|
+
const changes = diff(snap.result, newResult);
|
|
51
|
+
|
|
52
|
+
if (changes.length > 0) {
|
|
53
|
+
log.error(`Drift detected in ${fnName}!`);
|
|
54
|
+
console.dir(changes, { depth: null });
|
|
55
|
+
allPassed = false;
|
|
56
|
+
failedCount++;
|
|
57
|
+
} else {
|
|
58
|
+
log.success(`${fnName} passed semantic check.`);
|
|
59
|
+
}
|
|
60
|
+
}
|
|
61
|
+
|
|
62
|
+
if (allPassed && snapshots.length > 0) {
|
|
63
|
+
log.success(`All ${snapshots.length} snapshots passed for ${fnName}.`);
|
|
64
|
+
}
|
|
65
|
+
|
|
66
|
+
return { passed: allPassed, failed: failedCount, total: snapshots.length };
|
|
67
|
+
}
|
|
68
|
+
|
|
69
|
+
export async function verifyDir(targetPattern) {
|
|
70
|
+
if (!fs.existsSync(path.resolve(process.cwd(), SNAPSHOT_FILE))) {
|
|
71
|
+
log.error(`No snapshots found! Run 'io-snapshot record' first.`);
|
|
72
|
+
process.exit(1);
|
|
73
|
+
}
|
|
74
|
+
|
|
75
|
+
const fileContent = fs.readFileSync(path.resolve(process.cwd(), SNAPSHOT_FILE), 'utf8');
|
|
76
|
+
const allSnapshots = fileContent.split('\n')
|
|
77
|
+
.filter(line => line.trim())
|
|
78
|
+
.map(line => {
|
|
79
|
+
try {
|
|
80
|
+
const parsed = SuperJSON.parse(line);
|
|
81
|
+
if (parsed?.fnName) return parsed;
|
|
82
|
+
} catch (error) {
|
|
83
|
+
log.warn(`Failed to parse snapshot line: ${error.message}`);
|
|
84
|
+
}
|
|
85
|
+
try {
|
|
86
|
+
const parsed = JSON.parse(line);
|
|
87
|
+
if (parsed?.fnName) return parsed;
|
|
88
|
+
} catch (error) {
|
|
89
|
+
log.warn(`Failed to parse snapshot line as JSON: ${error.message}`);
|
|
90
|
+
}
|
|
91
|
+
return null;
|
|
92
|
+
})
|
|
93
|
+
.filter(s => s?.fnName);
|
|
94
|
+
|
|
95
|
+
const fnNames = [...new Set(allSnapshots.map(s => s.fnName))];
|
|
96
|
+
|
|
97
|
+
if (fnNames.length === 0) {
|
|
98
|
+
log.warn('No snapshots to verify.');
|
|
99
|
+
return;
|
|
100
|
+
}
|
|
101
|
+
|
|
102
|
+
let allFiles = [];
|
|
103
|
+
|
|
104
|
+
if (targetPattern) {
|
|
105
|
+
const files = await glob(targetPattern);
|
|
106
|
+
const tsFiles = files.filter(f => (f.endsWith('.ts') || f.endsWith('.tsx')) && !f.includes('.snap.bak'));
|
|
107
|
+
const jsFiles = files.filter(f => (f.endsWith('.js') || f.endsWith('.jsx')) && !f.includes('.snap.bak'));
|
|
108
|
+
allFiles = [...tsFiles, ...jsFiles];
|
|
109
|
+
} else {
|
|
110
|
+
const allSourceFiles = await glob('**/*.{ts,tsx,js,jsx}', {
|
|
111
|
+
ignore: ['node_modules/**', '**/node_modules/**']
|
|
112
|
+
});
|
|
113
|
+
allFiles = allSourceFiles.filter(f => !f.includes('.snap.bak'));
|
|
114
|
+
}
|
|
115
|
+
|
|
116
|
+
if (allFiles.length === 0) {
|
|
117
|
+
log.warn('No source files found.');
|
|
118
|
+
return;
|
|
119
|
+
}
|
|
120
|
+
|
|
121
|
+
log.info(`Verifying ${fnNames.length} functions against ${allSnapshots.length} snapshots...`);
|
|
122
|
+
|
|
123
|
+
let totalPassed = true;
|
|
124
|
+
|
|
125
|
+
for (const fnName of fnNames) {
|
|
126
|
+
let found = false;
|
|
127
|
+
|
|
128
|
+
for (const file of allFiles) {
|
|
129
|
+
try {
|
|
130
|
+
const absolutePath = path.resolve(process.cwd(), file);
|
|
131
|
+
const fileUrl = pathToFileURL(absolutePath).href;
|
|
132
|
+
const mod = await import(fileUrl);
|
|
133
|
+
const fn = mod[fnName];
|
|
134
|
+
|
|
135
|
+
if (!fn) {
|
|
136
|
+
continue;
|
|
137
|
+
}
|
|
138
|
+
|
|
139
|
+
found = true;
|
|
140
|
+
log.info(`Checking function '${fnName}' in ${file}`);
|
|
141
|
+
const result = await verify(fn, fnName);
|
|
142
|
+
if (result && !result.passed) {
|
|
143
|
+
totalPassed = false;
|
|
144
|
+
}
|
|
145
|
+
break;
|
|
146
|
+
} catch (error) {
|
|
147
|
+
log.warn(`Could not load function from ${file}: ${error.message}`);
|
|
148
|
+
}
|
|
149
|
+
}
|
|
150
|
+
|
|
151
|
+
if (!found) {
|
|
152
|
+
log.warn(`Function '${fnName}' not found in any source file.`);
|
|
153
|
+
}
|
|
154
|
+
}
|
|
155
|
+
|
|
156
|
+
if (totalPassed) {
|
|
157
|
+
log.divider('SUCCESS');
|
|
158
|
+
log.success('All verifications passed!');
|
|
159
|
+
process.exit(0);
|
|
160
|
+
} else {
|
|
161
|
+
log.divider('FAILURE');
|
|
162
|
+
log.error('Some verifications failed.');
|
|
163
|
+
process.exit(1);
|
|
164
|
+
}
|
|
165
|
+
}
|
package/package.json
ADDED
|
@@ -0,0 +1,59 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "@kendroger/io-snapshot",
|
|
3
|
+
"version": "1.1.0",
|
|
4
|
+
"description": "Capture and compare function behavior snapshots for zero-regression refactoring.",
|
|
5
|
+
"keywords": [
|
|
6
|
+
"refactor",
|
|
7
|
+
"testing",
|
|
8
|
+
"semantic",
|
|
9
|
+
"automation",
|
|
10
|
+
"ai",
|
|
11
|
+
"diff",
|
|
12
|
+
"test"
|
|
13
|
+
],
|
|
14
|
+
"homepage": "https://github.com/kenDRoger/io-snapshot#readme",
|
|
15
|
+
"bugs": {
|
|
16
|
+
"url": "https://github.com/kenDRoger/io-snapshot/issues"
|
|
17
|
+
},
|
|
18
|
+
"repository": {
|
|
19
|
+
"type": "git",
|
|
20
|
+
"url": "git+https://github.com/kenDRoger/io-snapshot.git"
|
|
21
|
+
},
|
|
22
|
+
"license": "MIT",
|
|
23
|
+
"author": "kendroger",
|
|
24
|
+
"type": "module",
|
|
25
|
+
"exports": {
|
|
26
|
+
".": "./lib/recorder.js",
|
|
27
|
+
"./recorder": "./lib/recorder.js",
|
|
28
|
+
"./verifier": "./lib/verifier.js",
|
|
29
|
+
"./transformer": "./lib/transformer.js",
|
|
30
|
+
"./daemon": "./lib/daemon.js",
|
|
31
|
+
"./constants": "./lib/constants.js",
|
|
32
|
+
"./config": "./lib/config.js",
|
|
33
|
+
"./logger": "./lib/logger.js"
|
|
34
|
+
},
|
|
35
|
+
"main": "./lib/recorder.js",
|
|
36
|
+
"bin": {
|
|
37
|
+
"io-snapshot": "bin/cli.js"
|
|
38
|
+
},
|
|
39
|
+
"directories": {
|
|
40
|
+
"lib": "lib"
|
|
41
|
+
},
|
|
42
|
+
"scripts": {
|
|
43
|
+
"test": "vitest"
|
|
44
|
+
},
|
|
45
|
+
"dependencies": {
|
|
46
|
+
"@babel/generator": "^7.23.0",
|
|
47
|
+
"@babel/parser": "^7.23.0",
|
|
48
|
+
"@babel/traverse": "^7.23.0",
|
|
49
|
+
"@babel/types": "^7.23.0",
|
|
50
|
+
"commander": "^12.0.0",
|
|
51
|
+
"microdiff": "^1.5.0",
|
|
52
|
+
"superjson": "^2.2.1",
|
|
53
|
+
"tinyglobby": "^0.2.0",
|
|
54
|
+
"ts-morph": "^21.0.0"
|
|
55
|
+
},
|
|
56
|
+
"devDependencies": {
|
|
57
|
+
"vitest": "^4.0.18"
|
|
58
|
+
}
|
|
59
|
+
}
|