@ophan/cli 0.0.1 → 0.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,82 +1,72 @@
1
1
  # @ophan/cli
2
2
 
3
- Command-line interface for analyzing codebases and managing Ophan databases.
3
+ AI-powered security analysis and documentation for your codebase. Detects vulnerabilities, maps data flows, and auto-documents every function — from the command line.
4
4
 
5
- ## Usage
5
+ Part of [Ophan](https://ophan.dev). Works standalone or alongside the [VS Code extension](https://marketplace.visualstudio.com/items?itemName=ophan.ophan).
6
6
 
7
- ### Repository Analysis
8
- ```bash
9
- # Analyze a repo (creates .ophan/index.db)
10
- npx ophan analyze --path /path/to/repo
7
+ ## Quick Start
11
8
 
12
- # Or as dev dependency
13
- npm install --save-dev ophan
14
- npx ophan analyze
9
+ ```bash
10
+ npx @ophan/cli analyze
15
11
  ```
16
12
 
17
- ### Commands
18
- - `ophan analyze` — Scan repo, extract functions, analyze with Claude, store by content hash
19
- - `ophan sync` — Push/pull analysis to/from Supabase (requires auth)
20
- - `ophan gc` — Garbage collect orphaned analysis entries (manual only, never automatic, 30-day grace period)
13
+ That's it. Ophan scans your repo, extracts every function, and sends them to Claude for security analysis and documentation. Results are stored locally in `.ophan/index.db`.
21
14
 
22
- ## How Analysis Works
15
+ You'll need an [Anthropic API key](https://console.anthropic.com/) set as `ANTHROPIC_API_KEY` in your environment or a `.env` file.
23
16
 
24
- ### Initial Run
25
- 1. Discovers all TypeScript/JavaScript source files
26
- 2. Extracts functions, computes SHA256 content hash for each
27
- 3. Checks local DB — skips functions with existing analysis for that hash
28
- 4. Sends new functions to Claude for analysis
29
- 5. Stores results in `function_analysis` (keyed by content_hash)
30
- 6. Updates `file_functions` with file path, function name, content_hash, mtime
17
+ ## What You Get
31
18
 
32
- ### Incremental Updates
33
- 1. Checks `file_mtime` in `file_functions` — skips entirely unchanged files (no parsing needed)
34
- 2. Re-parses changed files, re-hashes functions
35
- 3. Only analyzes functions with new/unknown hashes
36
- 4. Updates `file_functions` mappings
19
+ For every function in your codebase:
37
20
 
38
- ### Garbage Collection
39
- `ophan gc` is manual only never runs automatically. This protects against branch switching scenarios (e.g. function deleted on feature branch but still exists on main).
21
+ - **Security analysis** — SQL injection, XSS, hardcoded secrets, path traversal, unsanitized input
22
+ - **Data flow tags**which functions touch user input, PII, credentials, databases, external APIs
23
+ - **Documentation** — plain-English descriptions, parameter docs, return type docs
40
24
 
41
- How it works:
42
- 1. Scans codebase, computes all current hashes
43
- 2. Compares to stored `function_analysis` entries
44
- 3. Only deletes entries not seen in grace period (default 30 days, configurable)
45
- 4. Updates `last_referenced_at` in Supabase for synced entries
46
- 5. If synced, GC'd entries can be re-downloaded on next sync instead of re-analyzed
25
+ ## Commands
47
26
 
48
- `file_functions` is ephemeral — rebuilt on every scan. `function_analysis` is persistent until explicit GC.
27
+ ```bash
28
+ npx @ophan/cli analyze # Analyze current directory
29
+ npx @ophan/cli analyze --path . # Analyze a specific path
30
+ npx @ophan/cli sync # Sync results to ophan.dev (optional)
31
+ npx @ophan/cli gc # Clean up old analysis entries
32
+ ```
49
33
 
50
- ### Sync
51
- `ophan sync` pushes/pulls `function_analysis` rows to/from Supabase:
52
- - Insert-only (content-addressed = immutable, no conflicts)
53
- - Free users: scoped to `user_id`
54
- - Team users: scoped to `team_id` (new members pull existing analysis)
34
+ ### As a dev dependency
55
35
 
56
- ## Design Principles
36
+ ```bash
37
+ npm install --save-dev @ophan/cli
38
+ npx ophan analyze
39
+ ```
40
+
41
+ Add to your team's repo so everyone gets the CLI on `npm install`. Analysis is cached by content hash — unchanged functions are never re-analyzed.
42
+
43
+ ## How It Works
57
44
 
58
- ### Local-First
59
- All analysis stored in per-repo `.ophan/index.db` (gitignored). No cloud required for core functionality. Supabase sync is optional.
45
+ 1. Parses your source files using language-native ASTs (TypeScript compiler API, Python's `ast` module)
46
+ 2. Extracts every function and computes a SHA256 content hash
47
+ 3. Skips functions that haven't changed since last analysis
48
+ 4. Sends new/changed functions to Claude for security and documentation analysis
49
+ 5. Stores results locally in `.ophan/index.db` (gitignored)
60
50
 
61
- ### Dev Dependency Model
62
- Add to `devDependencies` — one engineer installs, whole team gets CLI on `npm install`. Bottom-up adoption path.
51
+ Supports **TypeScript**, **JavaScript**, and **Python**.
63
52
 
64
- ### Non-Invasive
65
- Database in hidden `.ophan/` directory, gitignored. Does not modify source code or project configuration.
53
+ ## Cloud Sync
66
54
 
67
- ### Content-Addressed
68
- Analysis keyed by function content hash, not file path. Branch-agnostic, merge-friendly, deduplication built in.
55
+ Optionally sync your analysis to [ophan.dev](https://ophan.dev) for a web dashboard, team sharing, and cross-machine access.
56
+
57
+ ```bash
58
+ npx @ophan/cli login # Authenticate with ophan.dev
59
+ npx @ophan/cli analyze # Auto-pulls from cloud, then analyzes remaining
60
+ npx @ophan/cli sync # Push new results to cloud
61
+ ```
69
62
 
70
- ## Output
63
+ ## Resources
71
64
 
72
- Creates `.ophan/index.db` containing:
73
- - `function_analysis` — content_hash, analysis JSON, model version, timestamp
74
- - `file_functions` file path, function name, content_hash, file mtime
65
+ - [Documentation](https://docs.ophan.dev)
66
+ - [CLI Reference](https://docs.ophan.dev/cli/commands)
67
+ - [VS Code Extension](https://marketplace.visualstudio.com/items?itemName=ophan.ophan)
68
+ - [GitHub](https://github.com/nicholasgriffintn/ophan)
75
69
 
76
- ## CI/CD Integration
70
+ ## License
77
71
 
78
- ### GitHub Action (Planned)
79
- Run `ophan analyze` on PRs:
80
- - Comment with new function documentation
81
- - Fail if new security warnings introduced
82
- - Show data flow changes in PR review
72
+ MIT
package/package.json CHANGED
@@ -1,7 +1,9 @@
1
1
  {
2
2
  "name": "@ophan/cli",
3
- "version": "0.0.1",
4
- "files": ["dist"],
3
+ "version": "0.0.2",
4
+ "files": [
5
+ "dist"
6
+ ],
5
7
  "bin": {
6
8
  "ophan": "./dist/index.js"
7
9
  },
package/dist/sync.test.js DELETED
@@ -1,288 +0,0 @@
1
- "use strict";
2
- var __importDefault = (this && this.__importDefault) || function (mod) {
3
- return (mod && mod.__esModule) ? mod : { "default": mod };
4
- };
5
- Object.defineProperty(exports, "__esModule", { value: true });
6
- const vitest_1 = require("vitest");
7
- const better_sqlite3_1 = __importDefault(require("better-sqlite3"));
8
- const test_utils_1 = require("./test-utils");
9
- const sync_1 = require("./sync");
10
- const test_utils_2 = require("@ophan/core/test-utils");
11
- const USER_A = "user-aaaa-aaaa-aaaa-aaaaaaaaaaaa";
12
- const USER_B = "user-bbbb-bbbb-bbbb-bbbbbbbbbbbb";
13
- const REPO_ID = "repo-1";
14
- function defaultRepoResponse() {
15
- return { data: { id: REPO_ID }, error: null };
16
- }
17
- // ============ syncToSupabase ============
18
- (0, vitest_1.describe)("syncToSupabase", () => {
19
- const cleanups = [];
20
- function tracked(result) {
21
- cleanups.push(result.cleanup);
22
- return result;
23
- }
24
- (0, vitest_1.afterEach)(() => {
25
- cleanups.forEach((fn) => fn());
26
- cleanups.length = 0;
27
- });
28
- (0, vitest_1.describe)("account change detection", () => {
29
- (0, vitest_1.it)("first sync — stores user_id, no reset", async () => {
30
- const { rootPath, dbPath } = tracked((0, test_utils_1.createTempDb)());
31
- const mock = (0, test_utils_1.createMockSupabase)({
32
- "repos.upsert": defaultRepoResponse(),
33
- });
34
- await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
35
- const db = new better_sqlite3_1.default(dbPath);
36
- const row = db
37
- .prepare("SELECT value FROM sync_meta WHERE key = 'last_synced_user_id'")
38
- .get();
39
- db.close();
40
- (0, vitest_1.expect)(row.value).toBe(USER_A);
41
- });
42
- (0, vitest_1.it)("same user re-sync — no reset of synced_at", async () => {
43
- const { rootPath, dbPath } = tracked((0, test_utils_1.createTempDb)((db) => {
44
- db.prepare("INSERT INTO sync_meta (key, value) VALUES ('last_synced_user_id', ?)").run(USER_A);
45
- (0, test_utils_2.insertAnalysisPair)(db, "hash1", { syncedAt: 1000 });
46
- }));
47
- const mock = (0, test_utils_1.createMockSupabase)({
48
- "repos.upsert": defaultRepoResponse(),
49
- });
50
- await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
51
- // synced_at should still be set (not reset to NULL)
52
- const db = new better_sqlite3_1.default(dbPath);
53
- const row = db
54
- .prepare("SELECT synced_at FROM function_analysis WHERE content_hash = 'hash1' AND analysis_type = 'documentation'")
55
- .get();
56
- db.close();
57
- (0, vitest_1.expect)(row.synced_at).toBe(1000);
58
- });
59
- (0, vitest_1.it)("different user — resets synced_at to NULL and clears GC", async () => {
60
- const { rootPath, dbPath } = tracked((0, test_utils_1.createTempDb)((db) => {
61
- db.prepare("INSERT INTO sync_meta (key, value) VALUES ('last_synced_user_id', ?)").run(USER_A);
62
- (0, test_utils_2.insertAnalysisPair)(db, "hash1", { syncedAt: 1000 });
63
- db.prepare("INSERT INTO function_gc (content_hash, analysis_type, gc_at) VALUES ('old', 'documentation', 999)").run();
64
- }));
65
- const mock = (0, test_utils_1.createMockSupabase)({
66
- "repos.upsert": defaultRepoResponse(),
67
- });
68
- await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_B);
69
- const db = new better_sqlite3_1.default(dbPath);
70
- // synced_at should be NULL (reset) then re-set by the push
71
- const meta = db
72
- .prepare("SELECT value FROM sync_meta WHERE key = 'last_synced_user_id'")
73
- .get();
74
- const gcCount = db
75
- .prepare("SELECT count(*) as c FROM function_gc")
76
- .get();
77
- db.close();
78
- (0, vitest_1.expect)(meta.value).toBe(USER_B);
79
- (0, vitest_1.expect)(gcCount.c).toBe(0);
80
- });
81
- });
82
- (0, vitest_1.describe)("push batching", () => {
83
- (0, vitest_1.it)("pushes only unsynced rows (synced_at IS NULL)", async () => {
84
- const { rootPath } = tracked((0, test_utils_1.createTempDb)((db) => {
85
- (0, test_utils_2.insertAnalysisPair)(db, "hash-synced", { syncedAt: 1000 });
86
- (0, test_utils_2.insertAnalysisPair)(db, "hash-unsynced", { syncedAt: null });
87
- }));
88
- const mock = (0, test_utils_1.createMockSupabase)({
89
- "repos.upsert": defaultRepoResponse(),
90
- });
91
- const result = await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
92
- // Only hash-unsynced's 2 rows (doc + sec) should be pushed
93
- (0, vitest_1.expect)(result.pushed).toBe(2);
94
- });
95
- (0, vitest_1.it)("marks synced_at after push", async () => {
96
- const { rootPath, dbPath } = tracked((0, test_utils_1.createTempDb)((db) => {
97
- (0, test_utils_2.insertAnalysisPair)(db, "hash1", { syncedAt: null });
98
- }));
99
- const mock = (0, test_utils_1.createMockSupabase)({
100
- "repos.upsert": defaultRepoResponse(),
101
- });
102
- await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
103
- const db = new better_sqlite3_1.default(dbPath);
104
- const rows = db
105
- .prepare("SELECT synced_at FROM function_analysis WHERE content_hash = 'hash1'")
106
- .all();
107
- db.close();
108
- for (const row of rows) {
109
- (0, vitest_1.expect)(row.synced_at).not.toBeNull();
110
- (0, vitest_1.expect)(row.synced_at).toBeGreaterThan(0);
111
- }
112
- });
113
- });
114
- (0, vitest_1.describe)("location full-sync", () => {
115
- (0, vitest_1.it)("DELETE + INSERT pattern for file_functions", async () => {
116
- const { rootPath } = tracked((0, test_utils_1.createTempDb)((db) => {
117
- (0, test_utils_2.insertFileFunction)(db, "src/a.ts", "fn1", "hash1");
118
- (0, test_utils_2.insertFileFunction)(db, "src/b.ts", "fn2", "hash2");
119
- }));
120
- const mock = (0, test_utils_1.createMockSupabase)({
121
- "repos.upsert": defaultRepoResponse(),
122
- });
123
- const result = await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
124
- (0, vitest_1.expect)(result.locations).toBe(2);
125
- // Verify DELETE was called before INSERT
126
- const locationCalls = mock.calls.filter((c) => c.table === "function_locations");
127
- const deleteIdx = locationCalls.findIndex((c) => c.method === "delete");
128
- const insertIdx = locationCalls.findIndex((c) => c.method === "insert");
129
- (0, vitest_1.expect)(deleteIdx).toBeLessThan(insertIdx);
130
- });
131
- (0, vitest_1.it)("empty file_functions — only deletes remote", async () => {
132
- const { rootPath } = tracked((0, test_utils_1.createTempDb)());
133
- const mock = (0, test_utils_1.createMockSupabase)({
134
- "repos.upsert": defaultRepoResponse(),
135
- });
136
- const result = await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
137
- (0, vitest_1.expect)(result.locations).toBe(0);
138
- // Should still call delete (clear remote)
139
- const deleteCalls = mock.calls.filter((c) => c.table === "function_locations" && c.method === "delete");
140
- (0, vitest_1.expect)(deleteCalls.length).toBe(1);
141
- // Should NOT call insert
142
- const insertCalls = mock.calls.filter((c) => c.table === "function_locations" && c.method === "insert");
143
- (0, vitest_1.expect)(insertCalls.length).toBe(0);
144
- });
145
- });
146
- (0, vitest_1.describe)("GC tombstone propagation", () => {
147
- (0, vitest_1.it)("processes unsynced GC entries and cleans them up", async () => {
148
- const { rootPath, dbPath } = tracked((0, test_utils_1.createTempDb)((db) => {
149
- db.prepare("INSERT INTO function_gc (content_hash, analysis_type, gc_at) VALUES (?, ?, ?)").run("dead-hash", "documentation", Math.floor(Date.now() / 1000));
150
- }));
151
- const mock = (0, test_utils_1.createMockSupabase)({
152
- "repos.upsert": defaultRepoResponse(),
153
- });
154
- const result = await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
155
- (0, vitest_1.expect)(result.gcProcessed).toBe(1);
156
- // GC rows should be cleaned up (synced then deleted)
157
- const db = new better_sqlite3_1.default(dbPath);
158
- const gcCount = db
159
- .prepare("SELECT count(*) as c FROM function_gc")
160
- .get();
161
- db.close();
162
- (0, vitest_1.expect)(gcCount.c).toBe(0);
163
- });
164
- (0, vitest_1.it)("type-specific tombstone — deletes specific analysis_type", async () => {
165
- const { rootPath } = tracked((0, test_utils_1.createTempDb)((db) => {
166
- db.prepare("INSERT INTO function_gc (content_hash, analysis_type, gc_at) VALUES (?, ?, ?)").run("dead-hash", "security", Math.floor(Date.now() / 1000));
167
- }));
168
- const mock = (0, test_utils_1.createMockSupabase)({
169
- "repos.upsert": defaultRepoResponse(),
170
- });
171
- await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
172
- // Should delete with analysis_type filter
173
- const deleteCalls = mock.calls.filter((c) => c.table === "function_analysis" && c.method === "delete");
174
- (0, vitest_1.expect)(deleteCalls.length).toBe(1);
175
- // Should have eq('analysis_type', 'security') in the chain
176
- const eqCalls = mock.calls.filter((c) => c.table === "function_analysis" &&
177
- c.method === "eq" &&
178
- c.args[0] === "analysis_type");
179
- (0, vitest_1.expect)(eqCalls.length).toBe(1);
180
- (0, vitest_1.expect)(eqCalls[0].args[1]).toBe("security");
181
- });
182
- (0, vitest_1.it)("NULL analysis_type (legacy) — deletes all types for hash", async () => {
183
- const { rootPath } = tracked((0, test_utils_1.createTempDb)((db) => {
184
- db.prepare("INSERT INTO function_gc (content_hash, analysis_type, gc_at) VALUES (?, ?, ?)").run("dead-hash", null, Math.floor(Date.now() / 1000));
185
- }));
186
- const mock = (0, test_utils_1.createMockSupabase)({
187
- "repos.upsert": defaultRepoResponse(),
188
- });
189
- await (0, sync_1.syncToSupabase)(rootPath, mock.client, USER_A);
190
- // Should delete WITHOUT analysis_type filter (no eq('analysis_type', ...))
191
- const eqCalls = mock.calls.filter((c) => c.table === "function_analysis" &&
192
- c.method === "eq" &&
193
- c.args[0] === "analysis_type");
194
- (0, vitest_1.expect)(eqCalls.length).toBe(0);
195
- });
196
- });
197
- });
198
- // ============ pullFromSupabase ============
199
- (0, vitest_1.describe)("pullFromSupabase", () => {
200
- const cleanups = [];
201
- function tracked(result) {
202
- cleanups.push(result.cleanup);
203
- return result;
204
- }
205
- (0, vitest_1.afterEach)(() => {
206
- cleanups.forEach((fn) => fn());
207
- cleanups.length = 0;
208
- });
209
- (0, vitest_1.it)("empty missing hashes — returns early with 0", async () => {
210
- const { rootPath } = tracked((0, test_utils_1.createTempDb)());
211
- const mock = (0, test_utils_1.createMockSupabase)();
212
- const result = await (0, sync_1.pullFromSupabase)(rootPath, mock.client, USER_A, REPO_ID, []);
213
- (0, vitest_1.expect)(result.pulled).toBe(0);
214
- // Should NOT call supabase at all
215
- (0, vitest_1.expect)(mock.calls.length).toBe(0);
216
- });
217
- (0, vitest_1.it)("applies defensive defaults for missing fields", async () => {
218
- const { rootPath, dbPath } = tracked((0, test_utils_1.createTempDb)());
219
- const mock = (0, test_utils_1.createMockSupabase)({
220
- "function_analysis.select": {
221
- data: [
222
- {
223
- content_hash: "hash1",
224
- analysis_type: "documentation",
225
- analysis: { description: "Test" },
226
- model_version: "claude-3",
227
- schema_version: null, // missing — should default to 1
228
- language: null, // missing — should default to 'typescript'
229
- entity_type: null, // missing — should default to 'function'
230
- },
231
- ],
232
- error: null,
233
- },
234
- });
235
- const result = await (0, sync_1.pullFromSupabase)(rootPath, mock.client, USER_A, REPO_ID, ["hash1"]);
236
- (0, vitest_1.expect)(result.pulled).toBe(1);
237
- // Verify defaults were applied
238
- const db = new better_sqlite3_1.default(dbPath);
239
- const row = db
240
- .prepare("SELECT language, entity_type, schema_version FROM function_analysis WHERE content_hash = 'hash1'")
241
- .get();
242
- db.close();
243
- (0, vitest_1.expect)(row.language).toBe("typescript");
244
- (0, vitest_1.expect)(row.entity_type).toBe("function");
245
- (0, vitest_1.expect)(row.schema_version).toBe(1);
246
- });
247
- (0, vitest_1.it)("handles string vs object analysis field", async () => {
248
- const { rootPath, dbPath } = tracked((0, test_utils_1.createTempDb)());
249
- const mock = (0, test_utils_1.createMockSupabase)({
250
- "function_analysis.select": {
251
- data: [
252
- {
253
- content_hash: "hash-obj",
254
- analysis_type: "documentation",
255
- analysis: { description: "Object form" }, // object — should be stringified
256
- model_version: "claude-3",
257
- schema_version: 1,
258
- language: "typescript",
259
- entity_type: "function",
260
- },
261
- {
262
- content_hash: "hash-str",
263
- analysis_type: "documentation",
264
- analysis: '{"description":"String form"}', // already string
265
- model_version: "claude-3",
266
- schema_version: 1,
267
- language: "typescript",
268
- entity_type: "function",
269
- },
270
- ],
271
- error: null,
272
- },
273
- });
274
- const result = await (0, sync_1.pullFromSupabase)(rootPath, mock.client, USER_A, REPO_ID, ["hash-obj", "hash-str"]);
275
- (0, vitest_1.expect)(result.pulled).toBe(2);
276
- // Both should be stored as valid JSON strings
277
- const db = new better_sqlite3_1.default(dbPath);
278
- const objRow = db
279
- .prepare("SELECT analysis FROM function_analysis WHERE content_hash = 'hash-obj'")
280
- .get();
281
- const strRow = db
282
- .prepare("SELECT analysis FROM function_analysis WHERE content_hash = 'hash-str'")
283
- .get();
284
- db.close();
285
- (0, vitest_1.expect)(JSON.parse(objRow.analysis).description).toBe("Object form");
286
- (0, vitest_1.expect)(JSON.parse(strRow.analysis).description).toBe("String form");
287
- });
288
- });
@@ -1,161 +0,0 @@
1
- "use strict";
2
- var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
3
- if (k2 === undefined) k2 = k;
4
- var desc = Object.getOwnPropertyDescriptor(m, k);
5
- if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
6
- desc = { enumerable: true, get: function() { return m[k]; } };
7
- }
8
- Object.defineProperty(o, k2, desc);
9
- }) : (function(o, m, k, k2) {
10
- if (k2 === undefined) k2 = k;
11
- o[k2] = m[k];
12
- }));
13
- var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
14
- Object.defineProperty(o, "default", { enumerable: true, value: v });
15
- }) : function(o, v) {
16
- o["default"] = v;
17
- });
18
- var __importStar = (this && this.__importStar) || (function () {
19
- var ownKeys = function(o) {
20
- ownKeys = Object.getOwnPropertyNames || function (o) {
21
- var ar = [];
22
- for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
23
- return ar;
24
- };
25
- return ownKeys(o);
26
- };
27
- return function (mod) {
28
- if (mod && mod.__esModule) return mod;
29
- var result = {};
30
- if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
31
- __setModuleDefault(result, mod);
32
- return result;
33
- };
34
- })();
35
- var __importDefault = (this && this.__importDefault) || function (mod) {
36
- return (mod && mod.__esModule) ? mod : { "default": mod };
37
- };
38
- Object.defineProperty(exports, "__esModule", { value: true });
39
- exports.createTempDb = createTempDb;
40
- exports.createMockSupabase = createMockSupabase;
41
- const better_sqlite3_1 = __importDefault(require("better-sqlite3"));
42
- const path = __importStar(require("path"));
43
- const fs = __importStar(require("fs"));
44
- const os = __importStar(require("os"));
45
- const test_utils_1 = require("@ophan/core/test-utils");
46
- /**
47
- * Creates a temp directory with a real .ophan/index.db file.
48
- * Sync functions take rootPath and construct dbPath internally,
49
- * so we need an actual file on disk.
50
- */
51
- function createTempDb(setup) {
52
- const rootPath = fs.mkdtempSync(path.join(os.tmpdir(), "ophan-test-"));
53
- const ophanDir = path.join(rootPath, ".ophan");
54
- fs.mkdirSync(ophanDir, { recursive: true });
55
- const dbPath = path.join(ophanDir, "index.db");
56
- // Create a real file-based DB with the current schema
57
- const memDb = (0, test_utils_1.createTestDb)();
58
- // Serialize the in-memory DB to a file
59
- const db = new better_sqlite3_1.default(dbPath);
60
- db.exec(memDb.prepare("SELECT sql FROM sqlite_master WHERE type='table'")
61
- .all()
62
- .map((row) => row.sql + ";")
63
- .join("\n"));
64
- // Run setup callback if provided
65
- if (setup)
66
- setup(db);
67
- db.close();
68
- memDb.close();
69
- return {
70
- dbPath,
71
- rootPath,
72
- cleanup: () => {
73
- try {
74
- fs.rmSync(rootPath, { recursive: true, force: true });
75
- }
76
- catch {
77
- // best-effort cleanup
78
- }
79
- },
80
- };
81
- }
82
- /**
83
- * Creates a mock Supabase client that records calls and returns controlled responses.
84
- *
85
- * Usage:
86
- * const mock = createMockSupabase({
87
- * 'repos.upsert': { data: [{ id: 'repo-1' }], error: null },
88
- * 'function_analysis.select': { data: [], error: null },
89
- * })
90
- * await syncToSupabase(rootPath, mock.client, userId)
91
- * expect(mock.calls).toContainEqual(expect.objectContaining({ table: 'repos', method: 'upsert' }))
92
- */
93
- function createMockSupabase(responses = {}) {
94
- const calls = [];
95
- function createChain(table) {
96
- let primaryMethod = "";
97
- const resolve = () => {
98
- const key = `${table}.${primaryMethod}`;
99
- return responses[key] ?? { data: [], error: null };
100
- };
101
- const chain = {};
102
- const handler = {
103
- get(_, prop) {
104
- // Primary operations
105
- if (["select", "insert", "upsert", "update", "delete"].includes(prop)) {
106
- return (...args) => {
107
- primaryMethod = prop;
108
- calls.push({ table, method: prop, args });
109
- return new Proxy(chain, handler);
110
- };
111
- }
112
- // Filter/modifier methods — chainable
113
- if ([
114
- "eq",
115
- "in",
116
- "neq",
117
- "gt",
118
- "lt",
119
- "gte",
120
- "lte",
121
- "like",
122
- "ilike",
123
- "contains",
124
- "order",
125
- "limit",
126
- "range",
127
- "or",
128
- "not",
129
- "is",
130
- "head",
131
- ].includes(prop)) {
132
- return (...args) => {
133
- calls.push({ table, method: prop, args });
134
- return new Proxy(chain, handler);
135
- };
136
- }
137
- // Terminal methods
138
- if (prop === "single" || prop === "maybeSingle") {
139
- return (...args) => {
140
- calls.push({ table, method: prop, args });
141
- return Promise.resolve(resolve());
142
- };
143
- }
144
- // Thenable — allows `await supabase.from(...).select(...).eq(...)`
145
- if (prop === "then") {
146
- const result = resolve();
147
- return Promise.resolve(result).then.bind(Promise.resolve(result));
148
- }
149
- return new Proxy(chain, handler);
150
- },
151
- };
152
- return new Proxy(chain, handler);
153
- }
154
- const client = {
155
- from: (table) => {
156
- calls.push({ table, method: "from", args: [table] });
157
- return createChain(table);
158
- },
159
- };
160
- return { client: client, calls };
161
- }