@ophan/cli 0.0.1 → 0.0.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,82 +1,72 @@
1
1
  # @ophan/cli
2
2
 
3
- Command-line interface for analyzing codebases and managing Ophan databases.
3
+ AI-powered security analysis and documentation for your codebase. Detects vulnerabilities, maps data flows, and auto-documents every function — from the command line.
4
4
 
5
- ## Usage
5
+ Part of [Ophan](https://ophan.dev). Works standalone or alongside the [VS Code extension](https://marketplace.visualstudio.com/items?itemName=ophan.ophan).
6
6
 
7
- ### Repository Analysis
8
- ```bash
9
- # Analyze a repo (creates .ophan/index.db)
10
- npx ophan analyze --path /path/to/repo
7
+ ## Quick Start
11
8
 
12
- # Or as dev dependency
13
- npm install --save-dev ophan
14
- npx ophan analyze
9
+ ```bash
10
+ npx @ophan/cli analyze
15
11
  ```
16
12
 
17
- ### Commands
18
- - `ophan analyze` — Scan repo, extract functions, analyze with Claude, store by content hash
19
- - `ophan sync` — Push/pull analysis to/from Supabase (requires auth)
20
- - `ophan gc` — Garbage collect orphaned analysis entries (manual only, never automatic, 30-day grace period)
13
+ That's it. Ophan scans your repo, extracts every function, and sends them to Claude for security analysis and documentation. Results are stored locally in `.ophan/index.db`.
21
14
 
22
- ## How Analysis Works
15
+ You'll need an [Anthropic API key](https://console.anthropic.com/) set as `ANTHROPIC_API_KEY` in your environment or a `.env` file.
23
16
 
24
- ### Initial Run
25
- 1. Discovers all TypeScript/JavaScript source files
26
- 2. Extracts functions, computes SHA256 content hash for each
27
- 3. Checks local DB — skips functions with existing analysis for that hash
28
- 4. Sends new functions to Claude for analysis
29
- 5. Stores results in `function_analysis` (keyed by content_hash)
30
- 6. Updates `file_functions` with file path, function name, content_hash, mtime
17
+ ## What You Get
31
18
 
32
- ### Incremental Updates
33
- 1. Checks `file_mtime` in `file_functions` — skips entirely unchanged files (no parsing needed)
34
- 2. Re-parses changed files, re-hashes functions
35
- 3. Only analyzes functions with new/unknown hashes
36
- 4. Updates `file_functions` mappings
19
+ For every function in your codebase:
37
20
 
38
- ### Garbage Collection
39
- `ophan gc` is manual only never runs automatically. This protects against branch switching scenarios (e.g. function deleted on feature branch but still exists on main).
21
+ - **Security analysis** — SQL injection, XSS, hardcoded secrets, path traversal, unsanitized input
22
+ - **Data flow tags**which functions touch user input, PII, credentials, databases, external APIs
23
+ - **Documentation** — plain-English descriptions, parameter docs, return type docs
40
24
 
41
- How it works:
42
- 1. Scans codebase, computes all current hashes
43
- 2. Compares to stored `function_analysis` entries
44
- 3. Only deletes entries not seen in grace period (default 30 days, configurable)
45
- 4. Updates `last_referenced_at` in Supabase for synced entries
46
- 5. If synced, GC'd entries can be re-downloaded on next sync instead of re-analyzed
25
+ ## Commands
47
26
 
48
- `file_functions` is ephemeral — rebuilt on every scan. `function_analysis` is persistent until explicit GC.
27
+ ```bash
28
+ npx @ophan/cli analyze # Analyze current directory
29
+ npx @ophan/cli analyze --path . # Analyze a specific path
30
+ npx @ophan/cli sync # Sync results to ophan.dev (optional)
31
+ npx @ophan/cli gc # Clean up old analysis entries
32
+ ```
49
33
 
50
- ### Sync
51
- `ophan sync` pushes/pulls `function_analysis` rows to/from Supabase:
52
- - Insert-only (content-addressed = immutable, no conflicts)
53
- - Free users: scoped to `user_id`
54
- - Team users: scoped to `team_id` (new members pull existing analysis)
34
+ ### As a dev dependency
55
35
 
56
- ## Design Principles
36
+ ```bash
37
+ npm install --save-dev @ophan/cli
38
+ npx @ophan/cli analyze
39
+ ```
57
40
 
58
- ### Local-First
59
- All analysis stored in per-repo `.ophan/index.db` (gitignored). No cloud required for core functionality. Supabase sync is optional.
41
+ Add to your team's repo so everyone gets the CLI on `npm install`. Analysis is cached by content hash — unchanged functions are never re-analyzed.
60
42
 
61
- ### Dev Dependency Model
62
- Add to `devDependencies` — one engineer installs, whole team gets CLI on `npm install`. Bottom-up adoption path.
43
+ ## How It Works
63
44
 
64
- ### Non-Invasive
65
- Database in hidden `.ophan/` directory, gitignored. Does not modify source code or project configuration.
45
+ 1. Parses your source files using language-native ASTs (TypeScript compiler API, Python's `ast` module)
46
+ 2. Extracts every function and computes a SHA256 content hash
47
+ 3. Skips functions that haven't changed since last analysis
48
+ 4. Sends new/changed functions to Claude for security and documentation analysis
49
+ 5. Stores results locally in `.ophan/index.db` (gitignored)
66
50
 
67
- ### Content-Addressed
68
- Analysis keyed by function content hash, not file path. Branch-agnostic, merge-friendly, deduplication built in.
51
+ Supports **TypeScript**, **JavaScript**, and **Python**.
52
+
53
+ ## Cloud Sync
54
+
55
+ Optionally sync your analysis to [ophan.dev](https://ophan.dev) for a web dashboard, team sharing, and cross-machine access.
56
+
57
+ ```bash
58
+ npx @ophan/cli login # Authenticate with ophan.dev
59
+ npx @ophan/cli analyze # Auto-pulls from cloud, then analyzes remaining
60
+ npx @ophan/cli sync # Push new results to cloud
61
+ ```
69
62
 
70
- ## Output
63
+ ## Resources
71
64
 
72
- Creates `.ophan/index.db` containing:
73
- - `function_analysis` — content_hash, analysis JSON, model version, timestamp
74
- - `file_functions` file path, function name, content_hash, file mtime
65
+ - [Documentation](https://docs.ophan.dev)
66
+ - [CLI Reference](https://docs.ophan.dev/cli/commands)
67
+ - [VS Code Extension](https://marketplace.visualstudio.com/items?itemName=ophan.ophan)
68
+ - [GitHub](https://github.com/nicholasgriffintn/ophan)
75
69
 
76
- ## CI/CD Integration
70
+ ## License
77
71
 
78
- ### GitHub Action (Planned)
79
- Run `ophan analyze` on PRs:
80
- - Comment with new function documentation
81
- - Fail if new security warnings introduced
82
- - Show data flow changes in PR review
72
+ MIT
package/dist/auth.js CHANGED
@@ -113,7 +113,7 @@ async function login(webappUrl, timeoutMs = 120000) {
113
113
  return new Promise((resolve, reject) => {
114
114
  const timeout = setTimeout(() => {
115
115
  server.close();
116
- reject(new Error("Login timed out. Please try again with `ophan login`."));
116
+ reject(new Error("Login timed out. Please try again with `npx @ophan/cli login`."));
117
117
  }, timeoutMs);
118
118
  const server = http.createServer(async (req, res) => {
119
119
  const url = new URL(req.url || "/", `http://localhost:${port}`);
@@ -196,7 +196,7 @@ async function login(webappUrl, timeoutMs = 120000) {
196
196
  async function getAuthenticatedClient() {
197
197
  const creds = readCredentials();
198
198
  if (!creds) {
199
- throw new Error("Not logged in. Run `ophan login` first.");
199
+ throw new Error("Not logged in. Run `npx @ophan/cli login` first.");
200
200
  }
201
201
  const supabase = (0, supabase_js_1.createClient)(creds.api_url, creds.api_key, {
202
202
  auth: {
@@ -224,7 +224,7 @@ async function getAuthenticatedClient() {
224
224
  refresh_token: creds.refresh_token,
225
225
  });
226
226
  if (error || !data.session) {
227
- throw new Error(`Session expired or invalid. Run \`ophan login\` again.\n ${error?.message || "No session returned"}`);
227
+ throw new Error(`Session expired or invalid. Run \`npx @ophan/cli login\` again.\n ${error?.message || "No session returned"}`);
228
228
  }
229
229
  // Persist the new tokens so next invocation can use the fresh access_token
230
230
  // without consuming the refresh token again
package/dist/index.js CHANGED
@@ -46,11 +46,16 @@ var __importStar = (this && this.__importStar) || (function () {
46
46
  return result;
47
47
  };
48
48
  })();
49
+ var __importDefault = (this && this.__importDefault) || function (mod) {
50
+ return (mod && mod.__esModule) ? mod : { "default": mod };
51
+ };
49
52
  Object.defineProperty(exports, "__esModule", { value: true });
50
53
  require("dotenv/config");
51
54
  const commander_1 = require("commander");
52
55
  const core_1 = require("@ophan/core");
56
+ const better_sqlite3_1 = __importDefault(require("better-sqlite3"));
53
57
  const path = __importStar(require("path"));
58
+ const fs = __importStar(require("fs"));
54
59
  const auth_1 = require("./auth");
55
60
  const sync_1 = require("./sync");
56
61
  const watch_1 = require("./watch");
@@ -64,7 +69,7 @@ program
64
69
  * Returns undefined if user isn't logged in or repo doesn't exist in Supabase.
65
70
  * Silent — never fails the analyze command.
66
71
  */
67
- async function createPullFn(rootPath) {
72
+ async function createPullFn(rootPath, onProgress) {
68
73
  const creds = (0, auth_1.readCredentials)();
69
74
  if (!creds)
70
75
  return undefined;
@@ -79,8 +84,14 @@ async function createPullFn(rootPath) {
79
84
  .single();
80
85
  if (!repo)
81
86
  return undefined;
87
+ const log = onProgress ?? ((step) => console.log(` ${step}`));
82
88
  return async (hashes) => {
83
- await (0, sync_1.pullFromSupabase)(rootPath, supabase, userId, repo.id, hashes, (step) => console.log(` ${step}`));
89
+ try {
90
+ await (0, sync_1.pullFromSupabase)(rootPath, supabase, userId, repo.id, hashes, log);
91
+ }
92
+ catch {
93
+ // Pull is a non-critical optimization — silently degrade
94
+ }
84
95
  };
85
96
  }
86
97
  catch {
@@ -96,7 +107,13 @@ async function runAnalyze(rootPath) {
96
107
  process.stdout.cursorTo(0);
97
108
  }
98
109
  process.stdout.write(` [${current}/${total}] ${file}\n`);
99
- }, pullFn);
110
+ }, pullFn, (analyzed, total) => {
111
+ if (process.stdout.isTTY) {
112
+ process.stdout.clearLine(0);
113
+ process.stdout.cursorTo(0);
114
+ }
115
+ process.stdout.write(` Analyzing: ${analyzed}/${total} functions`);
116
+ });
100
117
  console.log(`\n✅ Done! ${result.analyzed} analyzed, ${result.skipped} cached` +
101
118
  (result.pulled ? ` (${result.pulled} from cloud)` : "") +
102
119
  ` across ${result.files} files` +
@@ -115,10 +132,29 @@ program
115
132
  .description("Remove orphaned analysis entries (manual, safe for branch switching)")
116
133
  .option("-p, --path <path>", "Path to repository", process.cwd())
117
134
  .option("-d, --days <days>", "Grace period in days", "30")
135
+ .option("-f, --force", "Run GC even when not on default branch")
118
136
  .action(async (options) => {
119
137
  const rootPath = path.resolve(options.path);
120
138
  const maxAgeDays = parseInt(options.days, 10);
121
139
  const dbPath = path.join(rootPath, ".ophan", "index.db");
140
+ // Branch safety check — GC on a feature branch can orphan
141
+ // hashes that only exist on the default branch
142
+ const currentBranch = (0, sync_1.getGitBranch)(rootPath);
143
+ if (currentBranch !== null) {
144
+ const defaultBranch = (0, sync_1.getDefaultBranch)(rootPath);
145
+ if (currentBranch !== defaultBranch && !options.force) {
146
+ console.error(`⚠️ You're on branch '${currentBranch}', not '${defaultBranch}'.\n` +
147
+ ` Running GC on a feature branch can delete analysis for functions\n` +
148
+ ` that only exist on '${defaultBranch}'. If synced, these deletions\n` +
149
+ ` propagate to the cloud.\n\n` +
150
+ ` Switch to '${defaultBranch}' first, or use --force to proceed.`);
151
+ process.exit(1);
152
+ }
153
+ if (currentBranch !== defaultBranch && options.force) {
154
+ console.warn(`⚠️ Running GC on branch '${currentBranch}' (default: '${defaultBranch}').\n` +
155
+ ` Proceeding with --force.\n`);
156
+ }
157
+ }
122
158
  console.log("🧹 Refreshing file index...\n");
123
159
  await (0, core_1.refreshFileIndex)(rootPath, (current, total, file) => {
124
160
  if (process.stdout.isTTY) {
@@ -144,7 +180,7 @@ program
144
180
  try {
145
181
  await (0, auth_1.getAuthenticatedClient)();
146
182
  console.log(` Already logged in as ${existing.email}`);
147
- console.log(` Run \`ophan logout\` first to switch accounts.`);
183
+ console.log(` Run \`npx @ophan/cli logout\` first to switch accounts.`);
148
184
  return;
149
185
  }
150
186
  catch {
@@ -172,21 +208,51 @@ program
172
208
  // ========== SYNC ==========
173
209
  program
174
210
  .command("sync")
175
- .description("Sync analysis to ophan.dev")
211
+ .description("Sync analysis with ophan.dev (pull + push)")
176
212
  .option("-p, --path <path>", "Path to repository", process.cwd())
177
213
  .action(async (options) => {
178
214
  const rootPath = path.resolve(options.path);
215
+ const dbPath = path.join(rootPath, ".ophan", "index.db");
179
216
  try {
180
217
  const { supabase, userId } = await (0, auth_1.getAuthenticatedClient)();
181
- console.log("☁️ Syncing to ophan.dev...\n");
182
- const result = await (0, sync_1.syncToSupabase)(rootPath, supabase, userId, (step) => {
183
- console.log(` ${step}`);
184
- });
185
- console.log(`\n✅ Sync complete: ${result.pushed} analysis entries pushed, ` +
186
- `${result.locations} file locations synced` +
187
- (result.gcProcessed > 0
188
- ? `, ${result.gcProcessed} GC deletions applied`
189
- : ""));
218
+ console.log("☁️ Syncing with ophan.dev...\n");
219
+ // Pull first: fetch missing analysis from cloud
220
+ let pulled = 0;
221
+ const repoName = path.basename(rootPath);
222
+ const { data: repo } = await supabase
223
+ .from("repos")
224
+ .select("id")
225
+ .eq("user_id", userId)
226
+ .eq("name", repoName)
227
+ .single();
228
+ if (repo) {
229
+ const missingHashes = (0, core_1.findMissingHashes)(dbPath);
230
+ if (missingHashes.length > 0) {
231
+ const pullResult = await (0, sync_1.pullFromSupabase)(rootPath, supabase, userId, repo.id, missingHashes, (step) => console.log(` ${step}`));
232
+ pulled = pullResult.pulled;
233
+ }
234
+ else {
235
+ console.log(" No missing analyses to pull.");
236
+ }
237
+ }
238
+ // Then push local analysis to cloud
239
+ const result = await (0, sync_1.syncToSupabase)(rootPath, supabase, userId, (step) => console.log(` ${step}`));
240
+ const parts = [
241
+ `${pulled} pulled`,
242
+ `${result.pushed} pushed`,
243
+ `${result.locations} file locations synced`,
244
+ ];
245
+ if (result.practices > 0)
246
+ parts.push(`${result.practices} practices synced`);
247
+ if (result.communities > 0)
248
+ parts.push(`${result.communities} community memberships`);
249
+ if (result.communityEdges > 0)
250
+ parts.push(`${result.communityEdges} community edges`);
251
+ if (result.summaries > 0)
252
+ parts.push(`${result.summaries} summaries`);
253
+ if (result.gcProcessed > 0)
254
+ parts.push(`${result.gcProcessed} GC deletions applied`);
255
+ console.log(`\n✅ Sync complete: ${parts.join(", ")}`);
190
256
  }
191
257
  catch (err) {
192
258
  console.error(`\n❌ Sync failed: ${err.message}`);
@@ -202,7 +268,21 @@ program
202
268
  .option("--sync", "Auto-sync to cloud after each analysis batch")
203
269
  .action(async (options) => {
204
270
  const rootPath = path.resolve(options.path);
205
- const pullFn = await createPullFn(rootPath);
271
+ // Watch is long-running (used by extension host) — never crash on stray rejections.
272
+ // Common cause: Supabase server unreachable during pull/sync operations.
273
+ process.on("unhandledRejection", (reason) => {
274
+ const msg = reason?.message ?? String(reason);
275
+ if (options.json) {
276
+ process.stdout.write(JSON.stringify({ event: "error", message: `Unhandled rejection: ${msg}` }) + "\n");
277
+ }
278
+ else {
279
+ console.error(` ❌ Unexpected error: ${msg}`);
280
+ }
281
+ });
282
+ const pullProgress = options.json
283
+ ? (step) => process.stdout.write(JSON.stringify({ event: "pull_progress", message: step }) + "\n")
284
+ : undefined;
285
+ const pullFn = await createPullFn(rootPath, pullProgress);
206
286
  let syncFn;
207
287
  if (options.sync) {
208
288
  const creds = (0, auth_1.readCredentials)();
@@ -213,18 +293,18 @@ program
213
293
  }
214
294
  catch {
215
295
  if (options.json) {
216
- process.stdout.write(JSON.stringify({ event: "sync_warning", message: "Session expired — sync disabled. Run ophan login." }) + "\n");
296
+ process.stdout.write(JSON.stringify({ event: "sync_warning", message: "Session expired — sync disabled. Run npx @ophan/cli login." }) + "\n");
217
297
  }
218
298
  else {
219
- console.warn(" ⚠️ Session expired — sync disabled. Run `ophan login` to re-authenticate.");
299
+ console.warn(" ⚠️ Session expired — sync disabled. Run `npx @ophan/cli login` to re-authenticate.");
220
300
  }
221
301
  }
222
302
  }
223
303
  else if (options.json) {
224
- process.stdout.write(JSON.stringify({ event: "sync_warning", message: "Not logged in — sync disabled. Run ophan login." }) + "\n");
304
+ process.stdout.write(JSON.stringify({ event: "sync_warning", message: "Not logged in — sync disabled. Run npx @ophan/cli login." }) + "\n");
225
305
  }
226
306
  else {
227
- console.warn(" ⚠️ Not logged in — sync disabled. Run `ophan login` to enable cloud sync.");
307
+ console.warn(" ⚠️ Not logged in — sync disabled. Run `npx @ophan/cli login` to enable cloud sync.");
228
308
  }
229
309
  }
230
310
  await (0, watch_1.startWatch)({
@@ -234,12 +314,291 @@ program
234
314
  json: options.json,
235
315
  });
236
316
  });
237
- // Keep "init" as alias for "analyze" for backwards compat
317
+ // ========== INIT ==========
238
318
  program
239
319
  .command("init")
240
- .description("Alias for analyze")
320
+ .description("Initialize Ophan for this repository (fast, no network calls)")
241
321
  .option("-p, --path <path>", "Path to repository", process.cwd())
242
322
  .action(async (options) => {
243
- await runAnalyze(path.resolve(options.path));
323
+ const rootPath = path.resolve(options.path);
324
+ const dbPath = path.join(rootPath, ".ophan", "index.db");
325
+ console.log("Initializing Ophan...\n");
326
+ // Create .ophan dir + DB tables + gitignore
327
+ fs.mkdirSync(path.join(rootPath, ".ophan"), { recursive: true });
328
+ (0, core_1.ensureGitignore)(rootPath);
329
+ (0, core_1.initDb)(dbPath).close();
330
+ // Scan files and index all functions
331
+ let fileCount = 0;
332
+ await (0, core_1.refreshFileIndex)(rootPath, (current, total, file) => {
333
+ fileCount = total;
334
+ if (process.stdout.isTTY) {
335
+ process.stdout.clearLine(0);
336
+ process.stdout.cursorTo(0);
337
+ }
338
+ process.stdout.write(` [${current}/${total}] ${file}\n`);
339
+ });
340
+ // Count indexed functions
341
+ const db = new better_sqlite3_1.default(dbPath, { readonly: true });
342
+ const row = db.prepare("SELECT COUNT(*) as count FROM file_functions").get();
343
+ db.close();
344
+ console.log(`\n✅ Initialized! Found ${row.count} functions across ${fileCount} files.`);
345
+ console.log(` Database: .ophan/index.db\n`);
346
+ console.log(` Next: run \`npx @ophan/cli watch\` to start analysis.`);
347
+ });
348
+ // ========== GRAPH ==========
349
+ program
350
+ .command("graph")
351
+ .description("Build function relationship graph and detect communities")
352
+ .option("-p, --path <path>", "Path to repository", process.cwd())
353
+ .option("-a, --algorithm <algorithm>", "Community detection algorithm (louvain, leiden, label-propagation)", core_1.DEFAULT_GRAPH_CONFIG.algorithm)
354
+ .option("-r, --resolution <number>", "Louvain resolution parameter (higher = more communities)", String(core_1.DEFAULT_GRAPH_CONFIG.resolution))
355
+ .option("--min-size <number>", "Minimum community size (smaller clusters are dissolved)", String(core_1.DEFAULT_GRAPH_CONFIG.minCommunitySize))
356
+ .option("--max-size <number>", "Maximum community size (informational)", String(core_1.DEFAULT_GRAPH_CONFIG.maxCommunitySize))
357
+ .option("--json", "Output results as JSON")
358
+ .option("--summarize", "Generate documentation for detected communities using Claude")
359
+ .option("--raw-source", "Use raw function source code for documentation (requires --summarize)")
360
+ .option("--directory-decay <number>", "Directory distance decay factor (0=off, higher=stronger)", String(core_1.DEFAULT_GRAPH_CONFIG.directoryDecay))
361
+ .option("--compare", "Compare multiple algorithm configurations side by side")
362
+ .action(async (options) => {
363
+ const rootPath = path.resolve(options.path);
364
+ const dbPath = path.join(rootPath, ".ophan", "index.db");
365
+ if (!fs.existsSync(dbPath)) {
366
+ console.error("No .ophan/index.db found. Run `npx @ophan/cli init` first.");
367
+ process.exit(1);
368
+ }
369
+ if (options.rawSource && !options.summarize) {
370
+ console.error("--raw-source requires --summarize");
371
+ process.exit(1);
372
+ }
373
+ const config = {
374
+ algorithm: options.algorithm,
375
+ edgeWeights: core_1.DEFAULT_GRAPH_CONFIG.edgeWeights,
376
+ resolution: parseFloat(options.resolution),
377
+ minCommunitySize: parseInt(options.minSize, 10),
378
+ maxCommunitySize: parseInt(options.maxSize, 10),
379
+ directoryDecay: parseFloat(options.directoryDecay),
380
+ };
381
+ if (!options.json) {
382
+ console.log(`\nBuilding function graph (${config.algorithm}, resolution=${config.resolution})...\n`);
383
+ }
384
+ // Extract functions with relationship data
385
+ const functions = await (0, core_1.extractAllFunctions)(rootPath, !options.json ? (current, total, file) => {
386
+ if (process.stdout.isTTY) {
387
+ process.stdout.clearLine(0);
388
+ process.stdout.cursorTo(0);
389
+ }
390
+ process.stdout.write(` [${current}/${total}] ${file}`);
391
+ } : undefined);
392
+ if (!options.json) {
393
+ console.log(`\n\n ${functions.length} functions extracted, resolving edges...\n`);
394
+ }
395
+ // Populate file_functions index from extracted functions (needed for edge resolution)
396
+ const db = (0, core_1.initDb)(dbPath);
397
+ // Snapshot old hashes before populateFileIndex overwrites file_functions
398
+ // This enables incremental edge resolution (only recompute changed + neighbor edges)
399
+ const oldHashes = new Set(db.prepare("SELECT DISTINCT content_hash FROM file_functions").all()
400
+ .map((r) => r.content_hash));
401
+ (0, core_1.populateFileIndex)(db, functions);
402
+ if (!options.json) {
403
+ console.log(` Indexed ${functions.length} functions, detecting communities...\n`);
404
+ }
405
+ // --compare mode: resolve edges once, run multiple algorithm configs, print table
406
+ if (options.compare) {
407
+ const resolver = rootPath ? (0, core_1.buildModuleResolver)(rootPath, functions) : undefined;
408
+ const edges = (0, core_1.resolveEdges)(db, functions, config, undefined, resolver);
409
+ (0, core_1.storeEdges)(db, edges);
410
+ const edgesWithTransitive = (0, core_1.addTransitiveEdges)(edges, config);
411
+ // Build hash→filePath map for directory distance decay
412
+ let hashToFilePath;
413
+ if (rootPath && (config.directoryDecay ?? 0) > 0) {
414
+ hashToFilePath = new Map();
415
+ for (const fn of functions)
416
+ hashToFilePath.set(fn.contentHash, fn.filePath);
417
+ }
418
+ const allHashes = new Set(functions.map((f) => f.contentHash));
419
+ const metrics = (0, core_1.runComparison)(edgesWithTransitive, config, core_1.DEFAULT_COMPARISONS, hashToFilePath, rootPath, allHashes);
420
+ if (options.json) {
421
+ console.log(JSON.stringify({ nodes: allHashes.size, edges: edgesWithTransitive.length, comparisons: metrics }));
422
+ }
423
+ else {
424
+ console.log(` Algorithm Comparison (${allHashes.size} nodes, ${edgesWithTransitive.length} edges)\n`);
425
+ // Table header
426
+ const header = [
427
+ "Configuration".padEnd(25),
428
+ "Communities".padStart(12),
429
+ "Dissolved".padStart(14),
430
+ "Coverage".padStart(9),
431
+ "Min".padStart(5),
432
+ "Med".padStart(5),
433
+ "Max".padStart(5),
434
+ "Modularity".padStart(11),
435
+ ].join("");
436
+ console.log(` ${header}`);
437
+ console.log(` ${"─".repeat(header.length)}`);
438
+ for (const m of metrics) {
439
+ const dissolved = `${m.dissolvedCount} (${m.dissolvedPct.toFixed(0)}%)`;
440
+ const modularity = m.modularity !== null ? m.modularity.toFixed(4) : "---";
441
+ const row = [
442
+ m.label.padEnd(25),
443
+ String(m.communityCount).padStart(12),
444
+ dissolved.padStart(14),
445
+ `${m.coverage.toFixed(0)}%`.padStart(9),
446
+ String(m.minSize).padStart(5),
447
+ String(m.medianSize).padStart(5),
448
+ String(m.maxSize).padStart(5),
449
+ modularity.padStart(11),
450
+ ].join("");
451
+ console.log(` ${row}`);
452
+ }
453
+ console.log();
454
+ }
455
+ db.close();
456
+ return;
457
+ }
458
+ // Run full graph pipeline with hierarchical detection
459
+ const hierResult = (0, core_1.detectHierarchicalCommunities)(db, functions, config, oldHashes, rootPath);
460
+ const result = hierResult.l0;
461
+ // Build source map for raw-source summarization mode
462
+ let sourceMap;
463
+ if (options.rawSource) {
464
+ sourceMap = new Map();
465
+ for (const fn of functions)
466
+ sourceMap.set(fn.contentHash, fn.sourceCode);
467
+ }
468
+ if (options.json) {
469
+ const output = {
470
+ algorithm: config.algorithm,
471
+ resolution: config.resolution,
472
+ nodes: result.nodesInGraph,
473
+ edges: result.edgesInGraph,
474
+ communities: result.communityCount,
475
+ dissolved: result.dissolvedCount,
476
+ modularity: result.modularity,
477
+ assignments: result.assignments,
478
+ communityEdges: hierResult.communityEdges,
479
+ l1Groups: hierResult.l1GroupCount,
480
+ l1Assignments: hierResult.l1Assignments,
481
+ };
482
+ // Summarize if requested
483
+ if (options.summarize) {
484
+ const sumResult = await (0, core_1.summarizeCommunities)(db, {
485
+ config: { algorithm: config.algorithm },
486
+ sourceMap,
487
+ rootPath,
488
+ });
489
+ const l1 = (0, core_1.loadAllSummaries)(db, config.algorithm, 1);
490
+ const l2 = (0, core_1.loadAllSummaries)(db, config.algorithm, 2);
491
+ const l3 = (0, core_1.loadAllSummaries)(db, config.algorithm, 3);
492
+ const cc = (0, core_1.loadAllSummaries)(db, config.algorithm, 10);
493
+ output.summaries = {
494
+ l1: l1.map((s) => ({ communityId: s.communityId, ...s.summary })),
495
+ l2: l2.map((s) => ({ communityId: s.communityId, ...s.summary })),
496
+ l3: l3.map((s) => ({ communityId: s.communityId, ...s.summary })),
497
+ crossCutting: cc.map((s) => ({ communityId: s.communityId, ...s.summary })),
498
+ stats: sumResult,
499
+ };
500
+ }
501
+ console.log(JSON.stringify(output));
502
+ }
503
+ else {
504
+ console.log(` Nodes: ${result.nodesInGraph}`);
505
+ console.log(` Edges: ${result.edgesInGraph}`);
506
+ console.log(` Communities: ${result.communityCount}`);
507
+ if (result.dissolvedCount > 0) {
508
+ console.log(` Dissolved: ${result.dissolvedCount} (below min size ${config.minCommunitySize})`);
509
+ }
510
+ if (result.modularity !== null) {
511
+ console.log(` Modularity: ${result.modularity.toFixed(4)}`);
512
+ }
513
+ if (result.effectiveResolution && result.effectiveResolution !== config.resolution) {
514
+ console.log(` Resolution: ${result.effectiveResolution} (auto-scaled from ${config.resolution})`);
515
+ }
516
+ if (hierResult.l1GroupCount > 0) {
517
+ console.log(` Groups: ${hierResult.l1GroupCount} (hierarchical)`);
518
+ }
519
+ if (hierResult.communityEdges.length > 0) {
520
+ console.log(` Cross-edges: ${hierResult.communityEdges.length} (between communities)`);
521
+ }
522
+ // Show community breakdown
523
+ if (result.communityCount > 0) {
524
+ const communityMap = new Map();
525
+ for (const a of result.assignments) {
526
+ if (a.communityId === "__dissolved")
527
+ continue;
528
+ const members = communityMap.get(a.communityId) || [];
529
+ members.push(a.contentHash);
530
+ communityMap.set(a.communityId, members);
531
+ }
532
+ console.log(`\n Community breakdown:`);
533
+ // Look up function names from DB for display
534
+ const readDb = new better_sqlite3_1.default(dbPath, { readonly: true });
535
+ const lookupName = readDb.prepare("SELECT function_name, file_path FROM file_functions WHERE content_hash = ? LIMIT 1");
536
+ const sorted = [...communityMap.entries()].sort((a, b) => b[1].length - a[1].length);
537
+ for (const [communityId, hashes] of sorted) {
538
+ const overMax = hashes.length > config.maxCommunitySize;
539
+ const flag = overMax ? " (large)" : "";
540
+ console.log(`\n Community ${communityId}${flag} (${hashes.length} functions):`);
541
+ for (const hash of hashes.slice(0, 15)) {
542
+ const row = lookupName.get(hash);
543
+ if (row) {
544
+ const relPath = path.relative(rootPath, row.file_path);
545
+ console.log(` ${row.function_name} (${relPath})`);
546
+ }
547
+ else {
548
+ console.log(` ${hash.slice(0, 12)}...`);
549
+ }
550
+ }
551
+ if (hashes.length > 15) {
552
+ console.log(` ... and ${hashes.length - 15} more`);
553
+ }
554
+ }
555
+ readDb.close();
556
+ }
557
+ // Summarize if requested
558
+ if (options.summarize) {
559
+ // Check if any function analysis exists (for a useful warning in non-raw mode)
560
+ const analysisCount = db.prepare("SELECT COUNT(*) as count FROM function_analysis").get().count;
561
+ if (!options.rawSource && analysisCount === 0) {
562
+ console.log(" ⚠️ No function analysis found. Run `ophan analyze` first for best results.\n");
563
+ }
564
+ else if (result.communityCount > 0) {
565
+ console.log(options.rawSource
566
+ ? "\n📝 Generating documentation from source code...\n"
567
+ : "\n📝 Generating documentation...\n");
568
+ const sumResult = await (0, core_1.summarizeCommunities)(db, {
569
+ config: { algorithm: config.algorithm },
570
+ onProgress: (step) => console.log(` ${step}`),
571
+ sourceMap,
572
+ rootPath,
573
+ });
574
+ const generated = sumResult.l1Summarized + sumResult.l2Summarized + sumResult.l3Summarized;
575
+ const cached = sumResult.l1Cached + sumResult.l2Cached + sumResult.l3Cached;
576
+ const driftSkipped = sumResult.l1DriftSkipped;
577
+ const failed = sumResult.l1Failed ?? 0;
578
+ const parts = [`${generated} generated`, `${cached} cached`];
579
+ if (driftSkipped > 0)
580
+ parts.push(`${driftSkipped} drift-skipped`);
581
+ if (failed > 0)
582
+ parts.push(`${failed} failed`);
583
+ console.log(`\n✅ Documentation: ${parts.join(", ")}`);
584
+ if (sumResult.ccDetected > 0) {
585
+ console.log(` Cross-cutting: ${sumResult.ccDetected} detected, ${sumResult.ccSummarized} documented, ${sumResult.ccCached} cached`);
586
+ const ccSummaries = (0, core_1.loadAllSummaries)(db, config.algorithm, 10);
587
+ if (ccSummaries.length > 0) {
588
+ console.log(`\n Cross-Cutting Concerns:`);
589
+ for (const s of ccSummaries) {
590
+ const sum = s.summary;
591
+ const badge = sum.concernType === "security" ? "SEC" : "DATA";
592
+ const communities = (sum.affectedCommunities || [])
593
+ .map((c) => c.communityTitle).join(", ");
594
+ console.log(` [${badge}] ${sum.title} (${communities})`);
595
+ }
596
+ }
597
+ }
598
+ }
599
+ }
600
+ console.log();
601
+ }
602
+ db.close();
244
603
  });
245
604
  program.parse();