@latent-space-labs/open-auto-doc 0.4.0 → 0.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +141 -0
  2. package/dist/index.js +241 -139
  3. package/package.json +1 -1
package/README.md ADDED
@@ -0,0 +1,141 @@
1
+ # open-auto-doc
2
+
3
+ **One command. Beautiful docs. Auto-deployed.**
4
+
5
+ Turn any GitHub repo into a fully hosted documentation site — powered by AI that actually reads your code.
6
+
7
+ ```bash
8
+ npx @latent-space-labs/open-auto-doc
9
+ ```
10
+
11
+ No config files. No manual writing. Just point it at your repos and get a production-ready docs site with architecture overviews, API references, component docs, and more.
12
+
13
+ ## Install
14
+
15
+ ```bash
16
+ # Run directly with npx (no install needed)
17
+ npx @latent-space-labs/open-auto-doc
18
+
19
+ # Or install globally
20
+ npm install -g @latent-space-labs/open-auto-doc
21
+ ```
22
+
23
+ ### Requirements
24
+
25
+ - **Node.js 18+**
26
+ - **A GitHub account** — works with public and private repos
27
+ - **An [Anthropic API key](https://console.anthropic.com/)** — powers the AI analysis
28
+
29
+ ## Quick Start
30
+
31
+ ```bash
32
+ # 1. Generate docs (interactive setup)
33
+ npx @latent-space-labs/open-auto-doc
34
+
35
+ # 2. Preview locally
36
+ cd docs-site && npm run dev
37
+
38
+ # 3. Deploy
39
+ open-auto-doc deploy
40
+
41
+ # 4. Auto-update on every push
42
+ open-auto-doc setup-ci
43
+ ```
44
+
45
+ The CLI walks you through GitHub auth, repo selection, and API key setup interactively.
46
+
47
+ ## What Gets Generated
48
+
49
+ | Section | Contents |
50
+ |---|---|
51
+ | **Architecture Overview** | Tech stack, module breakdown, data flow diagrams, entry points, key patterns |
52
+ | **Getting Started** | Prerequisites, install steps, quick start guide, config options |
53
+ | **API Reference** | Endpoints with methods, params, request/response bodies, auth |
54
+ | **Components** | UI components with props, usage examples, categories |
55
+ | **Data Models** | Schemas with field types, constraints, relationships, ER diagrams |
56
+ | **Business Logic** | Domain concepts, business rules, workflows |
57
+ | **Configuration** | Config files, env vars, settings reference |
58
+ | **Error Handling** | Error codes, common errors, debugging tips |
59
+
60
+ Multi-repo setups also get **cross-repo analysis** — shared dependencies, API contracts, and relationship diagrams.
61
+
62
+ ## All Commands
63
+
64
+ | Command | What it does |
65
+ |---|---|
66
+ | `open-auto-doc` | Full interactive setup: auth → pick repos → analyze → generate |
67
+ | `open-auto-doc init -o <dir>` | Same, with custom output directory (default: `docs-site`) |
68
+ | `open-auto-doc generate` | Re-analyze and regenerate using saved config |
69
+ | `open-auto-doc generate --incremental` | Only re-analyze changed files |
70
+ | `open-auto-doc deploy` | Create a GitHub repo for docs and push |
71
+ | `open-auto-doc setup-ci` | Add GitHub Actions workflow for auto-updates |
72
+ | `open-auto-doc setup-mcp` | Set up MCP server for Claude Code |
73
+ | `open-auto-doc login` | Authenticate with GitHub |
74
+ | `open-auto-doc logout` | Clear stored credentials |
75
+
76
+ ## MCP Server for AI Assistants
77
+
78
+ open-auto-doc includes an [MCP (Model Context Protocol)](https://modelcontextprotocol.io/) server that lets AI assistants like Claude Code query your documentation directly. Instead of reading raw source files, the AI gets structured knowledge about your architecture, APIs, components, and data models.
79
+
80
+ ### How it works
81
+
82
+ When you run `open-auto-doc init` or `generate`, the AI analysis results are cached as JSON in `<outputDir>/.autodoc-cache/`. The MCP server (`@latent-space-labs/open-auto-doc-mcp`) is a separate lightweight package that reads this cache and serves it over the Model Context Protocol via stdio.
83
+
84
+ ```
85
+ open-auto-doc init → .autodoc-cache/*.json → MCP Server (stdio) → Claude Code
86
+ ```
87
+
88
+ The server has no dependency on the analyzer or generator — it only reads JSON files, so `npx` startup is fast.
89
+
90
+ ### Setup
91
+
92
+ ```bash
93
+ # Set up after generating docs
94
+ open-auto-doc setup-mcp
95
+ ```
96
+
97
+ This creates `.mcp.json` in your project root. Claude Code will automatically discover the tools:
98
+
99
+ - `get_project_overview` — purpose, tech stack, summary stats
100
+ - `search_documentation` — full-text search across all sections
101
+ - `get_api_endpoints` — API endpoint details with params, auth, request/response
102
+ - `get_components` — UI component documentation with props and usage
103
+ - `get_data_models` — data model schemas with fields and relationships
104
+ - `get_architecture` — modules, data flow, patterns, entry points
105
+ - `get_diagram` — Mermaid diagrams (architecture, ER, flow)
106
+ - `get_business_rules` — domain concepts, rules, and workflows
107
+
108
+ Resources are also available at `docs://overview`, `docs://architecture`, `docs://getting-started`, and `docs://diagrams/{id}`.
109
+
110
+ You can also configure `.mcp.json` manually:
111
+
112
+ ```json
113
+ {
114
+ "mcpServers": {
115
+ "project-docs": {
116
+ "command": "npx",
117
+ "args": ["-y", "@latent-space-labs/open-auto-doc-mcp", "--project-dir", "."]
118
+ }
119
+ }
120
+ }
121
+ ```
122
+
123
+ ## Language Support
124
+
125
+ open-auto-doc is **language-agnostic**. It uses AI to understand code — not language-specific parsers. Works with TypeScript, JavaScript, Python, Go, Rust, Java, Kotlin, Ruby, PHP, C#, Swift, and more.
126
+
127
+ ## Privacy & Security
128
+
129
+ - Your Anthropic API key is **only sent to the Anthropic API**
130
+ - All code analysis runs **locally on your machine** (or in your own CI runner)
131
+ - Credentials stored at `~/.open-auto-doc/credentials.json` with `0600` permissions
132
+ - Run `open-auto-doc logout` to clear everything
133
+
134
+ ## Links
135
+
136
+ - [GitHub](https://github.com/Latent-Space-Labs/open-auto-doc) — source code, issues, contributing
137
+ - [Anthropic Console](https://console.anthropic.com/) — get an API key
138
+
139
+ ## License
140
+
141
+ MIT
package/dist/index.js CHANGED
@@ -4,10 +4,10 @@
4
4
  import { Command } from "commander";
5
5
 
6
6
  // src/commands/init.ts
7
- import * as p6 from "@clack/prompts";
8
- import fs9 from "fs";
7
+ import * as p7 from "@clack/prompts";
8
+ import fs10 from "fs";
9
9
  import net from "net";
10
- import path9 from "path";
10
+ import path10 from "path";
11
11
  import { fileURLToPath as fileURLToPath2 } from "url";
12
12
  import { spawn } from "child_process";
13
13
 
@@ -273,7 +273,8 @@ async function createAndPushDocsRepo(params) {
273
273
  owner = selected;
274
274
  }
275
275
  const isOrg = owner !== username;
276
- const defaultName = config?.repos?.[0] ? `${config.repos[0].name}-docs` : "my-project-docs";
276
+ const slug = config?.projectName ? config.projectName.toLowerCase().replace(/[^a-z0-9]+/g, "-").replace(/(^-|-$)/g, "") : config?.repos?.[0]?.name;
277
+ const defaultName = slug ? `${slug}-docs` : "my-project-docs";
277
278
  const repoName = await p3.text({
278
279
  message: "Name for the docs GitHub repo:",
279
280
  initialValue: defaultName,
@@ -527,7 +528,7 @@ function detectEntryFiles(flatFiles) {
527
528
  /^app\/page\.tsx$/,
528
529
  /^pages\/index\.\w+$/
529
530
  ];
530
- return flatFiles.filter((f) => entryPatterns.some((p12) => p12.test(f)));
531
+ return flatFiles.filter((f) => entryPatterns.some((p13) => p13.test(f)));
531
532
  }
532
533
  var DEP_FILES = [
533
534
  {
@@ -1730,7 +1731,7 @@ function classifyChanges(entries, staticAnalysis) {
1730
1731
  const affected = /* @__PURE__ */ new Set();
1731
1732
  for (const entry of entries) {
1732
1733
  for (const [section, patterns] of Object.entries(SECTION_PATTERNS)) {
1733
- if (patterns.some((p12) => p12.test(entry.filePath))) {
1734
+ if (patterns.some((p13) => p13.test(entry.filePath))) {
1734
1735
  affected.add(section);
1735
1736
  }
1736
1737
  }
@@ -2808,9 +2809,85 @@ function showSecretsInstructions(multiRepo = false) {
2808
2809
  );
2809
2810
  }
2810
2811
 
2811
- // ../generator/dist/index.js
2812
- import fs8 from "fs-extra";
2812
+ // src/commands/setup-mcp.ts
2813
+ import * as p6 from "@clack/prompts";
2814
+ import fs8 from "fs";
2813
2815
  import path8 from "path";
2816
+ var MCP_SERVER_KEY = "project-docs";
2817
+ function getMcpConfig() {
2818
+ return {
2819
+ command: "npx",
2820
+ args: ["-y", "@latent-space-labs/open-auto-doc-mcp", "--project-dir", "."]
2821
+ };
2822
+ }
2823
+ async function setupMcpConfig(opts) {
2824
+ const cacheDir = path8.join(opts.outputDir, ".autodoc-cache");
2825
+ if (!fs8.existsSync(cacheDir)) {
2826
+ p6.log.warn("No analysis cache found \u2014 skipping MCP setup. Run setup-mcp after generating docs.");
2827
+ return;
2828
+ }
2829
+ writeMcpJson(process.cwd());
2830
+ }
2831
+ async function setupMcpCommand() {
2832
+ p6.intro("open-auto-doc \u2014 MCP Server Setup");
2833
+ const config = loadConfig();
2834
+ if (!config) {
2835
+ p6.log.error(
2836
+ "No .autodocrc.json found. Run `open-auto-doc init` first to generate documentation."
2837
+ );
2838
+ process.exit(1);
2839
+ }
2840
+ const cacheDir = path8.join(config.outputDir, ".autodoc-cache");
2841
+ if (!fs8.existsSync(cacheDir)) {
2842
+ p6.log.error(
2843
+ `No analysis cache found at ${cacheDir}.
2844
+ Run \`open-auto-doc init\` or \`open-auto-doc generate\` first.`
2845
+ );
2846
+ process.exit(1);
2847
+ }
2848
+ const cacheFiles = fs8.readdirSync(cacheDir).filter((f) => f.endsWith("-analysis.json"));
2849
+ if (cacheFiles.length === 0) {
2850
+ p6.log.error("Cache directory exists but contains no analysis files.");
2851
+ process.exit(1);
2852
+ }
2853
+ writeMcpJson(process.cwd());
2854
+ p6.log.success("MCP server configured!");
2855
+ p6.note(
2856
+ [
2857
+ "The following tools are now available in Claude Code:",
2858
+ "",
2859
+ " get_project_overview \u2014 Project summary and tech stack",
2860
+ " search_documentation \u2014 Full-text search across all docs",
2861
+ " get_api_endpoints \u2014 API endpoint details",
2862
+ " get_components \u2014 UI component documentation",
2863
+ " get_data_models \u2014 Data model schemas",
2864
+ " get_architecture \u2014 Architecture and patterns",
2865
+ " get_diagram \u2014 Mermaid diagrams",
2866
+ " get_business_rules \u2014 Domain concepts and workflows"
2867
+ ].join("\n"),
2868
+ "Available MCP tools"
2869
+ );
2870
+ p6.outro("Open Claude Code in this project to start using the tools.");
2871
+ }
2872
+ function writeMcpJson(projectRoot) {
2873
+ const mcpPath = path8.join(projectRoot, ".mcp.json");
2874
+ let existing = {};
2875
+ if (fs8.existsSync(mcpPath)) {
2876
+ try {
2877
+ existing = JSON.parse(fs8.readFileSync(mcpPath, "utf-8"));
2878
+ } catch {
2879
+ }
2880
+ }
2881
+ const mcpServers = existing.mcpServers ?? {};
2882
+ mcpServers[MCP_SERVER_KEY] = getMcpConfig();
2883
+ const merged = { ...existing, mcpServers };
2884
+ fs8.writeFileSync(mcpPath, JSON.stringify(merged, null, 2) + "\n");
2885
+ p6.log.step(`Wrote ${path8.relative(projectRoot, mcpPath)}`);
2886
+ }
2887
+
2888
+ // ../generator/dist/index.js
2889
+ import fs9 from "fs-extra";
2890
+ import path9 from "path";
2814
2891
  import { execSync as execSync6 } from "child_process";
2815
2892
  import fs23 from "fs-extra";
2816
2893
  import path23 from "path";
@@ -2819,26 +2896,26 @@ import { fileURLToPath } from "url";
2819
2896
  import fs33 from "fs-extra";
2820
2897
  import path33 from "path";
2821
2898
  async function scaffoldSite(outputDir, projectName, templateDir) {
2822
- await fs8.copy(templateDir, outputDir, {
2899
+ await fs9.copy(templateDir, outputDir, {
2823
2900
  overwrite: true,
2824
2901
  filter: (src) => {
2825
- const basename = path8.basename(src);
2902
+ const basename = path9.basename(src);
2826
2903
  return basename !== "node_modules" && basename !== ".next" && basename !== ".source" && basename !== "dist" && basename !== ".turbo";
2827
2904
  }
2828
2905
  });
2829
2906
  const filesToProcess = await findTextFiles(outputDir);
2830
2907
  for (const filePath of filesToProcess) {
2831
2908
  try {
2832
- let content = await fs8.readFile(filePath, "utf-8");
2909
+ let content = await fs9.readFile(filePath, "utf-8");
2833
2910
  if (content.includes("{{projectName}}")) {
2834
2911
  content = content.replace(/\{\{projectName\}\}/g, projectName);
2835
- await fs8.writeFile(filePath, content, "utf-8");
2912
+ await fs9.writeFile(filePath, content, "utf-8");
2836
2913
  }
2837
2914
  } catch {
2838
2915
  }
2839
2916
  }
2840
- const nodeModulesPath = path8.join(outputDir, "node_modules");
2841
- if (!fs8.existsSync(nodeModulesPath)) {
2917
+ const nodeModulesPath = path9.join(outputDir, "node_modules");
2918
+ if (!fs9.existsSync(nodeModulesPath)) {
2842
2919
  try {
2843
2920
  execSync6("npm install --ignore-scripts", {
2844
2921
  cwd: outputDir,
@@ -2846,7 +2923,7 @@ async function scaffoldSite(outputDir, projectName, templateDir) {
2846
2923
  timeout: 12e4
2847
2924
  });
2848
2925
  } catch (err) {
2849
- const hasNodeModules = fs8.existsSync(nodeModulesPath);
2926
+ const hasNodeModules = fs9.existsSync(nodeModulesPath);
2850
2927
  if (!hasNodeModules) {
2851
2928
  throw new Error(`npm install failed: ${err instanceof Error ? err.message : err}`);
2852
2929
  }
@@ -2875,14 +2952,14 @@ async function findTextFiles(dir) {
2875
2952
  ]);
2876
2953
  const results = [];
2877
2954
  async function walk(currentDir) {
2878
- const entries = await fs8.readdir(currentDir, { withFileTypes: true });
2955
+ const entries = await fs9.readdir(currentDir, { withFileTypes: true });
2879
2956
  for (const entry of entries) {
2880
- const fullPath = path8.join(currentDir, entry.name);
2957
+ const fullPath = path9.join(currentDir, entry.name);
2881
2958
  if (entry.isDirectory()) {
2882
2959
  if (entry.name !== "node_modules" && entry.name !== ".next" && entry.name !== ".source") {
2883
2960
  await walk(fullPath);
2884
2961
  }
2885
- } else if (textExtensions.has(path8.extname(entry.name))) {
2962
+ } else if (textExtensions.has(path9.extname(entry.name))) {
2886
2963
  results.push(fullPath);
2887
2964
  }
2888
2965
  }
@@ -3614,12 +3691,12 @@ function buildRepoSummary(result) {
3614
3691
  }
3615
3692
 
3616
3693
  // src/commands/init.ts
3617
- var __dirname2 = path9.dirname(fileURLToPath2(import.meta.url));
3694
+ var __dirname2 = path10.dirname(fileURLToPath2(import.meta.url));
3618
3695
  async function initCommand(options) {
3619
- p6.intro("open-auto-doc \u2014 AI-powered documentation generator");
3696
+ p7.intro("open-auto-doc \u2014 AI-powered documentation generator");
3620
3697
  const templateDir = resolveTemplateDir();
3621
- if (!fs9.existsSync(path9.join(templateDir, "package.json"))) {
3622
- p6.log.error(
3698
+ if (!fs10.existsSync(path10.join(templateDir, "package.json"))) {
3699
+ p7.log.error(
3623
3700
  `Site template not found at: ${templateDir}
3624
3701
  This usually means the npm package was not built correctly.
3625
3702
  Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
@@ -3628,38 +3705,53 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3628
3705
  }
3629
3706
  let token = getGithubToken();
3630
3707
  if (!token) {
3631
- p6.log.info("Let's connect your GitHub account.");
3708
+ p7.log.info("Let's connect your GitHub account.");
3632
3709
  token = await authenticateWithGithub();
3633
3710
  setGithubToken(token);
3634
3711
  } else {
3635
- p6.log.success("Using saved GitHub credentials.");
3712
+ p7.log.success("Using saved GitHub credentials.");
3636
3713
  }
3637
3714
  const repos = await pickRepos(token);
3638
- p6.log.info(`Selected ${repos.length} ${repos.length === 1 ? "repository" : "repositories"}`);
3715
+ p7.log.info(`Selected ${repos.length} ${repos.length === 1 ? "repository" : "repositories"}`);
3716
+ let projectName;
3717
+ if (repos.length > 1) {
3718
+ const nameInput = await p7.text({
3719
+ message: "What would you like to name this project?",
3720
+ placeholder: "My Project",
3721
+ validate: (v) => {
3722
+ if (!v || v.trim().length === 0) return "Project name is required";
3723
+ }
3724
+ });
3725
+ if (p7.isCancel(nameInput)) {
3726
+ p7.cancel("Operation cancelled");
3727
+ process.exit(0);
3728
+ }
3729
+ projectName = nameInput;
3730
+ }
3639
3731
  let apiKey = getAnthropicKey();
3640
3732
  if (!apiKey) {
3641
- const keyInput = await p6.text({
3733
+ const keyInput = await p7.text({
3642
3734
  message: "Enter your Anthropic API key",
3643
3735
  placeholder: "sk-ant-...",
3644
3736
  validate: (v) => {
3645
3737
  if (!v || !v.startsWith("sk-ant-")) return "Please enter a valid Anthropic API key";
3646
3738
  }
3647
3739
  });
3648
- if (p6.isCancel(keyInput)) {
3649
- p6.cancel("Operation cancelled");
3740
+ if (p7.isCancel(keyInput)) {
3741
+ p7.cancel("Operation cancelled");
3650
3742
  process.exit(0);
3651
3743
  }
3652
3744
  apiKey = keyInput;
3653
- const saveKey = await p6.confirm({
3745
+ const saveKey = await p7.confirm({
3654
3746
  message: "Save API key for future use?"
3655
3747
  });
3656
- if (saveKey && !p6.isCancel(saveKey)) {
3748
+ if (saveKey && !p7.isCancel(saveKey)) {
3657
3749
  setAnthropicKey(apiKey);
3658
3750
  }
3659
3751
  } else {
3660
- p6.log.success("Using saved Anthropic API key.");
3752
+ p7.log.success("Using saved Anthropic API key.");
3661
3753
  }
3662
- const model = await p6.select({
3754
+ const model = await p7.select({
3663
3755
  message: "Which model should analyze your repos?",
3664
3756
  options: [
3665
3757
  { value: "claude-sonnet-4-6", label: "Claude Sonnet 4.6", hint: "Fast & capable (recommended)" },
@@ -3667,12 +3759,12 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3667
3759
  { value: "claude-opus-4-6", label: "Claude Opus 4.6", hint: "Most capable, slowest" }
3668
3760
  ]
3669
3761
  });
3670
- if (p6.isCancel(model)) {
3671
- p6.cancel("Operation cancelled");
3762
+ if (p7.isCancel(model)) {
3763
+ p7.cancel("Operation cancelled");
3672
3764
  process.exit(0);
3673
3765
  }
3674
- p6.log.info(`Using ${model}`);
3675
- const cloneSpinner = p6.spinner();
3766
+ p7.log.info(`Using ${model}`);
3767
+ const cloneSpinner = p7.spinner();
3676
3768
  cloneSpinner.start(`Cloning ${repos.length} repositories...`);
3677
3769
  const clones = [];
3678
3770
  for (const repo of repos) {
@@ -3681,12 +3773,12 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3681
3773
  const cloned = cloneRepo(repo, token);
3682
3774
  clones.push(cloned);
3683
3775
  } catch (err) {
3684
- p6.log.warn(`Failed to clone ${repo.name}: ${err instanceof Error ? err.message : err}`);
3776
+ p7.log.warn(`Failed to clone ${repo.name}: ${err instanceof Error ? err.message : err}`);
3685
3777
  }
3686
3778
  }
3687
3779
  cloneSpinner.stop(`Cloned ${clones.length}/${repos.length} repositories`);
3688
3780
  if (clones.length === 0) {
3689
- p6.log.error("No repositories were cloned.");
3781
+ p7.log.error("No repositories were cloned.");
3690
3782
  process.exit(1);
3691
3783
  }
3692
3784
  const total = clones.length;
@@ -3714,7 +3806,7 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3714
3806
  } catch (err) {
3715
3807
  const errMsg = err instanceof Error ? err.message : String(err);
3716
3808
  progressTable.update(repoName, { status: "failed", error: errMsg });
3717
- p6.log.warn(`[${repoName}] Analysis failed: ${errMsg}`);
3809
+ p7.log.warn(`[${repoName}] Analysis failed: ${errMsg}`);
3718
3810
  return { repo: repoName, result: null };
3719
3811
  }
3720
3812
  });
@@ -3722,17 +3814,17 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3722
3814
  progressTable.stop();
3723
3815
  const results = settled.filter((s) => s.result !== null).map((s) => s.result);
3724
3816
  const { done, failed } = progressTable.getSummary();
3725
- p6.log.step(
3817
+ p7.log.step(
3726
3818
  `Analyzed ${done}/${total} repositories` + (failed > 0 ? ` (${failed} failed)` : "") + (results.length > 0 ? ` \u2014 ${results.reduce((n, r) => n + r.apiEndpoints.length, 0)} endpoints, ${results.reduce((n, r) => n + r.components.length, 0)} components, ${results.reduce((n, r) => n + r.diagrams.length, 0)} diagrams` : "")
3727
3819
  );
3728
3820
  if (results.length === 0) {
3729
- p6.log.error("No repositories were successfully analyzed.");
3821
+ p7.log.error("No repositories were successfully analyzed.");
3730
3822
  cleanup(clones);
3731
3823
  process.exit(1);
3732
3824
  }
3733
3825
  let crossRepo;
3734
3826
  if (results.length > 1) {
3735
- const crossSpinner = p6.spinner();
3827
+ const crossSpinner = p7.spinner();
3736
3828
  crossSpinner.start("Analyzing cross-repository relationships...");
3737
3829
  try {
3738
3830
  crossRepo = await analyzeCrossRepos(results, apiKey, model, (text4) => {
@@ -3741,31 +3833,33 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3741
3833
  crossSpinner.stop(`Cross-repo analysis complete \u2014 ${crossRepo.repoRelationships.length} relationships found`);
3742
3834
  } catch (err) {
3743
3835
  crossSpinner.stop("Cross-repo analysis failed (non-fatal)");
3744
- p6.log.warn(`Cross-repo error: ${err instanceof Error ? err.message : err}`);
3836
+ p7.log.warn(`Cross-repo error: ${err instanceof Error ? err.message : err}`);
3745
3837
  }
3746
3838
  }
3747
- const outputDir = path9.resolve(options.output || "docs-site");
3748
- const projectName = results.length === 1 ? results[0].repoName : "My Project";
3749
- const genSpinner = p6.spinner();
3839
+ const outputDir = path10.resolve(options.output || "docs-site");
3840
+ if (!projectName) {
3841
+ projectName = results.length === 1 ? results[0].repoName : "My Project";
3842
+ }
3843
+ const genSpinner = p7.spinner();
3750
3844
  try {
3751
3845
  genSpinner.start("Scaffolding documentation site...");
3752
3846
  await scaffoldSite(outputDir, projectName, templateDir);
3753
3847
  genSpinner.stop("Site scaffolded");
3754
3848
  } catch (err) {
3755
3849
  genSpinner.stop("Scaffold failed");
3756
- p6.log.error(`Scaffold error: ${err instanceof Error ? err.stack || err.message : err}`);
3850
+ p7.log.error(`Scaffold error: ${err instanceof Error ? err.stack || err.message : err}`);
3757
3851
  cleanup(clones);
3758
3852
  process.exit(1);
3759
3853
  }
3760
3854
  try {
3761
3855
  genSpinner.start("Writing documentation content...");
3762
- const contentDir = path9.join(outputDir, "content", "docs");
3856
+ const contentDir = path10.join(outputDir, "content", "docs");
3763
3857
  await writeContent(contentDir, results, crossRepo);
3764
3858
  await writeMeta(contentDir, results, crossRepo);
3765
3859
  genSpinner.stop("Documentation content written");
3766
3860
  } catch (err) {
3767
3861
  genSpinner.stop("Content writing failed");
3768
- p6.log.error(`Content error: ${err instanceof Error ? err.stack || err.message : err}`);
3862
+ p7.log.error(`Content error: ${err instanceof Error ? err.stack || err.message : err}`);
3769
3863
  cleanup(clones);
3770
3864
  process.exit(1);
3771
3865
  }
@@ -3776,7 +3870,8 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3776
3870
  cloneUrl: r.cloneUrl,
3777
3871
  htmlUrl: r.htmlUrl
3778
3872
  })),
3779
- outputDir
3873
+ outputDir,
3874
+ ...projectName !== results[0]?.repoName && { projectName }
3780
3875
  };
3781
3876
  try {
3782
3877
  saveConfig(config);
@@ -3786,31 +3881,37 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3786
3881
  try {
3787
3882
  await runBuildCheck({ docsDir: outputDir, apiKey, model });
3788
3883
  } catch (err) {
3789
- p6.log.warn(`Build check skipped: ${err instanceof Error ? err.message : err}`);
3884
+ p7.log.warn(`Build check skipped: ${err instanceof Error ? err.message : err}`);
3885
+ }
3886
+ p7.log.success("Documentation generated successfully!");
3887
+ const shouldSetupMcp = await p7.confirm({
3888
+ message: "Set up MCP server so Claude Code can query your docs?"
3889
+ });
3890
+ if (!p7.isCancel(shouldSetupMcp) && shouldSetupMcp) {
3891
+ await setupMcpConfig({ outputDir });
3790
3892
  }
3791
- p6.log.success("Documentation generated successfully!");
3792
3893
  let devServer;
3793
3894
  const devPort = await findFreePort(3e3);
3794
3895
  try {
3795
3896
  devServer = startDevServer(outputDir, devPort);
3796
- p6.log.success(`Documentation site running at http://localhost:${devPort}`);
3797
- p6.log.info("Open the link above to preview your docs site.");
3897
+ p7.log.success(`Documentation site running at http://localhost:${devPort}`);
3898
+ p7.log.info("Open the link above to preview your docs site.");
3798
3899
  } catch {
3799
- p6.log.warn("Could not start preview server. You can run it manually:");
3800
- p6.log.info(` cd ${path9.relative(process.cwd(), outputDir)} && npm run dev`);
3900
+ p7.log.warn("Could not start preview server. You can run it manually:");
3901
+ p7.log.info(` cd ${path10.relative(process.cwd(), outputDir)} && npm run dev`);
3801
3902
  }
3802
- const shouldDeploy = await p6.confirm({
3903
+ const shouldDeploy = await p7.confirm({
3803
3904
  message: "Would you like to deploy your docs to GitHub?"
3804
3905
  });
3805
- if (p6.isCancel(shouldDeploy) || !shouldDeploy) {
3906
+ if (p7.isCancel(shouldDeploy) || !shouldDeploy) {
3806
3907
  if (devServer) {
3807
3908
  killDevServer(devServer);
3808
3909
  }
3809
- p6.note(
3810
- `cd ${path9.relative(process.cwd(), outputDir)} && npm run dev`,
3910
+ p7.note(
3911
+ `cd ${path10.relative(process.cwd(), outputDir)} && npm run dev`,
3811
3912
  "To start the dev server again"
3812
3913
  );
3813
- p6.outro("Done!");
3914
+ p7.outro("Done!");
3814
3915
  return;
3815
3916
  }
3816
3917
  if (devServer) {
@@ -3822,26 +3923,26 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3822
3923
  config
3823
3924
  });
3824
3925
  if (!deployResult) {
3825
- p6.note(
3826
- `cd ${path9.relative(process.cwd(), outputDir)} && npm run dev`,
3926
+ p7.note(
3927
+ `cd ${path10.relative(process.cwd(), outputDir)} && npm run dev`,
3827
3928
  "Next steps"
3828
3929
  );
3829
- p6.outro("Done!");
3930
+ p7.outro("Done!");
3830
3931
  return;
3831
3932
  }
3832
- const shouldSetupCi = await p6.confirm({
3933
+ const shouldSetupCi = await p7.confirm({
3833
3934
  message: "Would you like to set up CI to auto-update docs on every push?"
3834
3935
  });
3835
- if (p6.isCancel(shouldSetupCi) || !shouldSetupCi) {
3936
+ if (p7.isCancel(shouldSetupCi) || !shouldSetupCi) {
3836
3937
  showVercelInstructions(deployResult.owner, deployResult.repoName);
3837
- p6.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
3938
+ p7.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
3838
3939
  return;
3839
3940
  }
3840
3941
  const gitRoot = getGitRoot();
3841
3942
  if (!gitRoot) {
3842
- p6.log.warn("Not in a git repository \u2014 skipping CI setup. Run `open-auto-doc setup-ci` from your project root later.");
3943
+ p7.log.warn("Not in a git repository \u2014 skipping CI setup. Run `open-auto-doc setup-ci` from your project root later.");
3843
3944
  showVercelInstructions(deployResult.owner, deployResult.repoName);
3844
- p6.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
3945
+ p7.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
3845
3946
  return;
3846
3947
  }
3847
3948
  const ciResult = await createCiWorkflow({
@@ -3852,24 +3953,24 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
3852
3953
  config
3853
3954
  });
3854
3955
  showVercelInstructions(deployResult.owner, deployResult.repoName);
3855
- p6.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
3956
+ p7.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
3856
3957
  }
3857
3958
  function resolveTemplateDir() {
3858
3959
  const candidates = [
3859
- path9.resolve(__dirname2, "site-template"),
3960
+ path10.resolve(__dirname2, "site-template"),
3860
3961
  // dist/site-template (npm global install)
3861
- path9.resolve(__dirname2, "../../site-template"),
3962
+ path10.resolve(__dirname2, "../../site-template"),
3862
3963
  // monorepo: packages/site-template
3863
- path9.resolve(__dirname2, "../../../site-template"),
3964
+ path10.resolve(__dirname2, "../../../site-template"),
3864
3965
  // monorepo alt
3865
- path9.resolve(__dirname2, "../../../../packages/site-template")
3966
+ path10.resolve(__dirname2, "../../../../packages/site-template")
3866
3967
  // monorepo from nested dist
3867
3968
  ];
3868
3969
  for (const candidate of candidates) {
3869
- const pkgPath = path9.join(candidate, "package.json");
3870
- if (fs9.existsSync(pkgPath)) return candidate;
3970
+ const pkgPath = path10.join(candidate, "package.json");
3971
+ if (fs10.existsSync(pkgPath)) return candidate;
3871
3972
  }
3872
- return path9.resolve(__dirname2, "site-template");
3973
+ return path10.resolve(__dirname2, "site-template");
3873
3974
  }
3874
3975
  function cleanup(clones) {
3875
3976
  for (const clone of clones) {
@@ -3912,26 +4013,26 @@ function killDevServer(child) {
3912
4013
  }
3913
4014
 
3914
4015
  // src/commands/generate.ts
3915
- import * as p7 from "@clack/prompts";
3916
- import path10 from "path";
4016
+ import * as p8 from "@clack/prompts";
4017
+ import path11 from "path";
3917
4018
  async function generateCommand(options) {
3918
- p7.intro("open-auto-doc \u2014 Regenerating documentation");
4019
+ p8.intro("open-auto-doc \u2014 Regenerating documentation");
3919
4020
  const config = loadConfig();
3920
4021
  if (!config) {
3921
- p7.log.error("No .autodocrc.json found. Run `open-auto-doc init` first.");
4022
+ p8.log.error("No .autodocrc.json found. Run `open-auto-doc init` first.");
3922
4023
  process.exit(1);
3923
4024
  }
3924
4025
  const token = getGithubToken();
3925
4026
  const apiKey = getAnthropicKey();
3926
4027
  if (!token) {
3927
- p7.log.error("Not authenticated. Run `open-auto-doc login` or set GITHUB_TOKEN env var.");
4028
+ p8.log.error("Not authenticated. Run `open-auto-doc login` or set GITHUB_TOKEN env var.");
3928
4029
  process.exit(1);
3929
4030
  }
3930
4031
  if (!apiKey) {
3931
- p7.log.error("No Anthropic API key found. Run `open-auto-doc init` or set ANTHROPIC_API_KEY env var.");
4032
+ p8.log.error("No Anthropic API key found. Run `open-auto-doc init` or set ANTHROPIC_API_KEY env var.");
3932
4033
  process.exit(1);
3933
4034
  }
3934
- const model = await p7.select({
4035
+ const model = await p8.select({
3935
4036
  message: "Which model should analyze your repos?",
3936
4037
  options: [
3937
4038
  { value: "claude-sonnet-4-6", label: "Claude Sonnet 4.6", hint: "Fast & capable (recommended)" },
@@ -3939,20 +4040,20 @@ async function generateCommand(options) {
3939
4040
  { value: "claude-opus-4-6", label: "Claude Opus 4.6", hint: "Most capable, slowest" }
3940
4041
  ]
3941
4042
  });
3942
- if (p7.isCancel(model)) {
3943
- p7.cancel("Cancelled.");
4043
+ if (p8.isCancel(model)) {
4044
+ p8.cancel("Cancelled.");
3944
4045
  process.exit(0);
3945
4046
  }
3946
- p7.log.info(`Using ${model}`);
4047
+ p8.log.info(`Using ${model}`);
3947
4048
  const incremental = options.incremental && !options.force;
3948
- const cacheDir = path10.join(config.outputDir, ".autodoc-cache");
4049
+ const cacheDir = path11.join(config.outputDir, ".autodoc-cache");
3949
4050
  const targetRepoName = options.repo;
3950
4051
  let reposToAnalyze = config.repos;
3951
4052
  const cachedResults = [];
3952
4053
  if (targetRepoName) {
3953
4054
  const targetRepo = config.repos.find((r) => r.name === targetRepoName);
3954
4055
  if (!targetRepo) {
3955
- p7.log.error(`Repo "${targetRepoName}" not found in config. Available: ${config.repos.map((r) => r.name).join(", ")}`);
4056
+ p8.log.error(`Repo "${targetRepoName}" not found in config. Available: ${config.repos.map((r) => r.name).join(", ")}`);
3956
4057
  process.exit(1);
3957
4058
  }
3958
4059
  reposToAnalyze = [targetRepo];
@@ -3961,13 +4062,13 @@ async function generateCommand(options) {
3961
4062
  const cached = loadCache(cacheDir, repo.name);
3962
4063
  if (cached) {
3963
4064
  cachedResults.push(cached.result);
3964
- p7.log.info(`Using cached analysis for ${repo.name}`);
4065
+ p8.log.info(`Using cached analysis for ${repo.name}`);
3965
4066
  } else {
3966
- p7.log.warn(`No cached analysis for ${repo.name} \u2014 its docs will be stale until it pushes`);
4067
+ p8.log.warn(`No cached analysis for ${repo.name} \u2014 its docs will be stale until it pushes`);
3967
4068
  }
3968
4069
  }
3969
4070
  }
3970
- const cloneSpinner = p7.spinner();
4071
+ const cloneSpinner = p8.spinner();
3971
4072
  cloneSpinner.start(`Cloning ${reposToAnalyze.length} ${reposToAnalyze.length === 1 ? "repository" : "repositories"}...`);
3972
4073
  const clones = [];
3973
4074
  for (const repo of reposToAnalyze) {
@@ -3986,12 +4087,12 @@ async function generateCommand(options) {
3986
4087
  );
3987
4088
  clones.push(cloned);
3988
4089
  } catch (err) {
3989
- p7.log.warn(`Failed to clone ${repo.name}: ${err instanceof Error ? err.message : err}`);
4090
+ p8.log.warn(`Failed to clone ${repo.name}: ${err instanceof Error ? err.message : err}`);
3990
4091
  }
3991
4092
  }
3992
4093
  cloneSpinner.stop(`Cloned ${clones.length}/${reposToAnalyze.length} ${reposToAnalyze.length === 1 ? "repository" : "repositories"}`);
3993
4094
  if (clones.length === 0) {
3994
- p7.log.error("No repositories were cloned.");
4095
+ p8.log.error("No repositories were cloned.");
3995
4096
  process.exit(1);
3996
4097
  }
3997
4098
  const total = clones.length;
@@ -4066,7 +4167,7 @@ async function generateCommand(options) {
4066
4167
  } catch (err) {
4067
4168
  const errMsg = err instanceof Error ? err.message : String(err);
4068
4169
  progressTable.update(repoName, { status: "failed", error: errMsg });
4069
- p7.log.warn(`[${repoName}] Analysis failed: ${errMsg}`);
4170
+ p8.log.warn(`[${repoName}] Analysis failed: ${errMsg}`);
4070
4171
  return { repo: repoName, result: null };
4071
4172
  }
4072
4173
  });
@@ -4074,14 +4175,14 @@ async function generateCommand(options) {
4074
4175
  progressTable.stop();
4075
4176
  const freshResults = settled.filter((s) => s.result !== null).map((s) => s.result);
4076
4177
  const { done: analyzedCount, failed: failedCount } = progressTable.getSummary();
4077
- p7.log.step(
4178
+ p8.log.step(
4078
4179
  `Analyzed ${analyzedCount}/${total} ${total === 1 ? "repository" : "repositories"}` + (failedCount > 0 ? ` (${failedCount} failed)` : "")
4079
4180
  );
4080
4181
  const results = [...freshResults, ...cachedResults];
4081
4182
  if (results.length > 0) {
4082
4183
  let crossRepo;
4083
4184
  if (results.length > 1) {
4084
- const crossSpinner = p7.spinner();
4185
+ const crossSpinner = p8.spinner();
4085
4186
  crossSpinner.start("Analyzing cross-repository relationships...");
4086
4187
  try {
4087
4188
  crossRepo = await analyzeCrossRepos(results, apiKey, model, (text4) => {
@@ -4090,66 +4191,66 @@ async function generateCommand(options) {
4090
4191
  crossSpinner.stop(`Cross-repo analysis complete \u2014 ${crossRepo.repoRelationships.length} relationships found`);
4091
4192
  } catch (err) {
4092
4193
  crossSpinner.stop("Cross-repo analysis failed (non-fatal)");
4093
- p7.log.warn(`Cross-repo error: ${err instanceof Error ? err.message : err}`);
4194
+ p8.log.warn(`Cross-repo error: ${err instanceof Error ? err.message : err}`);
4094
4195
  }
4095
4196
  }
4096
- const contentDir = path10.join(config.outputDir, "content", "docs");
4197
+ const contentDir = path11.join(config.outputDir, "content", "docs");
4097
4198
  await writeContent(contentDir, results, crossRepo, changelogs.size > 0 ? changelogs : void 0);
4098
4199
  await writeMeta(contentDir, results, crossRepo, changelogs.size > 0 ? changelogs : void 0);
4099
4200
  try {
4100
4201
  await runBuildCheck({ docsDir: config.outputDir, apiKey, model });
4101
4202
  } catch (err) {
4102
- p7.log.warn(`Build check skipped: ${err instanceof Error ? err.message : err}`);
4203
+ p8.log.warn(`Build check skipped: ${err instanceof Error ? err.message : err}`);
4103
4204
  }
4104
- p7.log.success("Documentation regenerated!");
4205
+ p8.log.success("Documentation regenerated!");
4105
4206
  }
4106
4207
  for (const clone of clones) {
4107
4208
  cleanupClone(clone);
4108
4209
  }
4109
- p7.outro("Done!");
4210
+ p8.outro("Done!");
4110
4211
  }
4111
4212
 
4112
4213
  // src/commands/deploy.ts
4113
- import * as p8 from "@clack/prompts";
4114
- import fs10 from "fs";
4115
- import path11 from "path";
4214
+ import * as p9 from "@clack/prompts";
4215
+ import fs11 from "fs";
4216
+ import path12 from "path";
4116
4217
  function resolveDocsDir(config, dirOption) {
4117
4218
  if (dirOption) {
4118
- const resolved = path11.resolve(dirOption);
4119
- if (!fs10.existsSync(resolved)) {
4120
- p8.log.error(`Directory not found: ${resolved}`);
4219
+ const resolved = path12.resolve(dirOption);
4220
+ if (!fs11.existsSync(resolved)) {
4221
+ p9.log.error(`Directory not found: ${resolved}`);
4121
4222
  process.exit(1);
4122
4223
  }
4123
4224
  return resolved;
4124
4225
  }
4125
- if (config?.outputDir && fs10.existsSync(path11.resolve(config.outputDir))) {
4126
- return path11.resolve(config.outputDir);
4226
+ if (config?.outputDir && fs11.existsSync(path12.resolve(config.outputDir))) {
4227
+ return path12.resolve(config.outputDir);
4127
4228
  }
4128
- if (fs10.existsSync(path11.resolve("docs-site"))) {
4129
- return path11.resolve("docs-site");
4229
+ if (fs11.existsSync(path12.resolve("docs-site"))) {
4230
+ return path12.resolve("docs-site");
4130
4231
  }
4131
- p8.log.error(
4232
+ p9.log.error(
4132
4233
  "Could not find docs site directory. Use --dir to specify the path, or run `open-auto-doc init` first."
4133
4234
  );
4134
4235
  process.exit(1);
4135
4236
  }
4136
4237
  async function deployCommand(options) {
4137
- p8.intro("open-auto-doc \u2014 Deploy docs to GitHub");
4238
+ p9.intro("open-auto-doc \u2014 Deploy docs to GitHub");
4138
4239
  const token = getGithubToken();
4139
4240
  if (!token) {
4140
- p8.log.error("Not authenticated. Run `open-auto-doc login` first.");
4241
+ p9.log.error("Not authenticated. Run `open-auto-doc login` first.");
4141
4242
  process.exit(1);
4142
4243
  }
4143
4244
  const config = loadConfig();
4144
4245
  const docsDir = resolveDocsDir(config, options.dir);
4145
- p8.log.info(`Docs directory: ${docsDir}`);
4246
+ p9.log.info(`Docs directory: ${docsDir}`);
4146
4247
  if (config?.docsRepo) {
4147
- p8.log.info(`Docs repo already configured: ${config.docsRepo}`);
4248
+ p9.log.info(`Docs repo already configured: ${config.docsRepo}`);
4148
4249
  const pushed = await pushUpdates({ token, docsDir, docsRepo: config.docsRepo });
4149
4250
  if (pushed) {
4150
- p8.outro("Docs updated! Vercel will auto-deploy from the push.");
4251
+ p9.outro("Docs updated! Vercel will auto-deploy from the push.");
4151
4252
  } else {
4152
- p8.outro("Docs are up to date!");
4253
+ p9.outro("Docs are up to date!");
4153
4254
  }
4154
4255
  return;
4155
4256
  }
@@ -4159,20 +4260,20 @@ async function deployCommand(options) {
4159
4260
  config: config || { repos: [], outputDir: docsDir }
4160
4261
  });
4161
4262
  if (!result) {
4162
- p8.cancel("Deploy cancelled.");
4263
+ p9.cancel("Deploy cancelled.");
4163
4264
  process.exit(0);
4164
4265
  }
4165
4266
  showVercelInstructions(result.owner, result.repoName);
4166
- p8.outro(`Docs repo: https://github.com/${result.owner}/${result.repoName}`);
4267
+ p9.outro(`Docs repo: https://github.com/${result.owner}/${result.repoName}`);
4167
4268
  }
4168
4269
 
4169
4270
  // src/commands/setup-ci.ts
4170
- import * as p9 from "@clack/prompts";
4271
+ import * as p10 from "@clack/prompts";
4171
4272
  async function setupCiCommand() {
4172
- p9.intro("open-auto-doc \u2014 CI/CD Setup");
4273
+ p10.intro("open-auto-doc \u2014 CI/CD Setup");
4173
4274
  const config = loadConfig();
4174
4275
  if (!config?.docsRepo) {
4175
- p9.log.error(
4276
+ p10.log.error(
4176
4277
  "No docs repo configured. Run `open-auto-doc deploy` first to create a docs GitHub repo."
4177
4278
  );
4178
4279
  process.exit(1);
@@ -4180,12 +4281,12 @@ async function setupCiCommand() {
4180
4281
  const token = getGithubToken();
4181
4282
  const isMultiRepo = config.repos.length > 1;
4182
4283
  if (isMultiRepo && !token) {
4183
- p9.log.error("Not authenticated. Run `open-auto-doc login` first (needed to push workflows to source repos).");
4284
+ p10.log.error("Not authenticated. Run `open-auto-doc login` first (needed to push workflows to source repos).");
4184
4285
  process.exit(1);
4185
4286
  }
4186
4287
  const gitRoot = getGitRoot();
4187
4288
  if (!isMultiRepo && !gitRoot) {
4188
- p9.log.error("Not in a git repository. Run this command from your project root.");
4289
+ p10.log.error("Not in a git repository. Run this command from your project root.");
4189
4290
  process.exit(1);
4190
4291
  }
4191
4292
  const result = await createCiWorkflow({
@@ -4196,50 +4297,51 @@ async function setupCiCommand() {
4196
4297
  config
4197
4298
  });
4198
4299
  if (!result) {
4199
- p9.cancel("Setup cancelled.");
4300
+ p10.cancel("Setup cancelled.");
4200
4301
  process.exit(0);
4201
4302
  }
4202
4303
  if ("repos" in result) {
4203
- p9.outro("Per-repo CI workflows created! Add the required secrets to each source repo.");
4304
+ p10.outro("Per-repo CI workflows created! Add the required secrets to each source repo.");
4204
4305
  } else {
4205
- p9.outro("CI/CD workflow is ready! Commit and push to activate.");
4306
+ p10.outro("CI/CD workflow is ready! Commit and push to activate.");
4206
4307
  }
4207
4308
  }
4208
4309
 
4209
4310
  // src/commands/login.ts
4210
- import * as p10 from "@clack/prompts";
4311
+ import * as p11 from "@clack/prompts";
4211
4312
  async function loginCommand() {
4212
- p10.intro("open-auto-doc \u2014 GitHub Login");
4313
+ p11.intro("open-auto-doc \u2014 GitHub Login");
4213
4314
  const existing = getGithubToken();
4214
4315
  if (existing) {
4215
- const overwrite = await p10.confirm({
4316
+ const overwrite = await p11.confirm({
4216
4317
  message: "You're already logged in. Re-authenticate?"
4217
4318
  });
4218
- if (!overwrite || p10.isCancel(overwrite)) {
4219
- p10.cancel("Keeping existing credentials");
4319
+ if (!overwrite || p11.isCancel(overwrite)) {
4320
+ p11.cancel("Keeping existing credentials");
4220
4321
  return;
4221
4322
  }
4222
4323
  }
4223
4324
  const token = await authenticateWithGithub();
4224
4325
  setGithubToken(token);
4225
- p10.outro("Logged in successfully!");
4326
+ p11.outro("Logged in successfully!");
4226
4327
  }
4227
4328
 
4228
4329
  // src/commands/logout.ts
4229
- import * as p11 from "@clack/prompts";
4330
+ import * as p12 from "@clack/prompts";
4230
4331
  async function logoutCommand() {
4231
- p11.intro("open-auto-doc \u2014 Logout");
4332
+ p12.intro("open-auto-doc \u2014 Logout");
4232
4333
  clearAll();
4233
- p11.outro("Credentials cleared. You've been logged out.");
4334
+ p12.outro("Credentials cleared. You've been logged out.");
4234
4335
  }
4235
4336
 
4236
4337
  // src/index.ts
4237
4338
  var program = new Command();
4238
- program.name("open-auto-doc").description("Auto-generate beautiful documentation websites from GitHub repositories using AI").version("0.4.0");
4339
+ program.name("open-auto-doc").description("Auto-generate beautiful documentation websites from GitHub repositories using AI").version("0.5.0");
4239
4340
  program.command("init", { isDefault: true }).description("Initialize and generate documentation for your repositories").option("-o, --output <dir>", "Output directory", "docs-site").action(initCommand);
4240
4341
  program.command("generate").description("Regenerate documentation using existing configuration").option("--incremental", "Only re-analyze changed files (uses cached results)").option("--force", "Force full regeneration (ignore cache)").option("--repo <name>", "Only analyze this repo (uses cache for others)").action(generateCommand);
4241
4342
  program.command("deploy").description("Create a GitHub repo for docs and push (connect to Vercel for auto-deploy)").option("-d, --dir <path>", "Docs site directory").action(deployCommand);
4242
4343
  program.command("setup-ci").description("Generate a GitHub Actions workflow for auto-updating docs").action(setupCiCommand);
4344
+ program.command("setup-mcp").description("Set up MCP server so Claude Code can query your docs").action(setupMcpCommand);
4243
4345
  program.command("login").description("Authenticate with GitHub").action(loginCommand);
4244
4346
  program.command("logout").description("Clear stored credentials").action(logoutCommand);
4245
4347
  program.parse();
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@latent-space-labs/open-auto-doc",
3
- "version": "0.4.0",
3
+ "version": "0.5.0",
4
4
  "description": "Auto-generate beautiful documentation websites from GitHub repositories using AI",
5
5
  "type": "module",
6
6
  "bin": {