@latent-space-labs/open-auto-doc 0.4.1 → 0.5.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +141 -0
- package/dist/index.js +222 -139
- package/package.json +1 -1
package/README.md
ADDED
|
@@ -0,0 +1,141 @@
|
|
|
1
|
+
# open-auto-doc
|
|
2
|
+
|
|
3
|
+
**One command. Beautiful docs. Auto-deployed.**
|
|
4
|
+
|
|
5
|
+
Turn any GitHub repo into a fully hosted documentation site — powered by AI that actually reads your code.
|
|
6
|
+
|
|
7
|
+
```bash
|
|
8
|
+
npx @latent-space-labs/open-auto-doc
|
|
9
|
+
```
|
|
10
|
+
|
|
11
|
+
No config files. No manual writing. Just point it at your repos and get a production-ready docs site with architecture overviews, API references, component docs, and more.
|
|
12
|
+
|
|
13
|
+
## Install
|
|
14
|
+
|
|
15
|
+
```bash
|
|
16
|
+
# Run directly with npx (no install needed)
|
|
17
|
+
npx @latent-space-labs/open-auto-doc
|
|
18
|
+
|
|
19
|
+
# Or install globally
|
|
20
|
+
npm install -g @latent-space-labs/open-auto-doc
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
### Requirements
|
|
24
|
+
|
|
25
|
+
- **Node.js 18+**
|
|
26
|
+
- **A GitHub account** — works with public and private repos
|
|
27
|
+
- **An [Anthropic API key](https://console.anthropic.com/)** — powers the AI analysis
|
|
28
|
+
|
|
29
|
+
## Quick Start
|
|
30
|
+
|
|
31
|
+
```bash
|
|
32
|
+
# 1. Generate docs (interactive setup)
|
|
33
|
+
npx @latent-space-labs/open-auto-doc
|
|
34
|
+
|
|
35
|
+
# 2. Preview locally
|
|
36
|
+
cd docs-site && npm run dev
|
|
37
|
+
|
|
38
|
+
# 3. Deploy
|
|
39
|
+
open-auto-doc deploy
|
|
40
|
+
|
|
41
|
+
# 4. Auto-update on every push
|
|
42
|
+
open-auto-doc setup-ci
|
|
43
|
+
```
|
|
44
|
+
|
|
45
|
+
The CLI walks you through GitHub auth, repo selection, and API key setup interactively.
|
|
46
|
+
|
|
47
|
+
## What Gets Generated
|
|
48
|
+
|
|
49
|
+
| Section | Contents |
|
|
50
|
+
|---|---|
|
|
51
|
+
| **Architecture Overview** | Tech stack, module breakdown, data flow diagrams, entry points, key patterns |
|
|
52
|
+
| **Getting Started** | Prerequisites, install steps, quick start guide, config options |
|
|
53
|
+
| **API Reference** | Endpoints with methods, params, request/response bodies, auth |
|
|
54
|
+
| **Components** | UI components with props, usage examples, categories |
|
|
55
|
+
| **Data Models** | Schemas with field types, constraints, relationships, ER diagrams |
|
|
56
|
+
| **Business Logic** | Domain concepts, business rules, workflows |
|
|
57
|
+
| **Configuration** | Config files, env vars, settings reference |
|
|
58
|
+
| **Error Handling** | Error codes, common errors, debugging tips |
|
|
59
|
+
|
|
60
|
+
Multi-repo setups also get **cross-repo analysis** — shared dependencies, API contracts, and relationship diagrams.
|
|
61
|
+
|
|
62
|
+
## All Commands
|
|
63
|
+
|
|
64
|
+
| Command | What it does |
|
|
65
|
+
|---|---|
|
|
66
|
+
| `open-auto-doc` | Full interactive setup: auth → pick repos → analyze → generate |
|
|
67
|
+
| `open-auto-doc init -o <dir>` | Same, with custom output directory (default: `docs-site`) |
|
|
68
|
+
| `open-auto-doc generate` | Re-analyze and regenerate using saved config |
|
|
69
|
+
| `open-auto-doc generate --incremental` | Only re-analyze changed files |
|
|
70
|
+
| `open-auto-doc deploy` | Create a GitHub repo for docs and push |
|
|
71
|
+
| `open-auto-doc setup-ci` | Add GitHub Actions workflow for auto-updates |
|
|
72
|
+
| `open-auto-doc setup-mcp` | Set up MCP server for Claude Code |
|
|
73
|
+
| `open-auto-doc login` | Authenticate with GitHub |
|
|
74
|
+
| `open-auto-doc logout` | Clear stored credentials |
|
|
75
|
+
|
|
76
|
+
## MCP Server for AI Assistants
|
|
77
|
+
|
|
78
|
+
open-auto-doc includes an [MCP (Model Context Protocol)](https://modelcontextprotocol.io/) server that lets AI assistants like Claude Code query your documentation directly. Instead of reading raw source files, the AI gets structured knowledge about your architecture, APIs, components, and data models.
|
|
79
|
+
|
|
80
|
+
### How it works
|
|
81
|
+
|
|
82
|
+
When you run `open-auto-doc init` or `generate`, the AI analysis results are cached as JSON in `<outputDir>/.autodoc-cache/`. The MCP server (`@latent-space-labs/open-auto-doc-mcp`) is a separate lightweight package that reads this cache and serves it over the Model Context Protocol via stdio.
|
|
83
|
+
|
|
84
|
+
```
|
|
85
|
+
open-auto-doc init → .autodoc-cache/*.json → MCP Server (stdio) → Claude Code
|
|
86
|
+
```
|
|
87
|
+
|
|
88
|
+
The server has no dependency on the analyzer or generator — it only reads JSON files, so `npx` startup is fast.
|
|
89
|
+
|
|
90
|
+
### Setup
|
|
91
|
+
|
|
92
|
+
```bash
|
|
93
|
+
# Set up after generating docs
|
|
94
|
+
open-auto-doc setup-mcp
|
|
95
|
+
```
|
|
96
|
+
|
|
97
|
+
This creates `.mcp.json` in your project root. Claude Code will automatically discover the tools:
|
|
98
|
+
|
|
99
|
+
- `get_project_overview` — purpose, tech stack, summary stats
|
|
100
|
+
- `search_documentation` — full-text search across all sections
|
|
101
|
+
- `get_api_endpoints` — API endpoint details with params, auth, request/response
|
|
102
|
+
- `get_components` — UI component documentation with props and usage
|
|
103
|
+
- `get_data_models` — data model schemas with fields and relationships
|
|
104
|
+
- `get_architecture` — modules, data flow, patterns, entry points
|
|
105
|
+
- `get_diagram` — Mermaid diagrams (architecture, ER, flow)
|
|
106
|
+
- `get_business_rules` — domain concepts, rules, and workflows
|
|
107
|
+
|
|
108
|
+
Resources are also available at `docs://overview`, `docs://architecture`, `docs://getting-started`, and `docs://diagrams/{id}`.
|
|
109
|
+
|
|
110
|
+
You can also configure `.mcp.json` manually:
|
|
111
|
+
|
|
112
|
+
```json
|
|
113
|
+
{
|
|
114
|
+
"mcpServers": {
|
|
115
|
+
"project-docs": {
|
|
116
|
+
"command": "npx",
|
|
117
|
+
"args": ["-y", "@latent-space-labs/open-auto-doc-mcp", "--project-dir", "."]
|
|
118
|
+
}
|
|
119
|
+
}
|
|
120
|
+
}
|
|
121
|
+
```
|
|
122
|
+
|
|
123
|
+
## Language Support
|
|
124
|
+
|
|
125
|
+
open-auto-doc is **language-agnostic**. It uses AI to understand code — not language-specific parsers. Works with TypeScript, JavaScript, Python, Go, Rust, Java, Kotlin, Ruby, PHP, C#, Swift, and more.
|
|
126
|
+
|
|
127
|
+
## Privacy & Security
|
|
128
|
+
|
|
129
|
+
- Your Anthropic API key is **only sent to the Anthropic API**
|
|
130
|
+
- All code analysis runs **locally on your machine** (or in your own CI runner)
|
|
131
|
+
- Credentials stored at `~/.open-auto-doc/credentials.json` with `0600` permissions
|
|
132
|
+
- Run `open-auto-doc logout` to clear everything
|
|
133
|
+
|
|
134
|
+
## Links
|
|
135
|
+
|
|
136
|
+
- [GitHub](https://github.com/Latent-Space-Labs/open-auto-doc) — source code, issues, contributing
|
|
137
|
+
- [Anthropic Console](https://console.anthropic.com/) — get an API key
|
|
138
|
+
|
|
139
|
+
## License
|
|
140
|
+
|
|
141
|
+
MIT
|
package/dist/index.js
CHANGED
|
@@ -4,10 +4,10 @@
|
|
|
4
4
|
import { Command } from "commander";
|
|
5
5
|
|
|
6
6
|
// src/commands/init.ts
|
|
7
|
-
import * as
|
|
8
|
-
import
|
|
7
|
+
import * as p7 from "@clack/prompts";
|
|
8
|
+
import fs10 from "fs";
|
|
9
9
|
import net from "net";
|
|
10
|
-
import
|
|
10
|
+
import path10 from "path";
|
|
11
11
|
import { fileURLToPath as fileURLToPath2 } from "url";
|
|
12
12
|
import { spawn } from "child_process";
|
|
13
13
|
|
|
@@ -528,7 +528,7 @@ function detectEntryFiles(flatFiles) {
|
|
|
528
528
|
/^app\/page\.tsx$/,
|
|
529
529
|
/^pages\/index\.\w+$/
|
|
530
530
|
];
|
|
531
|
-
return flatFiles.filter((f) => entryPatterns.some((
|
|
531
|
+
return flatFiles.filter((f) => entryPatterns.some((p13) => p13.test(f)));
|
|
532
532
|
}
|
|
533
533
|
var DEP_FILES = [
|
|
534
534
|
{
|
|
@@ -1731,7 +1731,7 @@ function classifyChanges(entries, staticAnalysis) {
|
|
|
1731
1731
|
const affected = /* @__PURE__ */ new Set();
|
|
1732
1732
|
for (const entry of entries) {
|
|
1733
1733
|
for (const [section, patterns] of Object.entries(SECTION_PATTERNS)) {
|
|
1734
|
-
if (patterns.some((
|
|
1734
|
+
if (patterns.some((p13) => p13.test(entry.filePath))) {
|
|
1735
1735
|
affected.add(section);
|
|
1736
1736
|
}
|
|
1737
1737
|
}
|
|
@@ -2809,9 +2809,85 @@ function showSecretsInstructions(multiRepo = false) {
|
|
|
2809
2809
|
);
|
|
2810
2810
|
}
|
|
2811
2811
|
|
|
2812
|
-
//
|
|
2813
|
-
import
|
|
2812
|
+
// src/commands/setup-mcp.ts
|
|
2813
|
+
import * as p6 from "@clack/prompts";
|
|
2814
|
+
import fs8 from "fs";
|
|
2814
2815
|
import path8 from "path";
|
|
2816
|
+
var MCP_SERVER_KEY = "project-docs";
|
|
2817
|
+
function getMcpConfig() {
|
|
2818
|
+
return {
|
|
2819
|
+
command: "npx",
|
|
2820
|
+
args: ["-y", "@latent-space-labs/open-auto-doc-mcp", "--project-dir", "."]
|
|
2821
|
+
};
|
|
2822
|
+
}
|
|
2823
|
+
async function setupMcpConfig(opts) {
|
|
2824
|
+
const cacheDir = path8.join(opts.outputDir, ".autodoc-cache");
|
|
2825
|
+
if (!fs8.existsSync(cacheDir)) {
|
|
2826
|
+
p6.log.warn("No analysis cache found \u2014 skipping MCP setup. Run setup-mcp after generating docs.");
|
|
2827
|
+
return;
|
|
2828
|
+
}
|
|
2829
|
+
writeMcpJson(process.cwd());
|
|
2830
|
+
}
|
|
2831
|
+
async function setupMcpCommand() {
|
|
2832
|
+
p6.intro("open-auto-doc \u2014 MCP Server Setup");
|
|
2833
|
+
const config = loadConfig();
|
|
2834
|
+
if (!config) {
|
|
2835
|
+
p6.log.error(
|
|
2836
|
+
"No .autodocrc.json found. Run `open-auto-doc init` first to generate documentation."
|
|
2837
|
+
);
|
|
2838
|
+
process.exit(1);
|
|
2839
|
+
}
|
|
2840
|
+
const cacheDir = path8.join(config.outputDir, ".autodoc-cache");
|
|
2841
|
+
if (!fs8.existsSync(cacheDir)) {
|
|
2842
|
+
p6.log.error(
|
|
2843
|
+
`No analysis cache found at ${cacheDir}.
|
|
2844
|
+
Run \`open-auto-doc init\` or \`open-auto-doc generate\` first.`
|
|
2845
|
+
);
|
|
2846
|
+
process.exit(1);
|
|
2847
|
+
}
|
|
2848
|
+
const cacheFiles = fs8.readdirSync(cacheDir).filter((f) => f.endsWith("-analysis.json"));
|
|
2849
|
+
if (cacheFiles.length === 0) {
|
|
2850
|
+
p6.log.error("Cache directory exists but contains no analysis files.");
|
|
2851
|
+
process.exit(1);
|
|
2852
|
+
}
|
|
2853
|
+
writeMcpJson(process.cwd());
|
|
2854
|
+
p6.log.success("MCP server configured!");
|
|
2855
|
+
p6.note(
|
|
2856
|
+
[
|
|
2857
|
+
"The following tools are now available in Claude Code:",
|
|
2858
|
+
"",
|
|
2859
|
+
" get_project_overview \u2014 Project summary and tech stack",
|
|
2860
|
+
" search_documentation \u2014 Full-text search across all docs",
|
|
2861
|
+
" get_api_endpoints \u2014 API endpoint details",
|
|
2862
|
+
" get_components \u2014 UI component documentation",
|
|
2863
|
+
" get_data_models \u2014 Data model schemas",
|
|
2864
|
+
" get_architecture \u2014 Architecture and patterns",
|
|
2865
|
+
" get_diagram \u2014 Mermaid diagrams",
|
|
2866
|
+
" get_business_rules \u2014 Domain concepts and workflows"
|
|
2867
|
+
].join("\n"),
|
|
2868
|
+
"Available MCP tools"
|
|
2869
|
+
);
|
|
2870
|
+
p6.outro("Open Claude Code in this project to start using the tools.");
|
|
2871
|
+
}
|
|
2872
|
+
function writeMcpJson(projectRoot) {
|
|
2873
|
+
const mcpPath = path8.join(projectRoot, ".mcp.json");
|
|
2874
|
+
let existing = {};
|
|
2875
|
+
if (fs8.existsSync(mcpPath)) {
|
|
2876
|
+
try {
|
|
2877
|
+
existing = JSON.parse(fs8.readFileSync(mcpPath, "utf-8"));
|
|
2878
|
+
} catch {
|
|
2879
|
+
}
|
|
2880
|
+
}
|
|
2881
|
+
const mcpServers = existing.mcpServers ?? {};
|
|
2882
|
+
mcpServers[MCP_SERVER_KEY] = getMcpConfig();
|
|
2883
|
+
const merged = { ...existing, mcpServers };
|
|
2884
|
+
fs8.writeFileSync(mcpPath, JSON.stringify(merged, null, 2) + "\n");
|
|
2885
|
+
p6.log.step(`Wrote ${path8.relative(projectRoot, mcpPath)}`);
|
|
2886
|
+
}
|
|
2887
|
+
|
|
2888
|
+
// ../generator/dist/index.js
|
|
2889
|
+
import fs9 from "fs-extra";
|
|
2890
|
+
import path9 from "path";
|
|
2815
2891
|
import { execSync as execSync6 } from "child_process";
|
|
2816
2892
|
import fs23 from "fs-extra";
|
|
2817
2893
|
import path23 from "path";
|
|
@@ -2820,26 +2896,26 @@ import { fileURLToPath } from "url";
|
|
|
2820
2896
|
import fs33 from "fs-extra";
|
|
2821
2897
|
import path33 from "path";
|
|
2822
2898
|
async function scaffoldSite(outputDir, projectName, templateDir) {
|
|
2823
|
-
await
|
|
2899
|
+
await fs9.copy(templateDir, outputDir, {
|
|
2824
2900
|
overwrite: true,
|
|
2825
2901
|
filter: (src) => {
|
|
2826
|
-
const basename =
|
|
2902
|
+
const basename = path9.basename(src);
|
|
2827
2903
|
return basename !== "node_modules" && basename !== ".next" && basename !== ".source" && basename !== "dist" && basename !== ".turbo";
|
|
2828
2904
|
}
|
|
2829
2905
|
});
|
|
2830
2906
|
const filesToProcess = await findTextFiles(outputDir);
|
|
2831
2907
|
for (const filePath of filesToProcess) {
|
|
2832
2908
|
try {
|
|
2833
|
-
let content = await
|
|
2909
|
+
let content = await fs9.readFile(filePath, "utf-8");
|
|
2834
2910
|
if (content.includes("{{projectName}}")) {
|
|
2835
2911
|
content = content.replace(/\{\{projectName\}\}/g, projectName);
|
|
2836
|
-
await
|
|
2912
|
+
await fs9.writeFile(filePath, content, "utf-8");
|
|
2837
2913
|
}
|
|
2838
2914
|
} catch {
|
|
2839
2915
|
}
|
|
2840
2916
|
}
|
|
2841
|
-
const nodeModulesPath =
|
|
2842
|
-
if (!
|
|
2917
|
+
const nodeModulesPath = path9.join(outputDir, "node_modules");
|
|
2918
|
+
if (!fs9.existsSync(nodeModulesPath)) {
|
|
2843
2919
|
try {
|
|
2844
2920
|
execSync6("npm install --ignore-scripts", {
|
|
2845
2921
|
cwd: outputDir,
|
|
@@ -2847,7 +2923,7 @@ async function scaffoldSite(outputDir, projectName, templateDir) {
|
|
|
2847
2923
|
timeout: 12e4
|
|
2848
2924
|
});
|
|
2849
2925
|
} catch (err) {
|
|
2850
|
-
const hasNodeModules =
|
|
2926
|
+
const hasNodeModules = fs9.existsSync(nodeModulesPath);
|
|
2851
2927
|
if (!hasNodeModules) {
|
|
2852
2928
|
throw new Error(`npm install failed: ${err instanceof Error ? err.message : err}`);
|
|
2853
2929
|
}
|
|
@@ -2876,14 +2952,14 @@ async function findTextFiles(dir) {
|
|
|
2876
2952
|
]);
|
|
2877
2953
|
const results = [];
|
|
2878
2954
|
async function walk(currentDir) {
|
|
2879
|
-
const entries = await
|
|
2955
|
+
const entries = await fs9.readdir(currentDir, { withFileTypes: true });
|
|
2880
2956
|
for (const entry of entries) {
|
|
2881
|
-
const fullPath =
|
|
2957
|
+
const fullPath = path9.join(currentDir, entry.name);
|
|
2882
2958
|
if (entry.isDirectory()) {
|
|
2883
2959
|
if (entry.name !== "node_modules" && entry.name !== ".next" && entry.name !== ".source") {
|
|
2884
2960
|
await walk(fullPath);
|
|
2885
2961
|
}
|
|
2886
|
-
} else if (textExtensions.has(
|
|
2962
|
+
} else if (textExtensions.has(path9.extname(entry.name))) {
|
|
2887
2963
|
results.push(fullPath);
|
|
2888
2964
|
}
|
|
2889
2965
|
}
|
|
@@ -3615,12 +3691,12 @@ function buildRepoSummary(result) {
|
|
|
3615
3691
|
}
|
|
3616
3692
|
|
|
3617
3693
|
// src/commands/init.ts
|
|
3618
|
-
var __dirname2 =
|
|
3694
|
+
var __dirname2 = path10.dirname(fileURLToPath2(import.meta.url));
|
|
3619
3695
|
async function initCommand(options) {
|
|
3620
|
-
|
|
3696
|
+
p7.intro("open-auto-doc \u2014 AI-powered documentation generator");
|
|
3621
3697
|
const templateDir = resolveTemplateDir();
|
|
3622
|
-
if (!
|
|
3623
|
-
|
|
3698
|
+
if (!fs10.existsSync(path10.join(templateDir, "package.json"))) {
|
|
3699
|
+
p7.log.error(
|
|
3624
3700
|
`Site template not found at: ${templateDir}
|
|
3625
3701
|
This usually means the npm package was not built correctly.
|
|
3626
3702
|
Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
@@ -3629,53 +3705,53 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3629
3705
|
}
|
|
3630
3706
|
let token = getGithubToken();
|
|
3631
3707
|
if (!token) {
|
|
3632
|
-
|
|
3708
|
+
p7.log.info("Let's connect your GitHub account.");
|
|
3633
3709
|
token = await authenticateWithGithub();
|
|
3634
3710
|
setGithubToken(token);
|
|
3635
3711
|
} else {
|
|
3636
|
-
|
|
3712
|
+
p7.log.success("Using saved GitHub credentials.");
|
|
3637
3713
|
}
|
|
3638
3714
|
const repos = await pickRepos(token);
|
|
3639
|
-
|
|
3715
|
+
p7.log.info(`Selected ${repos.length} ${repos.length === 1 ? "repository" : "repositories"}`);
|
|
3640
3716
|
let projectName;
|
|
3641
3717
|
if (repos.length > 1) {
|
|
3642
|
-
const nameInput = await
|
|
3718
|
+
const nameInput = await p7.text({
|
|
3643
3719
|
message: "What would you like to name this project?",
|
|
3644
3720
|
placeholder: "My Project",
|
|
3645
3721
|
validate: (v) => {
|
|
3646
3722
|
if (!v || v.trim().length === 0) return "Project name is required";
|
|
3647
3723
|
}
|
|
3648
3724
|
});
|
|
3649
|
-
if (
|
|
3650
|
-
|
|
3725
|
+
if (p7.isCancel(nameInput)) {
|
|
3726
|
+
p7.cancel("Operation cancelled");
|
|
3651
3727
|
process.exit(0);
|
|
3652
3728
|
}
|
|
3653
3729
|
projectName = nameInput;
|
|
3654
3730
|
}
|
|
3655
3731
|
let apiKey = getAnthropicKey();
|
|
3656
3732
|
if (!apiKey) {
|
|
3657
|
-
const keyInput = await
|
|
3733
|
+
const keyInput = await p7.text({
|
|
3658
3734
|
message: "Enter your Anthropic API key",
|
|
3659
3735
|
placeholder: "sk-ant-...",
|
|
3660
3736
|
validate: (v) => {
|
|
3661
3737
|
if (!v || !v.startsWith("sk-ant-")) return "Please enter a valid Anthropic API key";
|
|
3662
3738
|
}
|
|
3663
3739
|
});
|
|
3664
|
-
if (
|
|
3665
|
-
|
|
3740
|
+
if (p7.isCancel(keyInput)) {
|
|
3741
|
+
p7.cancel("Operation cancelled");
|
|
3666
3742
|
process.exit(0);
|
|
3667
3743
|
}
|
|
3668
3744
|
apiKey = keyInput;
|
|
3669
|
-
const saveKey = await
|
|
3745
|
+
const saveKey = await p7.confirm({
|
|
3670
3746
|
message: "Save API key for future use?"
|
|
3671
3747
|
});
|
|
3672
|
-
if (saveKey && !
|
|
3748
|
+
if (saveKey && !p7.isCancel(saveKey)) {
|
|
3673
3749
|
setAnthropicKey(apiKey);
|
|
3674
3750
|
}
|
|
3675
3751
|
} else {
|
|
3676
|
-
|
|
3752
|
+
p7.log.success("Using saved Anthropic API key.");
|
|
3677
3753
|
}
|
|
3678
|
-
const model = await
|
|
3754
|
+
const model = await p7.select({
|
|
3679
3755
|
message: "Which model should analyze your repos?",
|
|
3680
3756
|
options: [
|
|
3681
3757
|
{ value: "claude-sonnet-4-6", label: "Claude Sonnet 4.6", hint: "Fast & capable (recommended)" },
|
|
@@ -3683,12 +3759,12 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3683
3759
|
{ value: "claude-opus-4-6", label: "Claude Opus 4.6", hint: "Most capable, slowest" }
|
|
3684
3760
|
]
|
|
3685
3761
|
});
|
|
3686
|
-
if (
|
|
3687
|
-
|
|
3762
|
+
if (p7.isCancel(model)) {
|
|
3763
|
+
p7.cancel("Operation cancelled");
|
|
3688
3764
|
process.exit(0);
|
|
3689
3765
|
}
|
|
3690
|
-
|
|
3691
|
-
const cloneSpinner =
|
|
3766
|
+
p7.log.info(`Using ${model}`);
|
|
3767
|
+
const cloneSpinner = p7.spinner();
|
|
3692
3768
|
cloneSpinner.start(`Cloning ${repos.length} repositories...`);
|
|
3693
3769
|
const clones = [];
|
|
3694
3770
|
for (const repo of repos) {
|
|
@@ -3697,12 +3773,12 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3697
3773
|
const cloned = cloneRepo(repo, token);
|
|
3698
3774
|
clones.push(cloned);
|
|
3699
3775
|
} catch (err) {
|
|
3700
|
-
|
|
3776
|
+
p7.log.warn(`Failed to clone ${repo.name}: ${err instanceof Error ? err.message : err}`);
|
|
3701
3777
|
}
|
|
3702
3778
|
}
|
|
3703
3779
|
cloneSpinner.stop(`Cloned ${clones.length}/${repos.length} repositories`);
|
|
3704
3780
|
if (clones.length === 0) {
|
|
3705
|
-
|
|
3781
|
+
p7.log.error("No repositories were cloned.");
|
|
3706
3782
|
process.exit(1);
|
|
3707
3783
|
}
|
|
3708
3784
|
const total = clones.length;
|
|
@@ -3730,7 +3806,7 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3730
3806
|
} catch (err) {
|
|
3731
3807
|
const errMsg = err instanceof Error ? err.message : String(err);
|
|
3732
3808
|
progressTable.update(repoName, { status: "failed", error: errMsg });
|
|
3733
|
-
|
|
3809
|
+
p7.log.warn(`[${repoName}] Analysis failed: ${errMsg}`);
|
|
3734
3810
|
return { repo: repoName, result: null };
|
|
3735
3811
|
}
|
|
3736
3812
|
});
|
|
@@ -3738,17 +3814,17 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3738
3814
|
progressTable.stop();
|
|
3739
3815
|
const results = settled.filter((s) => s.result !== null).map((s) => s.result);
|
|
3740
3816
|
const { done, failed } = progressTable.getSummary();
|
|
3741
|
-
|
|
3817
|
+
p7.log.step(
|
|
3742
3818
|
`Analyzed ${done}/${total} repositories` + (failed > 0 ? ` (${failed} failed)` : "") + (results.length > 0 ? ` \u2014 ${results.reduce((n, r) => n + r.apiEndpoints.length, 0)} endpoints, ${results.reduce((n, r) => n + r.components.length, 0)} components, ${results.reduce((n, r) => n + r.diagrams.length, 0)} diagrams` : "")
|
|
3743
3819
|
);
|
|
3744
3820
|
if (results.length === 0) {
|
|
3745
|
-
|
|
3821
|
+
p7.log.error("No repositories were successfully analyzed.");
|
|
3746
3822
|
cleanup(clones);
|
|
3747
3823
|
process.exit(1);
|
|
3748
3824
|
}
|
|
3749
3825
|
let crossRepo;
|
|
3750
3826
|
if (results.length > 1) {
|
|
3751
|
-
const crossSpinner =
|
|
3827
|
+
const crossSpinner = p7.spinner();
|
|
3752
3828
|
crossSpinner.start("Analyzing cross-repository relationships...");
|
|
3753
3829
|
try {
|
|
3754
3830
|
crossRepo = await analyzeCrossRepos(results, apiKey, model, (text4) => {
|
|
@@ -3757,33 +3833,33 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3757
3833
|
crossSpinner.stop(`Cross-repo analysis complete \u2014 ${crossRepo.repoRelationships.length} relationships found`);
|
|
3758
3834
|
} catch (err) {
|
|
3759
3835
|
crossSpinner.stop("Cross-repo analysis failed (non-fatal)");
|
|
3760
|
-
|
|
3836
|
+
p7.log.warn(`Cross-repo error: ${err instanceof Error ? err.message : err}`);
|
|
3761
3837
|
}
|
|
3762
3838
|
}
|
|
3763
|
-
const outputDir =
|
|
3839
|
+
const outputDir = path10.resolve(options.output || "docs-site");
|
|
3764
3840
|
if (!projectName) {
|
|
3765
3841
|
projectName = results.length === 1 ? results[0].repoName : "My Project";
|
|
3766
3842
|
}
|
|
3767
|
-
const genSpinner =
|
|
3843
|
+
const genSpinner = p7.spinner();
|
|
3768
3844
|
try {
|
|
3769
3845
|
genSpinner.start("Scaffolding documentation site...");
|
|
3770
3846
|
await scaffoldSite(outputDir, projectName, templateDir);
|
|
3771
3847
|
genSpinner.stop("Site scaffolded");
|
|
3772
3848
|
} catch (err) {
|
|
3773
3849
|
genSpinner.stop("Scaffold failed");
|
|
3774
|
-
|
|
3850
|
+
p7.log.error(`Scaffold error: ${err instanceof Error ? err.stack || err.message : err}`);
|
|
3775
3851
|
cleanup(clones);
|
|
3776
3852
|
process.exit(1);
|
|
3777
3853
|
}
|
|
3778
3854
|
try {
|
|
3779
3855
|
genSpinner.start("Writing documentation content...");
|
|
3780
|
-
const contentDir =
|
|
3856
|
+
const contentDir = path10.join(outputDir, "content", "docs");
|
|
3781
3857
|
await writeContent(contentDir, results, crossRepo);
|
|
3782
3858
|
await writeMeta(contentDir, results, crossRepo);
|
|
3783
3859
|
genSpinner.stop("Documentation content written");
|
|
3784
3860
|
} catch (err) {
|
|
3785
3861
|
genSpinner.stop("Content writing failed");
|
|
3786
|
-
|
|
3862
|
+
p7.log.error(`Content error: ${err instanceof Error ? err.stack || err.message : err}`);
|
|
3787
3863
|
cleanup(clones);
|
|
3788
3864
|
process.exit(1);
|
|
3789
3865
|
}
|
|
@@ -3805,31 +3881,37 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3805
3881
|
try {
|
|
3806
3882
|
await runBuildCheck({ docsDir: outputDir, apiKey, model });
|
|
3807
3883
|
} catch (err) {
|
|
3808
|
-
|
|
3884
|
+
p7.log.warn(`Build check skipped: ${err instanceof Error ? err.message : err}`);
|
|
3885
|
+
}
|
|
3886
|
+
p7.log.success("Documentation generated successfully!");
|
|
3887
|
+
const shouldSetupMcp = await p7.confirm({
|
|
3888
|
+
message: "Set up MCP server so Claude Code can query your docs?"
|
|
3889
|
+
});
|
|
3890
|
+
if (!p7.isCancel(shouldSetupMcp) && shouldSetupMcp) {
|
|
3891
|
+
await setupMcpConfig({ outputDir });
|
|
3809
3892
|
}
|
|
3810
|
-
p6.log.success("Documentation generated successfully!");
|
|
3811
3893
|
let devServer;
|
|
3812
3894
|
const devPort = await findFreePort(3e3);
|
|
3813
3895
|
try {
|
|
3814
3896
|
devServer = startDevServer(outputDir, devPort);
|
|
3815
|
-
|
|
3816
|
-
|
|
3897
|
+
p7.log.success(`Documentation site running at http://localhost:${devPort}`);
|
|
3898
|
+
p7.log.info("Open the link above to preview your docs site.");
|
|
3817
3899
|
} catch {
|
|
3818
|
-
|
|
3819
|
-
|
|
3900
|
+
p7.log.warn("Could not start preview server. You can run it manually:");
|
|
3901
|
+
p7.log.info(` cd ${path10.relative(process.cwd(), outputDir)} && npm run dev`);
|
|
3820
3902
|
}
|
|
3821
|
-
const shouldDeploy = await
|
|
3903
|
+
const shouldDeploy = await p7.confirm({
|
|
3822
3904
|
message: "Would you like to deploy your docs to GitHub?"
|
|
3823
3905
|
});
|
|
3824
|
-
if (
|
|
3906
|
+
if (p7.isCancel(shouldDeploy) || !shouldDeploy) {
|
|
3825
3907
|
if (devServer) {
|
|
3826
3908
|
killDevServer(devServer);
|
|
3827
3909
|
}
|
|
3828
|
-
|
|
3829
|
-
`cd ${
|
|
3910
|
+
p7.note(
|
|
3911
|
+
`cd ${path10.relative(process.cwd(), outputDir)} && npm run dev`,
|
|
3830
3912
|
"To start the dev server again"
|
|
3831
3913
|
);
|
|
3832
|
-
|
|
3914
|
+
p7.outro("Done!");
|
|
3833
3915
|
return;
|
|
3834
3916
|
}
|
|
3835
3917
|
if (devServer) {
|
|
@@ -3841,26 +3923,26 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3841
3923
|
config
|
|
3842
3924
|
});
|
|
3843
3925
|
if (!deployResult) {
|
|
3844
|
-
|
|
3845
|
-
`cd ${
|
|
3926
|
+
p7.note(
|
|
3927
|
+
`cd ${path10.relative(process.cwd(), outputDir)} && npm run dev`,
|
|
3846
3928
|
"Next steps"
|
|
3847
3929
|
);
|
|
3848
|
-
|
|
3930
|
+
p7.outro("Done!");
|
|
3849
3931
|
return;
|
|
3850
3932
|
}
|
|
3851
|
-
const shouldSetupCi = await
|
|
3933
|
+
const shouldSetupCi = await p7.confirm({
|
|
3852
3934
|
message: "Would you like to set up CI to auto-update docs on every push?"
|
|
3853
3935
|
});
|
|
3854
|
-
if (
|
|
3936
|
+
if (p7.isCancel(shouldSetupCi) || !shouldSetupCi) {
|
|
3855
3937
|
showVercelInstructions(deployResult.owner, deployResult.repoName);
|
|
3856
|
-
|
|
3938
|
+
p7.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
|
|
3857
3939
|
return;
|
|
3858
3940
|
}
|
|
3859
3941
|
const gitRoot = getGitRoot();
|
|
3860
3942
|
if (!gitRoot) {
|
|
3861
|
-
|
|
3943
|
+
p7.log.warn("Not in a git repository \u2014 skipping CI setup. Run `open-auto-doc setup-ci` from your project root later.");
|
|
3862
3944
|
showVercelInstructions(deployResult.owner, deployResult.repoName);
|
|
3863
|
-
|
|
3945
|
+
p7.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
|
|
3864
3946
|
return;
|
|
3865
3947
|
}
|
|
3866
3948
|
const ciResult = await createCiWorkflow({
|
|
@@ -3871,24 +3953,24 @@ Try reinstalling: npm install -g @latent-space-labs/open-auto-doc`
|
|
|
3871
3953
|
config
|
|
3872
3954
|
});
|
|
3873
3955
|
showVercelInstructions(deployResult.owner, deployResult.repoName);
|
|
3874
|
-
|
|
3956
|
+
p7.outro(`Docs repo: https://github.com/${deployResult.owner}/${deployResult.repoName}`);
|
|
3875
3957
|
}
|
|
3876
3958
|
function resolveTemplateDir() {
|
|
3877
3959
|
const candidates = [
|
|
3878
|
-
|
|
3960
|
+
path10.resolve(__dirname2, "site-template"),
|
|
3879
3961
|
// dist/site-template (npm global install)
|
|
3880
|
-
|
|
3962
|
+
path10.resolve(__dirname2, "../../site-template"),
|
|
3881
3963
|
// monorepo: packages/site-template
|
|
3882
|
-
|
|
3964
|
+
path10.resolve(__dirname2, "../../../site-template"),
|
|
3883
3965
|
// monorepo alt
|
|
3884
|
-
|
|
3966
|
+
path10.resolve(__dirname2, "../../../../packages/site-template")
|
|
3885
3967
|
// monorepo from nested dist
|
|
3886
3968
|
];
|
|
3887
3969
|
for (const candidate of candidates) {
|
|
3888
|
-
const pkgPath =
|
|
3889
|
-
if (
|
|
3970
|
+
const pkgPath = path10.join(candidate, "package.json");
|
|
3971
|
+
if (fs10.existsSync(pkgPath)) return candidate;
|
|
3890
3972
|
}
|
|
3891
|
-
return
|
|
3973
|
+
return path10.resolve(__dirname2, "site-template");
|
|
3892
3974
|
}
|
|
3893
3975
|
function cleanup(clones) {
|
|
3894
3976
|
for (const clone of clones) {
|
|
@@ -3931,26 +4013,26 @@ function killDevServer(child) {
|
|
|
3931
4013
|
}
|
|
3932
4014
|
|
|
3933
4015
|
// src/commands/generate.ts
|
|
3934
|
-
import * as
|
|
3935
|
-
import
|
|
4016
|
+
import * as p8 from "@clack/prompts";
|
|
4017
|
+
import path11 from "path";
|
|
3936
4018
|
async function generateCommand(options) {
|
|
3937
|
-
|
|
4019
|
+
p8.intro("open-auto-doc \u2014 Regenerating documentation");
|
|
3938
4020
|
const config = loadConfig();
|
|
3939
4021
|
if (!config) {
|
|
3940
|
-
|
|
4022
|
+
p8.log.error("No .autodocrc.json found. Run `open-auto-doc init` first.");
|
|
3941
4023
|
process.exit(1);
|
|
3942
4024
|
}
|
|
3943
4025
|
const token = getGithubToken();
|
|
3944
4026
|
const apiKey = getAnthropicKey();
|
|
3945
4027
|
if (!token) {
|
|
3946
|
-
|
|
4028
|
+
p8.log.error("Not authenticated. Run `open-auto-doc login` or set GITHUB_TOKEN env var.");
|
|
3947
4029
|
process.exit(1);
|
|
3948
4030
|
}
|
|
3949
4031
|
if (!apiKey) {
|
|
3950
|
-
|
|
4032
|
+
p8.log.error("No Anthropic API key found. Run `open-auto-doc init` or set ANTHROPIC_API_KEY env var.");
|
|
3951
4033
|
process.exit(1);
|
|
3952
4034
|
}
|
|
3953
|
-
const model = await
|
|
4035
|
+
const model = await p8.select({
|
|
3954
4036
|
message: "Which model should analyze your repos?",
|
|
3955
4037
|
options: [
|
|
3956
4038
|
{ value: "claude-sonnet-4-6", label: "Claude Sonnet 4.6", hint: "Fast & capable (recommended)" },
|
|
@@ -3958,20 +4040,20 @@ async function generateCommand(options) {
|
|
|
3958
4040
|
{ value: "claude-opus-4-6", label: "Claude Opus 4.6", hint: "Most capable, slowest" }
|
|
3959
4041
|
]
|
|
3960
4042
|
});
|
|
3961
|
-
if (
|
|
3962
|
-
|
|
4043
|
+
if (p8.isCancel(model)) {
|
|
4044
|
+
p8.cancel("Cancelled.");
|
|
3963
4045
|
process.exit(0);
|
|
3964
4046
|
}
|
|
3965
|
-
|
|
4047
|
+
p8.log.info(`Using ${model}`);
|
|
3966
4048
|
const incremental = options.incremental && !options.force;
|
|
3967
|
-
const cacheDir =
|
|
4049
|
+
const cacheDir = path11.join(config.outputDir, ".autodoc-cache");
|
|
3968
4050
|
const targetRepoName = options.repo;
|
|
3969
4051
|
let reposToAnalyze = config.repos;
|
|
3970
4052
|
const cachedResults = [];
|
|
3971
4053
|
if (targetRepoName) {
|
|
3972
4054
|
const targetRepo = config.repos.find((r) => r.name === targetRepoName);
|
|
3973
4055
|
if (!targetRepo) {
|
|
3974
|
-
|
|
4056
|
+
p8.log.error(`Repo "${targetRepoName}" not found in config. Available: ${config.repos.map((r) => r.name).join(", ")}`);
|
|
3975
4057
|
process.exit(1);
|
|
3976
4058
|
}
|
|
3977
4059
|
reposToAnalyze = [targetRepo];
|
|
@@ -3980,13 +4062,13 @@ async function generateCommand(options) {
|
|
|
3980
4062
|
const cached = loadCache(cacheDir, repo.name);
|
|
3981
4063
|
if (cached) {
|
|
3982
4064
|
cachedResults.push(cached.result);
|
|
3983
|
-
|
|
4065
|
+
p8.log.info(`Using cached analysis for ${repo.name}`);
|
|
3984
4066
|
} else {
|
|
3985
|
-
|
|
4067
|
+
p8.log.warn(`No cached analysis for ${repo.name} \u2014 its docs will be stale until it pushes`);
|
|
3986
4068
|
}
|
|
3987
4069
|
}
|
|
3988
4070
|
}
|
|
3989
|
-
const cloneSpinner =
|
|
4071
|
+
const cloneSpinner = p8.spinner();
|
|
3990
4072
|
cloneSpinner.start(`Cloning ${reposToAnalyze.length} ${reposToAnalyze.length === 1 ? "repository" : "repositories"}...`);
|
|
3991
4073
|
const clones = [];
|
|
3992
4074
|
for (const repo of reposToAnalyze) {
|
|
@@ -4005,12 +4087,12 @@ async function generateCommand(options) {
|
|
|
4005
4087
|
);
|
|
4006
4088
|
clones.push(cloned);
|
|
4007
4089
|
} catch (err) {
|
|
4008
|
-
|
|
4090
|
+
p8.log.warn(`Failed to clone ${repo.name}: ${err instanceof Error ? err.message : err}`);
|
|
4009
4091
|
}
|
|
4010
4092
|
}
|
|
4011
4093
|
cloneSpinner.stop(`Cloned ${clones.length}/${reposToAnalyze.length} ${reposToAnalyze.length === 1 ? "repository" : "repositories"}`);
|
|
4012
4094
|
if (clones.length === 0) {
|
|
4013
|
-
|
|
4095
|
+
p8.log.error("No repositories were cloned.");
|
|
4014
4096
|
process.exit(1);
|
|
4015
4097
|
}
|
|
4016
4098
|
const total = clones.length;
|
|
@@ -4085,7 +4167,7 @@ async function generateCommand(options) {
|
|
|
4085
4167
|
} catch (err) {
|
|
4086
4168
|
const errMsg = err instanceof Error ? err.message : String(err);
|
|
4087
4169
|
progressTable.update(repoName, { status: "failed", error: errMsg });
|
|
4088
|
-
|
|
4170
|
+
p8.log.warn(`[${repoName}] Analysis failed: ${errMsg}`);
|
|
4089
4171
|
return { repo: repoName, result: null };
|
|
4090
4172
|
}
|
|
4091
4173
|
});
|
|
@@ -4093,14 +4175,14 @@ async function generateCommand(options) {
|
|
|
4093
4175
|
progressTable.stop();
|
|
4094
4176
|
const freshResults = settled.filter((s) => s.result !== null).map((s) => s.result);
|
|
4095
4177
|
const { done: analyzedCount, failed: failedCount } = progressTable.getSummary();
|
|
4096
|
-
|
|
4178
|
+
p8.log.step(
|
|
4097
4179
|
`Analyzed ${analyzedCount}/${total} ${total === 1 ? "repository" : "repositories"}` + (failedCount > 0 ? ` (${failedCount} failed)` : "")
|
|
4098
4180
|
);
|
|
4099
4181
|
const results = [...freshResults, ...cachedResults];
|
|
4100
4182
|
if (results.length > 0) {
|
|
4101
4183
|
let crossRepo;
|
|
4102
4184
|
if (results.length > 1) {
|
|
4103
|
-
const crossSpinner =
|
|
4185
|
+
const crossSpinner = p8.spinner();
|
|
4104
4186
|
crossSpinner.start("Analyzing cross-repository relationships...");
|
|
4105
4187
|
try {
|
|
4106
4188
|
crossRepo = await analyzeCrossRepos(results, apiKey, model, (text4) => {
|
|
@@ -4109,66 +4191,66 @@ async function generateCommand(options) {
|
|
|
4109
4191
|
crossSpinner.stop(`Cross-repo analysis complete \u2014 ${crossRepo.repoRelationships.length} relationships found`);
|
|
4110
4192
|
} catch (err) {
|
|
4111
4193
|
crossSpinner.stop("Cross-repo analysis failed (non-fatal)");
|
|
4112
|
-
|
|
4194
|
+
p8.log.warn(`Cross-repo error: ${err instanceof Error ? err.message : err}`);
|
|
4113
4195
|
}
|
|
4114
4196
|
}
|
|
4115
|
-
const contentDir =
|
|
4197
|
+
const contentDir = path11.join(config.outputDir, "content", "docs");
|
|
4116
4198
|
await writeContent(contentDir, results, crossRepo, changelogs.size > 0 ? changelogs : void 0);
|
|
4117
4199
|
await writeMeta(contentDir, results, crossRepo, changelogs.size > 0 ? changelogs : void 0);
|
|
4118
4200
|
try {
|
|
4119
4201
|
await runBuildCheck({ docsDir: config.outputDir, apiKey, model });
|
|
4120
4202
|
} catch (err) {
|
|
4121
|
-
|
|
4203
|
+
p8.log.warn(`Build check skipped: ${err instanceof Error ? err.message : err}`);
|
|
4122
4204
|
}
|
|
4123
|
-
|
|
4205
|
+
p8.log.success("Documentation regenerated!");
|
|
4124
4206
|
}
|
|
4125
4207
|
for (const clone of clones) {
|
|
4126
4208
|
cleanupClone(clone);
|
|
4127
4209
|
}
|
|
4128
|
-
|
|
4210
|
+
p8.outro("Done!");
|
|
4129
4211
|
}
|
|
4130
4212
|
|
|
4131
4213
|
// src/commands/deploy.ts
|
|
4132
|
-
import * as
|
|
4133
|
-
import
|
|
4134
|
-
import
|
|
4214
|
+
import * as p9 from "@clack/prompts";
|
|
4215
|
+
import fs11 from "fs";
|
|
4216
|
+
import path12 from "path";
|
|
4135
4217
|
function resolveDocsDir(config, dirOption) {
|
|
4136
4218
|
if (dirOption) {
|
|
4137
|
-
const resolved =
|
|
4138
|
-
if (!
|
|
4139
|
-
|
|
4219
|
+
const resolved = path12.resolve(dirOption);
|
|
4220
|
+
if (!fs11.existsSync(resolved)) {
|
|
4221
|
+
p9.log.error(`Directory not found: ${resolved}`);
|
|
4140
4222
|
process.exit(1);
|
|
4141
4223
|
}
|
|
4142
4224
|
return resolved;
|
|
4143
4225
|
}
|
|
4144
|
-
if (config?.outputDir &&
|
|
4145
|
-
return
|
|
4226
|
+
if (config?.outputDir && fs11.existsSync(path12.resolve(config.outputDir))) {
|
|
4227
|
+
return path12.resolve(config.outputDir);
|
|
4146
4228
|
}
|
|
4147
|
-
if (
|
|
4148
|
-
return
|
|
4229
|
+
if (fs11.existsSync(path12.resolve("docs-site"))) {
|
|
4230
|
+
return path12.resolve("docs-site");
|
|
4149
4231
|
}
|
|
4150
|
-
|
|
4232
|
+
p9.log.error(
|
|
4151
4233
|
"Could not find docs site directory. Use --dir to specify the path, or run `open-auto-doc init` first."
|
|
4152
4234
|
);
|
|
4153
4235
|
process.exit(1);
|
|
4154
4236
|
}
|
|
4155
4237
|
async function deployCommand(options) {
|
|
4156
|
-
|
|
4238
|
+
p9.intro("open-auto-doc \u2014 Deploy docs to GitHub");
|
|
4157
4239
|
const token = getGithubToken();
|
|
4158
4240
|
if (!token) {
|
|
4159
|
-
|
|
4241
|
+
p9.log.error("Not authenticated. Run `open-auto-doc login` first.");
|
|
4160
4242
|
process.exit(1);
|
|
4161
4243
|
}
|
|
4162
4244
|
const config = loadConfig();
|
|
4163
4245
|
const docsDir = resolveDocsDir(config, options.dir);
|
|
4164
|
-
|
|
4246
|
+
p9.log.info(`Docs directory: ${docsDir}`);
|
|
4165
4247
|
if (config?.docsRepo) {
|
|
4166
|
-
|
|
4248
|
+
p9.log.info(`Docs repo already configured: ${config.docsRepo}`);
|
|
4167
4249
|
const pushed = await pushUpdates({ token, docsDir, docsRepo: config.docsRepo });
|
|
4168
4250
|
if (pushed) {
|
|
4169
|
-
|
|
4251
|
+
p9.outro("Docs updated! Vercel will auto-deploy from the push.");
|
|
4170
4252
|
} else {
|
|
4171
|
-
|
|
4253
|
+
p9.outro("Docs are up to date!");
|
|
4172
4254
|
}
|
|
4173
4255
|
return;
|
|
4174
4256
|
}
|
|
@@ -4178,20 +4260,20 @@ async function deployCommand(options) {
|
|
|
4178
4260
|
config: config || { repos: [], outputDir: docsDir }
|
|
4179
4261
|
});
|
|
4180
4262
|
if (!result) {
|
|
4181
|
-
|
|
4263
|
+
p9.cancel("Deploy cancelled.");
|
|
4182
4264
|
process.exit(0);
|
|
4183
4265
|
}
|
|
4184
4266
|
showVercelInstructions(result.owner, result.repoName);
|
|
4185
|
-
|
|
4267
|
+
p9.outro(`Docs repo: https://github.com/${result.owner}/${result.repoName}`);
|
|
4186
4268
|
}
|
|
4187
4269
|
|
|
4188
4270
|
// src/commands/setup-ci.ts
|
|
4189
|
-
import * as
|
|
4271
|
+
import * as p10 from "@clack/prompts";
|
|
4190
4272
|
async function setupCiCommand() {
|
|
4191
|
-
|
|
4273
|
+
p10.intro("open-auto-doc \u2014 CI/CD Setup");
|
|
4192
4274
|
const config = loadConfig();
|
|
4193
4275
|
if (!config?.docsRepo) {
|
|
4194
|
-
|
|
4276
|
+
p10.log.error(
|
|
4195
4277
|
"No docs repo configured. Run `open-auto-doc deploy` first to create a docs GitHub repo."
|
|
4196
4278
|
);
|
|
4197
4279
|
process.exit(1);
|
|
@@ -4199,12 +4281,12 @@ async function setupCiCommand() {
|
|
|
4199
4281
|
const token = getGithubToken();
|
|
4200
4282
|
const isMultiRepo = config.repos.length > 1;
|
|
4201
4283
|
if (isMultiRepo && !token) {
|
|
4202
|
-
|
|
4284
|
+
p10.log.error("Not authenticated. Run `open-auto-doc login` first (needed to push workflows to source repos).");
|
|
4203
4285
|
process.exit(1);
|
|
4204
4286
|
}
|
|
4205
4287
|
const gitRoot = getGitRoot();
|
|
4206
4288
|
if (!isMultiRepo && !gitRoot) {
|
|
4207
|
-
|
|
4289
|
+
p10.log.error("Not in a git repository. Run this command from your project root.");
|
|
4208
4290
|
process.exit(1);
|
|
4209
4291
|
}
|
|
4210
4292
|
const result = await createCiWorkflow({
|
|
@@ -4215,50 +4297,51 @@ async function setupCiCommand() {
|
|
|
4215
4297
|
config
|
|
4216
4298
|
});
|
|
4217
4299
|
if (!result) {
|
|
4218
|
-
|
|
4300
|
+
p10.cancel("Setup cancelled.");
|
|
4219
4301
|
process.exit(0);
|
|
4220
4302
|
}
|
|
4221
4303
|
if ("repos" in result) {
|
|
4222
|
-
|
|
4304
|
+
p10.outro("Per-repo CI workflows created! Add the required secrets to each source repo.");
|
|
4223
4305
|
} else {
|
|
4224
|
-
|
|
4306
|
+
p10.outro("CI/CD workflow is ready! Commit and push to activate.");
|
|
4225
4307
|
}
|
|
4226
4308
|
}
|
|
4227
4309
|
|
|
4228
4310
|
// src/commands/login.ts
|
|
4229
|
-
import * as
|
|
4311
|
+
import * as p11 from "@clack/prompts";
|
|
4230
4312
|
async function loginCommand() {
|
|
4231
|
-
|
|
4313
|
+
p11.intro("open-auto-doc \u2014 GitHub Login");
|
|
4232
4314
|
const existing = getGithubToken();
|
|
4233
4315
|
if (existing) {
|
|
4234
|
-
const overwrite = await
|
|
4316
|
+
const overwrite = await p11.confirm({
|
|
4235
4317
|
message: "You're already logged in. Re-authenticate?"
|
|
4236
4318
|
});
|
|
4237
|
-
if (!overwrite ||
|
|
4238
|
-
|
|
4319
|
+
if (!overwrite || p11.isCancel(overwrite)) {
|
|
4320
|
+
p11.cancel("Keeping existing credentials");
|
|
4239
4321
|
return;
|
|
4240
4322
|
}
|
|
4241
4323
|
}
|
|
4242
4324
|
const token = await authenticateWithGithub();
|
|
4243
4325
|
setGithubToken(token);
|
|
4244
|
-
|
|
4326
|
+
p11.outro("Logged in successfully!");
|
|
4245
4327
|
}
|
|
4246
4328
|
|
|
4247
4329
|
// src/commands/logout.ts
|
|
4248
|
-
import * as
|
|
4330
|
+
import * as p12 from "@clack/prompts";
|
|
4249
4331
|
async function logoutCommand() {
|
|
4250
|
-
|
|
4332
|
+
p12.intro("open-auto-doc \u2014 Logout");
|
|
4251
4333
|
clearAll();
|
|
4252
|
-
|
|
4334
|
+
p12.outro("Credentials cleared. You've been logged out.");
|
|
4253
4335
|
}
|
|
4254
4336
|
|
|
4255
4337
|
// src/index.ts
|
|
4256
4338
|
var program = new Command();
|
|
4257
|
-
program.name("open-auto-doc").description("Auto-generate beautiful documentation websites from GitHub repositories using AI").version("0.
|
|
4339
|
+
program.name("open-auto-doc").description("Auto-generate beautiful documentation websites from GitHub repositories using AI").version("0.5.0");
|
|
4258
4340
|
program.command("init", { isDefault: true }).description("Initialize and generate documentation for your repositories").option("-o, --output <dir>", "Output directory", "docs-site").action(initCommand);
|
|
4259
4341
|
program.command("generate").description("Regenerate documentation using existing configuration").option("--incremental", "Only re-analyze changed files (uses cached results)").option("--force", "Force full regeneration (ignore cache)").option("--repo <name>", "Only analyze this repo (uses cache for others)").action(generateCommand);
|
|
4260
4342
|
program.command("deploy").description("Create a GitHub repo for docs and push (connect to Vercel for auto-deploy)").option("-d, --dir <path>", "Docs site directory").action(deployCommand);
|
|
4261
4343
|
program.command("setup-ci").description("Generate a GitHub Actions workflow for auto-updating docs").action(setupCiCommand);
|
|
4344
|
+
program.command("setup-mcp").description("Set up MCP server so Claude Code can query your docs").action(setupMcpCommand);
|
|
4262
4345
|
program.command("login").description("Authenticate with GitHub").action(loginCommand);
|
|
4263
4346
|
program.command("logout").description("Clear stored credentials").action(logoutCommand);
|
|
4264
4347
|
program.parse();
|