wikigen 1.0.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (42) hide show
  1. wikigen-1.0.0/LICENSE +21 -0
  2. wikigen-1.0.0/PKG-INFO +352 -0
  3. wikigen-1.0.0/README.md +317 -0
  4. wikigen-1.0.0/pyproject.toml +45 -0
  5. wikigen-1.0.0/setup.cfg +4 -0
  6. wikigen-1.0.0/setup.py +50 -0
  7. wikigen-1.0.0/wikigen/__init__.py +7 -0
  8. wikigen-1.0.0/wikigen/cli.py +690 -0
  9. wikigen-1.0.0/wikigen/config.py +526 -0
  10. wikigen-1.0.0/wikigen/defaults.py +78 -0
  11. wikigen-1.0.0/wikigen/flows/__init__.py +1 -0
  12. wikigen-1.0.0/wikigen/flows/flow.py +38 -0
  13. wikigen-1.0.0/wikigen/formatter/help_formatter.py +194 -0
  14. wikigen-1.0.0/wikigen/formatter/init_formatter.py +56 -0
  15. wikigen-1.0.0/wikigen/formatter/output_formatter.py +290 -0
  16. wikigen-1.0.0/wikigen/mcp/__init__.py +12 -0
  17. wikigen-1.0.0/wikigen/mcp/chunking.py +127 -0
  18. wikigen-1.0.0/wikigen/mcp/embeddings.py +69 -0
  19. wikigen-1.0.0/wikigen/mcp/output_resources.py +65 -0
  20. wikigen-1.0.0/wikigen/mcp/search_index.py +826 -0
  21. wikigen-1.0.0/wikigen/mcp/server.py +232 -0
  22. wikigen-1.0.0/wikigen/mcp/vector_index.py +297 -0
  23. wikigen-1.0.0/wikigen/metadata/__init__.py +35 -0
  24. wikigen-1.0.0/wikigen/metadata/logo.py +28 -0
  25. wikigen-1.0.0/wikigen/metadata/project.py +28 -0
  26. wikigen-1.0.0/wikigen/metadata/version.py +17 -0
  27. wikigen-1.0.0/wikigen/nodes/__init__.py +1 -0
  28. wikigen-1.0.0/wikigen/nodes/nodes.py +1080 -0
  29. wikigen-1.0.0/wikigen/utils/__init__.py +0 -0
  30. wikigen-1.0.0/wikigen/utils/adjust_headings.py +72 -0
  31. wikigen-1.0.0/wikigen/utils/call_llm.py +271 -0
  32. wikigen-1.0.0/wikigen/utils/crawl_github_files.py +450 -0
  33. wikigen-1.0.0/wikigen/utils/crawl_local_files.py +151 -0
  34. wikigen-1.0.0/wikigen/utils/llm_providers.py +101 -0
  35. wikigen-1.0.0/wikigen/utils/version_check.py +84 -0
  36. wikigen-1.0.0/wikigen.egg-info/PKG-INFO +352 -0
  37. wikigen-1.0.0/wikigen.egg-info/SOURCES.txt +40 -0
  38. wikigen-1.0.0/wikigen.egg-info/dependency_links.txt +1 -0
  39. wikigen-1.0.0/wikigen.egg-info/entry_points.txt +2 -0
  40. wikigen-1.0.0/wikigen.egg-info/not-zip-safe +1 -0
  41. wikigen-1.0.0/wikigen.egg-info/requires.txt +11 -0
  42. wikigen-1.0.0/wikigen.egg-info/top_level.txt +1 -0
wikigen-1.0.0/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 Mithun Ramesh
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
wikigen-1.0.0/PKG-INFO ADDED
@@ -0,0 +1,352 @@
1
+ Metadata-Version: 2.4
2
+ Name: wikigen
3
+ Version: 1.0.0
4
+ Summary: Wiki's for nerds, by nerds
5
+ Author: Mithun Ramesh
6
+ License-Expression: MIT
7
+ Project-URL: Homepage, https://github.com/usesalt/wikigen
8
+ Project-URL: Repository, https://github.com/usesalt/wikigen
9
+ Project-URL: Issues, https://github.com/usesalt/wikigen/issues
10
+ Classifier: Development Status :: 3 - Alpha
11
+ Classifier: Intended Audience :: Developers
12
+ Classifier: Operating System :: OS Independent
13
+ Classifier: Programming Language :: Python :: 3
14
+ Classifier: Programming Language :: Python :: 3.12
15
+ Classifier: Programming Language :: Python :: 3.13
16
+ Classifier: Programming Language :: Python :: 3.14
17
+ Classifier: Topic :: Software Development :: Documentation
18
+ Classifier: Topic :: Text Processing :: Markup
19
+ Requires-Python: >=3.12
20
+ Description-Content-Type: text/markdown
21
+ License-File: LICENSE
22
+ Requires-Dist: pocketflow>=0.0.1
23
+ Requires-Dist: pyyaml>=6.0
24
+ Requires-Dist: requests>=2.28.0
25
+ Requires-Dist: gitpython>=3.1.0
26
+ Requires-Dist: google-genai>=1.9.0
27
+ Requires-Dist: pathspec>=0.11.0
28
+ Requires-Dist: keyring>=24.0.0
29
+ Requires-Dist: mcp>=1.19.0
30
+ Requires-Dist: faiss-cpu>=1.7.4
31
+ Requires-Dist: sentence-transformers>=2.2.0
32
+ Requires-Dist: numpy>=1.24.0
33
+ Dynamic: license-file
34
+ Dynamic: requires-python
35
+
36
+ ![WikiGen](assets/wikigen.png)
37
+
38
+ ## WIKIGEN
39
+
40
+ [![PyPI](https://img.shields.io/badge/pypi-v0.2.4-blue)](https://pypi.org/project/wikigen/) [![Python](https://img.shields.io/badge/python-3.12+-blue)](https://www.python.org/) [![Downloads](https://img.shields.io/badge/downloads-3k+-brightgreen)](https://pypi.org/project/wikigen/) [![License](https://img.shields.io/badge/license-MIT-green)](LICENSE) [![GitHub](https://img.shields.io/badge/github-usesalt%2Fwikigen-red)](https://github.com/usesalt/wikigen)
41
+
42
+
43
+ **WikiGen** (Previously named "salt-docs") is a compact, human-readable documentation generator for codebases that minimizes tokens and makes structure easy for models to follow.
44
+ It's intended for **LLM input** as a drop-in, lossless representation of your existing codebase.
45
+
46
+
47
+ ## Installation
48
+
49
+ ### Option 1: Install from PyPI
50
+ ```bash
51
+ pip install wikigen
52
+ ```
53
+
54
+ ### Option 2: Install from source
55
+ ```bash
56
+ git clone https://github.com/usesalt/wikigen.git
57
+ cd wikigen
58
+ pip install -e .
59
+ ```
60
+
61
+ ## Quick Start
62
+
63
+ ### 1. Initial Setup
64
+ Run the setup wizard to configure your API keys and preferences:
65
+
66
+ ```bash
67
+ wikigen init
68
+ ```
69
+
70
+ ### 2. Generate Documentation
71
+
72
+ #### Analyze GitHub repository
73
+ ```bash
74
+ wikigen run https://github.com/username/repo
75
+ ```
76
+
77
+ #### Analyze local directory
78
+ ```bash
79
+ wikigen run /path/to/your/codebase
80
+ ```
81
+
82
+ #### With custom options
83
+ ```bash
84
+ wikigen run https://github.com/username/repo --output /custom/path --language spanish --max-abstractions 10
85
+ ```
86
+
87
+ ## Configuration
88
+
89
+ WikiGen stores configuration in a per-user config file and uses your system's keyring for secure API key storage.
90
+
91
+ - macOS/Linux: `~/.config/wikigen/config.json` (or `$XDG_CONFIG_HOME/wikigen/config.json`)
92
+ - Windows: `%APPDATA%\wikigen\config.json`
93
+
94
+ ### Configuration Options
95
+ - `llm_provider`: LLM provider to use (gemini, openai, anthropic, openrouter, ollama) - default: gemini
96
+ - `llm_model`: Model name to use (e.g., "gemini-2.5-flash", "gpt-4o-mini", "claude-3-5-sonnet-20241022") - default: gemini-2.5-flash
97
+ - `output_dir`: Default output directory
98
+ - `language`: Default language for generated docs
99
+ - `max_abstractions`: Default number of abstractions to identify
100
+ - `max_file_size`: Maximum file size in bytes
101
+ - `use_cache`: Enable/disable LLM response caching
102
+ - `include_patterns`: Default file patterns to include
103
+ - `exclude_patterns`: Default file patterns to exclude
104
+ - `ollama_base_url`: Custom Ollama base URL (optional, default: http://localhost:11434)
105
+
106
+ ### Managing Configuration
107
+
108
+ #### View Current Configuration
109
+ ```bash
110
+ wikigen config show
111
+ ```
112
+
113
+ #### Update API Keys
114
+ ```bash
115
+ # Update API key for any provider (interactive)
116
+ wikigen config update-api-key gemini
117
+ wikigen config update-api-key openai
118
+ wikigen config update-api-key anthropic
119
+ wikigen config update-api-key openrouter
120
+
121
+ # Legacy command (still works, redirects to update-api-key)
122
+ wikigen config update-gemini-key
123
+
124
+ # Update GitHub token (interactive)
125
+ wikigen config update-github-token
126
+
127
+ # Update GitHub token directly
128
+ wikigen config update-github-token "your-token-here"
129
+ ```
130
+
131
+ #### Update Other Settings
132
+ ```bash
133
+ # Change LLM provider
134
+ wikigen config set llm-provider openai
135
+
136
+ # Change LLM model
137
+ wikigen config set llm-model gpt-4o-mini
138
+
139
+ # Change default language
140
+ wikigen config set language spanish
141
+
142
+ # Change max abstractions
143
+ wikigen config set max_abstractions 15
144
+
145
+ # Disable caching
146
+ wikigen config set use_cache false
147
+
148
+ # Update output directory
149
+ wikigen config set output_dir /custom/path
150
+ ```
151
+
152
+ ---
153
+ ## CI/CD Integration
154
+
155
+ WikiGen can automatically generate and update documentation in your CI/CD pipeline. Perfect for keeping docs in sync with code changes!
156
+
157
+ ### Quick Setup for GitHub Actions
158
+
159
+ 1. **Add workflow file** to `.github/workflows/wikigen.yml`:
160
+
161
+ ```yaml
162
+ name: WikiGen
163
+
164
+ on:
165
+ push:
166
+ branches: [main]
167
+
168
+ jobs:
169
+ docs:
170
+ runs-on: ubuntu-latest
171
+ permissions:
172
+ contents: write
173
+ pull-requests: write
174
+ steps:
175
+ - uses: actions/checkout@v4
176
+ - uses: actions/setup-python@v5
177
+ with:
178
+ python-version: '3.12'
179
+ - run: pip install wikigen
180
+ - run: wikigen run . --ci --output-path docs/
181
+ env:
182
+ GEMINI_API_KEY: ${{ secrets.GEMINI_API_KEY }}
183
+ - uses: peter-evans/create-pull-request@v6
184
+ with:
185
+ commit-message: 'docs: updated documentation for new changes'
186
+ branch: wikigen/update-${{ github.run_number }}
187
+ title: 'Update Documentation'
188
+ ```
189
+
190
+ 2. **Add your LLM API key** to GitHub Secrets:
191
+ - Go to **Settings** → **Secrets and variables** → **Actions**
192
+ - Add `GEMINI_API_KEY` (or `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc.)
193
+
194
+ 3. **Push to main** - Documentation will be automatically generated and a PR will be created!
195
+
196
+ ### CI-Specific Flags
197
+
198
+ - `--ci` - Enable CI mode (non-interactive, better error messages)
199
+ - `--output-path <path>` - Custom output directory (e.g., `docs/`, `documentation/`)
200
+ - `--update` - Merge with existing docs instead of overwriting
201
+ - `--check-changes` - Exit with code 1 if docs changed, 0 if unchanged
202
+
203
+ ### Learn More
204
+
205
+ See the complete [CI/CD Integration Guide](docs/ci-cd-integration.md) for:
206
+ - Advanced configuration options
207
+ - Multiple LLM provider setup
208
+ - Troubleshooting tips
209
+ - Best practices
210
+ - Future integrations (Confluence, Notion, etc.)
211
+
212
+ ---
213
+ ## MCP Server Setup
214
+
215
+ WikiGen includes an MCP (Model Context Protocol) server that exposes your generated documentation to AI assistants in IDEs like Cursor, Continue.dev, and Claude Desktop.
216
+
217
+ ### MCP Tools Available
218
+
219
+ The MCP server provides these tools:
220
+ - `list_docs` - List all available documentation files
221
+ - `get_docs` - Fetch the full content of a documentation file (by resource name or absolute path)
222
+ - `search_docs` - Full-text search across documentation (paths, names, and resource names)
223
+ - `index_directories` - Index directories for fast searching
224
+
225
+ ### Setup Instructions
226
+
227
+ #### Cursor
228
+
229
+ 1. Open or create your MCP configuration file:
230
+ - **macOS/Linux**: `~/.cursor/mcp.json`
231
+ - **Windows**: `%APPDATA%\Cursor\mcp.json`
232
+
233
+ 2. Add the wikigen server configuration:
234
+
235
+ ```json
236
+ {
237
+ "mcpServers": {
238
+ "wikigen": {
239
+ "command": "wikigen",
240
+ "args": ["mcp"]
241
+ }
242
+ }
243
+ }
244
+ ```
245
+
246
+ 3. Restart Cursor to load the MCP server.
247
+
248
+ 4. The AI assistant in Cursor can now access your documentation using tools like:
249
+ - "What documentation do we have?"
250
+ - "Get me the documentation for 'WIKIGEN project"
251
+ - "Read the README documentation"
252
+
253
+ #### Claude Desktop
254
+
255
+ 1. Open or create your Claude configuration file:
256
+ - **macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`
257
+ - **Windows**: `%APPDATA%\Claude\claude_desktop_config.json`
258
+ - **Linux**: `~/.config/Claude/claude_desktop_config.json`
259
+
260
+ 2. Add the wikigen server configuration:
261
+
262
+ ```json
263
+ {
264
+ "mcpServers": {
265
+ "wikigen": {
266
+ "command": "wikigen",
267
+ "args": ["mcp"]
268
+ }
269
+ }
270
+ }
271
+ ```
272
+
273
+ 3. Restart Claude Desktop to load the MCP server.
274
+
275
+ #### Troubleshooting
276
+
277
+ - **Command not found**: Make sure `wikigen` is in your PATH. You can verify by running `wikigen --version` in your terminal.
278
+ - **Server not starting**: Ensure you've run `wikigen init` and have generated at least one documentation project.
279
+ - **No docs found**: The MCP server discovers docs from your configured `output_dir`. Run `wikigen config show` to check your output directory.
280
+
281
+ ### Testing the MCP Server
282
+
283
+ You can test the MCP server directly:
284
+
285
+ ```bash
286
+ wikigen mcp
287
+ ```
288
+
289
+ This will start the server in stdio mode (for MCP clients). To test locally, you can use the test scripts in the `tests/` directory.
290
+
291
+ ## LLM Provider Support
292
+
293
+ WikiGen supports multiple LLM providers, allowing you to choose the best option for your needs:
294
+
295
+ ### Supported Providers
296
+
297
+ 1. **Google Gemini** (default)
298
+ - Recommended models: gemini-2.5-pro, gemini-2.5-flash, gemini-1.5-pro, gemini-1.5-flash
299
+ - API key required: Yes (GEMINI_API_KEY)
300
+
301
+ 2. **OpenAI**
302
+ - Recommended models: gpt-4o-mini, gpt-4.1-mini, gpt-5-mini, gpt-5-nano
303
+ - API key required: Yes (OPENAI_API_KEY)
304
+ - Supports o1 models with reasoning capabilities
305
+
306
+ 3. **Anthropic Claude**
307
+ - Recommended models: claude-3-5-sonnet, claude-3-5-haiku, claude-3-7-sonnet (with extended thinking), claude-3-opus
308
+ - API key required: Yes (ANTHROPIC_API_KEY)
309
+
310
+ 4. **OpenRouter**
311
+ - Recommended models: google/gemini-2.5-flash:free, meta-llama/llama-3.1-8b-instruct:free, openai/gpt-4o-mini, anthropic/claude-3.5-sonnet
312
+ - API key required: Yes (OPENROUTER_API_KEY)
313
+ - Access multiple models through a single API
314
+
315
+ 5. **Ollama (Local)**
316
+ - Recommended models: llama3.2, llama3.1, mistral, codellama, phi3
317
+ - API key required: No (runs locally)
318
+ - Default URL: http://localhost:11434
319
+ - Perfect for privacy-sensitive projects or offline usage
320
+
321
+ ### Switching Providers
322
+
323
+ You can switch between providers at any time:
324
+
325
+ ```bash
326
+ # Switch to OpenAI
327
+ wikigen config set llm-provider openai
328
+ wikigen config set llm-model gpt-4o-mini
329
+ wikigen config update-api-key openai
330
+
331
+ # Switch to Ollama (local)
332
+ wikigen config set llm-provider ollama
333
+ wikigen config set llm-model llama3.2
334
+ # No API key needed for Ollama!
335
+ ```
336
+
337
+ ## CLI Options
338
+
339
+ ### Required
340
+ - `run` - GitHub repo URL, current open directory or local directory path
341
+ - `--repo` or `--dir` - GitHub repo URL or local directory path (depricated)
342
+
343
+ ### Optional
344
+ - `-n, --name` - Project name (derived from repo/directory if omitted)
345
+ - `-t, --token` - GitHub personal access token
346
+ - `-o, --output` - Output directory (overrides config default)
347
+ - `-i, --include` - File patterns to include (e.g., "*.py", "*.js")
348
+ - `-e, --exclude` - File patterns to exclude (e.g., "tests/*", "docs/*")
349
+ - `-s, --max-size` - Maximum file size in bytes (default: 100KB)
350
+ - `--language` - Language for generated docs (default: "english")
351
+ - `--no-cache` - Disable LLM response caching
352
+ - `--max-abstractions` - Maximum number of abstractions to identify (default: 10)
@@ -0,0 +1,317 @@
1
+ ![WikiGen](assets/wikigen.png)
2
+
3
+ ## WIKIGEN
4
+
5
+ [![PyPI](https://img.shields.io/badge/pypi-v0.2.4-blue)](https://pypi.org/project/wikigen/) [![Python](https://img.shields.io/badge/python-3.12+-blue)](https://www.python.org/) [![Downloads](https://img.shields.io/badge/downloads-3k+-brightgreen)](https://pypi.org/project/wikigen/) [![License](https://img.shields.io/badge/license-MIT-green)](LICENSE) [![GitHub](https://img.shields.io/badge/github-usesalt%2Fwikigen-red)](https://github.com/usesalt/wikigen)
6
+
7
+
8
+ **WikiGen** (Previously named "salt-docs") is a compact, human-readable documentation generator for codebases that minimizes tokens and makes structure easy for models to follow.
9
+ It's intended for **LLM input** as a drop-in, lossless representation of your existing codebase.
10
+
11
+
12
+ ## Installation
13
+
14
+ ### Option 1: Install from PyPI
15
+ ```bash
16
+ pip install wikigen
17
+ ```
18
+
19
+ ### Option 2: Install from source
20
+ ```bash
21
+ git clone https://github.com/usesalt/wikigen.git
22
+ cd wikigen
23
+ pip install -e .
24
+ ```
25
+
26
+ ## Quick Start
27
+
28
+ ### 1. Initial Setup
29
+ Run the setup wizard to configure your API keys and preferences:
30
+
31
+ ```bash
32
+ wikigen init
33
+ ```
34
+
35
+ ### 2. Generate Documentation
36
+
37
+ #### Analyze GitHub repository
38
+ ```bash
39
+ wikigen run https://github.com/username/repo
40
+ ```
41
+
42
+ #### Analyze local directory
43
+ ```bash
44
+ wikigen run /path/to/your/codebase
45
+ ```
46
+
47
+ #### With custom options
48
+ ```bash
49
+ wikigen run https://github.com/username/repo --output /custom/path --language spanish --max-abstractions 10
50
+ ```
51
+
52
+ ## Configuration
53
+
54
+ WikiGen stores configuration in a per-user config file and uses your system's keyring for secure API key storage.
55
+
56
+ - macOS/Linux: `~/.config/wikigen/config.json` (or `$XDG_CONFIG_HOME/wikigen/config.json`)
57
+ - Windows: `%APPDATA%\wikigen\config.json`
58
+
59
+ ### Configuration Options
60
+ - `llm_provider`: LLM provider to use (gemini, openai, anthropic, openrouter, ollama) - default: gemini
61
+ - `llm_model`: Model name to use (e.g., "gemini-2.5-flash", "gpt-4o-mini", "claude-3-5-sonnet-20241022") - default: gemini-2.5-flash
62
+ - `output_dir`: Default output directory
63
+ - `language`: Default language for generated docs
64
+ - `max_abstractions`: Default number of abstractions to identify
65
+ - `max_file_size`: Maximum file size in bytes
66
+ - `use_cache`: Enable/disable LLM response caching
67
+ - `include_patterns`: Default file patterns to include
68
+ - `exclude_patterns`: Default file patterns to exclude
69
+ - `ollama_base_url`: Custom Ollama base URL (optional, default: http://localhost:11434)
70
+
71
+ ### Managing Configuration
72
+
73
+ #### View Current Configuration
74
+ ```bash
75
+ wikigen config show
76
+ ```
77
+
78
+ #### Update API Keys
79
+ ```bash
80
+ # Update API key for any provider (interactive)
81
+ wikigen config update-api-key gemini
82
+ wikigen config update-api-key openai
83
+ wikigen config update-api-key anthropic
84
+ wikigen config update-api-key openrouter
85
+
86
+ # Legacy command (still works, redirects to update-api-key)
87
+ wikigen config update-gemini-key
88
+
89
+ # Update GitHub token (interactive)
90
+ wikigen config update-github-token
91
+
92
+ # Update GitHub token directly
93
+ wikigen config update-github-token "your-token-here"
94
+ ```
95
+
96
+ #### Update Other Settings
97
+ ```bash
98
+ # Change LLM provider
99
+ wikigen config set llm-provider openai
100
+
101
+ # Change LLM model
102
+ wikigen config set llm-model gpt-4o-mini
103
+
104
+ # Change default language
105
+ wikigen config set language spanish
106
+
107
+ # Change max abstractions
108
+ wikigen config set max_abstractions 15
109
+
110
+ # Disable caching
111
+ wikigen config set use_cache false
112
+
113
+ # Update output directory
114
+ wikigen config set output_dir /custom/path
115
+ ```
116
+
117
+ ---
118
+ ## CI/CD Integration
119
+
120
+ WikiGen can automatically generate and update documentation in your CI/CD pipeline. Perfect for keeping docs in sync with code changes!
121
+
122
+ ### Quick Setup for GitHub Actions
123
+
124
+ 1. **Add workflow file** to `.github/workflows/wikigen.yml`:
125
+
126
+ ```yaml
127
+ name: WikiGen
128
+
129
+ on:
130
+ push:
131
+ branches: [main]
132
+
133
+ jobs:
134
+ docs:
135
+ runs-on: ubuntu-latest
136
+ permissions:
137
+ contents: write
138
+ pull-requests: write
139
+ steps:
140
+ - uses: actions/checkout@v4
141
+ - uses: actions/setup-python@v5
142
+ with:
143
+ python-version: '3.12'
144
+ - run: pip install wikigen
145
+ - run: wikigen run . --ci --output-path docs/
146
+ env:
147
+ GEMINI_API_KEY: ${{ secrets.GEMINI_API_KEY }}
148
+ - uses: peter-evans/create-pull-request@v6
149
+ with:
150
+ commit-message: 'docs: updated documentation for new changes'
151
+ branch: wikigen/update-${{ github.run_number }}
152
+ title: 'Update Documentation'
153
+ ```
154
+
155
+ 2. **Add your LLM API key** to GitHub Secrets:
156
+ - Go to **Settings** → **Secrets and variables** → **Actions**
157
+ - Add `GEMINI_API_KEY` (or `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc.)
158
+
159
+ 3. **Push to main** - Documentation will be automatically generated and a PR will be created!
160
+
161
+ ### CI-Specific Flags
162
+
163
+ - `--ci` - Enable CI mode (non-interactive, better error messages)
164
+ - `--output-path <path>` - Custom output directory (e.g., `docs/`, `documentation/`)
165
+ - `--update` - Merge with existing docs instead of overwriting
166
+ - `--check-changes` - Exit with code 1 if docs changed, 0 if unchanged
167
+
168
+ ### Learn More
169
+
170
+ See the complete [CI/CD Integration Guide](docs/ci-cd-integration.md) for:
171
+ - Advanced configuration options
172
+ - Multiple LLM provider setup
173
+ - Troubleshooting tips
174
+ - Best practices
175
+ - Future integrations (Confluence, Notion, etc.)
176
+
177
+ ---
178
+ ## MCP Server Setup
179
+
180
+ WikiGen includes an MCP (Model Context Protocol) server that exposes your generated documentation to AI assistants in IDEs like Cursor, Continue.dev, and Claude Desktop.
181
+
182
+ ### MCP Tools Available
183
+
184
+ The MCP server provides these tools:
185
+ - `list_docs` - List all available documentation files
186
+ - `get_docs` - Fetch the full content of a documentation file (by resource name or absolute path)
187
+ - `search_docs` - Full-text search across documentation (paths, names, and resource names)
188
+ - `index_directories` - Index directories for fast searching
189
+
190
+ ### Setup Instructions
191
+
192
+ #### Cursor
193
+
194
+ 1. Open or create your MCP configuration file:
195
+ - **macOS/Linux**: `~/.cursor/mcp.json`
196
+ - **Windows**: `%APPDATA%\Cursor\mcp.json`
197
+
198
+ 2. Add the wikigen server configuration:
199
+
200
+ ```json
201
+ {
202
+ "mcpServers": {
203
+ "wikigen": {
204
+ "command": "wikigen",
205
+ "args": ["mcp"]
206
+ }
207
+ }
208
+ }
209
+ ```
210
+
211
+ 3. Restart Cursor to load the MCP server.
212
+
213
+ 4. The AI assistant in Cursor can now access your documentation using tools like:
214
+ - "What documentation do we have?"
215
+ - "Get me the documentation for 'WIKIGEN project"
216
+ - "Read the README documentation"
217
+
218
+ #### Claude Desktop
219
+
220
+ 1. Open or create your Claude configuration file:
221
+ - **macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`
222
+ - **Windows**: `%APPDATA%\Claude\claude_desktop_config.json`
223
+ - **Linux**: `~/.config/Claude/claude_desktop_config.json`
224
+
225
+ 2. Add the wikigen server configuration:
226
+
227
+ ```json
228
+ {
229
+ "mcpServers": {
230
+ "wikigen": {
231
+ "command": "wikigen",
232
+ "args": ["mcp"]
233
+ }
234
+ }
235
+ }
236
+ ```
237
+
238
+ 3. Restart Claude Desktop to load the MCP server.
239
+
240
+ #### Troubleshooting
241
+
242
+ - **Command not found**: Make sure `wikigen` is in your PATH. You can verify by running `wikigen --version` in your terminal.
243
+ - **Server not starting**: Ensure you've run `wikigen init` and have generated at least one documentation project.
244
+ - **No docs found**: The MCP server discovers docs from your configured `output_dir`. Run `wikigen config show` to check your output directory.
245
+
246
+ ### Testing the MCP Server
247
+
248
+ You can test the MCP server directly:
249
+
250
+ ```bash
251
+ wikigen mcp
252
+ ```
253
+
254
+ This will start the server in stdio mode (for MCP clients). To test locally, you can use the test scripts in the `tests/` directory.
255
+
256
+ ## LLM Provider Support
257
+
258
+ WikiGen supports multiple LLM providers, allowing you to choose the best option for your needs:
259
+
260
+ ### Supported Providers
261
+
262
+ 1. **Google Gemini** (default)
263
+ - Recommended models: gemini-2.5-pro, gemini-2.5-flash, gemini-1.5-pro, gemini-1.5-flash
264
+ - API key required: Yes (GEMINI_API_KEY)
265
+
266
+ 2. **OpenAI**
267
+ - Recommended models: gpt-4o-mini, gpt-4.1-mini, gpt-5-mini, gpt-5-nano
268
+ - API key required: Yes (OPENAI_API_KEY)
269
+ - Supports o1 models with reasoning capabilities
270
+
271
+ 3. **Anthropic Claude**
272
+ - Recommended models: claude-3-5-sonnet, claude-3-5-haiku, claude-3-7-sonnet (with extended thinking), claude-3-opus
273
+ - API key required: Yes (ANTHROPIC_API_KEY)
274
+
275
+ 4. **OpenRouter**
276
+ - Recommended models: google/gemini-2.5-flash:free, meta-llama/llama-3.1-8b-instruct:free, openai/gpt-4o-mini, anthropic/claude-3.5-sonnet
277
+ - API key required: Yes (OPENROUTER_API_KEY)
278
+ - Access multiple models through a single API
279
+
280
+ 5. **Ollama (Local)**
281
+ - Recommended models: llama3.2, llama3.1, mistral, codellama, phi3
282
+ - API key required: No (runs locally)
283
+ - Default URL: http://localhost:11434
284
+ - Perfect for privacy-sensitive projects or offline usage
285
+
286
+ ### Switching Providers
287
+
288
+ You can switch between providers at any time:
289
+
290
+ ```bash
291
+ # Switch to OpenAI
292
+ wikigen config set llm-provider openai
293
+ wikigen config set llm-model gpt-4o-mini
294
+ wikigen config update-api-key openai
295
+
296
+ # Switch to Ollama (local)
297
+ wikigen config set llm-provider ollama
298
+ wikigen config set llm-model llama3.2
299
+ # No API key needed for Ollama!
300
+ ```
301
+
302
+ ## CLI Options
303
+
304
+ ### Required
305
+ - `run` - GitHub repo URL, current open directory or local directory path
306
+ - `--repo` or `--dir` - GitHub repo URL or local directory path (depricated)
307
+
308
+ ### Optional
309
+ - `-n, --name` - Project name (derived from repo/directory if omitted)
310
+ - `-t, --token` - GitHub personal access token
311
+ - `-o, --output` - Output directory (overrides config default)
312
+ - `-i, --include` - File patterns to include (e.g., "*.py", "*.js")
313
+ - `-e, --exclude` - File patterns to exclude (e.g., "tests/*", "docs/*")
314
+ - `-s, --max-size` - Maximum file size in bytes (default: 100KB)
315
+ - `--language` - Language for generated docs (default: "english")
316
+ - `--no-cache` - Disable LLM response caching
317
+ - `--max-abstractions` - Maximum number of abstractions to identify (default: 10)