mcp-docs-scraper 0.1.0 → 0.1.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +186 -149
- package/dist/server.d.ts.map +1 -1
- package/dist/server.js +55 -97
- package/dist/server.js.map +1 -1
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,34 +1,44 @@
|
|
|
1
1
|
# MCP Docs Scraper
|
|
2
2
|
|
|
3
|
-
|
|
3
|
+
**Give your AI coding agent instant, structured access to any library's documentation.**
|
|
4
4
|
|
|
5
|
-
|
|
5
|
+
[](https://www.npmjs.com/package/mcp-docs-scraper)
|
|
6
|
+
[](https://opensource.org/licenses/MIT)
|
|
7
|
+
[](https://nodejs.org/)
|
|
6
8
|
|
|
7
|
-
|
|
8
|
-
- **Smart web scraping fallback** - Crawls and cleans docs sites when no repo is available
|
|
9
|
-
- **Auto-detection** - Automatically detects GitHub repos from documentation URLs
|
|
10
|
-
- **Full-text search** - Search within cached documentation with snippets
|
|
11
|
-
- **Local caching** - No duplicate fetches, works offline after initial index
|
|
9
|
+
## The Problem
|
|
12
10
|
|
|
13
|
-
|
|
11
|
+
When your agent needs documentation, it typically:
|
|
14
12
|
|
|
15
|
-
|
|
16
|
-
|
|
17
|
-
|
|
18
|
-
|
|
13
|
+
1. **Searches the web** → Gets 10 links, picks one
|
|
14
|
+
2. **Fetches the page** → Downloads HTML, parses it
|
|
15
|
+
3. **Realizes it needs more** → Goes back, fetches another page
|
|
16
|
+
4. **Repeats 3-5 times** → Each step is a tool call
|
|
19
17
|
|
|
20
|
-
|
|
21
|
-
pnpm install
|
|
18
|
+
That's **5-15 tool calls** just to answer "how do I validate emails with Zod?" Each call adds latency and burns tokens on navigation overhead.
|
|
22
19
|
|
|
23
|
-
|
|
24
|
-
|
|
20
|
+
## The Solution
|
|
21
|
+
|
|
22
|
+
MCP Docs Scraper **indexes documentation once** and gives your agent **direct, structured access**:
|
|
23
|
+
|
|
24
|
+
```
|
|
25
|
+
┌─────────────────┐ ┌──────────────────┐ ┌─────────────┐
|
|
26
|
+
│ GitHub Repo │─────▶│ MCP Docs │─────▶│ AI Agent │
|
|
27
|
+
│ or Docs Site │ │ Scraper │ │ │
|
|
28
|
+
└─────────────────┘ │ ┌────────────┐ │ │ 1 search │
|
|
29
|
+
│ │ Local │ │ │ 1 fetch │
|
|
30
|
+
│ │ Cache │ │ │ Done. │
|
|
31
|
+
│ └────────────┘ │ └─────────────┘
|
|
32
|
+
└──────────────────┘
|
|
25
33
|
```
|
|
26
34
|
|
|
27
|
-
|
|
35
|
+
**Result:** 2 tool calls instead of 10. Faster responses, lower costs, better answers.
|
|
36
|
+
|
|
37
|
+
## Quick Start
|
|
28
38
|
|
|
29
|
-
|
|
39
|
+
**1. Add to your MCP config:**
|
|
30
40
|
|
|
31
|
-
**Claude Desktop** (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, `%APPDATA%\Claude\claude_desktop_config.json` on Windows):
|
|
41
|
+
For **Claude Desktop** (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, `%APPDATA%\Claude\claude_desktop_config.json` on Windows):
|
|
32
42
|
|
|
33
43
|
```json
|
|
34
44
|
{
|
|
@@ -41,7 +51,7 @@ pnpm build
|
|
|
41
51
|
}
|
|
42
52
|
```
|
|
43
53
|
|
|
44
|
-
**Cursor** (`.cursor/mcp.json`):
|
|
54
|
+
For **Cursor** (`.cursor/mcp.json` in your project or global config):
|
|
45
55
|
|
|
46
56
|
```json
|
|
47
57
|
{
|
|
@@ -54,67 +64,82 @@ pnpm build
|
|
|
54
64
|
}
|
|
55
65
|
```
|
|
56
66
|
|
|
57
|
-
|
|
67
|
+
**2. Restart your AI client**
|
|
58
68
|
|
|
59
|
-
|
|
69
|
+
**3. Ask your agent something like:**
|
|
70
|
+
|
|
71
|
+
> "Index the Zod documentation and show me how to create custom validators"
|
|
72
|
+
|
|
73
|
+
The agent will automatically use the tools to index, search, and retrieve exactly what it needs.
|
|
74
|
+
|
|
75
|
+
## Features
|
|
76
|
+
|
|
77
|
+
| Feature | Why It Matters |
|
|
78
|
+
| ------------------------------- | -------------------------------------------------------------- |
|
|
79
|
+
| **GitHub-first fetching** | Pulls clean markdown directly from repos—no HTML parsing noise |
|
|
80
|
+
| **Smart web scraping fallback** | Works even when there's no GitHub repo available |
|
|
81
|
+
| **Auto-detection** | Point it at `zod.dev`, it finds the GitHub repo automatically |
|
|
82
|
+
| **Full-text search** | Agent finds the right section in one call, not five |
|
|
83
|
+
| **Local caching** | Index once, use forever. Works offline after first fetch |
|
|
84
|
+
| **Structured tree navigation** | Agent sees what's available without loading everything |
|
|
85
|
+
|
|
86
|
+
## Available Tools
|
|
87
|
+
|
|
88
|
+
| Tool | Description |
|
|
89
|
+
| -------------------- | -------------------------------------------------------- |
|
|
90
|
+
| `index_docs` | Fetch and cache documentation from GitHub or any website |
|
|
91
|
+
| `get_docs_tree` | Browse the structure of cached docs |
|
|
92
|
+
| `search_docs` | Full-text search with snippets |
|
|
93
|
+
| `get_docs_content` | Retrieve specific files from cache |
|
|
94
|
+
| `detect_github_repo` | Find GitHub repo from a docs website URL |
|
|
95
|
+
| `list_cached_docs` | List all cached documentation |
|
|
96
|
+
| `clear_cache` | Remove cached documentation |
|
|
97
|
+
|
|
98
|
+
## How Agents Use This
|
|
99
|
+
|
|
100
|
+
Here's a typical interaction when you ask "How do I use Zod's transform feature?":
|
|
60
101
|
|
|
61
|
-
```json
|
|
62
|
-
{
|
|
63
|
-
"mcpServers": {
|
|
64
|
-
"docs-scraper": {
|
|
65
|
-
"command": "node",
|
|
66
|
-
"args": ["/absolute/path/to/mcp-docs-scraper/dist/index.js"]
|
|
67
|
-
}
|
|
68
|
-
}
|
|
69
|
-
}
|
|
70
102
|
```
|
|
103
|
+
Agent: [Calls search_docs for "colinhacks_zod" with query "transform"]
|
|
104
|
+
→ Gets: [{path: "README.md", snippet: "...transform method allows..."}]
|
|
71
105
|
|
|
72
|
-
|
|
106
|
+
Agent: [Calls get_docs_content for "README.md"]
|
|
107
|
+
→ Gets: Full markdown content of that section
|
|
73
108
|
|
|
74
|
-
|
|
109
|
+
Agent: "Here's how to use Zod's transform feature: ..."
|
|
110
|
+
```
|
|
111
|
+
|
|
112
|
+
**2 tool calls. Done.** Compare that to the web search → browse → scroll → click → read → go back loop.
|
|
113
|
+
|
|
114
|
+
### First-Time Indexing
|
|
115
|
+
|
|
116
|
+
If docs aren't cached yet, the agent indexes them first:
|
|
75
117
|
|
|
76
|
-
```
|
|
77
|
-
|
|
78
|
-
|
|
79
|
-
|
|
80
|
-
"command": "npx",
|
|
81
|
-
"args": ["-y", "mcp-docs-scraper"],
|
|
82
|
-
"env": {
|
|
83
|
-
"GITHUB_TOKEN": "ghp_your_token_here"
|
|
84
|
-
}
|
|
85
|
-
}
|
|
86
|
-
}
|
|
87
|
-
}
|
|
118
|
+
```
|
|
119
|
+
Agent: [Calls index_docs for "https://github.com/colinhacks/zod"]
|
|
120
|
+
→ Fetches all markdown files, caches locally
|
|
121
|
+
→ Returns: {id: "colinhacks_zod", pages: 15, ...}
|
|
88
122
|
```
|
|
89
123
|
|
|
90
|
-
|
|
124
|
+
This happens once. After that, all access is instant from local cache.
|
|
91
125
|
|
|
92
|
-
|
|
93
|
-
| -------------------- | ------------------------------------------------ |
|
|
94
|
-
| `ping` | Health check - returns pong |
|
|
95
|
-
| `index_docs` | Fetch and cache documentation from GitHub or web |
|
|
96
|
-
| `get_docs_tree` | Get hierarchical structure of cached docs |
|
|
97
|
-
| `get_docs_content` | Retrieve content of specific doc files |
|
|
98
|
-
| `search_docs` | Full-text search within cached docs |
|
|
99
|
-
| `detect_github_repo` | Find GitHub repo from a docs website URL |
|
|
100
|
-
| `list_cached_docs` | List all cached documentation |
|
|
101
|
-
| `clear_cache` | Remove cached documentation |
|
|
126
|
+
## Tool Details
|
|
102
127
|
|
|
103
128
|
### `index_docs`
|
|
104
129
|
|
|
105
130
|
Fetch and cache documentation from a GitHub repository or website.
|
|
106
131
|
|
|
107
132
|
```typescript
|
|
108
|
-
// Index from GitHub (
|
|
133
|
+
// Index from GitHub (recommended)
|
|
109
134
|
index_docs({ url: "https://github.com/colinhacks/zod" });
|
|
110
135
|
|
|
111
|
-
// Index from
|
|
136
|
+
// Index from a docs site (auto-detects GitHub if possible)
|
|
112
137
|
index_docs({ url: "https://zod.dev" });
|
|
113
138
|
|
|
114
|
-
// Force web scraping
|
|
115
|
-
index_docs({ url: "https://docs.example.com", type: "scrape"
|
|
139
|
+
// Force web scraping when GitHub isn't available
|
|
140
|
+
index_docs({ url: "https://docs.example.com", type: "scrape" });
|
|
116
141
|
|
|
117
|
-
// Re-index
|
|
142
|
+
// Re-index to get latest changes
|
|
118
143
|
index_docs({ url: "https://github.com/owner/repo", force_refresh: true });
|
|
119
144
|
```
|
|
120
145
|
|
|
@@ -125,27 +150,10 @@ index_docs({ url: "https://github.com/owner/repo", force_refresh: true });
|
|
|
125
150
|
"id": "colinhacks_zod",
|
|
126
151
|
"source": "github",
|
|
127
152
|
"repo": "colinhacks/zod",
|
|
128
|
-
"
|
|
129
|
-
"stats": {
|
|
130
|
-
"pages": 15,
|
|
131
|
-
"total_size_bytes": 245000,
|
|
132
|
-
"indexed_at": "2025-01-07T..."
|
|
133
|
-
}
|
|
153
|
+
"stats": { "pages": 15, "total_size_bytes": 245000 }
|
|
134
154
|
}
|
|
135
155
|
```
|
|
136
156
|
|
|
137
|
-
### `get_docs_tree`
|
|
138
|
-
|
|
139
|
-
Get the file tree for cached documentation.
|
|
140
|
-
|
|
141
|
-
```typescript
|
|
142
|
-
// Full tree
|
|
143
|
-
get_docs_tree({ docs_id: "colinhacks_zod" });
|
|
144
|
-
|
|
145
|
-
// Subtree only
|
|
146
|
-
get_docs_tree({ docs_id: "colinhacks_zod", path: "docs/", max_depth: 2 });
|
|
147
|
-
```
|
|
148
|
-
|
|
149
157
|
### `search_docs`
|
|
150
158
|
|
|
151
159
|
Full-text search within cached documentation.
|
|
@@ -162,8 +170,6 @@ search_docs({
|
|
|
162
170
|
|
|
163
171
|
```json
|
|
164
172
|
{
|
|
165
|
-
"docs_id": "colinhacks_zod",
|
|
166
|
-
"query": "custom validation",
|
|
167
173
|
"results": [
|
|
168
174
|
{
|
|
169
175
|
"path": "README.md",
|
|
@@ -177,7 +183,7 @@ search_docs({
|
|
|
177
183
|
|
|
178
184
|
### `get_docs_content`
|
|
179
185
|
|
|
180
|
-
Retrieve actual content of specific files
|
|
186
|
+
Retrieve actual content of specific files.
|
|
181
187
|
|
|
182
188
|
```typescript
|
|
183
189
|
get_docs_content({
|
|
@@ -190,11 +196,9 @@ get_docs_content({
|
|
|
190
196
|
|
|
191
197
|
```json
|
|
192
198
|
{
|
|
193
|
-
"docs_id": "colinhacks_zod",
|
|
194
199
|
"contents": {
|
|
195
200
|
"README.md": {
|
|
196
201
|
"content": "# Zod\n\nTypeScript-first schema validation...",
|
|
197
|
-
"title": "Zod",
|
|
198
202
|
"headings": ["# Zod", "## Installation", "## Basic Usage"],
|
|
199
203
|
"size_bytes": 15234
|
|
200
204
|
}
|
|
@@ -203,6 +207,18 @@ get_docs_content({
|
|
|
203
207
|
}
|
|
204
208
|
```
|
|
205
209
|
|
|
210
|
+
### `get_docs_tree`
|
|
211
|
+
|
|
212
|
+
Get the file structure of cached documentation.
|
|
213
|
+
|
|
214
|
+
```typescript
|
|
215
|
+
// Full tree
|
|
216
|
+
get_docs_tree({ docs_id: "colinhacks_zod" });
|
|
217
|
+
|
|
218
|
+
// Subtree only
|
|
219
|
+
get_docs_tree({ docs_id: "colinhacks_zod", path: "docs/", max_depth: 2 });
|
|
220
|
+
```
|
|
221
|
+
|
|
206
222
|
### `detect_github_repo`
|
|
207
223
|
|
|
208
224
|
Find GitHub repository from a documentation website URL.
|
|
@@ -222,67 +238,66 @@ detect_github_repo({ url: "https://zod.dev" });
|
|
|
222
238
|
}
|
|
223
239
|
```
|
|
224
240
|
|
|
225
|
-
### `list_cached_docs`
|
|
226
|
-
|
|
227
|
-
List all documentation sets in the local cache.
|
|
241
|
+
### `list_cached_docs` / `clear_cache`
|
|
228
242
|
|
|
229
243
|
```typescript
|
|
244
|
+
// See what's cached
|
|
230
245
|
list_cached_docs();
|
|
231
|
-
```
|
|
232
|
-
|
|
233
|
-
### `clear_cache`
|
|
234
246
|
|
|
235
|
-
|
|
236
|
-
|
|
237
|
-
```typescript
|
|
238
|
-
// Clear specific entry
|
|
247
|
+
// Clear specific docs
|
|
239
248
|
clear_cache({ docs_id: "colinhacks_zod" });
|
|
240
249
|
|
|
241
|
-
// Clear
|
|
250
|
+
// Clear everything
|
|
242
251
|
clear_cache({ all: true });
|
|
243
252
|
```
|
|
244
253
|
|
|
245
|
-
##
|
|
246
|
-
|
|
247
|
-
Here's a typical workflow for an AI coding agent:
|
|
248
|
-
|
|
249
|
-
1. **Find the docs source:**
|
|
254
|
+
## Configuration
|
|
250
255
|
|
|
251
|
-
|
|
252
|
-
detect_github_repo({ url: "https://zod.dev" })
|
|
253
|
-
→ { found: true, repo: "colinhacks/zod" }
|
|
254
|
-
```
|
|
256
|
+
### With GitHub Token (Recommended for Heavy Use)
|
|
255
257
|
|
|
256
|
-
|
|
258
|
+
The GitHub API allows 60 requests/hour without authentication. For higher limits (5,000/hour), add a token:
|
|
257
259
|
|
|
258
|
-
|
|
259
|
-
|
|
260
|
-
|
|
261
|
-
|
|
260
|
+
```json
|
|
261
|
+
{
|
|
262
|
+
"mcpServers": {
|
|
263
|
+
"docs-scraper": {
|
|
264
|
+
"command": "npx",
|
|
265
|
+
"args": ["-y", "mcp-docs-scraper"],
|
|
266
|
+
"env": {
|
|
267
|
+
"GITHUB_TOKEN": "ghp_your_token_here"
|
|
268
|
+
}
|
|
269
|
+
}
|
|
270
|
+
}
|
|
271
|
+
}
|
|
272
|
+
```
|
|
262
273
|
|
|
263
|
-
|
|
274
|
+
### Local Installation
|
|
264
275
|
|
|
265
|
-
|
|
266
|
-
get_docs_tree({ docs_id: "colinhacks_zod" })
|
|
267
|
-
→ Returns hierarchical file tree
|
|
268
|
-
```
|
|
276
|
+
If you prefer to install locally instead of using npx:
|
|
269
277
|
|
|
270
|
-
|
|
278
|
+
```bash
|
|
279
|
+
git clone https://github.com/kwiscion/mcp-docs-scraper.git
|
|
280
|
+
cd mcp-docs-scraper
|
|
281
|
+
pnpm install
|
|
282
|
+
pnpm build
|
|
283
|
+
```
|
|
271
284
|
|
|
272
|
-
|
|
273
|
-
search_docs({ docs_id: "colinhacks_zod", query: "transform" })
|
|
274
|
-
→ Returns matching files with snippets
|
|
275
|
-
```
|
|
285
|
+
Then configure:
|
|
276
286
|
|
|
277
|
-
|
|
278
|
-
|
|
279
|
-
|
|
280
|
-
|
|
281
|
-
|
|
287
|
+
```json
|
|
288
|
+
{
|
|
289
|
+
"mcpServers": {
|
|
290
|
+
"docs-scraper": {
|
|
291
|
+
"command": "node",
|
|
292
|
+
"args": ["/absolute/path/to/mcp-docs-scraper/dist/index.js"]
|
|
293
|
+
}
|
|
294
|
+
}
|
|
295
|
+
}
|
|
296
|
+
```
|
|
282
297
|
|
|
283
298
|
## Cache Location
|
|
284
299
|
|
|
285
|
-
Documentation is cached locally
|
|
300
|
+
Documentation is cached locally:
|
|
286
301
|
|
|
287
302
|
- **macOS/Linux:** `~/.mcp-docs-cache/`
|
|
288
303
|
- **Windows:** `%USERPROFILE%\.mcp-docs-cache\`
|
|
@@ -295,43 +310,34 @@ Structure:
|
|
|
295
310
|
│ └── owner_repo/
|
|
296
311
|
│ ├── meta.json
|
|
297
312
|
│ ├── search-index.json
|
|
298
|
-
│ └── content
|
|
299
|
-
│ └── *.md
|
|
313
|
+
│ └── content/*.md
|
|
300
314
|
└── scraped/
|
|
301
315
|
└── domain_path/
|
|
302
316
|
├── meta.json
|
|
303
317
|
├── search-index.json
|
|
304
|
-
└── content
|
|
305
|
-
└── *.md
|
|
318
|
+
└── content/*.md
|
|
306
319
|
```
|
|
307
320
|
|
|
308
321
|
## Troubleshooting
|
|
309
322
|
|
|
310
323
|
### "GitHub API rate limit exceeded"
|
|
311
324
|
|
|
312
|
-
|
|
325
|
+
Add a `GITHUB_TOKEN` environment variable (see Configuration above).
|
|
313
326
|
|
|
314
327
|
### "Documentation not found in cache"
|
|
315
328
|
|
|
316
|
-
|
|
329
|
+
Run `index_docs` first to fetch and cache the documentation.
|
|
317
330
|
|
|
318
331
|
### "No content found" when scraping
|
|
319
332
|
|
|
320
|
-
|
|
321
|
-
|
|
322
|
-
- The site blocks automated access
|
|
323
|
-
- The URL doesn't contain documentation content
|
|
324
|
-
- Try a more specific URL (e.g., `/docs` instead of homepage)
|
|
333
|
+
The site may block automated access, or the URL doesn't contain documentation. Try:
|
|
325
334
|
|
|
326
|
-
|
|
335
|
+
1. Use `detect_github_repo` to find a GitHub source instead
|
|
336
|
+
2. Try a more specific URL (e.g., `/docs` instead of homepage)
|
|
327
337
|
|
|
328
338
|
### "Website blocked automated access"
|
|
329
339
|
|
|
330
|
-
|
|
331
|
-
|
|
332
|
-
1. Use `detect_github_repo` to find a GitHub alternative
|
|
333
|
-
2. Try a different starting URL
|
|
334
|
-
3. Use the GitHub source if available
|
|
340
|
+
Some sites block scraping. Use `detect_github_repo` to find a GitHub alternative.
|
|
335
341
|
|
|
336
342
|
## Development
|
|
337
343
|
|
|
@@ -339,19 +345,50 @@ Structure:
|
|
|
339
345
|
# Install dependencies
|
|
340
346
|
pnpm install
|
|
341
347
|
|
|
342
|
-
#
|
|
348
|
+
# Development mode (with hot reload)
|
|
349
|
+
pnpm dev
|
|
350
|
+
|
|
351
|
+
# Build for production
|
|
343
352
|
pnpm build
|
|
344
353
|
|
|
345
|
-
# Run the server
|
|
354
|
+
# Run the built server
|
|
346
355
|
pnpm start
|
|
356
|
+
```
|
|
357
|
+
|
|
358
|
+
### Project Structure
|
|
347
359
|
|
|
348
|
-
# Development mode (with tsx)
|
|
349
|
-
pnpm dev
|
|
350
360
|
```
|
|
361
|
+
src/
|
|
362
|
+
├── index.ts # Entry point
|
|
363
|
+
├── server.ts # MCP server setup
|
|
364
|
+
├── tools/ # Tool implementations
|
|
365
|
+
├── services/ # Core logic (GitHub, scraper, cache)
|
|
366
|
+
├── types/ # TypeScript types
|
|
367
|
+
└── utils/ # Helpers
|
|
368
|
+
```
|
|
369
|
+
|
|
370
|
+
## Contributing
|
|
371
|
+
|
|
372
|
+
Contributions are welcome!
|
|
373
|
+
|
|
374
|
+
1. Fork the repository
|
|
375
|
+
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
|
|
376
|
+
3. Make your changes
|
|
377
|
+
4. Run `pnpm build` to ensure it compiles
|
|
378
|
+
5. Test with a real MCP client (Claude Desktop or Cursor)
|
|
379
|
+
6. Submit a PR
|
|
380
|
+
|
|
381
|
+
See the [`plan/`](./plan/) directory for architecture decisions and implementation details.
|
|
382
|
+
|
|
383
|
+
## Acknowledgments
|
|
384
|
+
|
|
385
|
+
Built with:
|
|
351
386
|
|
|
352
|
-
-
|
|
353
|
-
-
|
|
387
|
+
- [Model Context Protocol](https://modelcontextprotocol.io/) — AI tool interoperability standard
|
|
388
|
+
- [MiniSearch](https://github.com/lucaong/minisearch) — Lightweight full-text search
|
|
389
|
+
- [Cheerio](https://github.com/cheeriojs/cheerio) — Fast HTML parsing
|
|
390
|
+
- [Turndown](https://github.com/mixmark-io/turndown) — HTML to Markdown conversion
|
|
354
391
|
|
|
355
392
|
## License
|
|
356
393
|
|
|
357
|
-
MIT License
|
|
394
|
+
MIT License — see [LICENSE](LICENSE) for details.
|
package/dist/server.d.ts.map
CHANGED
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"server.d.ts","sourceRoot":"","sources":["../src/server.ts"],"names":[],"mappings":"AAcA,MAAM,WAAW,iBAAiB;IAChC,GAAG,IAAI,OAAO,CAAC,IAAI,CAAC,CAAC;IACrB,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC,CAAC;CACxB;AAED,wBAAgB,YAAY,IAAI,iBAAiB,
|
|
1
|
+
{"version":3,"file":"server.d.ts","sourceRoot":"","sources":["../src/server.ts"],"names":[],"mappings":"AAcA,MAAM,WAAW,iBAAiB;IAChC,GAAG,IAAI,OAAO,CAAC,IAAI,CAAC,CAAC;IACrB,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC,CAAC;CACxB;AAED,wBAAgB,YAAY,IAAI,iBAAiB,CAoOhD"}
|
package/dist/server.js
CHANGED
|
@@ -8,215 +8,173 @@ export function createServer() {
|
|
|
8
8
|
name: "mcp-docs-scraper",
|
|
9
9
|
version: "0.1.0",
|
|
10
10
|
});
|
|
11
|
-
//
|
|
12
|
-
|
|
13
|
-
|
|
14
|
-
description: "Health check tool - returns pong",
|
|
15
|
-
inputSchema: {},
|
|
16
|
-
}, async () => {
|
|
17
|
-
return {
|
|
18
|
-
content: [
|
|
19
|
-
{
|
|
20
|
-
type: "text",
|
|
21
|
-
text: JSON.stringify({ message: "pong" }, null, 2),
|
|
22
|
-
},
|
|
23
|
-
],
|
|
24
|
-
};
|
|
25
|
-
});
|
|
26
|
-
// Register list_cached_docs tool
|
|
11
|
+
// ===========================================================================
|
|
12
|
+
// list_cached_docs
|
|
13
|
+
// ===========================================================================
|
|
27
14
|
server.registerTool("list_cached_docs", {
|
|
28
15
|
title: "List Cached Docs",
|
|
29
|
-
description: "List all documentation sets
|
|
16
|
+
description: "List all cached documentation sets. Use to find docs_id values for other tools, or check if docs need indexing. Returns: id, source (github/scraped), repo or base_url, indexed_at, page_count, total_size_bytes.",
|
|
30
17
|
inputSchema: {},
|
|
31
18
|
}, async () => {
|
|
32
19
|
const result = await listCachedDocs();
|
|
33
20
|
return {
|
|
34
|
-
content: [
|
|
35
|
-
{
|
|
36
|
-
type: "text",
|
|
37
|
-
text: JSON.stringify(result, null, 2),
|
|
38
|
-
},
|
|
39
|
-
],
|
|
21
|
+
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
|
|
40
22
|
};
|
|
41
23
|
});
|
|
42
|
-
//
|
|
24
|
+
// ===========================================================================
|
|
25
|
+
// clear_cache
|
|
26
|
+
// ===========================================================================
|
|
43
27
|
server.registerTool("clear_cache", {
|
|
44
28
|
title: "Clear Cache",
|
|
45
|
-
description: "Remove cached documentation.
|
|
29
|
+
description: "Remove cached documentation. Use docs_id for specific entry, or all:true to clear everything. Returns cleared IDs and remaining count.",
|
|
46
30
|
inputSchema: {
|
|
47
31
|
docs_id: z
|
|
48
32
|
.string()
|
|
49
33
|
.optional()
|
|
50
|
-
.describe("Specific docs ID to clear (
|
|
51
|
-
all: z
|
|
52
|
-
.boolean()
|
|
53
|
-
.optional()
|
|
54
|
-
.describe("Clear all cached docs (default: false)"),
|
|
34
|
+
.describe("Specific docs ID to clear (e.g., 'colinhacks_zod')"),
|
|
35
|
+
all: z.boolean().optional().describe("Clear all cached docs"),
|
|
55
36
|
},
|
|
56
37
|
}, async ({ docs_id, all }) => {
|
|
57
38
|
const result = await clearCache({ docs_id, all });
|
|
58
39
|
return {
|
|
59
|
-
content: [
|
|
60
|
-
{
|
|
61
|
-
type: "text",
|
|
62
|
-
text: JSON.stringify(result, null, 2),
|
|
63
|
-
},
|
|
64
|
-
],
|
|
40
|
+
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
|
|
65
41
|
};
|
|
66
42
|
});
|
|
67
|
-
//
|
|
43
|
+
// ===========================================================================
|
|
44
|
+
// index_docs
|
|
45
|
+
// ===========================================================================
|
|
68
46
|
server.registerTool("index_docs", {
|
|
69
47
|
title: "Index Docs",
|
|
70
|
-
description: "Fetch and cache documentation from
|
|
48
|
+
description: "Fetch and cache documentation from GitHub or website. REQUIRED before search_docs/get_docs_content. Auto mode tries GitHub first (cleaner), falls back to scraping. Returns docs_id for subsequent operations.",
|
|
71
49
|
inputSchema: {
|
|
72
50
|
url: z
|
|
73
51
|
.string()
|
|
74
|
-
.describe("GitHub
|
|
52
|
+
.describe("GitHub repo URL (https://github.com/owner/repo) or docs website URL"),
|
|
75
53
|
type: z
|
|
76
54
|
.enum(["github", "scrape", "auto"])
|
|
77
55
|
.optional()
|
|
78
|
-
.describe(
|
|
56
|
+
.describe("Source type (default: auto)"),
|
|
79
57
|
force_refresh: z
|
|
80
58
|
.boolean()
|
|
81
59
|
.optional()
|
|
82
|
-
.describe("
|
|
60
|
+
.describe("Re-fetch even if cached"),
|
|
83
61
|
},
|
|
84
62
|
}, async ({ url, type, force_refresh }) => {
|
|
85
63
|
try {
|
|
86
64
|
const result = await indexDocs({ url, type, force_refresh });
|
|
87
65
|
return {
|
|
88
|
-
content: [
|
|
89
|
-
{
|
|
90
|
-
type: "text",
|
|
91
|
-
text: JSON.stringify(result, null, 2),
|
|
92
|
-
},
|
|
93
|
-
],
|
|
66
|
+
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
|
|
94
67
|
};
|
|
95
68
|
}
|
|
96
69
|
catch (error) {
|
|
97
70
|
return createErrorResponse(error);
|
|
98
71
|
}
|
|
99
72
|
});
|
|
100
|
-
//
|
|
73
|
+
// ===========================================================================
|
|
74
|
+
// get_docs_tree
|
|
75
|
+
// ===========================================================================
|
|
101
76
|
server.registerTool("get_docs_tree", {
|
|
102
77
|
title: "Get Docs Tree",
|
|
103
|
-
description: "Get
|
|
78
|
+
description: "Get file/folder structure of cached docs. Use to discover file paths before get_docs_content. Optionally filter by path or limit depth.",
|
|
104
79
|
inputSchema: {
|
|
105
80
|
docs_id: z
|
|
106
81
|
.string()
|
|
107
|
-
.describe("
|
|
108
|
-
path: z
|
|
109
|
-
|
|
110
|
-
.optional()
|
|
111
|
-
.describe("Subtree path to filter (optional, default: root)"),
|
|
112
|
-
max_depth: z
|
|
113
|
-
.number()
|
|
114
|
-
.optional()
|
|
115
|
-
.describe("Maximum depth to return (optional, default: unlimited)"),
|
|
82
|
+
.describe("Docs ID from index_docs or list_cached_docs"),
|
|
83
|
+
path: z.string().optional().describe("Subtree path (e.g., 'docs/api')"),
|
|
84
|
+
max_depth: z.number().optional().describe("Max folder depth to return"),
|
|
116
85
|
},
|
|
117
86
|
}, async ({ docs_id, path, max_depth }) => {
|
|
118
87
|
try {
|
|
119
88
|
const result = await getDocsTree({ docs_id, path, max_depth });
|
|
120
89
|
return {
|
|
121
|
-
content: [
|
|
122
|
-
{
|
|
123
|
-
type: "text",
|
|
124
|
-
text: JSON.stringify(result, null, 2),
|
|
125
|
-
},
|
|
126
|
-
],
|
|
90
|
+
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
|
|
127
91
|
};
|
|
128
92
|
}
|
|
129
93
|
catch (error) {
|
|
130
94
|
return createErrorResponse(error);
|
|
131
95
|
}
|
|
132
96
|
});
|
|
133
|
-
//
|
|
97
|
+
// ===========================================================================
|
|
98
|
+
// get_docs_content
|
|
99
|
+
// ===========================================================================
|
|
134
100
|
server.registerTool("get_docs_content", {
|
|
135
101
|
title: "Get Docs Content",
|
|
136
|
-
description: "Retrieve
|
|
102
|
+
description: "Retrieve markdown content of specific files. Use after search_docs to get full content of relevant files. Returns content, title, headings, and size for each path.",
|
|
137
103
|
inputSchema: {
|
|
138
104
|
docs_id: z
|
|
139
105
|
.string()
|
|
140
|
-
.describe("
|
|
106
|
+
.describe("Docs ID from index_docs or list_cached_docs"),
|
|
141
107
|
paths: z
|
|
142
108
|
.array(z.string())
|
|
143
|
-
.describe("
|
|
109
|
+
.describe("File paths to retrieve (e.g., ['README.md', 'docs/guide.md'])"),
|
|
144
110
|
format: z
|
|
145
111
|
.enum(["markdown", "raw"])
|
|
146
112
|
.optional()
|
|
147
|
-
.describe(
|
|
113
|
+
.describe("Output format (default: markdown)"),
|
|
148
114
|
},
|
|
149
115
|
}, async ({ docs_id, paths, format }) => {
|
|
150
116
|
try {
|
|
151
117
|
const result = await getDocsContent({ docs_id, paths, format });
|
|
152
118
|
return {
|
|
153
|
-
content: [
|
|
154
|
-
{
|
|
155
|
-
type: "text",
|
|
156
|
-
text: JSON.stringify(result, null, 2),
|
|
157
|
-
},
|
|
158
|
-
],
|
|
119
|
+
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
|
|
159
120
|
};
|
|
160
121
|
}
|
|
161
122
|
catch (error) {
|
|
162
123
|
return createErrorResponse(error);
|
|
163
124
|
}
|
|
164
125
|
});
|
|
165
|
-
//
|
|
126
|
+
// ===========================================================================
|
|
127
|
+
// search_docs
|
|
128
|
+
// ===========================================================================
|
|
166
129
|
server.registerTool("search_docs", {
|
|
167
130
|
title: "Search Docs",
|
|
168
|
-
description: "Full-text search within cached
|
|
131
|
+
description: "Full-text search within cached docs. FASTEST way to find information—use before get_docs_content. Returns ranked results with file paths and snippets.",
|
|
169
132
|
inputSchema: {
|
|
170
133
|
docs_id: z
|
|
171
134
|
.string()
|
|
172
|
-
.describe("
|
|
173
|
-
query: z
|
|
135
|
+
.describe("Docs ID from index_docs or list_cached_docs"),
|
|
136
|
+
query: z
|
|
137
|
+
.string()
|
|
138
|
+
.describe("Search query—natural language works well (e.g., 'validate email')"),
|
|
174
139
|
limit: z
|
|
175
140
|
.number()
|
|
176
141
|
.optional()
|
|
177
|
-
.describe("Max results
|
|
142
|
+
.describe("Max results (default: 10, max: 50)"),
|
|
178
143
|
},
|
|
179
144
|
}, async ({ docs_id, query, limit }) => {
|
|
180
145
|
try {
|
|
181
146
|
const result = await searchDocs({ docs_id, query, limit });
|
|
182
147
|
return {
|
|
183
|
-
content: [
|
|
184
|
-
{
|
|
185
|
-
type: "text",
|
|
186
|
-
text: JSON.stringify(result, null, 2),
|
|
187
|
-
},
|
|
188
|
-
],
|
|
148
|
+
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
|
|
189
149
|
};
|
|
190
150
|
}
|
|
191
151
|
catch (error) {
|
|
192
152
|
return createErrorResponse(error);
|
|
193
153
|
}
|
|
194
154
|
});
|
|
195
|
-
//
|
|
155
|
+
// ===========================================================================
|
|
156
|
+
// detect_github_repo
|
|
157
|
+
// ===========================================================================
|
|
196
158
|
server.registerTool("detect_github_repo", {
|
|
197
159
|
title: "Detect GitHub Repo",
|
|
198
|
-
description: "Find GitHub repository from a
|
|
160
|
+
description: "Find GitHub repository from a docs website URL. Use before index_docs to check if cleaner GitHub source exists. Returns repo in 'owner/repo' format with confidence level.",
|
|
199
161
|
inputSchema: {
|
|
200
|
-
url: z
|
|
201
|
-
.string()
|
|
202
|
-
.describe("Docs website URL to analyze (e.g., https://zod.dev)"),
|
|
162
|
+
url: z.string().describe("Docs website URL (e.g., 'https://zod.dev')"),
|
|
203
163
|
},
|
|
204
164
|
}, async ({ url }) => {
|
|
205
165
|
try {
|
|
206
166
|
const result = await detectGitHub({ url });
|
|
207
167
|
return {
|
|
208
|
-
content: [
|
|
209
|
-
{
|
|
210
|
-
type: "text",
|
|
211
|
-
text: JSON.stringify(result, null, 2),
|
|
212
|
-
},
|
|
213
|
-
],
|
|
168
|
+
content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
|
|
214
169
|
};
|
|
215
170
|
}
|
|
216
171
|
catch (error) {
|
|
217
172
|
return createErrorResponse(error);
|
|
218
173
|
}
|
|
219
174
|
});
|
|
175
|
+
// ===========================================================================
|
|
176
|
+
// Server Transport
|
|
177
|
+
// ===========================================================================
|
|
220
178
|
const transport = new StdioServerTransport();
|
|
221
179
|
return {
|
|
222
180
|
async run() {
|
package/dist/server.js.map
CHANGED
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"server.js","sourceRoot":"","sources":["../src/server.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,SAAS,EAAE,MAAM,yCAAyC,CAAC;AACpE,OAAO,EAAE,oBAAoB,EAAE,MAAM,2CAA2C,CAAC;AACjF,OAAO,EAAE,CAAC,EAAE,MAAM,KAAK,CAAC;AACxB,OAAO,EACL,cAAc,EACd,UAAU,EACV,SAAS,EACT,WAAW,EACX,cAAc,EACd,UAAU,EACV,YAAY,GACb,MAAM,kBAAkB,CAAC;AAC1B,OAAO,EAAE,mBAAmB,EAAE,MAAM,mBAAmB,CAAC;AAOxD,MAAM,UAAU,YAAY;IAC1B,MAAM,MAAM,GAAG,IAAI,SAAS,CAAC;QAC3B,IAAI,EAAE,kBAAkB;QACxB,OAAO,EAAE,OAAO;KACjB,CAAC,CAAC;IAEH,
|
|
1
|
+
{"version":3,"file":"server.js","sourceRoot":"","sources":["../src/server.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,SAAS,EAAE,MAAM,yCAAyC,CAAC;AACpE,OAAO,EAAE,oBAAoB,EAAE,MAAM,2CAA2C,CAAC;AACjF,OAAO,EAAE,CAAC,EAAE,MAAM,KAAK,CAAC;AACxB,OAAO,EACL,cAAc,EACd,UAAU,EACV,SAAS,EACT,WAAW,EACX,cAAc,EACd,UAAU,EACV,YAAY,GACb,MAAM,kBAAkB,CAAC;AAC1B,OAAO,EAAE,mBAAmB,EAAE,MAAM,mBAAmB,CAAC;AAOxD,MAAM,UAAU,YAAY;IAC1B,MAAM,MAAM,GAAG,IAAI,SAAS,CAAC;QAC3B,IAAI,EAAE,kBAAkB;QACxB,OAAO,EAAE,OAAO;KACjB,CAAC,CAAC;IAEH,8EAA8E;IAC9E,mBAAmB;IACnB,8EAA8E;IAC9E,MAAM,CAAC,YAAY,CACjB,kBAAkB,EAClB;QACE,KAAK,EAAE,kBAAkB;QACzB,WAAW,EACT,mNAAmN;QACrN,WAAW,EAAE,EAAE;KAChB,EACD,KAAK,IAAI,EAAE;QACT,MAAM,MAAM,GAAG,MAAM,cAAc,EAAE,CAAC;QACtC,OAAO;YACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;SACnE,CAAC;IACJ,CAAC,CACF,CAAC;IAEF,8EAA8E;IAC9E,cAAc;IACd,8EAA8E;IAC9E,MAAM,CAAC,YAAY,CACjB,aAAa,EACb;QACE,KAAK,EAAE,aAAa;QACpB,WAAW,EACT,wIAAwI;QAC1I,WAAW,EAAE;YACX,OAAO,EAAE,CAAC;iBACP,MAAM,EAAE;iBACR,QAAQ,EAAE;iBACV,QAAQ,CAAC,oDAAoD,CAAC;YACjE,GAAG,EAAE,CAAC,CAAC,OAAO,EAAE,CAAC,QAAQ,EAAE,CAAC,QAAQ,CAAC,uBAAuB,CAAC;SAC9D;KACF,EACD,KAAK,EAAE,EAAE,OAAO,EAAE,GAAG,EAAE,EAAE,EAAE;QACzB,MAAM,MAAM,GAAG,MAAM,UAAU,CAAC,EAAE,OAAO,EAAE,GAAG,EAAE,CAAC,CAAC;QAClD,OAAO;YACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;SACnE,CAAC;IACJ,CAAC,CACF,CAAC;IAEF,8EAA8E;IAC9E,aAAa;IACb,8EAA8E;IAC9E,MAAM,CAAC,YAAY,CACjB,YAAY,EACZ;QACE,KAAK,EAAE,YAAY;QACnB,WAAW,EACT,gNAAgN;QAClN,WAAW,EAAE;YACX,GAAG,EAAE,CAAC;iBACH,MAAM,EAAE;iBACR,QAAQ,CACP,qEAAqE,CACtE;YACH,IAAI,EAAE,CAAC;iBACJ,IAAI,CAAC,CAAC,QAAQ,EAAE,QAAQ,EAAE,MAAM,CAAC,CAAC;iBAClC,QAAQ,EAAE;iBACV,QAAQ,CAAC,6BAA6B,CAAC;YAC1C,aAAa,EAAE,CAAC;iBACb,OAAO,EAAE;iBACT,QAAQ,EAAE;iBACV,QAAQ,CAAC,yBAAyB,CAAC;SACvC;KACF,EACD,KAAK,EAAE,EAAE,GAAG,EAAE,IAAI,EAAE,aAAa,EAAE,EAAE,EAAE;QACrC,IAAI,CAAC;YACH,MAAM,MAAM,GAAG,MAAM,SAAS,CAAC,EAAE,GAAG,EAAE,IAAI,EAAE,aAAa,EAAE,CAAC,CAAC;YAC7D,OAAO;gBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;aACnE,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO,mBAAmB,CAAC,KAAK,CAAC,CAAC;QACpC,CAAC;IACH,CAAC,CACF,CAAC;IAEF,8EAA8E;IAC9E,gBAAgB;IAChB,8EAA8E;IAC9E,MAAM,CAAC,YAAY,CACjB,eAAe,EACf;QACE,KAAK,EAAE,eAAe;QACtB,WAAW,EACT,yIAAyI;QAC3I,WAAW,EAAE;YACX,OAAO,EAAE,CAAC;iBACP,MAAM,EAAE;iBACR,QAAQ,CAAC,6CAA6C,CAAC;YAC1D,IAAI,EAAE,CAAC,CAAC,MAAM,EAAE,CAAC,QAAQ,EAAE,CAAC,QAAQ,CAAC,iCAAiC,CAAC;YACvE,SAAS,EAAE,CAAC,CAAC,MAAM,EAAE,CAAC,QAAQ,EAAE,CAAC,QAAQ,CAAC,4BAA4B,CAAC;SACxE;KACF,EACD,KAAK,EAAE,EAAE,OAAO,EAAE,IAAI,EAAE,SAAS,EAAE,EAAE,EAAE;QACrC,IAAI,CAAC;YACH,MAAM,MAAM,GAAG,MAAM,WAAW,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,SAAS,EAAE,CAAC,CAAC;YAC/D,OAAO;gBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;aACnE,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO,mBAAmB,CAAC,KAAK,CAAC,CAAC;QACpC,CAAC;IACH,CAAC,CACF,CAAC;IAEF,8EAA8E;IAC9E,mBAAmB;IACnB,8EAA8E;IAC9E,MAAM,CAAC,YAAY,CACjB,kBAAkB,EAClB;QACE,KAAK,EAAE,kBAAkB;QACzB,WAAW,EACT,qKAAqK;QACvK,WAAW,EAAE;YACX,OAAO,EAAE,CAAC;iBACP,MAAM,EAAE;iBACR,QAAQ,CAAC,6CAA6C,CAAC;YAC1D,KAAK,EAAE,CAAC;iBACL,KAAK,CAAC,CAAC,CAAC,MAAM,EAAE,CAAC;iBACjB,QAAQ,CACP,+DAA+D,CAChE;YACH,MAAM,EAAE,CAAC;iBACN,IAAI,CAAC,CAAC,UAAU,EAAE,KAAK,CAAC,CAAC;iBACzB,QAAQ,EAAE;iBACV,QAAQ,CAAC,mCAAmC,CAAC;SACjD;KACF,EACD,KAAK,EAAE,EAAE,OAAO,EAAE,KAAK,EAAE,MAAM,EAAE,EAAE,EAAE;QACnC,IAAI,CAAC;YACH,MAAM,MAAM,GAAG,MAAM,cAAc,CAAC,EAAE,OAAO,EAAE,KAAK,EAAE,MAAM,EAAE,CAAC,CAAC;YAChE,OAAO;gBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;aACnE,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO,mBAAmB,CAAC,KAAK,CAAC,CAAC;QACpC,CAAC;IACH,CAAC,CACF,CAAC;IAEF,8EAA8E;IAC9E,cAAc;IACd,8EAA8E;IAC9E,MAAM,CAAC,YAAY,CACjB,aAAa,EACb;QACE,KAAK,EAAE,aAAa;QACpB,WAAW,EACT,wJAAwJ;QAC1J,WAAW,EAAE;YACX,OAAO,EAAE,CAAC;iBACP,MAAM,EAAE;iBACR,QAAQ,CAAC,6CAA6C,CAAC;YAC1D,KAAK,EAAE,CAAC;iBACL,MAAM,EAAE;iBACR,QAAQ,CACP,mEAAmE,CACpE;YACH,KAAK,EAAE,CAAC;iBACL,MAAM,EAAE;iBACR,QAAQ,EAAE;iBACV,QAAQ,CAAC,oCAAoC,CAAC;SAClD;KACF,EACD,KAAK,EAAE,EAAE,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,EAAE,EAAE;QAClC,IAAI,CAAC;YACH,MAAM,MAAM,GAAG,MAAM,UAAU,CAAC,EAAE,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC;YAC3D,OAAO;gBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;aACnE,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO,mBAAmB,CAAC,KAAK,CAAC,CAAC;QACpC,CAAC;IACH,CAAC,CACF,CAAC;IAEF,8EAA8E;IAC9E,qBAAqB;IACrB,8EAA8E;IAC9E,MAAM,CAAC,YAAY,CACjB,oBAAoB,EACpB;QACE,KAAK,EAAE,oBAAoB;QAC3B,WAAW,EACT,4KAA4K;QAC9K,WAAW,EAAE;YACX,GAAG,EAAE,CAAC,CAAC,MAAM,EAAE,CAAC,QAAQ,CAAC,4CAA4C,CAAC;SACvE;KACF,EACD,KAAK,EAAE,EAAE,GAAG,EAAE,EAAE,EAAE;QAChB,IAAI,CAAC;YACH,MAAM,MAAM,GAAG,MAAM,YAAY,CAAC,EAAE,GAAG,EAAE,CAAC,CAAC;YAC3C,OAAO;gBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;aACnE,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,OAAO,mBAAmB,CAAC,KAAK,CAAC,CAAC;QACpC,CAAC;IACH,CAAC,CACF,CAAC;IAEF,8EAA8E;IAC9E,mBAAmB;IACnB,8EAA8E;IAC9E,MAAM,SAAS,GAAG,IAAI,oBAAoB,EAAE,CAAC;IAE7C,OAAO;QACL,KAAK,CAAC,GAAG;YACP,MAAM,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;YAChC,OAAO,CAAC,KAAK,CAAC,0CAA0C,CAAC,CAAC;QAC5D,CAAC;QAED,KAAK,CAAC,KAAK;YACT,MAAM,MAAM,CAAC,KAAK,EAAE,CAAC;QACvB,CAAC;KACF,CAAC;AACJ,CAAC"}
|