@arabold/docs-mcp-server 1.12.1 → 1.12.3
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +150 -161
- package/db/migrations/001-add-indexed-at-column.sql +1 -1
- package/dist/{DocumentManagementService-_qCZ1Hi2.js → DocumentManagementService-BupnR1eC.js} +3 -3
- package/dist/{DocumentManagementService-_qCZ1Hi2.js.map → DocumentManagementService-BupnR1eC.js.map} +1 -1
- package/dist/{EmbeddingFactory-BJMbJvje.js → EmbeddingFactory-DZKXkqOe.js} +2 -2
- package/dist/{EmbeddingFactory-BJMbJvje.js.map → EmbeddingFactory-DZKXkqOe.js.map} +1 -1
- package/dist/{FindVersionTool-CH1c3Tyu.js → FindVersionTool-BcnLvjlo.js} +2 -2
- package/dist/{FindVersionTool-CH1c3Tyu.js.map → FindVersionTool-BcnLvjlo.js.map} +1 -1
- package/dist/{RemoveTool-DmB1YJTA.js → RemoveTool-Bqpr8F9m.js} +2 -2
- package/dist/{RemoveTool-DmB1YJTA.js.map → RemoveTool-Bqpr8F9m.js.map} +1 -1
- package/dist/cli.js +3 -3
- package/dist/server.js +3 -3
- package/dist/web.js +2 -2
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,12 +1,14 @@
|
|
|
1
|
-
# Docs MCP Server:
|
|
1
|
+
# Docs MCP Server: Your AI's Up-to-Date Documentation Expert
|
|
2
2
|
|
|
3
3
|
AI coding assistants often struggle with outdated documentation, leading to incorrect suggestions or hallucinated code examples. Verifying AI responses against specific library versions can be time-consuming and inefficient.
|
|
4
4
|
|
|
5
|
-
The **Docs MCP Server**
|
|
5
|
+
The **Docs MCP Server** solves this by acting as a personal, always-current knowledge base for your AI assistant. Its primary purpose is to **index 3rd party documentation** – the libraries you actually use in your codebase. It scrapes websites, GitHub repositories, package managers (npm, PyPI), and even local files, cataloging the docs locally. It then provides powerful search tools via the Model Context Protocol (MCP) to your coding agent.
|
|
6
|
+
|
|
7
|
+
This enables your LLM agent to access the **latest official documentation** for any library you add, dramatically improving the quality and reliability of generated code and integration details.
|
|
6
8
|
|
|
7
9
|
By grounding AI responses in accurate, version-aware context, the Docs MCP Server enables you to receive concise and relevant integration details and code snippets, improving the reliability and efficiency of LLM-assisted development.
|
|
8
10
|
|
|
9
|
-
It's **free**, **open-source**, runs **locally** for privacy, and integrates seamlessly
|
|
11
|
+
It's **free**, **open-source**, runs **locally** for privacy, and integrates seamlessly into your development workflow.
|
|
10
12
|
|
|
11
13
|
## Why Use the Docs MCP Server?
|
|
12
14
|
|
|
@@ -39,17 +41,101 @@ LLM-assisted coding promises speed and efficiency, but often falls short due to:
|
|
|
39
41
|
- **Simple Deployment:** Easy setup via Docker or `npx`.
|
|
40
42
|
- **Seamless Integration:** Works with MCP-compatible clients (like Claude, Cline, Roo).
|
|
41
43
|
|
|
42
|
-
##
|
|
44
|
+
## How to Run the Docs MCP Server
|
|
45
|
+
|
|
46
|
+
Get up and running quickly! We recommend using Docker Desktop (Docker Compose) for the easiest setup and management.
|
|
47
|
+
|
|
48
|
+
- [Recommended: Docker Desktop](#recommended-docker-desktop)
|
|
49
|
+
- [Alternative: Using Docker](#alternative-using-docker)
|
|
50
|
+
- [Alternative: Using npx](#alternative-using-npx)
|
|
51
|
+
|
|
52
|
+
## Recommended: Docker Desktop
|
|
53
|
+
|
|
54
|
+
This method provides a persistent local setup by running the server and web interface using Docker Compose. It requires cloning the repository but simplifies managing both services together.
|
|
55
|
+
|
|
56
|
+
1. **Ensure Docker and Docker Compose are installed and running.**
|
|
57
|
+
2. **Clone the repository:**
|
|
58
|
+
```bash
|
|
59
|
+
git clone https://github.com/arabold/docs-mcp-server.git
|
|
60
|
+
cd docs-mcp-server
|
|
61
|
+
```
|
|
62
|
+
3. **Set up your environment:**
|
|
63
|
+
Copy the example environment file and edit it to add your OpenAI API key (required):
|
|
64
|
+
|
|
65
|
+
```bash
|
|
66
|
+
cp .env.example .env
|
|
67
|
+
# Edit the .env file and set your OpenAI API key:
|
|
68
|
+
```
|
|
69
|
+
|
|
70
|
+
Example `.env`:
|
|
71
|
+
|
|
72
|
+
```
|
|
73
|
+
OPENAI_API_KEY=your-api-key-here
|
|
74
|
+
```
|
|
75
|
+
|
|
76
|
+
For additional configuration options (e.g., other providers, advanced settings), see the [Configuration](#configuration) section.
|
|
77
|
+
|
|
78
|
+
4. **Launch the services:**
|
|
79
|
+
Run this command from the repository's root directory. It will build the images (if necessary) and start the server and web interface in the background.
|
|
80
|
+
|
|
81
|
+
```bash
|
|
82
|
+
docker compose up -d
|
|
83
|
+
```
|
|
43
84
|
|
|
44
|
-
|
|
85
|
+
- `-d`: Runs the containers in detached mode (in the background). Omit this to see logs directly in your terminal.
|
|
45
86
|
|
|
46
|
-
|
|
47
|
-
- [Option 2: Using npx](#option-2-using-npx)
|
|
48
|
-
- [Option 3: Using Docker Compose](#option-3-using-docker-compose)
|
|
87
|
+
**Note:** If you pull updates for the repository (e.g., using `git pull`), you'll need to rebuild the Docker images to include the changes by running `docker compose up -d --build`.
|
|
49
88
|
|
|
50
|
-
|
|
89
|
+
5. **Configure your MCP client:**
|
|
90
|
+
Add the following configuration block to your MCP settings file (e.g., for Claude, Cline, Roo):
|
|
51
91
|
|
|
52
|
-
|
|
92
|
+
```json
|
|
93
|
+
{
|
|
94
|
+
"mcpServers": {
|
|
95
|
+
"docs-mcp-server": {
|
|
96
|
+
"url": "http://localhost:6280/sse", // Connects via HTTP to the Docker Compose service
|
|
97
|
+
"disabled": false,
|
|
98
|
+
"autoApprove": []
|
|
99
|
+
}
|
|
100
|
+
}
|
|
101
|
+
}
|
|
102
|
+
```
|
|
103
|
+
|
|
104
|
+
Restart your AI assistant application after updating the configuration.
|
|
105
|
+
|
|
106
|
+
6. **Access the Web Interface:**
|
|
107
|
+
The web interface will be available at `http://localhost:6281`.
|
|
108
|
+
|
|
109
|
+
**Benefits of this method:**
|
|
110
|
+
|
|
111
|
+
- Runs both the server and web UI with a single command.
|
|
112
|
+
- Uses the local source code (rebuilds automatically if code changes and you run `docker compose up --build`).
|
|
113
|
+
- Persistent data storage via the `docs-mcp-data` Docker volume.
|
|
114
|
+
- Easy configuration management via the `.env` file.
|
|
115
|
+
|
|
116
|
+
To stop the services, run `docker compose down` from the repository directory.
|
|
117
|
+
|
|
118
|
+
### Adding Library Documentation
|
|
119
|
+
|
|
120
|
+

|
|
121
|
+
|
|
122
|
+
Once the Docs MCP Server is running, you can use the Web Interface to **add new documentation** to be indexed or **search existing documentation**.
|
|
123
|
+
|
|
124
|
+
1. **Open the Web Interface:** If you used the recommended Docker Compose setup, navigate your browser to `http://localhost:6281`.
|
|
125
|
+
2. **Find the "Queue New Scrape Job" Form:** This is usually prominently displayed on the main page.
|
|
126
|
+
3. **Enter the Details:**
|
|
127
|
+
- **URL:** Provide the starting URL for the documentation you want to index (e.g., `https://react.dev/reference/react`, `https://github.com/expressjs/express`, `https://docs.python.org/3/`).
|
|
128
|
+
- **Library Name:** Give it a short, memorable name (e.g., `react`, `express`, `python`). This is how you'll refer to it in searches.
|
|
129
|
+
- **Version (Optional):** If you want to index a specific version, enter it here (e.g., `18.2.0`, `4.17.1`, `3.11`). If left blank, the server often tries to detect the latest version or indexes it as unversioned.
|
|
130
|
+
- **(Optional) Advanced Settings:** Adjust `Scope` (e.g., 'Subpages', 'Hostname', 'Domain'), `Max Pages`, `Max Depth`, and `Follow Redirects` if needed. Defaults are usually sufficient.
|
|
131
|
+
4. **Click "Queue Job":** The server will start a background job to fetch, process, and index the documentation. You can monitor its progress in the "Job Queue" section of the Web UI.
|
|
132
|
+
5. **Repeat:** Repeat steps 3-4 for every library whose documentation you want the server to manage.
|
|
133
|
+
|
|
134
|
+
**That's it!** Once a job completes successfully, the documentation for that library and version becomes available for searching through your connected AI coding assistant (using the `search_docs` tool) or directly in the Web UI's by clicking on the library name in the "Indexed Documenation" section.
|
|
135
|
+
|
|
136
|
+
## Alternative: Using Docker
|
|
137
|
+
|
|
138
|
+
This approach is easy, straightforward, and doesn't require cloning the repository.
|
|
53
139
|
|
|
54
140
|
1. **Ensure Docker is installed and running.**
|
|
55
141
|
2. **Configure your MCP settings:**
|
|
@@ -148,7 +234,53 @@ docker run -i --rm \
|
|
|
148
234
|
ghcr.io/arabold/docs-mcp-server:latest
|
|
149
235
|
```
|
|
150
236
|
|
|
151
|
-
###
|
|
237
|
+
### Launching Web Interface
|
|
238
|
+
|
|
239
|
+
You can access a web-based GUI at `http://localhost:6281` to manage and search library documentation through your browser.
|
|
240
|
+
|
|
241
|
+
If you're running the server with Docker, use Docker for the web interface as well:
|
|
242
|
+
|
|
243
|
+
```bash
|
|
244
|
+
docker run --rm \
|
|
245
|
+
-e OPENAI_API_KEY="your-openai-api-key-here" \
|
|
246
|
+
-v docs-mcp-data:/data \
|
|
247
|
+
-p 6281:6281 \
|
|
248
|
+
ghcr.io/arabold/docs-mcp-server:latest \
|
|
249
|
+
docs-web
|
|
250
|
+
```
|
|
251
|
+
|
|
252
|
+
Make sure to:
|
|
253
|
+
|
|
254
|
+
- Use the same volume name (`docs-mcp-data` in this example) as your server
|
|
255
|
+
- Map port 6281 with `-p 6281:6281`
|
|
256
|
+
- Pass any configuration environment variables with `-e` flags
|
|
257
|
+
|
|
258
|
+
### Using the CLI
|
|
259
|
+
|
|
260
|
+
You can use the CLI to manage documentation directly via Docker.
|
|
261
|
+
|
|
262
|
+
```bash
|
|
263
|
+
docker run --rm \
|
|
264
|
+
-e OPENAI_API_KEY="your-openai-api-key-here" \
|
|
265
|
+
-v docs-mcp-data:/data \
|
|
266
|
+
ghcr.io/arabold/docs-mcp-server:latest \
|
|
267
|
+
docs-cli <command> [options]
|
|
268
|
+
```
|
|
269
|
+
|
|
270
|
+
Make sure to use the same volume name (`docs-mcp-data` in this example) as you did for the server. Any of the configuration environment variables (see [Configuration](#configuration) above) can be passed using `-e` flags, just like with the server.
|
|
271
|
+
|
|
272
|
+
The main commands available are:
|
|
273
|
+
|
|
274
|
+
- `scrape`: Scrapes and indexes documentation from a URL.
|
|
275
|
+
- `search`: Searches the indexed documentation.
|
|
276
|
+
- `list`: Lists all indexed libraries.
|
|
277
|
+
- `remove`: Removes indexed documentation.
|
|
278
|
+
- `fetch-url`: Fetches a single URL and converts to Markdown.
|
|
279
|
+
- `find-version`: Finds the best matching version for a library.
|
|
280
|
+
|
|
281
|
+
See the [CLI Command Reference](#cli-command-reference) below for detailed command usage.
|
|
282
|
+
|
|
283
|
+
## Alternative: Using npx
|
|
152
284
|
|
|
153
285
|
This approach is useful when you need local file access (e.g., indexing documentation from your local file system). While this can also be achieved by mounting paths into a Docker container, using `npx` is simpler but requires a Node.js installation.
|
|
154
286
|
|
|
@@ -178,87 +310,7 @@ This approach is useful when you need local file access (e.g., indexing document
|
|
|
178
310
|
|
|
179
311
|
3. **That's it!** The server will now be available to your AI assistant.
|
|
180
312
|
|
|
181
|
-
###
|
|
182
|
-
|
|
183
|
-
This method provides a persistent local setup by running the server and web interface using Docker Compose. It requires cloning the repository but simplifies managing both services together.
|
|
184
|
-
|
|
185
|
-
1. **Ensure Docker and Docker Compose are installed and running.**
|
|
186
|
-
2. **Clone the repository:**
|
|
187
|
-
```bash
|
|
188
|
-
git clone https://github.com/arabold/docs-mcp-server.git
|
|
189
|
-
cd docs-mcp-server
|
|
190
|
-
```
|
|
191
|
-
3. **Set up your environment:**
|
|
192
|
-
Copy the example environment file and **edit it** to add your necessary API keys (e.g., `OPENAI_API_KEY`).
|
|
193
|
-
```bash
|
|
194
|
-
cp .env.example .env
|
|
195
|
-
# Now, edit the .env file with your editor
|
|
196
|
-
```
|
|
197
|
-
Refer to the [Configuration](#configuration) section for details on available environment variables.
|
|
198
|
-
4. **Launch the services:**
|
|
199
|
-
Run this command from the repository's root directory. It will build the images (if necessary) and start the server and web interface in the background.
|
|
200
|
-
|
|
201
|
-
```bash
|
|
202
|
-
docker compose up -d
|
|
203
|
-
```
|
|
204
|
-
|
|
205
|
-
- `-d`: Runs the containers in detached mode (in the background). Omit this to see logs directly in your terminal.
|
|
206
|
-
|
|
207
|
-
**Note:** If you pull updates for the repository (e.g., using `git pull`), you'll need to rebuild the Docker images to include the changes by running `docker compose up -d --build`.
|
|
208
|
-
|
|
209
|
-
5. **Configure your MCP client:**
|
|
210
|
-
Add the following configuration block to your MCP settings file (e.g., for Claude, Cline, Roo):
|
|
211
|
-
|
|
212
|
-
```json
|
|
213
|
-
{
|
|
214
|
-
"mcpServers": {
|
|
215
|
-
"docs-mcp-server": {
|
|
216
|
-
"url": "http://localhost:6280/sse", // Connects via HTTP to the Docker Compose service
|
|
217
|
-
"disabled": false,
|
|
218
|
-
"autoApprove": []
|
|
219
|
-
}
|
|
220
|
-
}
|
|
221
|
-
}
|
|
222
|
-
```
|
|
223
|
-
|
|
224
|
-
Restart your AI assistant application after updating the configuration.
|
|
225
|
-
|
|
226
|
-
6. **Access the Web Interface:**
|
|
227
|
-
The web interface will be available at `http://localhost:6281`.
|
|
228
|
-
|
|
229
|
-
**Benefits of this method:**
|
|
230
|
-
|
|
231
|
-
- Runs both the server and web UI with a single command.
|
|
232
|
-
- Uses the local source code (rebuilds automatically if code changes and you run `docker compose up --build`).
|
|
233
|
-
- Persistent data storage via the `docs-mcp-data` Docker volume.
|
|
234
|
-
- Easy configuration management via the `.env` file.
|
|
235
|
-
|
|
236
|
-
To stop the services, run `docker compose down` from the repository directory.
|
|
237
|
-
|
|
238
|
-
## Using the Web Interface
|
|
239
|
-
|
|
240
|
-
You can access a web-based GUI at `http://localhost:6281` to manage and search library documentation through your browser. **Important: Use the same method (Docker or npx) for both the server and web interface to ensure access to the same indexed documentation.**
|
|
241
|
-
|
|
242
|
-
### Using Docker Web Interface
|
|
243
|
-
|
|
244
|
-
If you're running the server with Docker, use Docker for the web interface as well:
|
|
245
|
-
|
|
246
|
-
```bash
|
|
247
|
-
docker run --rm \
|
|
248
|
-
-e OPENAI_API_KEY="your-openai-api-key-here" \
|
|
249
|
-
-v docs-mcp-data:/data \
|
|
250
|
-
-p 3000:3000 \
|
|
251
|
-
ghcr.io/arabold/docs-mcp-server:latest \
|
|
252
|
-
docs-web
|
|
253
|
-
```
|
|
254
|
-
|
|
255
|
-
Make sure to:
|
|
256
|
-
|
|
257
|
-
- Use the same volume name (`docs-mcp-data` in this example) as your server
|
|
258
|
-
- Map port 6281 with `-p 6281:3000`
|
|
259
|
-
- Pass any configuration environment variables with `-e` flags
|
|
260
|
-
|
|
261
|
-
### Using `npx Web Interface
|
|
313
|
+
### Launching Web Interface
|
|
262
314
|
|
|
263
315
|
If you're running the server with `npx`, use `npx` for the web interface as well:
|
|
264
316
|
|
|
@@ -270,27 +322,7 @@ You can specify a different port using the `--port` flag.
|
|
|
270
322
|
|
|
271
323
|
The `npx` approach will use the default data directory on your system (typically in your home directory), ensuring consistency between server and web interface.
|
|
272
324
|
|
|
273
|
-
|
|
274
|
-
|
|
275
|
-
You can use the CLI to manage documentation directly, either via Docker or npx. **Important: Use the same method (Docker or npx) for both the server and CLI to ensure access to the same indexed documentation.**
|
|
276
|
-
|
|
277
|
-
Here's how to invoke the CLI:
|
|
278
|
-
|
|
279
|
-
### Using Docker CLI
|
|
280
|
-
|
|
281
|
-
If you're running the server with Docker, use Docker for the CLI as well:
|
|
282
|
-
|
|
283
|
-
```bash
|
|
284
|
-
docker run --rm \
|
|
285
|
-
-e OPENAI_API_KEY="your-openai-api-key-here" \
|
|
286
|
-
-v docs-mcp-data:/data \
|
|
287
|
-
ghcr.io/arabold/docs-mcp-server:latest \
|
|
288
|
-
docs-cli <command> [options]
|
|
289
|
-
```
|
|
290
|
-
|
|
291
|
-
Make sure to use the same volume name (`docs-mcp-data` in this example) as you did for the server. Any of the configuration environment variables (see [Configuration](#configuration) above) can be passed using `-e` flags, just like with the server.
|
|
292
|
-
|
|
293
|
-
### Using `npx CLI
|
|
325
|
+
### Using the CLI
|
|
294
326
|
|
|
295
327
|
If you're running the server with npx, use `npx` for the CLI as well:
|
|
296
328
|
|
|
@@ -300,20 +332,11 @@ npx -y --package=@arabold/docs-mcp-server docs-cli <command> [options]
|
|
|
300
332
|
|
|
301
333
|
The `npx` approach will use the default data directory on your system (typically in your home directory), ensuring consistency between server and CLI.
|
|
302
334
|
|
|
303
|
-
The main commands available are:
|
|
304
|
-
|
|
305
|
-
- `scrape`: Scrapes and indexes documentation from a URL.
|
|
306
|
-
- `search`: Searches the indexed documentation.
|
|
307
|
-
- `list`: Lists all indexed libraries.
|
|
308
|
-
- `remove`: Removes indexed documentation.
|
|
309
|
-
- `fetch-url`: Fetches a single URL and converts to Markdown.
|
|
310
|
-
- `find-version`: Finds the best matching version for a library.
|
|
311
|
-
|
|
312
335
|
See the [CLI Command Reference](#cli-command-reference) below for detailed command usage.
|
|
313
336
|
|
|
314
337
|
## Configuration
|
|
315
338
|
|
|
316
|
-
The following environment variables are supported to configure the embedding model behavior
|
|
339
|
+
The following environment variables are supported to configure the embedding model behavior. Specify them in your `.env` file or pass them as `-e` flags when running the server via Docker or npx.
|
|
317
340
|
|
|
318
341
|
### Embedding Model Configuration
|
|
319
342
|
|
|
@@ -351,13 +374,11 @@ The database schema uses a fixed dimension of 1536 for embedding vectors. Only m
|
|
|
351
374
|
|
|
352
375
|
For OpenAI-compatible APIs (like Ollama), use the `openai` provider with `OPENAI_API_BASE` pointing to your endpoint.
|
|
353
376
|
|
|
354
|
-
|
|
355
|
-
|
|
356
|
-
## Development & Advanced Setup
|
|
377
|
+
## Development
|
|
357
378
|
|
|
358
379
|
This section covers running the server/CLI directly from the source code for development purposes. The primary usage method is now via the public Docker image as described in "Method 2".
|
|
359
380
|
|
|
360
|
-
### Running from Source
|
|
381
|
+
### Running from Source
|
|
361
382
|
|
|
362
383
|
This provides an isolated environment and exposes the server via HTTP endpoints.
|
|
363
384
|
|
|
@@ -378,46 +399,14 @@ This method is useful for contributing to the project or running un-published ve
|
|
|
378
399
|
npm run build
|
|
379
400
|
```
|
|
380
401
|
4. **Setup Environment:**
|
|
381
|
-
Create and configure your `.env` file as described in
|
|
402
|
+
Create and configure your `.env` file as described in the [Configuration](#configuration) section. This is crucial for providing the `OPENAI_API_KEY`.
|
|
382
403
|
|
|
383
404
|
5. **Run:**
|
|
384
405
|
- **Server (Development Mode):** `npm run dev:server` (builds, watches, and restarts)
|
|
385
406
|
- **Server (Production Mode):** `npm run start` (runs pre-built code)
|
|
386
407
|
- **CLI:** `npm run cli -- <command> [options]` or `node dist/cli.js <command> [options]`
|
|
387
408
|
|
|
388
|
-
###
|
|
389
|
-
|
|
390
|
-
**Note:** This `.env` file setup is primarily needed when running the server from source or using the Docker method. When using the `npx` integration method, the `OPENAI_API_KEY` is set directly in the MCP configuration file.
|
|
391
|
-
|
|
392
|
-
1. Create a `.env` file based on `.env.example`:
|
|
393
|
-
```bash
|
|
394
|
-
cp .env.example .env
|
|
395
|
-
```
|
|
396
|
-
2. Update your OpenAI API key in `.env`:
|
|
397
|
-
|
|
398
|
-
```
|
|
399
|
-
# Required: Your OpenAI API key for generating embeddings.
|
|
400
|
-
OPENAI_API_KEY=your-api-key-here
|
|
401
|
-
|
|
402
|
-
# Optional: Your OpenAI Organization ID (handled automatically by LangChain if set)
|
|
403
|
-
OPENAI_ORG_ID=
|
|
404
|
-
|
|
405
|
-
# Optional: Custom base URL for OpenAI API (e.g., for Azure OpenAI or compatible APIs)
|
|
406
|
-
OPENAI_API_BASE=
|
|
407
|
-
|
|
408
|
-
# Optional: Embedding model name (defaults to "text-embedding-3-small")
|
|
409
|
-
# Examples: text-embedding-3-large, text-embedding-ada-002
|
|
410
|
-
DOCS_MCP_EMBEDDING_MODEL=
|
|
411
|
-
|
|
412
|
-
# Optional: Specify a custom directory to store the SQLite database file (documents.db).
|
|
413
|
-
# If set, this path takes precedence over the default locations.
|
|
414
|
-
# Default behavior (if unset):
|
|
415
|
-
# 1. Uses './.store/' in the project root if it exists (legacy).
|
|
416
|
-
# 2. Falls back to OS-specific data directory (e.g., ~/Library/Application Support/docs-mcp-server on macOS).
|
|
417
|
-
# DOCS_MCP_STORE_PATH=/path/to/your/desired/storage/directory
|
|
418
|
-
```
|
|
419
|
-
|
|
420
|
-
### Testing (from Source)
|
|
409
|
+
### Testing
|
|
421
410
|
|
|
422
411
|
Since MCP servers communicate over stdio when run directly via Node.js, debugging can be challenging. We recommend using the [MCP Inspector](https://github.com/modelcontextprotocol/inspector), which is available as a package script after building:
|
|
423
412
|
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
-- Add indexed_at column to track when documents were last indexed
|
|
2
2
|
-- Step 1: Add the column allowing NULLs (SQLite limitation workaround)
|
|
3
|
-
ALTER TABLE documents ADD COLUMN indexed_at DATETIME;
|
|
3
|
+
ALTER TABLE documents ADD COLUMN IF NOT EXISTS indexed_at DATETIME;
|
|
4
4
|
|
|
5
5
|
-- Step 2: Update existing rows to set the timestamp
|
|
6
6
|
UPDATE documents SET indexed_at = CURRENT_TIMESTAMP WHERE indexed_at IS NULL;
|
package/dist/{DocumentManagementService-_qCZ1Hi2.js → DocumentManagementService-BupnR1eC.js}
RENAMED
|
@@ -2758,7 +2758,7 @@ class DocumentStore {
|
|
|
2758
2758
|
*/
|
|
2759
2759
|
async initializeEmbeddings() {
|
|
2760
2760
|
const modelSpec = process.env.DOCS_MCP_EMBEDDING_MODEL || "text-embedding-3-small";
|
|
2761
|
-
const { createEmbeddingModel } = await import("./EmbeddingFactory-
|
|
2761
|
+
const { createEmbeddingModel } = await import("./EmbeddingFactory-DZKXkqOe.js");
|
|
2762
2762
|
this.embeddings = createEmbeddingModel(modelSpec);
|
|
2763
2763
|
const testVector = await this.embeddings.embedQuery("test");
|
|
2764
2764
|
this.modelDimension = testVector.length;
|
|
@@ -2779,8 +2779,8 @@ class DocumentStore {
|
|
|
2779
2779
|
*/
|
|
2780
2780
|
async initialize() {
|
|
2781
2781
|
try {
|
|
2782
|
-
applyMigrations(this.db);
|
|
2783
2782
|
sqliteVec.load(this.db);
|
|
2783
|
+
applyMigrations(this.db);
|
|
2784
2784
|
this.prepareStatements();
|
|
2785
2785
|
await this.initializeEmbeddings();
|
|
2786
2786
|
} catch (error) {
|
|
@@ -3406,4 +3406,4 @@ export {
|
|
|
3406
3406
|
DimensionError as v,
|
|
3407
3407
|
VECTOR_DIMENSION as w
|
|
3408
3408
|
};
|
|
3409
|
-
//# sourceMappingURL=DocumentManagementService-
|
|
3409
|
+
//# sourceMappingURL=DocumentManagementService-BupnR1eC.js.map
|