@pleaseai/context-please-mcp 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +24 -0
- package/README.md +688 -0
- package/dist/config.d.ts +55 -0
- package/dist/config.d.ts.map +1 -0
- package/dist/config.js +172 -0
- package/dist/config.js.map +1 -0
- package/dist/embedding.d.ts +5 -0
- package/dist/embedding.d.ts.map +1 -0
- package/dist/embedding.js +77 -0
- package/dist/embedding.js.map +1 -0
- package/dist/handlers.d.ts +74 -0
- package/dist/handlers.d.ts.map +1 -0
- package/dist/handlers.js +686 -0
- package/dist/handlers.js.map +1 -0
- package/dist/index.d.ts +3 -0
- package/dist/index.d.ts.map +1 -0
- package/dist/index.js +280 -0
- package/dist/index.js.map +1 -0
- package/dist/snapshot.d.ts +95 -0
- package/dist/snapshot.d.ts.map +1 -0
- package/dist/snapshot.js +439 -0
- package/dist/snapshot.js.map +1 -0
- package/dist/sync.d.ts +11 -0
- package/dist/sync.d.ts.map +1 -0
- package/dist/sync.js +123 -0
- package/dist/sync.js.map +1 -0
- package/dist/utils.d.ts +10 -0
- package/dist/utils.d.ts.map +1 -0
- package/dist/utils.js +27 -0
- package/dist/utils.js.map +1 -0
- package/package.json +42 -0
package/LICENSE
ADDED
@@ -0,0 +1,24 @@
|
|
1
|
+
MIT License
|
2
|
+
|
3
|
+
Copyright (c) 2025 PleaseAI
|
4
|
+
|
5
|
+
This project is a fork of claude-context (https://github.com/zilliztech/claude-context)
|
6
|
+
Original Copyright (c) 2025 Zilliz
|
7
|
+
|
8
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
9
|
+
of this software and associated documentation files (the "Software"), to deal
|
10
|
+
in the Software without restriction, including without limitation the rights
|
11
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
12
|
+
copies of the Software, and to permit persons to whom the Software is
|
13
|
+
furnished to do so, subject to the following conditions:
|
14
|
+
|
15
|
+
The above copyright notice and this permission notice shall be included in all
|
16
|
+
copies or substantial portions of the Software.
|
17
|
+
|
18
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
19
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
20
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
21
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
22
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
23
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
24
|
+
SOFTWARE.
|
package/README.md
ADDED
@@ -0,0 +1,688 @@
|
|
1
|
+
# @pleaseai/context-please-mcp
|
2
|
+
|
3
|
+

|
4
|
+
Model Context Protocol (MCP) integration for Context Please - A powerful MCP server that enables AI assistants and agents to index and search codebases using semantic search.
|
5
|
+
|
6
|
+
> **Note:** This is a fork of [@zilliz/claude-context-mcp](https://www.npmjs.com/package/@zilliz/claude-context-mcp) by Zilliz, maintained by PleaseAI.
|
7
|
+
|
8
|
+
[](https://www.npmjs.com/package/@pleaseai/context-please-mcp)
|
9
|
+
[](https://www.npmjs.com/package/@pleaseai/context-please-mcp)
|
10
|
+
|
11
|
+
> 📖 **New to Context Please?** Check out the [main project README](../../README.md) for an overview and setup instructions.
|
12
|
+
|
13
|
+
## 🚀 Use Claude Context as MCP in Claude Code and others
|
14
|
+
|
15
|
+

|
16
|
+
|
17
|
+
Model Context Protocol (MCP) allows you to integrate Claude Context with your favorite AI coding assistants, e.g. Claude Code.
|
18
|
+
|
19
|
+
## Quick Start
|
20
|
+
|
21
|
+
### Prerequisites
|
22
|
+
|
23
|
+
Before using the MCP server, make sure you have:
|
24
|
+
|
25
|
+
- API key for your chosen embedding provider (OpenAI, VoyageAI, Gemini, or Ollama setup)
|
26
|
+
- Milvus vector database (local or cloud)
|
27
|
+
|
28
|
+
> 💡 **Setup Help:** See the [main project setup guide](../../README.md#-quick-start) for detailed installation instructions.
|
29
|
+
|
30
|
+
### Prepare Environment Variables
|
31
|
+
|
32
|
+
#### Embedding Provider Configuration
|
33
|
+
|
34
|
+
Claude Context MCP supports multiple embedding providers. Choose the one that best fits your needs:
|
35
|
+
|
36
|
+
> 📋 **Quick Reference**: For a complete list of environment variables and their descriptions, see the [Environment Variables Guide](../../docs/getting-started/environment-variables.md).
|
37
|
+
|
38
|
+
```bash
|
39
|
+
# Supported providers: OpenAI, VoyageAI, Gemini, Ollama
|
40
|
+
EMBEDDING_PROVIDER=OpenAI
|
41
|
+
```
|
42
|
+
|
43
|
+
<details>
|
44
|
+
<summary><strong>1. OpenAI Configuration (Default)</strong></summary>
|
45
|
+
|
46
|
+
OpenAI provides high-quality embeddings with excellent performance for code understanding.
|
47
|
+
|
48
|
+
```bash
|
49
|
+
# Required: Your OpenAI API key
|
50
|
+
OPENAI_API_KEY=sk-your-openai-api-key
|
51
|
+
|
52
|
+
# Optional: Specify embedding model (default: text-embedding-3-small)
|
53
|
+
EMBEDDING_MODEL=text-embedding-3-small
|
54
|
+
|
55
|
+
# Optional: Custom API base URL (for Azure OpenAI or other compatible services)
|
56
|
+
OPENAI_BASE_URL=https://api.openai.com/v1
|
57
|
+
```
|
58
|
+
|
59
|
+
**Available Models:**
|
60
|
+
See `getSupportedModels` in [`openai-embedding.ts`](https://github.com/zilliztech/claude-context/blob/master/packages/core/src/embedding/openai-embedding.ts) for the full list of supported models.
|
61
|
+
|
62
|
+
**Getting API Key:**
|
63
|
+
|
64
|
+
1. Visit [OpenAI Platform](https://platform.openai.com/api-keys)
|
65
|
+
2. Sign in or create an account
|
66
|
+
3. Generate a new API key
|
67
|
+
4. Set up billing if needed
|
68
|
+
|
69
|
+
</details>
|
70
|
+
|
71
|
+
<details>
|
72
|
+
<summary><strong>2. VoyageAI Configuration</strong></summary>
|
73
|
+
|
74
|
+
VoyageAI offers specialized code embeddings optimized for programming languages.
|
75
|
+
|
76
|
+
```bash
|
77
|
+
# Required: Your VoyageAI API key
|
78
|
+
VOYAGEAI_API_KEY=pa-your-voyageai-api-key
|
79
|
+
|
80
|
+
# Optional: Specify embedding model (default: voyage-code-3)
|
81
|
+
EMBEDDING_MODEL=voyage-code-3
|
82
|
+
```
|
83
|
+
|
84
|
+
**Available Models:**
|
85
|
+
See `getSupportedModels` in [`voyageai-embedding.ts`](https://github.com/zilliztech/claude-context/blob/master/packages/core/src/embedding/voyageai-embedding.ts) for the full list of supported models.
|
86
|
+
|
87
|
+
**Getting API Key:**
|
88
|
+
|
89
|
+
1. Visit [VoyageAI Console](https://dash.voyageai.com/)
|
90
|
+
2. Sign up for an account
|
91
|
+
3. Navigate to API Keys section
|
92
|
+
4. Create a new API key
|
93
|
+
|
94
|
+
</details>
|
95
|
+
|
96
|
+
<details>
|
97
|
+
<summary><strong>3. Gemini Configuration</strong></summary>
|
98
|
+
|
99
|
+
Google's Gemini provides competitive embeddings with good multilingual support.
|
100
|
+
|
101
|
+
```bash
|
102
|
+
# Required: Your Gemini API key
|
103
|
+
GEMINI_API_KEY=your-gemini-api-key
|
104
|
+
|
105
|
+
# Optional: Specify embedding model (default: gemini-embedding-001)
|
106
|
+
EMBEDDING_MODEL=gemini-embedding-001
|
107
|
+
|
108
|
+
# Optional: Custom API base URL (for custom endpoints)
|
109
|
+
GEMINI_BASE_URL=https://generativelanguage.googleapis.com/v1beta
|
110
|
+
```
|
111
|
+
|
112
|
+
**Available Models:**
|
113
|
+
See `getSupportedModels` in [`gemini-embedding.ts`](https://github.com/zilliztech/claude-context/blob/master/packages/core/src/embedding/gemini-embedding.ts) for the full list of supported models.
|
114
|
+
|
115
|
+
**Getting API Key:**
|
116
|
+
|
117
|
+
1. Visit [Google AI Studio](https://aistudio.google.com/)
|
118
|
+
2. Sign in with your Google account
|
119
|
+
3. Go to "Get API key" section
|
120
|
+
4. Create a new API key
|
121
|
+
|
122
|
+
</details>
|
123
|
+
|
124
|
+
<details>
|
125
|
+
<summary><strong>4. Ollama Configuration (Local/Self-hosted)</strong></summary>
|
126
|
+
|
127
|
+
Ollama allows you to run embeddings locally without sending data to external services.
|
128
|
+
|
129
|
+
```bash
|
130
|
+
# Required: Specify which Ollama model to use
|
131
|
+
EMBEDDING_MODEL=nomic-embed-text
|
132
|
+
|
133
|
+
# Optional: Specify Ollama host (default: http://127.0.0.1:11434)
|
134
|
+
OLLAMA_HOST=http://127.0.0.1:11434
|
135
|
+
```
|
136
|
+
|
137
|
+
**Setup Instructions:**
|
138
|
+
|
139
|
+
1. Install Ollama from [ollama.ai](https://ollama.ai/)
|
140
|
+
2. Pull the embedding model:
|
141
|
+
|
142
|
+
```bash
|
143
|
+
ollama pull nomic-embed-text
|
144
|
+
```
|
145
|
+
|
146
|
+
3. Ensure Ollama is running:
|
147
|
+
|
148
|
+
```bash
|
149
|
+
ollama serve
|
150
|
+
```
|
151
|
+
|
152
|
+
</details>
|
153
|
+
|
154
|
+
#### Get a free vector database on Zilliz Cloud
|
155
|
+
|
156
|
+
Claude Context needs a vector database. You can [sign up](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) on Zilliz Cloud to get an API key.
|
157
|
+
|
158
|
+

|
159
|
+
|
160
|
+
Copy your Personal Key to replace `your-zilliz-cloud-api-key` in the configuration examples.
|
161
|
+
|
162
|
+
```bash
|
163
|
+
MILVUS_TOKEN=your-zilliz-cloud-api-key
|
164
|
+
```
|
165
|
+
|
166
|
+
#### Embedding Batch Size
|
167
|
+
|
168
|
+
You can set the embedding batch size to optimize the performance of the MCP server, depending on your embedding model throughput. The default value is 100.
|
169
|
+
|
170
|
+
```bash
|
171
|
+
EMBEDDING_BATCH_SIZE=512
|
172
|
+
```
|
173
|
+
|
174
|
+
#### Custom File Processing (Optional)
|
175
|
+
|
176
|
+
You can configure custom file extensions and ignore patterns globally via environment variables:
|
177
|
+
|
178
|
+
```bash
|
179
|
+
# Additional file extensions to include beyond defaults
|
180
|
+
CUSTOM_EXTENSIONS=.vue,.svelte,.astro,.twig
|
181
|
+
|
182
|
+
# Additional ignore patterns to exclude files/directories
|
183
|
+
CUSTOM_IGNORE_PATTERNS=temp/**,*.backup,private/**,uploads/**
|
184
|
+
```
|
185
|
+
|
186
|
+
These settings work in combination with tool parameters - patterns from both sources will be merged together.
|
187
|
+
|
188
|
+
## Usage with MCP Clients
|
189
|
+
|
190
|
+
<details>
|
191
|
+
<summary><strong>Claude Code</strong></summary>
|
192
|
+
|
193
|
+
Use the command line interface to add the Claude Context MCP server:
|
194
|
+
|
195
|
+
```bash
|
196
|
+
# Add the Claude Context MCP server
|
197
|
+
claude mcp add claude-context -e OPENAI_API_KEY=your-openai-api-key -e MILVUS_TOKEN=your-zilliz-cloud-api-key -- npx @pleaseai/context-please-mcp@latest
|
198
|
+
|
199
|
+
```
|
200
|
+
|
201
|
+
See the [Claude Code MCP documentation](https://docs.anthropic.com/en/docs/claude-code/mcp) for more details about MCP server management.
|
202
|
+
|
203
|
+
</details>
|
204
|
+
|
205
|
+
<details>
|
206
|
+
<summary><strong>OpenAI Codex CLI</strong></summary>
|
207
|
+
|
208
|
+
Codex CLI uses TOML configuration files:
|
209
|
+
|
210
|
+
1. Create or edit the `~/.codex/config.toml` file.
|
211
|
+
|
212
|
+
2. Add the following configuration:
|
213
|
+
|
214
|
+
```toml
|
215
|
+
# IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`.
|
216
|
+
[mcp_servers.claude-context]
|
217
|
+
command = "npx"
|
218
|
+
args = ["@pleaseai/context-please-mcp@latest"]
|
219
|
+
env = { "OPENAI_API_KEY" = "your-openai-api-key", "MILVUS_TOKEN" = "your-zilliz-cloud-api-key" }
|
220
|
+
# Optional: override the default 10s startup timeout
|
221
|
+
startup_timeout_ms = 20000
|
222
|
+
```
|
223
|
+
|
224
|
+
3. Save the file and restart Codex CLI to apply the changes.
|
225
|
+
|
226
|
+
</details>
|
227
|
+
|
228
|
+
<details>
|
229
|
+
<summary><strong>Gemini CLI</strong></summary>
|
230
|
+
|
231
|
+
Gemini CLI requires manual configuration through a JSON file:
|
232
|
+
|
233
|
+
1. Create or edit the `~/.gemini/settings.json` file.
|
234
|
+
|
235
|
+
2. Add the following configuration:
|
236
|
+
|
237
|
+
```json
|
238
|
+
{
|
239
|
+
"mcpServers": {
|
240
|
+
"claude-context": {
|
241
|
+
"command": "npx",
|
242
|
+
"args": ["@pleaseai/context-please-mcp@latest"],
|
243
|
+
"env": {
|
244
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
245
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
246
|
+
}
|
247
|
+
}
|
248
|
+
}
|
249
|
+
}
|
250
|
+
```
|
251
|
+
|
252
|
+
3. Save the file and restart Gemini CLI to apply the changes.
|
253
|
+
|
254
|
+
</details>
|
255
|
+
|
256
|
+
<details>
|
257
|
+
<summary><strong>Qwen Code</strong></summary>
|
258
|
+
|
259
|
+
Create or edit the `~/.qwen/settings.json` file and add the following configuration:
|
260
|
+
|
261
|
+
```json
|
262
|
+
{
|
263
|
+
"mcpServers": {
|
264
|
+
"claude-context": {
|
265
|
+
"command": "npx",
|
266
|
+
"args": ["@pleaseai/context-please-mcp@latest"],
|
267
|
+
"env": {
|
268
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
269
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
270
|
+
}
|
271
|
+
}
|
272
|
+
}
|
273
|
+
}
|
274
|
+
```
|
275
|
+
|
276
|
+
</details>
|
277
|
+
|
278
|
+
<details>
|
279
|
+
<summary><strong>Cursor</strong></summary>
|
280
|
+
|
281
|
+
Go to: `Settings` -> `Cursor Settings` -> `MCP` -> `Add new global MCP server`
|
282
|
+
|
283
|
+
Pasting the following configuration into your Cursor `~/.cursor/mcp.json` file is the recommended approach. You may also install in a specific project by creating `.cursor/mcp.json` in your project folder. See [Cursor MCP docs](https://docs.cursor.com/context/model-context-protocol) for more info.
|
284
|
+
|
285
|
+
**OpenAI Configuration (Default):**
|
286
|
+
|
287
|
+
```json
|
288
|
+
{
|
289
|
+
"mcpServers": {
|
290
|
+
"claude-context": {
|
291
|
+
"command": "npx",
|
292
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
|
293
|
+
"env": {
|
294
|
+
"EMBEDDING_PROVIDER": "OpenAI",
|
295
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
296
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
297
|
+
}
|
298
|
+
}
|
299
|
+
}
|
300
|
+
}
|
301
|
+
```
|
302
|
+
|
303
|
+
**VoyageAI Configuration:**
|
304
|
+
|
305
|
+
```json
|
306
|
+
{
|
307
|
+
"mcpServers": {
|
308
|
+
"claude-context": {
|
309
|
+
"command": "npx",
|
310
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
|
311
|
+
"env": {
|
312
|
+
"EMBEDDING_PROVIDER": "VoyageAI",
|
313
|
+
"VOYAGEAI_API_KEY": "your-voyageai-api-key",
|
314
|
+
"EMBEDDING_MODEL": "voyage-code-3",
|
315
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
316
|
+
}
|
317
|
+
}
|
318
|
+
}
|
319
|
+
}
|
320
|
+
```
|
321
|
+
|
322
|
+
**Gemini Configuration:**
|
323
|
+
|
324
|
+
```json
|
325
|
+
{
|
326
|
+
"mcpServers": {
|
327
|
+
"claude-context": {
|
328
|
+
"command": "npx",
|
329
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
|
330
|
+
"env": {
|
331
|
+
"EMBEDDING_PROVIDER": "Gemini",
|
332
|
+
"GEMINI_API_KEY": "your-gemini-api-key",
|
333
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
334
|
+
}
|
335
|
+
}
|
336
|
+
}
|
337
|
+
}
|
338
|
+
```
|
339
|
+
|
340
|
+
**Ollama Configuration:**
|
341
|
+
|
342
|
+
```json
|
343
|
+
{
|
344
|
+
"mcpServers": {
|
345
|
+
"claude-context": {
|
346
|
+
"command": "npx",
|
347
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
|
348
|
+
"env": {
|
349
|
+
"EMBEDDING_PROVIDER": "Ollama",
|
350
|
+
"EMBEDDING_MODEL": "nomic-embed-text",
|
351
|
+
"OLLAMA_HOST": "http://127.0.0.1:11434",
|
352
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
353
|
+
}
|
354
|
+
}
|
355
|
+
}
|
356
|
+
}
|
357
|
+
```
|
358
|
+
|
359
|
+
</details>
|
360
|
+
|
361
|
+
<details>
|
362
|
+
<summary><strong>Void</strong></summary>
|
363
|
+
|
364
|
+
Go to: `Settings` -> `MCP` -> `Add MCP Server`
|
365
|
+
|
366
|
+
Add the following configuration to your Void MCP settings:
|
367
|
+
|
368
|
+
```json
|
369
|
+
{
|
370
|
+
"mcpServers": {
|
371
|
+
"code-context": {
|
372
|
+
"command": "npx",
|
373
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
|
374
|
+
"env": {
|
375
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
376
|
+
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
|
377
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
378
|
+
}
|
379
|
+
}
|
380
|
+
}
|
381
|
+
}
|
382
|
+
```
|
383
|
+
|
384
|
+
</details>
|
385
|
+
|
386
|
+
<details>
|
387
|
+
<summary><strong>Claude Desktop</strong></summary>
|
388
|
+
|
389
|
+
Add to your Claude Desktop configuration:
|
390
|
+
|
391
|
+
```json
|
392
|
+
{
|
393
|
+
"mcpServers": {
|
394
|
+
"claude-context": {
|
395
|
+
"command": "npx",
|
396
|
+
"args": ["@pleaseai/context-please-mcp@latest"],
|
397
|
+
"env": {
|
398
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
399
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
400
|
+
}
|
401
|
+
}
|
402
|
+
}
|
403
|
+
}
|
404
|
+
```
|
405
|
+
|
406
|
+
</details>
|
407
|
+
|
408
|
+
<details>
|
409
|
+
<summary><strong>Windsurf</strong></summary>
|
410
|
+
|
411
|
+
Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:
|
412
|
+
|
413
|
+
```json
|
414
|
+
{
|
415
|
+
"mcpServers": {
|
416
|
+
"claude-context": {
|
417
|
+
"command": "npx",
|
418
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
|
419
|
+
"env": {
|
420
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
421
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
422
|
+
}
|
423
|
+
}
|
424
|
+
}
|
425
|
+
}
|
426
|
+
```
|
427
|
+
|
428
|
+
</details>
|
429
|
+
|
430
|
+
<details>
|
431
|
+
<summary><strong>VS Code</strong></summary>
|
432
|
+
|
433
|
+
The Claude Context MCP server can be used with VS Code through MCP-compatible extensions. Add the following configuration to your VS Code MCP settings:
|
434
|
+
|
435
|
+
```json
|
436
|
+
{
|
437
|
+
"mcpServers": {
|
438
|
+
"claude-context": {
|
439
|
+
"command": "npx",
|
440
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
|
441
|
+
"env": {
|
442
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
443
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
444
|
+
}
|
445
|
+
}
|
446
|
+
}
|
447
|
+
}
|
448
|
+
```
|
449
|
+
|
450
|
+
</details>
|
451
|
+
|
452
|
+
<details>
|
453
|
+
<summary><strong>Cherry Studio</strong></summary>
|
454
|
+
|
455
|
+
Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:
|
456
|
+
|
457
|
+
1. Navigate to **Settings → MCP Servers → Add Server**.
|
458
|
+
2. Fill in the server details:
|
459
|
+
- **Name**: `claude-context`
|
460
|
+
- **Type**: `STDIO`
|
461
|
+
- **Command**: `npx`
|
462
|
+
- **Arguments**: `["@pleaseai/context-please-mcp@latest"]`
|
463
|
+
- **Environment Variables**:
|
464
|
+
- `OPENAI_API_KEY`: `your-openai-api-key`
|
465
|
+
- `MILVUS_TOKEN`: `your-zilliz-cloud-api-key`
|
466
|
+
3. Save the configuration to activate the server.
|
467
|
+
|
468
|
+
</details>
|
469
|
+
|
470
|
+
<details>
|
471
|
+
<summary><strong>Cline</strong></summary>
|
472
|
+
|
473
|
+
Cline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:
|
474
|
+
|
475
|
+
1. Open Cline and click on the **MCP Servers** icon in the top navigation bar.
|
476
|
+
|
477
|
+
2. Select the **Installed** tab, then click **Advanced MCP Settings**.
|
478
|
+
|
479
|
+
3. In the `cline_mcp_settings.json` file, add the following configuration:
|
480
|
+
|
481
|
+
```json
|
482
|
+
{
|
483
|
+
"mcpServers": {
|
484
|
+
"claude-context": {
|
485
|
+
"command": "npx",
|
486
|
+
"args": ["@pleaseai/context-please-mcp@latest"],
|
487
|
+
"env": {
|
488
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
489
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
490
|
+
}
|
491
|
+
}
|
492
|
+
}
|
493
|
+
}
|
494
|
+
```
|
495
|
+
|
496
|
+
4. Save the file.
|
497
|
+
|
498
|
+
</details>
|
499
|
+
|
500
|
+
<details>
|
501
|
+
<summary><strong>Augment</strong></summary>
|
502
|
+
|
503
|
+
To configure Claude Context MCP in Augment Code, you can use either the graphical interface or manual configuration.
|
504
|
+
|
505
|
+
#### **A. Using the Augment Code UI**
|
506
|
+
|
507
|
+
1. Click the hamburger menu.
|
508
|
+
|
509
|
+
2. Select **Settings**.
|
510
|
+
|
511
|
+
3. Navigate to the **Tools** section.
|
512
|
+
|
513
|
+
4. Click the **+ Add MCP** button.
|
514
|
+
|
515
|
+
5. Enter the following command:
|
516
|
+
|
517
|
+
```
|
518
|
+
npx @pleaseai/context-please-mcp@latest
|
519
|
+
```
|
520
|
+
|
521
|
+
6. Name the MCP: **Claude Context**.
|
522
|
+
|
523
|
+
7. Click the **Add** button.
|
524
|
+
|
525
|
+
------
|
526
|
+
|
527
|
+
#### **B. Manual Configuration**
|
528
|
+
|
529
|
+
1. Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
|
530
|
+
2. Select Edit Settings
|
531
|
+
3. Under Advanced, click Edit in settings.json
|
532
|
+
4. Add the server configuration to the `mcpServers` array in the `augment.advanced` object
|
533
|
+
|
534
|
+
```json
|
535
|
+
"augment.advanced": {
|
536
|
+
"mcpServers": [
|
537
|
+
{
|
538
|
+
"name": "claude-context",
|
539
|
+
"command": "npx",
|
540
|
+
"args": ["-y", "@pleaseai/context-please-mcp@latest"]
|
541
|
+
}
|
542
|
+
]
|
543
|
+
}
|
544
|
+
```
|
545
|
+
|
546
|
+
</details>
|
547
|
+
|
548
|
+
<details>
|
549
|
+
<summary><strong>Roo Code</strong></summary>
|
550
|
+
|
551
|
+
Roo Code utilizes a JSON configuration file for MCP servers:
|
552
|
+
|
553
|
+
1. Open Roo Code and navigate to **Settings → MCP Servers → Edit Global Config**.
|
554
|
+
|
555
|
+
2. In the `mcp_settings.json` file, add the following configuration:
|
556
|
+
|
557
|
+
```json
|
558
|
+
{
|
559
|
+
"mcpServers": {
|
560
|
+
"claude-context": {
|
561
|
+
"command": "npx",
|
562
|
+
"args": ["@pleaseai/context-please-mcp@latest"],
|
563
|
+
"env": {
|
564
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
565
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
566
|
+
}
|
567
|
+
}
|
568
|
+
}
|
569
|
+
}
|
570
|
+
```
|
571
|
+
|
572
|
+
3. Save the file to activate the server.
|
573
|
+
|
574
|
+
</details>
|
575
|
+
|
576
|
+
<details>
|
577
|
+
<summary><strong>Zencoder</strong></summary>
|
578
|
+
|
579
|
+
Zencoder offers support for MCP tools and servers in both its JetBrains and VS Code plugin versions.
|
580
|
+
|
581
|
+
1. Go to the Zencoder menu (...)
|
582
|
+
2. From the dropdown menu, select `Tools`
|
583
|
+
3. Click on the `Add Custom MCP`
|
584
|
+
4. Add the name (i.e. `Claude Context` and server configuration from below, and make sure to hit the `Install` button
|
585
|
+
|
586
|
+
```json
|
587
|
+
{
|
588
|
+
"command": "npx",
|
589
|
+
"args": ["@pleaseai/context-please-mcp@latest"],
|
590
|
+
"env": {
|
591
|
+
"OPENAI_API_KEY": "your-openai-api-key",
|
592
|
+
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
|
593
|
+
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
|
594
|
+
}
|
595
|
+
}
|
596
|
+
|
597
|
+
```
|
598
|
+
|
599
|
+
5. Save the server by hitting the `Install` button.
|
600
|
+
|
601
|
+
</details>
|
602
|
+
|
603
|
+
<details>
|
604
|
+
<summary><strong>LangChain/LangGraph</strong></summary>
|
605
|
+
|
606
|
+
For LangChain/LangGraph integration examples, see [this example](https://github.com/zilliztech/claude-context/blob/643796a0d30e706a2a0dff3d55621c9b5d831807/evaluation/retrieval/custom.py#L88).
|
607
|
+
|
608
|
+
</details>
|
609
|
+
|
610
|
+
<details>
|
611
|
+
<summary><strong>Other MCP Clients</strong></summary>
|
612
|
+
|
613
|
+
The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:
|
614
|
+
|
615
|
+
```bash
|
616
|
+
npx @pleaseai/context-please-mcp@latest
|
617
|
+
```
|
618
|
+
|
619
|
+
</details>
|
620
|
+
|
621
|
+
## Features
|
622
|
+
|
623
|
+
- 🔌 **MCP Protocol Compliance**: Full compatibility with MCP-enabled AI assistants and agents
|
624
|
+
- 🔍 **Hybrid Code Search**: Natural language queries using advanced hybrid search (BM25 + dense vector) to find relevant code snippets
|
625
|
+
- 📁 **Codebase Indexing**: Index entire codebases for fast hybrid search across millions of lines of code
|
626
|
+
- 🔄 **Incremental Indexing**: Efficiently re-index only changed files using Merkle trees for auto-sync
|
627
|
+
- 🧩 **Intelligent Code Chunking**: AST-based code analysis for syntax-aware chunking with automatic fallback
|
628
|
+
- 🗄️ **Scalable**: Integrates with Zilliz Cloud for scalable vector search, no matter how large your codebase is
|
629
|
+
- 🛠️ **Customizable**: Configure file extensions, ignore patterns, and embedding models
|
630
|
+
- ⚡ **Real-time**: Interactive indexing and searching with progress feedback
|
631
|
+
|
632
|
+
## Available Tools
|
633
|
+
|
634
|
+
### 1. `index_codebase`
|
635
|
+
|
636
|
+
Index a codebase directory for hybrid search (BM25 + dense vector).
|
637
|
+
|
638
|
+
**Parameters:**
|
639
|
+
|
640
|
+
- `path` (required): Absolute path to the codebase directory to index
|
641
|
+
- `force` (optional): Force re-indexing even if already indexed (default: false)
|
642
|
+
- `splitter` (optional): Code splitter to use - 'ast' for syntax-aware splitting with automatic fallback, 'langchain' for character-based splitting (default: "ast")
|
643
|
+
- `customExtensions` (optional): Additional file extensions to include beyond defaults (e.g., ['.vue', '.svelte', '.astro']). Extensions should include the dot prefix or will be automatically added (default: [])
|
644
|
+
- `ignorePatterns` (optional): Additional ignore patterns to exclude specific files/directories beyond defaults (e.g., ['static/**', '*.tmp', 'private/**']) (default: [])
|
645
|
+
|
646
|
+
### 2. `search_code`
|
647
|
+
|
648
|
+
Search the indexed codebase using natural language queries with hybrid search (BM25 + dense vector).
|
649
|
+
|
650
|
+
**Parameters:**
|
651
|
+
|
652
|
+
- `path` (required): Absolute path to the codebase directory to search in
|
653
|
+
- `query` (required): Natural language query to search for in the codebase
|
654
|
+
- `limit` (optional): Maximum number of results to return (default: 10, max: 50)
|
655
|
+
- `extensionFilter` (optional): List of file extensions to filter results (e.g., ['.ts', '.py']) (default: [])
|
656
|
+
|
657
|
+
### 3. `clear_index`
|
658
|
+
|
659
|
+
Clear the search index for a specific codebase.
|
660
|
+
|
661
|
+
**Parameters:**
|
662
|
+
|
663
|
+
- `path` (required): Absolute path to the codebase directory to clear index for
|
664
|
+
|
665
|
+
### 4. `get_indexing_status`
|
666
|
+
|
667
|
+
Get the current indexing status of a codebase. Shows progress percentage for actively indexing codebases and completion status for indexed codebases.
|
668
|
+
|
669
|
+
**Parameters:**
|
670
|
+
|
671
|
+
- `path` (required): Absolute path to the codebase directory to check status for
|
672
|
+
|
673
|
+
## Contributing
|
674
|
+
|
675
|
+
This package is part of the Claude Context monorepo. Please see:
|
676
|
+
|
677
|
+
- [Main Contributing Guide](../../CONTRIBUTING.md) - General contribution guidelines
|
678
|
+
- [MCP Package Contributing](CONTRIBUTING.md) - Specific development guide for this package
|
679
|
+
|
680
|
+
## Related Projects
|
681
|
+
|
682
|
+
- **[@pleaseai/context-please-core](../core)** - Core indexing engine used by this MCP server
|
683
|
+
- **[VSCode Extension](../vscode-extension)** - Alternative VSCode integration
|
684
|
+
- [Model Context Protocol](https://modelcontextprotocol.io/) - Official MCP documentation
|
685
|
+
|
686
|
+
## License
|
687
|
+
|
688
|
+
MIT - See [LICENSE](../../LICENSE) for details
|