@autodev/codebase 0.0.4 → 0.0.5
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +165 -60
- package/dist/index.js +1983 -1981
- package/dist/index.js.map +1 -1
- package/package.json +2 -1
package/README.md
CHANGED
|
@@ -32,10 +32,25 @@ brew install ollama
|
|
|
32
32
|
ollama serve
|
|
33
33
|
|
|
34
34
|
# In a new terminal, pull the embedding model
|
|
35
|
-
ollama pull
|
|
35
|
+
ollama pull dengcao/Qwen3-Embedding-0.6B:Q8_0
|
|
36
36
|
```
|
|
37
37
|
|
|
38
|
-
### 2. Install
|
|
38
|
+
### 2. Install ripgrep
|
|
39
|
+
|
|
40
|
+
`ripgrep` is required for fast codebase indexing. Install it with:
|
|
41
|
+
|
|
42
|
+
```bash
|
|
43
|
+
# Install ripgrep (macOS)
|
|
44
|
+
brew install ripgrep
|
|
45
|
+
|
|
46
|
+
# Or on Ubuntu/Debian
|
|
47
|
+
sudo apt-get install ripgrep
|
|
48
|
+
|
|
49
|
+
# Or on Arch Linux
|
|
50
|
+
sudo pacman -S ripgrep
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
### 3. Install and Start Qdrant
|
|
39
54
|
|
|
40
55
|
Start Qdrant using Docker:
|
|
41
56
|
|
|
@@ -53,7 +68,7 @@ tar -xzf qdrant-x86_64-unknown-linux-gnu.tar.gz
|
|
|
53
68
|
./qdrant
|
|
54
69
|
```
|
|
55
70
|
|
|
56
|
-
###
|
|
71
|
+
### 4. Verify Services Are Running
|
|
57
72
|
|
|
58
73
|
```bash
|
|
59
74
|
# Check Ollama
|
|
@@ -62,7 +77,7 @@ curl http://localhost:11434/api/tags
|
|
|
62
77
|
# Check Qdrant
|
|
63
78
|
curl http://localhost:6333/collections
|
|
64
79
|
```
|
|
65
|
-
###
|
|
80
|
+
### 5. Install Autodev-codebase
|
|
66
81
|
|
|
67
82
|
```bash
|
|
68
83
|
npm install -g @autodev/codebase
|
|
@@ -106,53 +121,89 @@ codebase mcp-server --port=3001 --host=localhost
|
|
|
106
121
|
codebase mcp-server --path=/workspace --port=3002
|
|
107
122
|
```
|
|
108
123
|
|
|
109
|
-
### IDE Integration (Cursor/Claude)
|
|
110
124
|
|
|
111
|
-
|
|
125
|
+
## ⚙️ Configuration
|
|
126
|
+
|
|
127
|
+
### Configuration Files & Priority
|
|
128
|
+
|
|
129
|
+
The library uses a layered configuration system, allowing you to customize settings at different levels. The priority order (highest to lowest) is:
|
|
130
|
+
|
|
131
|
+
1. **CLI Parameters** (e.g., `--model`, `--ollama-url`, `--qdrant-url`, `--config`, etc.)
|
|
132
|
+
2. **Project Config File** (`./autodev-config.json`)
|
|
133
|
+
3. **Global Config File** (`~/.autodev-cache/autodev-config.json`)
|
|
134
|
+
4. **Built-in Defaults**
|
|
135
|
+
|
|
136
|
+
Settings specified at a higher level override those at lower levels. This lets you tailor the behavior for your environment or project as needed.
|
|
137
|
+
|
|
138
|
+
**Config file locations:**
|
|
139
|
+
- Global: `~/.autodev-cache/autodev-config.json`
|
|
140
|
+
- Project: `./autodev-config.json`
|
|
141
|
+
- CLI: Pass parameters directly when running commands
|
|
142
|
+
|
|
143
|
+
|
|
144
|
+
#### Global Configuration
|
|
145
|
+
|
|
146
|
+
Create a global configuration file at `~/.autodev-cache/autodev-config.json`:
|
|
112
147
|
|
|
113
148
|
```json
|
|
114
149
|
{
|
|
115
|
-
"
|
|
116
|
-
|
|
117
|
-
|
|
118
|
-
|
|
119
|
-
|
|
150
|
+
"isEnabled": true,
|
|
151
|
+
"embedder": {
|
|
152
|
+
"provider": "ollama",
|
|
153
|
+
"model": "dengcao/Qwen3-Embedding-0.6B:Q8_0",
|
|
154
|
+
"dimension": 1024,
|
|
155
|
+
"baseUrl": "http://localhost:11434"
|
|
156
|
+
},
|
|
157
|
+
"qdrantUrl": "http://localhost:6333",
|
|
158
|
+
"qdrantApiKey": "your-api-key-if-needed",
|
|
159
|
+
"searchMinScore": 0.4
|
|
120
160
|
}
|
|
121
161
|
```
|
|
122
162
|
|
|
123
|
-
|
|
163
|
+
#### Project Configuration
|
|
164
|
+
|
|
165
|
+
Create a project-specific configuration file at `./autodev-config.json`:
|
|
124
166
|
|
|
125
167
|
```json
|
|
126
168
|
{
|
|
127
|
-
"
|
|
128
|
-
"
|
|
129
|
-
|
|
130
|
-
|
|
131
|
-
|
|
132
|
-
|
|
133
|
-
|
|
134
|
-
|
|
135
|
-
}
|
|
169
|
+
"embedder": {
|
|
170
|
+
"provider": "openai-compatible",
|
|
171
|
+
"apiKey": "sk-xxxxx",
|
|
172
|
+
"baseUrl": "http://localhost:2302/v1",
|
|
173
|
+
"model": "openai/text-embedding-3-smallnpm",
|
|
174
|
+
"dimension": 1536,
|
|
175
|
+
},
|
|
176
|
+
"qdrantUrl": "http://localhost:6334"
|
|
136
177
|
}
|
|
137
178
|
```
|
|
138
179
|
|
|
139
|
-
|
|
180
|
+
#### Configuration Options
|
|
181
|
+
|
|
182
|
+
| Option | Type | Description | Default |
|
|
183
|
+
|--------|------|-------------|---------|
|
|
184
|
+
| `isEnabled` | boolean | Enable/disable code indexing feature | `true` |
|
|
185
|
+
| `embedder.provider` | string | Embedding provider (`ollama`, `openai`, `openai-compatible`) | `ollama` |
|
|
186
|
+
| `embedder.model` | string | Embedding model name | `dengcao/Qwen3-Embedding-0.6B:Q8_0` |
|
|
187
|
+
| `embedder.dimension` | number | Vector dimension size | `1024` |
|
|
188
|
+
| `embedder.baseUrl` | string | Provider API base URL | `http://localhost:11434` |
|
|
189
|
+
| `embedder.apiKey` | string | API key (for OpenAI/compatible providers) | - |
|
|
190
|
+
| `qdrantUrl` | string | Qdrant vector database URL | `http://localhost:6333` |
|
|
191
|
+
| `qdrantApiKey` | string | Qdrant API key (if authentication enabled) | - |
|
|
192
|
+
| `searchMinScore` | number | Minimum similarity score for search results | `0.4` |
|
|
140
193
|
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
|
|
144
|
-
|
|
194
|
+
**Note**: The `isConfigured` field is automatically calculated based on the completeness of your configuration and should not be set manually. The system will determine if the configuration is valid based on the required fields for your chosen provider.
|
|
195
|
+
|
|
196
|
+
#### Configuration Priority Examples
|
|
197
|
+
|
|
198
|
+
```bash
|
|
199
|
+
# Use global config defaults
|
|
200
|
+
codebase
|
|
145
201
|
|
|
146
|
-
|
|
147
|
-
|
|
148
|
-
storageOptions: { /* ... */ },
|
|
149
|
-
loggerOptions: { /* ... */ },
|
|
150
|
-
configOptions: { /* ... */ }
|
|
151
|
-
})
|
|
202
|
+
# Override model via CLI (highest priority)
|
|
203
|
+
codebase --model="custom-model"
|
|
152
204
|
|
|
153
|
-
|
|
154
|
-
|
|
155
|
-
await manager.startIndexing()
|
|
205
|
+
# Use project config with CLI overrides
|
|
206
|
+
codebase --config=./my-config.json --qdrant-url=http://remote:6333
|
|
156
207
|
```
|
|
157
208
|
|
|
158
209
|
## 🔧 CLI Options
|
|
@@ -160,6 +211,7 @@ await manager.startIndexing()
|
|
|
160
211
|
### Global Options
|
|
161
212
|
- `--path=<path>` - Workspace path (default: current directory)
|
|
162
213
|
- `--demo` - Create demo files in workspace
|
|
214
|
+
- `--force` - ignore cache force re-index
|
|
163
215
|
- `--ollama-url=<url>` - Ollama API URL (default: http://localhost:11434)
|
|
164
216
|
- `--qdrant-url=<url>` - Qdrant vector DB URL (default: http://localhost:6333)
|
|
165
217
|
- `--model=<model>` - Embedding model (default: nomic-embed-text)
|
|
@@ -167,12 +219,43 @@ await manager.startIndexing()
|
|
|
167
219
|
- `--storage=<path>` - Storage directory path
|
|
168
220
|
- `--cache=<path>` - Cache directory path
|
|
169
221
|
- `--log-level=<level>` - Log level: error|warn|info|debug (default: error)
|
|
222
|
+
- `--log-level=<level>` - Log level: error|warn|info|debug (default: error)
|
|
170
223
|
- `--help, -h` - Show help
|
|
171
224
|
|
|
172
225
|
### MCP Server Options
|
|
173
226
|
- `--port=<port>` - HTTP server port (default: 3001)
|
|
174
227
|
- `--host=<host>` - HTTP server host (default: localhost)
|
|
175
228
|
|
|
229
|
+
|
|
230
|
+
### IDE Integration (Cursor/Claude)
|
|
231
|
+
|
|
232
|
+
Configure your IDE to connect to the MCP server:
|
|
233
|
+
|
|
234
|
+
```json
|
|
235
|
+
{
|
|
236
|
+
"mcpServers": {
|
|
237
|
+
"codebase": {
|
|
238
|
+
"url": "http://localhost:3001/sse"
|
|
239
|
+
}
|
|
240
|
+
}
|
|
241
|
+
}
|
|
242
|
+
```
|
|
243
|
+
|
|
244
|
+
For clients that do not support SSE MCP, you can use the following configuration:
|
|
245
|
+
|
|
246
|
+
```json
|
|
247
|
+
{
|
|
248
|
+
"mcpServers": {
|
|
249
|
+
"codebase": {
|
|
250
|
+
"command": "codebase",
|
|
251
|
+
"args": [
|
|
252
|
+
"stdio-adapter",
|
|
253
|
+
"--server-url=http://localhost:3001/sse"
|
|
254
|
+
]
|
|
255
|
+
}
|
|
256
|
+
}
|
|
257
|
+
}
|
|
258
|
+
```
|
|
176
259
|
## 🌐 MCP Server Features
|
|
177
260
|
|
|
178
261
|
### Web Interface
|
|
@@ -184,8 +267,7 @@ await manager.startIndexing()
|
|
|
184
267
|
- **`search_codebase`** - Semantic search through your codebase
|
|
185
268
|
- Parameters: `query` (string), `limit` (number), `filters` (object)
|
|
186
269
|
- Returns: Formatted search results with file paths, scores, and code blocks
|
|
187
|
-
|
|
188
|
-
- **`configure_search`** - Configure search parameters at runtime
|
|
270
|
+
|
|
189
271
|
|
|
190
272
|
|
|
191
273
|
### Scripts
|
|
@@ -206,26 +288,49 @@ npm run demo-tui
|
|
|
206
288
|
npm run mcp-server
|
|
207
289
|
```
|
|
208
290
|
|
|
209
|
-
##
|
|
210
|
-
|
|
211
|
-
|
|
212
|
-
|
|
213
|
-
|
|
214
|
-
|
|
215
|
-
|
|
216
|
-
|
|
217
|
-
|
|
218
|
-
-
|
|
219
|
-
-
|
|
220
|
-
-
|
|
221
|
-
-
|
|
222
|
-
-
|
|
223
|
-
-
|
|
224
|
-
|
|
225
|
-
|
|
226
|
-
|
|
227
|
-
|
|
228
|
-
|
|
229
|
-
|
|
230
|
-
|
|
231
|
-
|
|
291
|
+
## Embedding Models PK
|
|
292
|
+
|
|
293
|
+
**Mainstream Embedding Models Performance**
|
|
294
|
+
|
|
295
|
+
| Model | Dimension | Avg Precision@3 | Avg Precision@5 | Good Queries (≥66.7%) | Failed Queries (0%) |
|
|
296
|
+
| ------------------------------------------------ | --------- | --------------- | --------------- | --------------------- | ------------------- |
|
|
297
|
+
| siliconflow/Qwen/Qwen3-Embedding-8B | 4096 | **76.7%** | 66.0% | 5/10 | 0/10 |
|
|
298
|
+
| siliconflow/Qwen/Qwen3-Embedding-4B | 2560 | **73.3%** | 54.0% | 5/10 | 1/10 |
|
|
299
|
+
| voyage/voyage-code-3 | 1024 | **73.3%** | 52.0% | 6/10 | 1/10 |
|
|
300
|
+
| siliconflow/Qwen/Qwen3-Embedding-0.6B | 1024 | **63.3%** | 42.0% | 4/10 | 1/10 |
|
|
301
|
+
| morph-embedding-v2 | 1536 | **56.7%** | 44.0% | 3/10 | 1/10 |
|
|
302
|
+
| openai/text-embedding-ada-002 | 1536 | **53.3%** | 38.0% | 2/10 | 1/10 |
|
|
303
|
+
| voyage/voyage-3-large | 1024 | **53.3%** | 42.0% | 3/10 | 2/10 |
|
|
304
|
+
| openai/text-embedding-3-large | 3072 | **46.7%** | 38.0% | 1/10 | 3/10 |
|
|
305
|
+
| voyage/voyage-3.5 | 1024 | **43.3%** | 38.0% | 1/10 | 2/10 |
|
|
306
|
+
| voyage/voyage-3.5-lite | 1024 | **36.7%** | 28.0% | 1/10 | 2/10 |
|
|
307
|
+
| openai/text-embedding-3-small | 1536 | **33.3%** | 28.0% | 1/10 | 4/10 |
|
|
308
|
+
| siliconflow/BAAI/bge-large-en-v1.5 | 1024 | **30.0%** | 28.0% | 0/10 | 3/10 |
|
|
309
|
+
| siliconflow/Pro/BAAI/bge-m3 | 1024 | **26.7%** | 24.0% | 0/10 | 2/10 |
|
|
310
|
+
| ollama/nomic-embed-text | 768 | **16.7%** | 18.0% | 0/10 | 6/10 |
|
|
311
|
+
| siliconflow/netease-youdao/bce-embedding-base_v1 | 1024 | **13.3%** | 16.0% | 0/10 | 6/10 |
|
|
312
|
+
|
|
313
|
+
------
|
|
314
|
+
|
|
315
|
+
**Ollama-based Embedding Models Performance**
|
|
316
|
+
|
|
317
|
+
| Model | Dimension | Precision@3 | Precision@5 | Good Queries (≥66.7%) | Failed Queries (0%) |
|
|
318
|
+
| -------------------------------------------------------- | --------- | ----------- | ----------- | --------------------- | ------------------- |
|
|
319
|
+
| ollama/dengcao/Qwen3-Embedding-4B:Q4_K_M | 2560 | 66.7% | 48.0% | 4/10 | 1/10 |
|
|
320
|
+
| ollama/dengcao/Qwen3-Embedding-0.6B:f16 | 1024 | 63.3% | 44.0% | 3/10 | 0/10 |
|
|
321
|
+
| ollama/dengcao/Qwen3-Embedding-0.6B:Q8_0 | 1024 | 63.3% | 44.0% | 3/10 | 0/10 |
|
|
322
|
+
| ollama/dengcao/Qwen3-Embedding-4B:Q8_0 | 2560 | 60.0% | 48.0% | 3/10 | 1/10 |
|
|
323
|
+
| lmstudio/taylor-jones/bge-code-v1-Q8_0-GGUF | 1536 | 60.0% | 54.0% | 4/10 | 1/10 |
|
|
324
|
+
| ollama/dengcao/Qwen3-Embedding-8B:Q4_K_M | 4096 | 56.7% | 42.0% | 2/10 | 2/10 |
|
|
325
|
+
| ollama/hf.co/nomic-ai/nomic-embed-code-GGUF:Q4_K_M | 3584 | 53.3% | 44.0% | 2/10 | 0/10 |
|
|
326
|
+
| ollama/bge-m3:f16 | 1024 | 26.7% | 24.0% | 0/10 | 2/10 |
|
|
327
|
+
| ollama/hf.co/nomic-ai/nomic-embed-text-v2-moe-GGUF:f16 | 768 | 26.7% | 20.0% | 0/10 | 2/10 |
|
|
328
|
+
| ollama/granite-embedding:278m-fp16 | 768 | 23.3% | 18.0% | 0/10 | 4/10 |
|
|
329
|
+
| ollama/unclemusclez/jina-embeddings-v2-base-code:f16 | 768 | 23.3% | 16.0% | 0/10 | 5/10 |
|
|
330
|
+
| lmstudio/awhiteside/CodeRankEmbed-Q8_0-GGUF | 768 | 23.3% | 16.0% | 0/10 | 5/10 |
|
|
331
|
+
| lmstudio/wsxiaoys/jina-embeddings-v2-base-code-Q8_0-GGUF | 768 | 23.3% | 16.0% | 0/10 | 5/10 |
|
|
332
|
+
| ollama/dengcao/Dmeta-embedding-zh:F16 | 768 | 20.0% | 20.0% | 0/10 | 6/10 |
|
|
333
|
+
| ollama/znbang/bge:small-en-v1.5-q8_0 | 384 | 16.7% | 16.0% | 0/10 | 6/10 |
|
|
334
|
+
| lmstudio/nomic-ai/nomic-embed-text-v1.5-GGUF@Q4_K_M | 768 | 16.7% | 14.0% | 0/10 | 6/10 |
|
|
335
|
+
| ollama/nomic-embed-text:f16 | 768 | 16.7% | 18.0% | 0/10 | 6/10 |
|
|
336
|
+
| ollama/snowflake-arctic-embed2:568m:f16 | 1024 | 16.7% | 18.0% | 0/10 | 5/10 |
|