@iceinvein/code-intelligence-mcp 0.1.24 → 0.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +134 -0
  2. package/package.json +1 -1
package/README.md ADDED
@@ -0,0 +1,134 @@
1
+ # Code Intelligence MCP Server
2
+
3
+ > **Semantic search and code navigation for LLM agents.**
4
+
5
+ [![NPM Version](https://img.shields.io/npm/v/@iceinvein/code-intelligence-mcp?style=flat-square&color=blue)](https://www.npmjs.com/package/@iceinvein/code-intelligence-mcp)
6
+ [![License](https://img.shields.io/badge/license-MIT-green?style=flat-square)](LICENSE)
7
+ [![MCP](https://img.shields.io/badge/MCP-Enabled-orange?style=flat-square)](https://modelcontextprotocol.io)
8
+
9
+ ---
10
+
11
+ This server indexes your codebase locally to provide **fast, semantic, and structure-aware** code navigation to tools like OpenCode, Trae, and Cursor.
12
+
13
+ ## Why Use This Server?
14
+
15
+ Unlike basic text search, this server builds a local knowledge graph to understand your code.
16
+
17
+ * 🔍 **Hybrid Search**: Combines **Tantivy** (keyword) + **LanceDB** (semantic vector) + **FastEmbed** (local embedding model).
18
+ * 🚀 **Production First**: Ranking heuristics prioritize implementation code over tests and glue code (`index.ts`).
19
+ * 🧠 **Developer Aware**: Handles common acronyms and casing (e.g., "db" matches "database" and "DBConnection").
20
+ * ⚡ **Fast & Local**: Written in **Rust**. Uses Metal GPU acceleration on macOS. Indexes are stored locally within your project.
21
+
22
+ ---
23
+
24
+ ## Quick Start
25
+
26
+ Runs directly via `npx` without requiring a local Rust toolchain.
27
+
28
+ ### OpenCode / Trae
29
+
30
+ Add to your `opencode.json` (or global config):
31
+
32
+ ```json
33
+ {
34
+ "mcp": {
35
+ "code-intelligence": {
36
+ "type": "local",
37
+ "command": ["npx", "-y", "@iceinvein/code-intelligence-mcp"],
38
+ "enabled": true
39
+ }
40
+ }
41
+ }
42
+ ```
43
+
44
+ *The server will automatically download the AI model (~300MB) and index your project in the background.*
45
+
46
+ ---
47
+
48
+ ## Capabilities
49
+
50
+ Available tools for the agent:
51
+
52
+ | Tool | Description |
53
+ | :--- | :--- |
54
+ | `search_code` | **Primary Search.** Finds code by meaning ("how does auth work?") or structure ("class User"). |
55
+ | `get_definition` | Retrieves the definition of a specific symbol. |
56
+ | `find_references` | Finds all usages of a function, class, or variable. |
57
+ | `get_call_hierarchy` | specifices upstream callers and downstream callees. |
58
+ | `get_type_graph` | Explores inheritance and interface implementations. |
59
+ | `get_usage_examples` | Returns real-world examples of how a symbol is used in the codebase. |
60
+
61
+ ---
62
+
63
+ ## Smart Ranking
64
+
65
+ The ranking engine optimizes results for relevance using several heuristics:
66
+
67
+ 1. **Test Penalty**: Test files (`*.test.ts`, `__tests__`) are ranked lower by default, but are boosted if the query intent implies testing (e.g. "verify login").
68
+ 2. **Glue Code Filtering**: Re-export files (e.g., `index.ts`) are deprioritized in favor of the actual implementation.
69
+ 3. **Acronym Expansion**: Queries are normalized so "nav bar" matches `NavBar`, `Navigation`, and `NavigationBar`.
70
+ 4. **Intent Detection**:
71
+ * "struct User" → Boosts definitions.
72
+ * "who calls login" → Triggers graph lookup.
73
+ * "verify login" → Boosts test files.
74
+
75
+ ---
76
+
77
+ ## Configuration (Optional)
78
+
79
+ Works without configuration by default. You can customize behavior via environment variables:
80
+
81
+ ```json
82
+ "env": {
83
+ "WATCH_MODE": "true", // Watch for file changes? (Default: false)
84
+ "EMBEDDINGS_DEVICE": "cpu", // Force CPU if Metal fails (Default: metal on mac)
85
+ "INDEX_PATTERNS": "**/*.go", // Add custom file types
86
+ "MAX_CONTEXT_BYTES": "50000" // Limit context window
87
+ }
88
+ ```
89
+
90
+ ---
91
+
92
+ ## Architecture
93
+
94
+ ```mermaid
95
+ flowchart LR
96
+ Client[MCP Client] <==> Tools
97
+
98
+ subgraph Server [Code Intelligence Server]
99
+ direction TB
100
+ Tools[Tool Router]
101
+
102
+ subgraph Indexer [Indexing Pipeline]
103
+ direction TB
104
+ Scan[File Scan] --> Parse[Tree-Sitter]
105
+ Parse --> Extract[Symbol Extraction]
106
+ Extract --> Embed[FastEmbed Model]
107
+ end
108
+
109
+ subgraph Storage [Storage Engine]
110
+ direction TB
111
+ SQLite[(SQLite)]
112
+ Tantivy[(Tantivy)]
113
+ Lance[(LanceDB)]
114
+ end
115
+
116
+ %% Data Flow
117
+ Tools -- Index --> Scan
118
+ Embed --> SQLite
119
+ Embed --> Tantivy
120
+ Embed --> Lance
121
+
122
+ Tools -- Query --> SQLite
123
+ Tools -- Query --> Tantivy
124
+ Tools -- Query --> Lance
125
+ end
126
+ ```
127
+
128
+ ---
129
+
130
+ ## Development
131
+
132
+ 1. **Prerequisites**: Rust (stable), `protobuf`.
133
+ 2. **Build**: `cargo build --release`
134
+ 3. **Run**: `./scripts/start_mcp.sh`
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@iceinvein/code-intelligence-mcp",
3
- "version": "0.1.24",
3
+ "version": "0.2.1",
4
4
  "description": "Code Intelligence MCP Server - Smart context for your LLM coding agent",
5
5
  "bin": {
6
6
  "code-intelligence-mcp": "bin/run.js"