neuromcp 0.1.1 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -22,7 +22,23 @@ npx neuromcp
22
22
  npx neuromcp
23
23
  ```
24
24
 
25
- The server starts on stdio, creates `~/.neuromcp/memory.db` on first run, and downloads the ONNX embedding model automatically.
25
+ The server starts on stdio, creates `~/.neuromcp/memory.db` on first run, and downloads a fallback ONNX model automatically.
26
+
27
+ ### Recommended: Install Ollama for real semantic search
28
+
29
+ neuromcp works out of the box with a built-in ONNX model, but for **real semantic understanding** (finding "how does auth work?" when the memory says "JWT tokens are validated in middleware"), install [Ollama](https://ollama.com) with `nomic-embed-text`:
30
+
31
+ ```bash
32
+ # Install Ollama from https://ollama.com, then:
33
+ ollama pull nomic-embed-text
34
+ ```
35
+
36
+ neuromcp auto-detects Ollama at `localhost:11434` and upgrades automatically. No config needed.
37
+
38
+ | Provider | Semantic Quality | Setup |
39
+ |----------|-----------------|-------|
40
+ | **Ollama + nomic-embed-text** | Excellent — real semantic understanding, 8K context | `ollama pull nomic-embed-text` |
41
+ | ONNX (built-in fallback) | Basic — keyword overlap only, no true semantic search | Zero config |
26
42
 
27
43
  ## Installation
28
44