@aeriondyseti/vector-memory-mcp 0.8.0 → 0.9.0-dev.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +58 -382
- package/package.json +3 -2
- package/src/mcp/tools.ts +56 -16
package/README.md
CHANGED
|
@@ -1,453 +1,129 @@
|
|
|
1
1
|
# Vector Memory MCP Server
|
|
2
2
|
|
|
3
|
-
>
|
|
3
|
+
> Semantic memory storage for AI assistants. Store decisions, patterns, and context that persists across sessions.
|
|
4
4
|
|
|
5
|
-
A
|
|
6
|
-
|
|
7
|
-
**Perfect for:** Software teams maintaining architectural knowledge, developers juggling multiple projects, and anyone building with MCP-compatible AI assistants.
|
|
5
|
+
A local-first MCP server that provides vector-based memory storage. Uses local embeddings and LanceDB for fast, private semantic search.
|
|
8
6
|
|
|
9
7
|
[](https://opensource.org/licenses/MIT)
|
|
10
|
-
[](https://bun.sh/)
|
|
12
|
-
[](https://modelcontextprotocol.io)
|
|
8
|
+
[](https://www.npmjs.com/package/@aeriondyseti/vector-memory-mcp)
|
|
13
9
|
|
|
14
10
|
---
|
|
15
11
|
|
|
16
|
-
##
|
|
17
|
-
|
|
18
|
-
### 🔒 **Local-First & Private**
|
|
19
|
-
- All embeddings generated locally (no cloud APIs)
|
|
20
|
-
- Data stored in local LanceDB databases
|
|
21
|
-
- Complete privacy and control over your memories
|
|
22
|
-
|
|
23
|
-
### 🎯 **Intelligent Semantic Search**
|
|
24
|
-
- Vector similarity with multi-factor scoring
|
|
25
|
-
- Considers relevance, recency, priority, and usage frequency
|
|
26
|
-
- Context-aware retrieval based on conversation flow
|
|
27
|
-
|
|
28
|
-
### 📊 **Smart Memory Storage**
|
|
29
|
-
- Stores memories in `~/.local/share/vector-memory-mcp/memories.db`
|
|
30
|
-
- Fast LanceDB-based storage with vector search capabilities
|
|
31
|
-
- Memories persist across sessions and projects
|
|
32
|
-
|
|
33
|
-
### ⚡ **High Performance**
|
|
34
|
-
- Sub-100ms search latency for 1000+ memories
|
|
35
|
-
- Efficient storage (<10MB per 1000 memories)
|
|
36
|
-
- CPU-optimized local embeddings (no GPU required)
|
|
12
|
+
## Features
|
|
37
13
|
|
|
38
|
-
|
|
39
|
-
-
|
|
40
|
-
|
|
41
|
-
|
|
42
|
-
-
|
|
43
|
-
- Built with Bun for maximum performance
|
|
44
|
-
- Simple MCP tools for storing and searching
|
|
45
|
-
- TypeScript for type safety
|
|
14
|
+
- **Local & Private** - All embeddings generated locally, data stored in local LanceDB
|
|
15
|
+
- **Semantic Search** - Vector similarity search with configurable scoring
|
|
16
|
+
- **Batch Operations** - Store, update, delete, and retrieve multiple memories at once
|
|
17
|
+
- **Session Handoffs** - Save and restore project context between sessions
|
|
18
|
+
- **MCP Native** - Standard protocol, works with any MCP-compatible client
|
|
46
19
|
|
|
47
20
|
---
|
|
48
21
|
|
|
49
|
-
##
|
|
22
|
+
## Quick Start
|
|
50
23
|
|
|
51
24
|
### Prerequisites
|
|
52
25
|
|
|
53
26
|
- [Bun](https://bun.sh/) 1.0+
|
|
54
|
-
- An MCP-compatible client
|
|
55
|
-
|
|
56
|
-
> **Note:** This server requires Bun to run.
|
|
27
|
+
- An MCP-compatible client (Claude Code, Claude Desktop, etc.)
|
|
57
28
|
|
|
58
|
-
###
|
|
29
|
+
### Install
|
|
59
30
|
|
|
60
|
-
#### Option 1: Global Install (Recommended)
|
|
61
|
-
|
|
62
|
-
**Install:**
|
|
63
31
|
```bash
|
|
64
32
|
bun install -g @aeriondyseti/vector-memory-mcp
|
|
65
33
|
```
|
|
66
34
|
|
|
67
|
-
>
|
|
68
|
-
|
|
69
|
-
**Configure your MCP client** (example config for clients that use `~/.claude/config.json`):
|
|
70
|
-
```json
|
|
71
|
-
{
|
|
72
|
-
"mcpServers": {
|
|
73
|
-
"memory": {
|
|
74
|
-
"type": "stdio",
|
|
75
|
-
"command": "bunx",
|
|
76
|
-
"args": [
|
|
77
|
-
"--bun",
|
|
78
|
-
"@aeriondyseti/vector-memory-mcp"
|
|
79
|
-
],
|
|
80
|
-
"env": {}
|
|
81
|
-
}
|
|
82
|
-
}
|
|
83
|
-
}
|
|
84
|
-
```
|
|
35
|
+
> First install downloads ML models (~90MB). This may take a minute.
|
|
85
36
|
|
|
86
|
-
|
|
37
|
+
### Configure
|
|
87
38
|
|
|
88
|
-
|
|
89
|
-
```bash
|
|
90
|
-
git clone https://github.com/AerionDyseti/vector-memory-mcp.git
|
|
91
|
-
cd vector-memory-mcp
|
|
92
|
-
bun install
|
|
93
|
-
```
|
|
39
|
+
Add to your MCP client config (e.g., `~/.claude/settings.json`):
|
|
94
40
|
|
|
95
|
-
**Configure your MCP client** (example config for clients that use `~/.claude/config.json`):
|
|
96
41
|
```json
|
|
97
42
|
{
|
|
98
43
|
"mcpServers": {
|
|
99
|
-
"memory": {
|
|
100
|
-
"
|
|
101
|
-
"
|
|
44
|
+
"vector-memory": {
|
|
45
|
+
"type": "stdio",
|
|
46
|
+
"command": "bunx",
|
|
47
|
+
"args": ["--bun", "@aeriondyseti/vector-memory-mcp"]
|
|
102
48
|
}
|
|
103
49
|
}
|
|
104
50
|
}
|
|
105
51
|
```
|
|
106
|
-
*Replace `/absolute/path/to/` with your actual installation path.*
|
|
107
52
|
|
|
108
|
-
|
|
53
|
+
### Use
|
|
109
54
|
|
|
110
|
-
|
|
111
|
-
- The vector-memory-mcp package and all dependencies
|
|
112
|
-
- Native binaries for ONNX Runtime (~32MB) and image processing (~10MB)
|
|
113
|
-
- ML model files automatically downloaded during installation (~90MB, cached in `~/.cache/huggingface/`)
|
|
114
|
-
- **Total first-time setup:** ~130MB of downloads
|
|
55
|
+
Restart your MCP client. You now have access to:
|
|
115
56
|
|
|
116
|
-
|
|
117
|
-
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
|
|
121
|
-
|
|
122
|
-
|
|
123
|
-
|
|
124
|
-
|
|
125
|
-
- `delete_memories` - Remove memories (always pass array of IDs)
|
|
126
|
-
- `store_handoff` - Store a handoff-style project snapshot
|
|
127
|
-
- `get_handoff` - Retrieve the latest handoff (includes referenced memories)
|
|
57
|
+
| Tool | Description |
|
|
58
|
+
|------|-------------|
|
|
59
|
+
| `store_memories` | Save memories (accepts array) |
|
|
60
|
+
| `search_memories` | Find relevant memories semantically |
|
|
61
|
+
| `get_memories` | Retrieve memories by ID (accepts array) |
|
|
62
|
+
| `update_memories` | Update existing memories |
|
|
63
|
+
| `delete_memories` | Remove memories (accepts array) |
|
|
64
|
+
| `store_handoff` | Save session context for later |
|
|
65
|
+
| `get_handoff` | Restore session context |
|
|
128
66
|
|
|
129
67
|
---
|
|
130
68
|
|
|
131
|
-
##
|
|
132
|
-
|
|
133
|
-
### Storing Memories
|
|
134
|
-
|
|
135
|
-
Ask your MCP client/agent to remember things for you:
|
|
69
|
+
## Usage
|
|
136
70
|
|
|
71
|
+
**Store a memory:**
|
|
137
72
|
```
|
|
138
73
|
You: "Remember that we use Drizzle ORM for database access"
|
|
139
|
-
|
|
140
|
-
```
|
|
141
|
-
|
|
142
|
-
Or your MCP client/agent can store memories directly:
|
|
143
|
-
```json
|
|
144
|
-
{
|
|
145
|
-
"content": "Use Drizzle ORM for type-safe database access",
|
|
146
|
-
"metadata": {
|
|
147
|
-
"tags": ["architecture", "database"],
|
|
148
|
-
"category": "tooling"
|
|
149
|
-
}
|
|
150
|
-
}
|
|
74
|
+
Assistant: [calls store_memories]
|
|
151
75
|
```
|
|
152
76
|
|
|
153
|
-
|
|
154
|
-
|
|
155
|
-
Your MCP client/agent can automatically search memories when relevant, or you can ask:
|
|
156
|
-
|
|
77
|
+
**Search memories:**
|
|
157
78
|
```
|
|
158
79
|
You: "What did we decide about the database?"
|
|
159
|
-
|
|
160
|
-
```
|
|
161
|
-
|
|
162
|
-
Search parameters:
|
|
163
|
-
```json
|
|
164
|
-
{
|
|
165
|
-
"query": "authentication strategy",
|
|
166
|
-
"limit": 10
|
|
167
|
-
}
|
|
168
|
-
```
|
|
169
|
-
|
|
170
|
-
### Managing Memories
|
|
171
|
-
|
|
172
|
-
Retrieve a specific memory:
|
|
173
|
-
```json
|
|
174
|
-
{
|
|
175
|
-
"id": "memory-id-here"
|
|
176
|
-
}
|
|
177
|
-
```
|
|
178
|
-
|
|
179
|
-
Delete a memory:
|
|
180
|
-
```json
|
|
181
|
-
{
|
|
182
|
-
"id": "memory-id-here"
|
|
183
|
-
}
|
|
184
|
-
```
|
|
185
|
-
|
|
186
|
-
---
|
|
187
|
-
|
|
188
|
-
## 🏗️ Architecture
|
|
189
|
-
|
|
190
|
-
```
|
|
191
|
-
vector-memory-mcp/
|
|
192
|
-
├── src/
|
|
193
|
-
│ ├── index.ts # Entry point
|
|
194
|
-
│ ├── config/ # Configuration management
|
|
195
|
-
│ ├── db/ # Database layer (LanceDB)
|
|
196
|
-
│ ├── services/
|
|
197
|
-
│ │ ├── embeddings.service.ts # Embeddings via @huggingface/transformers
|
|
198
|
-
│ │ └── memory.service.ts # Core memory operations
|
|
199
|
-
│ └── mcp/
|
|
200
|
-
│ ├── server.ts # MCP server setup
|
|
201
|
-
│ ├── tools.ts # MCP tool definitions
|
|
202
|
-
│ └── handlers.ts # Tool request handlers
|
|
203
|
-
├── tests/
|
|
204
|
-
│ ├── memory.test.ts
|
|
205
|
-
│ └── embeddings.test.ts
|
|
206
|
-
├── bin/
|
|
207
|
-
│ └── vector-memory-mcp.js # Executable entry point
|
|
208
|
-
└── package.json
|
|
209
|
-
```
|
|
210
|
-
|
|
211
|
-
### Technology Stack
|
|
212
|
-
|
|
213
|
-
- **MCP Framework**: @modelcontextprotocol/sdk (official SDK)
|
|
214
|
-
- **Vector Database**: LanceDB (fast, local, vector search)
|
|
215
|
-
- **Embeddings**: [@huggingface/transformers](https://huggingface.co/docs/transformers.js) (Xenova/all-MiniLM-L6-v2, 384 dimensions)
|
|
216
|
-
- **Language**: TypeScript 5.0+
|
|
217
|
-
- **Runtime**: Bun 1.0+
|
|
218
|
-
- **Testing**: Bun test
|
|
219
|
-
|
|
220
|
-
---
|
|
221
|
-
|
|
222
|
-
## 🎨 How It Works
|
|
223
|
-
|
|
224
|
-
### 1. Memory Storage
|
|
225
|
-
|
|
80
|
+
Assistant: [calls search_memories with relevant query]
|
|
226
81
|
```
|
|
227
|
-
An MCP client calls store_memories tool
|
|
228
|
-
↓
|
|
229
|
-
Content → @huggingface/transformers → 384d vector
|
|
230
|
-
↓
|
|
231
|
-
Store in LanceDB with metadata
|
|
232
|
-
↓
|
|
233
|
-
~/.local/share/vector-memory-mcp/memories.db
|
|
234
|
-
```
|
|
235
|
-
|
|
236
|
-
### 2. Memory Retrieval
|
|
237
82
|
|
|
83
|
+
**Session handoffs:**
|
|
238
84
|
```
|
|
239
|
-
|
|
240
|
-
|
|
241
|
-
Query → @huggingface/transformers → 384d vector
|
|
242
|
-
↓
|
|
243
|
-
Vector search in LanceDB
|
|
244
|
-
↓
|
|
245
|
-
Vector similarity scoring
|
|
246
|
-
↓
|
|
247
|
-
Return top N relevant memories
|
|
85
|
+
You: "Save context for next session"
|
|
86
|
+
Assistant: [calls store_handoff with summary, completed items, next steps]
|
|
248
87
|
```
|
|
249
88
|
|
|
250
89
|
---
|
|
251
90
|
|
|
252
|
-
##
|
|
253
|
-
|
|
254
|
-
The server uses environment variables for configuration:
|
|
91
|
+
## Configuration
|
|
255
92
|
|
|
256
|
-
|
|
93
|
+
Environment variables:
|
|
257
94
|
|
|
258
|
-
|
|
259
|
-
|
|
260
|
-
|
|
261
|
-
|
|
262
|
-
|
|
263
|
-
export VECTOR_MEMORY_DB_PATH="/path/to/custom/memories.db"
|
|
264
|
-
export VECTOR_MEMORY_MODEL="Xenova/all-MiniLM-L6-v2"
|
|
265
|
-
```
|
|
266
|
-
|
|
267
|
-
Or in your MCP client config:
|
|
268
|
-
```json
|
|
269
|
-
{
|
|
270
|
-
"mcpServers": {
|
|
271
|
-
"memory": {
|
|
272
|
-
"command": "vector-memory-mcp",
|
|
273
|
-
"env": {
|
|
274
|
-
"VECTOR_MEMORY_DB_PATH": "/custom/path/memories.db"
|
|
275
|
-
}
|
|
276
|
-
}
|
|
277
|
-
}
|
|
278
|
-
}
|
|
279
|
-
```
|
|
95
|
+
| Variable | Default | Description |
|
|
96
|
+
|----------|---------|-------------|
|
|
97
|
+
| `VECTOR_MEMORY_DB_PATH` | `.vector-memory/memories.db` | Database location |
|
|
98
|
+
| `VECTOR_MEMORY_MODEL` | `Xenova/all-MiniLM-L6-v2` | Embedding model |
|
|
99
|
+
| `VECTOR_MEMORY_HTTP_PORT` | `3271` | HTTP server port |
|
|
280
100
|
|
|
281
101
|
---
|
|
282
102
|
|
|
283
|
-
##
|
|
284
|
-
|
|
285
|
-
### Running Tests
|
|
286
|
-
|
|
287
|
-
```bash
|
|
288
|
-
# Run all tests (recommended - includes model preload)
|
|
289
|
-
bun run test
|
|
290
|
-
|
|
291
|
-
# Run tests directly (skips 19 embedding tests, faster)
|
|
292
|
-
bun test
|
|
293
|
-
|
|
294
|
-
# Run with coverage
|
|
295
|
-
bun test --coverage
|
|
296
|
-
|
|
297
|
-
# Type checking
|
|
298
|
-
bun run typecheck
|
|
299
|
-
```
|
|
300
|
-
|
|
301
|
-
> **Note:** `bun run test` uses a wrapper that preloads the embedding model, running all 98 tests. `bun test` directly is faster but skips embedding-specific tests.
|
|
302
|
-
|
|
303
|
-
### Development Mode
|
|
103
|
+
## Development
|
|
304
104
|
|
|
305
105
|
```bash
|
|
306
|
-
|
|
307
|
-
|
|
308
|
-
|
|
309
|
-
# Run directly without building
|
|
310
|
-
bun run src/index.ts
|
|
311
|
-
```
|
|
312
|
-
|
|
313
|
-
### Building
|
|
314
|
-
|
|
315
|
-
```bash
|
|
316
|
-
# Build for production
|
|
317
|
-
bun run build
|
|
106
|
+
git clone https://github.com/AerionDyseti/vector-memory-mcp.git
|
|
107
|
+
cd vector-memory-mcp
|
|
108
|
+
bun install
|
|
318
109
|
|
|
319
|
-
|
|
110
|
+
bun run test # Run all tests
|
|
111
|
+
bun run dev # Watch mode
|
|
112
|
+
bun run typecheck # Type checking
|
|
320
113
|
```
|
|
321
114
|
|
|
322
|
-
|
|
323
|
-
|
|
324
|
-
## 🗺️ Roadmap
|
|
325
|
-
|
|
326
|
-
### ✅ Phase 1: Foundation (Current)
|
|
327
|
-
- ✅ Core database with LanceDB
|
|
328
|
-
- ✅ Embedding generation with @huggingface/transformers
|
|
329
|
-
- ✅ Basic MCP tools (store, search, get, delete)
|
|
330
|
-
- ✅ TypeScript implementation with Drizzle ORM
|
|
331
|
-
|
|
332
|
-
### 🚧 Phase 2: Enhanced Search & Scoring
|
|
333
|
-
- Multi-factor scoring algorithm (similarity, recency, priority, usage frequency)
|
|
334
|
-
- Configurable scoring weights
|
|
335
|
-
- Priority levels for memories
|
|
336
|
-
- Usage tracking and frequency-based ranking
|
|
337
|
-
- Metadata filtering and advanced tagging
|
|
338
|
-
|
|
339
|
-
### 📋 Phase 3: Dual-Level Memory System
|
|
340
|
-
- Project-specific memories (`.memory/db` in repo)
|
|
341
|
-
- Global memories (`~/.local/share/vector-memory-mcp/`)
|
|
342
|
-
- Automatic precedence handling (project overrides global)
|
|
343
|
-
- Project detection and context switching
|
|
344
|
-
|
|
345
|
-
### 🎯 Phase 4: Smart Automation
|
|
346
|
-
- Auto-detect architectural decisions
|
|
347
|
-
- Capture bug fixes and solutions automatically
|
|
348
|
-
- Generate session-end summaries
|
|
349
|
-
- Natural language trigger detection (85%+ accuracy)
|
|
350
|
-
- Continuous conversation monitoring
|
|
351
|
-
|
|
352
|
-
### 🔮 Phase 5: Advanced Features
|
|
353
|
-
- Memory deduplication with similarity threshold
|
|
354
|
-
- Batch operations (import/export)
|
|
355
|
-
- Markdown import/export
|
|
356
|
-
- Memory clustering and visualization
|
|
357
|
-
- Cross-project insights
|
|
358
|
-
- Multi-modal memories (images, diagrams)
|
|
359
|
-
- Session hooks for automatic context injection
|
|
360
|
-
- Multi-CLI support (Cursor, Windsurf, etc.)
|
|
361
|
-
- Smart priority suggestions
|
|
362
|
-
|
|
363
|
-
---
|
|
364
|
-
|
|
365
|
-
## 🤝 Contributing
|
|
366
|
-
|
|
367
|
-
Contributions are welcome! This project is in active development.
|
|
368
|
-
|
|
369
|
-
### Areas We'd Love Help With:
|
|
370
|
-
- Testing and bug reports
|
|
371
|
-
- Documentation improvements
|
|
372
|
-
- Performance optimizations
|
|
373
|
-
- New feature ideas
|
|
374
|
-
|
|
375
|
-
See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines *(coming soon)*.
|
|
376
|
-
|
|
377
|
-
---
|
|
378
|
-
|
|
379
|
-
## 📄 License
|
|
380
|
-
|
|
381
|
-
MIT License - see [LICENSE](LICENSE) for details.
|
|
382
|
-
|
|
383
|
-
---
|
|
384
|
-
|
|
385
|
-
## 🙏 Acknowledgments
|
|
386
|
-
|
|
387
|
-
- Built with [@modelcontextprotocol/sdk](https://github.com/modelcontextprotocol/typescript-sdk) - Official MCP TypeScript SDK
|
|
388
|
-
- Uses [LanceDB](https://lancedb.com/) for fast, local vector search
|
|
389
|
-
- Powered by [@huggingface/transformers](https://huggingface.co/docs/transformers.js) for local embeddings
|
|
390
|
-
- Database layer via [Drizzle ORM](https://orm.drizzle.team/)
|
|
391
|
-
- Inspired by [doobidoo's mcp-memory-service](https://github.com/doobidoo/mcp-memory-service)
|
|
392
|
-
|
|
393
|
-
---
|
|
394
|
-
|
|
395
|
-
## 🔗 Related Projects
|
|
396
|
-
|
|
397
|
-
- [Model Context Protocol](https://modelcontextprotocol.io) - Official MCP specification
|
|
398
|
-
- Any MCP-compatible client
|
|
399
|
-
- [LanceDB](https://lancedb.com/) - Fast, local vector search
|
|
400
|
-
- [Transformers.js](https://huggingface.co/docs/transformers.js) - Run transformers in JavaScript
|
|
115
|
+
See [CHANGELOG.md](CHANGELOG.md) for release history and [ROADMAP.md](ROADMAP.md) for planned features.
|
|
401
116
|
|
|
402
117
|
---
|
|
403
118
|
|
|
404
|
-
##
|
|
119
|
+
## Contributing
|
|
405
120
|
|
|
406
|
-
|
|
407
|
-
- **Discussions**: [GitHub Discussions](https://github.com/AerionDyseti/vector-memory-mcp/discussions)
|
|
408
|
-
- **Documentation**: Check the `docs/` directory
|
|
121
|
+
Contributions welcome! See [issues](https://github.com/AerionDyseti/vector-memory-mcp/issues) for areas we'd love help with.
|
|
409
122
|
|
|
410
|
-
|
|
411
|
-
|
|
412
|
-
## ⚡ Quick Examples
|
|
413
|
-
|
|
414
|
-
### Example 1: Storing a Decision
|
|
123
|
+
## License
|
|
415
124
|
|
|
416
|
-
|
|
417
|
-
You: "Remember that we decided to use Drizzle ORM for type-safe database access"
|
|
418
|
-
Claude: I'll store that for you.
|
|
419
|
-
[Calls store_memories tool with content and metadata]
|
|
420
|
-
✓ Memory stored successfully
|
|
421
|
-
```
|
|
422
|
-
|
|
423
|
-
### Example 2: Searching Memories
|
|
424
|
-
|
|
425
|
-
```
|
|
426
|
-
You: "What did we decide about database tooling?"
|
|
427
|
-
Claude: Let me search for that...
|
|
428
|
-
[Calls search_memories with query about database]
|
|
429
|
-
Found: "Use Drizzle ORM for type-safe database access"
|
|
430
|
-
|
|
431
|
-
Based on our previous decision, we're using Drizzle ORM...
|
|
432
|
-
```
|
|
433
|
-
|
|
434
|
-
### Example 3: Managing Memories
|
|
435
|
-
|
|
436
|
-
```
|
|
437
|
-
You: "Show me what you remember about authentication"
|
|
438
|
-
Claude: [Searches for authentication-related memories]
|
|
439
|
-
Found 3 memories:
|
|
440
|
-
1. "Use JWT tokens for API authentication"
|
|
441
|
-
2. "Store refresh tokens in httpOnly cookies"
|
|
442
|
-
3. "Implement rate limiting on auth endpoints"
|
|
443
|
-
```
|
|
125
|
+
MIT - see [LICENSE](LICENSE)
|
|
444
126
|
|
|
445
127
|
---
|
|
446
128
|
|
|
447
|
-
|
|
448
|
-
|
|
449
|
-
**[⬆ Back to Top](#vector-memory-mcp-server)**
|
|
450
|
-
|
|
451
|
-
Made with ❤️ for developers who value context continuity
|
|
452
|
-
|
|
453
|
-
</div>
|
|
129
|
+
Built with [MCP SDK](https://github.com/modelcontextprotocol/typescript-sdk), [LanceDB](https://lancedb.com/), and [Transformers.js](https://huggingface.co/docs/transformers.js)
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@aeriondyseti/vector-memory-mcp",
|
|
3
|
-
"version": "0.
|
|
3
|
+
"version": "0.9.0-dev.1",
|
|
4
4
|
"description": "A zero-configuration RAG memory server for MCP clients",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "src/index.ts",
|
|
@@ -36,7 +36,8 @@
|
|
|
36
36
|
"warmup": "bun run scripts/warmup.ts",
|
|
37
37
|
"postinstall": "bun run scripts/warmup.ts",
|
|
38
38
|
"publish:check": "bun run scripts/publish.ts --dry-run",
|
|
39
|
-
"publish:npm": "bun run scripts/publish.ts"
|
|
39
|
+
"publish:npm": "bun run scripts/publish.ts",
|
|
40
|
+
"publish:dev": "npm publish --access public --tag dev"
|
|
40
41
|
},
|
|
41
42
|
"keywords": [
|
|
42
43
|
"mcp",
|
package/src/mcp/tools.ts
CHANGED
|
@@ -2,10 +2,28 @@ import type { Tool } from "@modelcontextprotocol/sdk/types.js";
|
|
|
2
2
|
|
|
3
3
|
export const storeMemoriesTool: Tool = {
|
|
4
4
|
name: "store_memories",
|
|
5
|
-
description:
|
|
6
|
-
|
|
7
|
-
|
|
8
|
-
|
|
5
|
+
description: `Store memories that persist across conversations. Use after making decisions or learning something worth remembering.
|
|
6
|
+
|
|
7
|
+
RULES:
|
|
8
|
+
- 1 concept per memory, 1-3 sentences (20-75 words)
|
|
9
|
+
- Self-contained with explicit subjects (no "it", "this", "the project")
|
|
10
|
+
- Include dates/versions when relevant
|
|
11
|
+
- Be concrete, not vague
|
|
12
|
+
|
|
13
|
+
MEMORY TYPES (use as metadata.type):
|
|
14
|
+
- decision: what was chosen + why ("Chose libSQL over PostgreSQL for vector support and simpler deployment")
|
|
15
|
+
- implementation: what was built + where + patterns used
|
|
16
|
+
- insight: learning + why it matters
|
|
17
|
+
- blocker: problem encountered + resolution
|
|
18
|
+
- next-step: TODO item + suggested approach
|
|
19
|
+
- context: background info + constraints
|
|
20
|
+
|
|
21
|
+
DON'T STORE: machine-specific paths, local env details, ephemeral states, pleasantries
|
|
22
|
+
|
|
23
|
+
GOOD: "Aerion chose libSQL over PostgreSQL for Resonance (Dec 2024) because of native vector support and simpler deployment."
|
|
24
|
+
BAD: "Uses SQLite" (no context, no subject, no reasoning)
|
|
25
|
+
|
|
26
|
+
For long content (>1000 chars), provide embedding_text with a searchable summary.`,
|
|
9
27
|
inputSchema: {
|
|
10
28
|
type: "object",
|
|
11
29
|
properties: {
|
|
@@ -61,9 +79,13 @@ export const deleteMemoriesTool: Tool = {
|
|
|
61
79
|
|
|
62
80
|
const updateMemoriesTool: Tool = {
|
|
63
81
|
name: "update_memories",
|
|
64
|
-
description:
|
|
65
|
-
|
|
66
|
-
|
|
82
|
+
description: `Update existing memories in place. Prefer over delete+create when updating the same conceptual item.
|
|
83
|
+
|
|
84
|
+
BEHAVIOR:
|
|
85
|
+
- Fields omitted/null: left untouched
|
|
86
|
+
- Fields provided: completely overwrite existing value (no merge)
|
|
87
|
+
|
|
88
|
+
Use to correct content, refine embedding text, or replace metadata without changing the memory ID.`,
|
|
67
89
|
inputSchema: {
|
|
68
90
|
type: "object",
|
|
69
91
|
properties: {
|
|
@@ -101,12 +123,22 @@ const updateMemoriesTool: Tool = {
|
|
|
101
123
|
|
|
102
124
|
export const searchMemoriesTool: Tool = {
|
|
103
125
|
name: "search_memories",
|
|
104
|
-
description:
|
|
105
|
-
|
|
106
|
-
|
|
107
|
-
|
|
108
|
-
|
|
109
|
-
|
|
126
|
+
description: `Search stored memories semantically. Use PROACTIVELY—don't wait to be asked.
|
|
127
|
+
|
|
128
|
+
WHEN TO SEARCH:
|
|
129
|
+
- At conversation start / returning to a project
|
|
130
|
+
- Before making decisions (check for prior decisions on same topic)
|
|
131
|
+
- When user references past work ("what did we decide", "as discussed", "remember when")
|
|
132
|
+
- Before suggesting solutions (check if problem was solved before)
|
|
133
|
+
- When debugging (check for prior blockers/resolutions)
|
|
134
|
+
- When uncertain about patterns or preferences
|
|
135
|
+
|
|
136
|
+
When in doubt, search. Missing relevant context is costlier than an extra query.
|
|
137
|
+
|
|
138
|
+
QUERY TIPS:
|
|
139
|
+
- Include project name, technical terms, or domain keywords
|
|
140
|
+
- Search for concepts, not exact phrases
|
|
141
|
+
- Try multiple queries if first doesn't return useful results`,
|
|
110
142
|
inputSchema: {
|
|
111
143
|
type: "object",
|
|
112
144
|
properties: {
|
|
@@ -149,9 +181,17 @@ export const getMemoriesTool: Tool = {
|
|
|
149
181
|
|
|
150
182
|
export const storeHandoffTool: Tool = {
|
|
151
183
|
name: "store_handoff",
|
|
152
|
-
description:
|
|
153
|
-
|
|
154
|
-
|
|
184
|
+
description: `Save session state for seamless resumption later. Use at end of work sessions or before context switches.
|
|
185
|
+
|
|
186
|
+
Creates a structured snapshot with:
|
|
187
|
+
- summary: 2-3 sentences on goal and current status
|
|
188
|
+
- completed: what got done (include file paths)
|
|
189
|
+
- in_progress_blocked: work in flight or stuck
|
|
190
|
+
- key_decisions: choices made and WHY (crucial for future context)
|
|
191
|
+
- next_steps: concrete, actionable items
|
|
192
|
+
- memory_ids: link to related memories stored this session
|
|
193
|
+
|
|
194
|
+
Retrievable via get_handoff. Only one handoff per project—new handoffs overwrite previous.`,
|
|
155
195
|
inputSchema: {
|
|
156
196
|
type: "object",
|
|
157
197
|
properties: {
|