better-mem0-mcp 1.1.4__py3-none-any.whl → 1.1.6__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -17,6 +17,8 @@ Save information to long-term memory. Mem0 automatically:
17
17
  {"action": "add", "content": "User prefers dark mode and uses FastAPI"}
18
18
  ```
19
19
 
20
+ **Response:** `Saved: {results: [{id, memory, event}]}`
21
+
20
22
  ### search
21
23
  Semantic search across stored memories. Combines vector search with graph context.
22
24
 
@@ -24,6 +26,8 @@ Semantic search across stored memories. Combines vector search with graph contex
24
26
  {"action": "search", "query": "coding preferences", "limit": 5}
25
27
  ```
26
28
 
29
+ **Response:** Formatted list of matching memories with optional graph context.
30
+
27
31
  ### list
28
32
  Get all stored memories for a user.
29
33
 
@@ -31,6 +35,8 @@ Get all stored memories for a user.
31
35
  {"action": "list"}
32
36
  ```
33
37
 
38
+ **Response:** `Memories (N): - [id] memory text`
39
+
34
40
  ### delete
35
41
  Remove a memory by ID.
36
42
 
@@ -38,10 +44,17 @@ Remove a memory by ID.
38
44
  {"action": "delete", "memory_id": "abc12345-..."}
39
45
  ```
40
46
 
47
+ **Response:** `Deleted: {memory_id}`
48
+
41
49
  ## Parameters
42
50
  - `action` - Required: add, search, list, delete
43
51
  - `content` - Required for add: information to remember
44
52
  - `query` - Required for search: what to search for
45
53
  - `memory_id` - Required for delete: ID of memory to remove
46
54
  - `limit` - Optional for search: max results (default: 5)
47
- - `user_id` - Optional: scope memories to a specific user
55
+ - `user_id` - Optional: scope memories to a specific user (default: "default")
56
+
57
+ ## Technical Notes
58
+ - Mem0 API returns `{"results": [...]}` for both `search()` and `get_all()`
59
+ - Graph context is automatically included in search results when available
60
+ - Embeddings use 1536 dimensions for pgvector HNSW compatibility
better_mem0_mcp/server.py CHANGED
@@ -108,10 +108,10 @@ async def memory(
108
108
  if not query:
109
109
  return "Error: 'query' required for search action"
110
110
 
111
- # Vector search - returns {"memories": [...]}
111
+ # Vector search - returns {"results": [...]} (same as get_all)
112
112
  response = _memory.search(query, user_id=user_id, limit=limit)
113
113
  memories = (
114
- response.get("memories", []) if isinstance(response, dict) else response
114
+ response.get("results", []) if isinstance(response, dict) else response
115
115
  )
116
116
 
117
117
  # Add graph context
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: better-mem0-mcp
3
- Version: 1.1.4
3
+ Version: 1.1.6
4
4
  Summary: Zero-setup MCP Server for AI memory - works with Neon/Supabase
5
5
  Project-URL: Homepage, https://github.com/n24q02m/better-mem0-mcp
6
6
  Project-URL: Repository, https://github.com/n24q02m/better-mem0-mcp.git
@@ -29,16 +29,28 @@ Description-Content-Type: text/markdown
29
29
 
30
30
  # better-mem0-mcp
31
31
 
32
- **Zero-setup** MCP Server for AI memory. Works with Neon/Supabase free tier.
32
+ **Self-hosted MCP Server for AI memory with PostgreSQL (pgvector).**
33
33
 
34
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
34
+ [![PyPI](https://img.shields.io/pypi/v/better-mem0-mcp)](https://pypi.org/project/better-mem0-mcp/)
35
+ [![Docker](https://img.shields.io/docker/v/n24q02m/better-mem0-mcp?label=docker)](https://hub.docker.com/r/n24q02m/better-mem0-mcp)
36
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE)
37
+
38
+ ## Features
39
+
40
+ - **Self-hosted PostgreSQL** - Your data stays with you (Neon/Supabase free tier supported)
41
+ - **Graph Memory** - SQL-based relationship tracking alongside vector memory
42
+ - **Multi-provider LLM** - Gemini, OpenAI, Anthropic, Groq, DeepSeek, Mistral
43
+ - **Fallback chains** - Multi-key per provider + multi-model fallback
44
+ - **Zero manual setup** - Just `DATABASE_URL` + `API_KEYS`
45
+
46
+ ---
35
47
 
36
48
  ## Quick Start
37
49
 
38
50
  ### 1. Get Prerequisites
39
51
 
40
- - **Database**: [Neon](https://neon.tech) or [Supabase](https://supabase.com) (free tier)
41
- - **API Key**: [Google AI Studio](https://aistudio.google.com/apikey) (free tier)
52
+ - **Database**: [Neon](https://neon.tech) or [Supabase](https://supabase.com) (free tier works)
53
+ - **API Key**: Any supported provider ([Google AI Studio](https://aistudio.google.com/apikey) is free)
42
54
 
43
55
  ### 2. Add to mcp.json
44
56
 
@@ -78,7 +90,7 @@ Description-Content-Type: text/markdown
78
90
 
79
91
  ### 3. Done!
80
92
 
81
- Ask Claude: "Remember that I prefer dark mode and use FastAPI"
93
+ Ask your AI: "Remember that I prefer dark mode and use FastAPI"
82
94
 
83
95
  ---
84
96
 
@@ -86,22 +98,26 @@ Ask Claude: "Remember that I prefer dark mode and use FastAPI"
86
98
 
87
99
  | Variable | Required | Description |
88
100
  |----------|----------|-------------|
89
- | `DATABASE_URL` | Yes | PostgreSQL connection string |
90
- | `API_KEYS` | Yes | `provider:key,...` (multi-key per provider OK) |
91
- | `LLM_MODELS` | No | `provider/model,...` (fallback chain) |
92
- | `EMBEDDER_MODELS` | No | `provider/model,...` (fallback chain) |
101
+ | `DATABASE_URL` | Yes | PostgreSQL with pgvector extension |
102
+ | `API_KEYS` | Yes | `provider:key` pairs, comma-separated |
103
+ | `LLM_MODELS` | No | Model fallback chain |
104
+ | `EMBEDDER_MODELS` | No | Embedding model chain |
105
+
106
+ ### Supported Providers
107
+
108
+ `gemini`, `openai`, `anthropic`, `groq`, `deepseek`, `mistral`
93
109
 
94
110
  ### Examples
95
111
 
96
- **Minimal (Gemini only):**
97
- ```
112
+ **Single provider:**
113
+ ```bash
98
114
  API_KEYS=gemini:AIza...
99
115
  ```
100
116
 
101
117
  **Multi-key with fallback:**
102
- ```
118
+ ```bash
103
119
  API_KEYS=gemini:AIza-1,gemini:AIza-2,openai:sk-xxx
104
- LLM_MODELS=gemini/gemini-2.5-flash,openai/gpt-4o-mini
120
+ LLM_MODELS=gemini/gemini-3-flash-preview,openai/gpt-4o-mini
105
121
  EMBEDDER_MODELS=gemini/gemini-embedding-001,openai/text-embedding-3-small
106
122
  ```
107
123
 
@@ -109,7 +125,7 @@ EMBEDDER_MODELS=gemini/gemini-embedding-001,openai/text-embedding-3-small
109
125
 
110
126
  | Setting | Default |
111
127
  |---------|---------|
112
- | `LLM_MODELS` | `gemini/gemini-2.5-flash` |
128
+ | `LLM_MODELS` | `gemini/gemini-3-flash-preview` |
113
129
  | `EMBEDDER_MODELS` | `gemini/gemini-embedding-001` |
114
130
 
115
131
  ---
@@ -118,32 +134,41 @@ EMBEDDER_MODELS=gemini/gemini-embedding-001,openai/text-embedding-3-small
118
134
 
119
135
  | Tool | Description |
120
136
  |------|-------------|
121
- | `memory` | `action`: add, search, list, delete |
122
- | `help` | Detailed documentation |
137
+ | `memory` | Memory operations: `add`, `search`, `list`, `delete` |
138
+ | `help` | Get full documentation for tools |
123
139
 
124
- ### Usage
140
+ ### Usage Examples
125
141
 
126
142
  ```json
127
143
  {"action": "add", "content": "I prefer TypeScript over JavaScript"}
128
- {"action": "search", "query": "preferences"}
144
+ {"action": "search", "query": "programming preferences"}
129
145
  {"action": "list"}
130
146
  {"action": "delete", "memory_id": "abc123"}
131
147
  ```
132
148
 
133
149
  ---
134
150
 
135
- ## Why better-mem0-mcp?
151
+ ## Build from Source
152
+
153
+ ```bash
154
+ git clone https://github.com/n24q02m/better-mem0-mcp
155
+ cd better-mem0-mcp
136
156
 
137
- | Feature | Official mem0-mcp | better-mem0-mcp |
138
- |---------|-------------------|-----------------|
139
- | Storage | Mem0 Cloud | **Self-hosted PostgreSQL** |
140
- | Graph Memory | No | **Yes (SQL-based)** |
141
- | LLM Provider | OpenAI only | **Any (Gemini/OpenAI/Ollama/...)** |
142
- | Fallback | No | **Yes (multi-key + multi-model)** |
143
- | Setup | API Key | **DATABASE_URL + API_KEYS** |
157
+ # Setup (requires mise: https://mise.jdx.dev/)
158
+ mise run setup
159
+
160
+ # Run
161
+ uv run better-mem0-mcp
162
+ ```
163
+
164
+ **Requirements:** Python 3.13+
144
165
 
145
166
  ---
146
167
 
168
+ ## Contributing
169
+
170
+ See [CONTRIBUTING.md](CONTRIBUTING.md)
171
+
147
172
  ## License
148
173
 
149
- MIT
174
+ MIT - See [LICENSE](LICENSE)
@@ -3,10 +3,10 @@ better_mem0_mcp/__main__.py,sha256=IeeRidmZF4oBaamjQM6FUbhXpsSdE4sQF_6y_D2GEE4,1
3
3
  better_mem0_mcp/config.py,sha256=vgKLIw3jHyeBLTzzBnuHV5x6Nra0Rbav1IJRL8rLCuk,5576
4
4
  better_mem0_mcp/graph.py,sha256=rE9z6XECiAktEqDNgmwqCpFpvKSn3azO9H4sRBhj8UU,6195
5
5
  better_mem0_mcp/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
6
- better_mem0_mcp/server.py,sha256=feOliZQqPYqACXIrWCnxqO9mGk1KzR9OAGAo-vh2Cnc,5483
7
- better_mem0_mcp/docs/memory.md,sha256=jqoxBYHVh2N9TZbt77qY5zhL1vSU-Ro-yuaczT1r7Mo,1126
8
- better_mem0_mcp-1.1.4.dist-info/METADATA,sha256=WJcMNmrNrKmC_vCE2_qnsEGd1XVH_SShaYEKououjJk,3888
9
- better_mem0_mcp-1.1.4.dist-info/WHEEL,sha256=WLgqFyCfm_KASv4WHyYy0P3pM_m7J5L9k2skdKLirC8,87
10
- better_mem0_mcp-1.1.4.dist-info/entry_points.txt,sha256=2b7E3D6yo94mQXP2Ms0bhUlWkK9f664f0GrstImOq30,57
11
- better_mem0_mcp-1.1.4.dist-info/licenses/LICENSE,sha256=d7xQ6sRyeGus6gnvwgqiQtSY7XdFw0Jd0w5-Co_xHnk,1064
12
- better_mem0_mcp-1.1.4.dist-info/RECORD,,
6
+ better_mem0_mcp/server.py,sha256=QYLP46pz0U9gkxQKB_g8FMga77ywElMMjSCU-hGZQ-M,5499
7
+ better_mem0_mcp/docs/memory.md,sha256=198dDuAGccG5Ca7rhEXIU03ZhxNKK44B_Brl2glurGc,1608
8
+ better_mem0_mcp-1.1.6.dist-info/METADATA,sha256=g0v4rbGy4eUlqCMCgxOSralR0r80Lmp4RU97oDTjYL4,4540
9
+ better_mem0_mcp-1.1.6.dist-info/WHEEL,sha256=WLgqFyCfm_KASv4WHyYy0P3pM_m7J5L9k2skdKLirC8,87
10
+ better_mem0_mcp-1.1.6.dist-info/entry_points.txt,sha256=2b7E3D6yo94mQXP2Ms0bhUlWkK9f664f0GrstImOq30,57
11
+ better_mem0_mcp-1.1.6.dist-info/licenses/LICENSE,sha256=d7xQ6sRyeGus6gnvwgqiQtSY7XdFw0Jd0w5-Co_xHnk,1064
12
+ better_mem0_mcp-1.1.6.dist-info/RECORD,,