superlocalmemory 2.4.2 → 2.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -16,6 +16,52 @@ SuperLocalMemory V2 - Intelligent local memory system for AI coding assistants.
16
16
 
17
17
  ---
18
18
 
19
+ ## [2.5.0] - 2026-02-12
20
+
21
+ **Release Type:** Major Feature Release — "Your AI Memory Has a Heartbeat"
22
+ **Backward Compatible:** Yes (additive tables, columns, and modules only)
23
+
24
+ ### The Big Picture
25
+ SuperLocalMemory transforms from passive storage to **active coordination layer**. Every memory operation now triggers real-time events. Agents are tracked. Trust is scored. The dashboard is live.
26
+
27
+ ### Added
28
+ - **DbConnectionManager** (`src/db_connection_manager.py`): SQLite WAL mode, 5s busy timeout, thread-local read pool, serialized write queue. Fixes "database is locked" errors when multiple agents write simultaneously
29
+ - **Event Bus** (`src/event_bus.py`): Real-time event broadcasting — every memory write/update/delete fires events to SSE, WebSocket, and Webhook subscribers. Tiered retention (hot 48h, warm 14d, cold 30d)
30
+ - **Subscription Manager** (`src/subscription_manager.py`): Durable + ephemeral event subscriptions with filters (event type, importance, protocol)
31
+ - **Webhook Dispatcher** (`src/webhook_dispatcher.py`): Background HTTP POST delivery with exponential backoff retry (3 attempts)
32
+ - **Agent Registry** (`src/agent_registry.py`): Tracks which AI agents connect — protocol, write/recall counters, last seen, metadata
33
+ - **Provenance Tracker** (`src/provenance_tracker.py`): Tracks who created each memory — created_by, source_protocol, trust_score, provenance_chain
34
+ - **Trust Scorer** (`src/trust_scorer.py`): Bayesian trust signal collection — positive (cross-agent recall, high importance), negative (quick delete, burst write). Silent collection in v2.5, enforcement in v2.6
35
+ - **SSE endpoint** (`/events/stream`): Real-time Server-Sent Events with cross-process DB polling, Last-Event-ID replay, event type filtering
36
+ - **Dashboard: Live Events tab**: Real-time event stream with color-coded types, filter dropdown, stats counters
37
+ - **Dashboard: Agents tab**: Connected agents table with protocol badges, trust scores, write/recall counters, trust overview
38
+ - **4 new database tables**: `memory_events`, `subscriptions`, `agent_registry`, `trust_signals`
39
+ - **4 new columns on `memories`**: `created_by`, `source_protocol`, `trust_score`, `provenance_chain`
40
+ - **28 API endpoints** across 8 modular route files
41
+ - **Architecture document**: `docs/ARCHITECTURE-V2.5.md`
42
+
43
+ ### Changed
44
+ - **`memory_store_v2.py`**: Replaced 15 direct `sqlite3.connect()` calls with `DbConnectionManager` (read pool + write queue). Added EventBus, Provenance, Trust integration. Fully backward compatible — falls back to direct SQLite if new modules unavailable
45
+ - **`mcp_server.py`**: Replaced 9 per-call `MemoryStoreV2(DB_PATH)` instantiations with shared singleton. Added agent registration and provenance tracking for MCP calls
46
+ - **`ui_server.py`**: Refactored from 2008-line monolith to 194-line app shell + 9 route modules in `routes/` directory using FastAPI APIRouter
47
+ - **`ui/app.js`**: Split from 1588-line monolith to 13 modular files in `ui/js/` directory (largest: 230 lines)
48
+
49
+ ### Fixed
50
+ - **"Database is locked" errors**: WAL mode + write serialization eliminates concurrent access failures
51
+ - **Fresh database crashes**: `/api/stats`, `/api/graph`, `/api/clusters` now return graceful empty responses when `graph_nodes`/`graph_edges`/`sessions` tables don't exist yet
52
+ - **Per-call overhead**: MCP tool handlers no longer rebuild TF-IDF vectorizer on every call
53
+
54
+ ### Testing
55
+ - 63 pytest tests (unit, concurrency, regression, security, edge cases, Docker/new-user scenarios)
56
+ - 27 end-to-end tests (all v2.5 components + 50 concurrent writes + reads during writes)
57
+ - 15 fresh-database edge case tests
58
+ - 17 backward compatibility tests against live v2.4.2 database
59
+ - All CLI commands and skill scripts verified
60
+ - Profile isolation verified across 2 profiles
61
+ - Playwright dashboard testing (all 8 tabs, SSE stream, profile switching)
62
+
63
+ ---
64
+
19
65
  ## [2.4.2] - 2026-02-11
20
66
 
21
67
  **Release Type:** Bug Fix Release
package/README.md CHANGED
@@ -39,6 +39,27 @@
39
39
 
40
40
  ---
41
41
 
42
+ ## NEW: v2.5 — "Your AI Memory Has a Heartbeat"
43
+
44
+ > **SuperLocalMemory is no longer passive storage — it's a real-time coordination layer.**
45
+
46
+ | What's New | Why It Matters |
47
+ |------------|----------------|
48
+ | **Real-Time Event Stream** | See every memory operation live in the dashboard — no refresh needed. SSE-powered, cross-process. |
49
+ | **No More "Database Locked"** | WAL mode + serialized write queue. 50 concurrent agents writing? Zero errors. |
50
+ | **Agent Tracking** | Know exactly which AI tool wrote what. Claude, Cursor, Windsurf, CLI — all tracked automatically. |
51
+ | **Trust Scoring** | Bayesian trust signals detect spam, quick-deletes, and cross-agent validation. Silent in v2.5, enforced in v2.6. |
52
+ | **Memory Provenance** | Every memory records who created it, via which protocol, with full derivation lineage. |
53
+ | **Production-Grade Code** | 28 API endpoints across 8 modular route files. 13 modular JS files. 63 pytest tests. |
54
+
55
+ **Upgrade:** `npm install -g superlocalmemory@latest`
56
+
57
+ **Dashboard:** `python3 ~/.claude-memory/ui_server.py` then open `http://localhost:8765`
58
+
59
+ [Architecture Doc](docs/ARCHITECTURE-V2.5.md) | [Full Changelog](CHANGELOG.md)
60
+
61
+ ---
62
+
42
63
  ## Install in One Command
43
64
 
44
65
  ```bash
@@ -0,0 +1,190 @@
1
+ # SuperLocalMemory V2.5 Architecture — "Your AI Memory Has a Heartbeat"
2
+
3
+ **Version:** 2.5.0 | **Date:** February 12, 2026 | **Author:** Varun Pratap Bhardwaj
4
+
5
+ ---
6
+
7
+ ## What Changed in v2.5
8
+
9
+ SuperLocalMemory transforms from **passive storage** (filing cabinet) to **active coordination layer** (nervous system). Every memory write, update, delete, or recall now triggers real-time events visible across all connected tools.
10
+
11
+ ### Before v2.5 (Passive)
12
+ ```
13
+ Claude writes memory → saved to SQLite → done (silent)
14
+ Cursor reads memory → returned → done (silent)
15
+ ```
16
+
17
+ ### After v2.5 (Active)
18
+ ```
19
+ Claude writes memory → saved to SQLite → Event Bus fires
20
+ ├── SSE → Dashboard shows it live
21
+ ├── Agent registered in registry
22
+ ├── Trust signal recorded
23
+ ├── Provenance tracked
24
+ └── Webhook dispatched (if configured)
25
+ ```
26
+
27
+ ---
28
+
29
+ ## System Architecture
30
+
31
+ ```
32
+ ┌─────────────────────────────────────────────────────────────┐
33
+ │ ACCESS LAYER │
34
+ │ MCP Server │ CLI │ REST API │ Skills │ Python Import │ A2A │
35
+ └──────────────────────┬──────────────────────────────────────┘
36
+
37
+ ┌──────────────────────┴──────────────────────────────────────┐
38
+ │ MEMORY STORE V2 │
39
+ │ ┌─────────────────────────────────────────────────────────┐ │
40
+ │ │ DbConnectionManager (Singleton) │ │
41
+ │ │ ├── WAL Mode (concurrent reads + serialized writes) │ │
42
+ │ │ ├── Busy Timeout (5s wait, not fail) │ │
43
+ │ │ ├── Thread-Local Read Pool │ │
44
+ │ │ ├── Serialized Write Queue (threading.Queue) │ │
45
+ │ │ └── Post-Write Hooks → Event Bus │ │
46
+ │ └─────────────────────────────────────────────────────┘ │
47
+ │ │
48
+ │ On every write: │
49
+ │ ├── EventBus.emit() → memory_events table + SSE + WS + WH │
50
+ │ ├── ProvenanceTracker → created_by, source_protocol columns │
51
+ │ ├── AgentRegistry → agent tracking + write/recall counters │
52
+ │ └── TrustScorer → signal collection (silent, no enforcement)│
53
+ └──────────────────────────────────────────────────────────────┘
54
+
55
+ ┌──────────────────────┴──────────────────────────────────────┐
56
+ │ STORAGE LAYER │
57
+ │ SQLite (single file: ~/.claude-memory/memory.db) │
58
+ │ │
59
+ │ Tables: │
60
+ │ ├── memories (+ provenance columns: created_by, │
61
+ │ │ source_protocol, trust_score, provenance_chain) │
62
+ │ ├── memory_events (event log with tiered retention) │
63
+ │ ├── subscriptions (durable + ephemeral event subscriptions) │
64
+ │ ├── agent_registry (connected agents + statistics) │
65
+ │ ├── trust_signals (trust signal audit trail) │
66
+ │ ├── graph_nodes, graph_edges, graph_clusters (knowledge graph)│
67
+ │ ├── identity_patterns (learned preferences) │
68
+ │ ├── sessions, creator_metadata │
69
+ │ └── memories_fts (FTS5 full-text search index) │
70
+ └──────────────────────────────────────────────────────────────┘
71
+
72
+ ┌──────────────────────┴──────────────────────────────────────┐
73
+ │ DASHBOARD LAYER │
74
+ │ FastAPI UI Server (modular routes) │
75
+ │ ├── 8 route modules in routes/ directory │
76
+ │ ├── 28 API endpoints │
77
+ │ ├── SSE /events/stream (real-time, cross-process) │
78
+ │ ├── WebSocket /ws/updates │
79
+ │ ├── 13 modular JS files in ui/js/ │
80
+ │ └── 8 dashboard tabs (Graph, Memories, Clusters, Patterns, │
81
+ │ Timeline, Live Events, Agents, Settings) │
82
+ └──────────────────────────────────────────────────────────────┘
83
+ ```
84
+
85
+ ---
86
+
87
+ ## New Components (v2.5)
88
+
89
+ | Component | File | Purpose |
90
+ |-----------|------|---------|
91
+ | **DbConnectionManager** | `src/db_connection_manager.py` | WAL mode, busy timeout, read pool, write queue, post-write hooks |
92
+ | **EventBus** | `src/event_bus.py` | Event emission, persistence, in-memory buffer, tiered retention |
93
+ | **SubscriptionManager** | `src/subscription_manager.py` | Durable + ephemeral event subscriptions with filters |
94
+ | **WebhookDispatcher** | `src/webhook_dispatcher.py` | Background HTTP POST delivery with retry logic |
95
+ | **AgentRegistry** | `src/agent_registry.py` | Agent tracking, write/recall counters, protocol detection |
96
+ | **ProvenanceTracker** | `src/provenance_tracker.py` | Memory origin tracking (who, how, trust, lineage) |
97
+ | **TrustScorer** | `src/trust_scorer.py` | Bayesian trust scoring with decay, burst detection |
98
+
99
+ ## New Database Tables (v2.5)
100
+
101
+ | Table | Columns | Purpose |
102
+ |-------|---------|---------|
103
+ | `memory_events` | id, event_type, memory_id, source_agent, source_protocol, payload, importance, tier, created_at | Event log with tiered retention |
104
+ | `subscriptions` | id, subscriber_id, channel, filter, webhook_url, durable, last_event_id | Event subscriptions |
105
+ | `agent_registry` | id, agent_id, agent_name, protocol, first_seen, last_seen, memories_written, memories_recalled, trust_score, metadata | Agent tracking |
106
+ | `trust_signals` | id, agent_id, signal_type, delta, old_score, new_score, context, created_at | Trust audit trail |
107
+
108
+ ## New Columns on `memories` Table (v2.5)
109
+
110
+ | Column | Type | Default | Purpose |
111
+ |--------|------|---------|---------|
112
+ | `created_by` | TEXT | 'user' | Agent ID that created this memory |
113
+ | `source_protocol` | TEXT | 'cli' | Protocol used (mcp, cli, rest, python, a2a) |
114
+ | `trust_score` | REAL | 1.0 | Trust score at creation time |
115
+ | `provenance_chain` | TEXT | '[]' | JSON derivation lineage |
116
+
117
+ ---
118
+
119
+ ## Event Types
120
+
121
+ | Event | Trigger |
122
+ |-------|---------|
123
+ | `memory.created` | New memory written |
124
+ | `memory.updated` | Existing memory modified |
125
+ | `memory.deleted` | Memory removed |
126
+ | `memory.recalled` | Memory retrieved by an agent |
127
+ | `graph.updated` | Knowledge graph rebuilt |
128
+ | `pattern.learned` | New pattern detected |
129
+ | `agent.connected` | New agent connects |
130
+ | `agent.disconnected` | Agent disconnects |
131
+
132
+ ## Event Retention Tiers
133
+
134
+ | Tier | Window | Kept |
135
+ |------|--------|------|
136
+ | Hot | 0-48h | All events |
137
+ | Warm | 2-14d | Importance >= 5 only |
138
+ | Cold | 14-30d | Daily aggregates |
139
+ | Archive | 30d+ | Pruned (stats in pattern_learner) |
140
+
141
+ ---
142
+
143
+ ## Trust Scoring (Silent Collection — v2.5)
144
+
145
+ All agents start at trust 1.0. Signals adjust the score with a Bayesian decay factor that stabilizes over time.
146
+
147
+ | Signal | Delta | When |
148
+ |--------|-------|------|
149
+ | `high_importance_write` | +0.015 | Memory with importance >= 7 |
150
+ | `memory_recalled_by_others` | +0.02 | Another agent finds this memory useful |
151
+ | `quick_delete` | -0.03 | Memory deleted within 1 hour |
152
+ | `high_volume_burst` | -0.02 | >20 writes in 5 minutes |
153
+
154
+ **v2.5:** Silent collection only. **v2.6:** Trust-weighted recall ranking. **v3.0:** Active enforcement.
155
+
156
+ ---
157
+
158
+ ## API Endpoints (28 total)
159
+
160
+ | Route | Method | Module | Purpose |
161
+ |-------|--------|--------|---------|
162
+ | `/` | GET | ui_server.py | Dashboard |
163
+ | `/health` | GET | ui_server.py | Health check |
164
+ | `/api/memories` | GET | routes/memories.py | List memories |
165
+ | `/api/graph` | GET | routes/memories.py | Knowledge graph data |
166
+ | `/api/search` | POST | routes/memories.py | Semantic search |
167
+ | `/api/clusters` | GET | routes/memories.py | Cluster info |
168
+ | `/api/clusters/{id}` | GET | routes/memories.py | Cluster detail |
169
+ | `/api/stats` | GET | routes/stats.py | System statistics |
170
+ | `/api/timeline` | GET | routes/stats.py | Timeline aggregation |
171
+ | `/api/patterns` | GET | routes/stats.py | Learned patterns |
172
+ | `/api/profiles` | GET | routes/profiles.py | List profiles |
173
+ | `/api/profiles/{n}/switch` | POST | routes/profiles.py | Switch profile |
174
+ | `/api/profiles/create` | POST | routes/profiles.py | Create profile |
175
+ | `/api/profiles/{n}` | DELETE | routes/profiles.py | Delete profile |
176
+ | `/api/export` | GET | routes/data_io.py | Export memories |
177
+ | `/api/import` | POST | routes/data_io.py | Import memories |
178
+ | `/api/backup/*` | GET/POST | routes/backup.py | Backup management |
179
+ | `/events/stream` | GET | routes/events.py | SSE real-time stream |
180
+ | `/api/events` | GET | routes/events.py | Event polling |
181
+ | `/api/events/stats` | GET | routes/events.py | Event statistics |
182
+ | `/api/agents` | GET | routes/agents.py | Agent list |
183
+ | `/api/agents/stats` | GET | routes/agents.py | Agent statistics |
184
+ | `/api/trust/stats` | GET | routes/agents.py | Trust overview |
185
+ | `/api/trust/signals/{id}` | GET | routes/agents.py | Agent trust signals |
186
+ | `/ws/updates` | WS | routes/ws.py | WebSocket |
187
+
188
+ ---
189
+
190
+ *SuperLocalMemory V2.5 — Created by Varun Pratap Bhardwaj. MIT License.*
package/mcp_server.py CHANGED
@@ -46,6 +46,14 @@ except ImportError as e:
46
46
  print(f"Ensure SuperLocalMemory V2 is installed at {MEMORY_DIR}", file=sys.stderr)
47
47
  sys.exit(1)
48
48
 
49
+ # Agent Registry + Provenance (v2.5+)
50
+ try:
51
+ from agent_registry import AgentRegistry
52
+ from provenance_tracker import ProvenanceTracker
53
+ PROVENANCE_AVAILABLE = True
54
+ except ImportError:
55
+ PROVENANCE_AVAILABLE = False
56
+
49
57
  # Parse command line arguments early (needed for port in constructor)
50
58
  import argparse as _argparse
51
59
  _parser = _argparse.ArgumentParser(add_help=False)
@@ -63,6 +71,80 @@ mcp = FastMCP(
63
71
  # Database path
64
72
  DB_PATH = MEMORY_DIR / "memory.db"
65
73
 
74
+ # ============================================================================
75
+ # Shared singleton instances (v2.5 — fixes per-call instantiation overhead)
76
+ # All MCP tool handlers share one MemoryStoreV2 instance instead of creating
77
+ # a new one per call. This means one ConnectionManager, one TF-IDF vectorizer,
78
+ # one write queue — shared across all concurrent MCP requests.
79
+ # ============================================================================
80
+
81
+ _store = None
82
+ _graph_engine = None
83
+ _pattern_learner = None
84
+
85
+
86
+ def get_store() -> MemoryStoreV2:
87
+ """Get or create the shared MemoryStoreV2 singleton."""
88
+ global _store
89
+ if _store is None:
90
+ _store = MemoryStoreV2(DB_PATH)
91
+ return _store
92
+
93
+
94
+ def get_graph_engine() -> GraphEngine:
95
+ """Get or create the shared GraphEngine singleton."""
96
+ global _graph_engine
97
+ if _graph_engine is None:
98
+ _graph_engine = GraphEngine(DB_PATH)
99
+ return _graph_engine
100
+
101
+
102
+ def get_pattern_learner() -> PatternLearner:
103
+ """Get or create the shared PatternLearner singleton."""
104
+ global _pattern_learner
105
+ if _pattern_learner is None:
106
+ _pattern_learner = PatternLearner(DB_PATH)
107
+ return _pattern_learner
108
+
109
+
110
+ _agent_registry = None
111
+ _provenance_tracker = None
112
+
113
+
114
+ def get_agent_registry():
115
+ """Get shared AgentRegistry singleton (v2.5+). Returns None if unavailable."""
116
+ global _agent_registry
117
+ if not PROVENANCE_AVAILABLE:
118
+ return None
119
+ if _agent_registry is None:
120
+ _agent_registry = AgentRegistry.get_instance(DB_PATH)
121
+ return _agent_registry
122
+
123
+
124
+ def get_provenance_tracker():
125
+ """Get shared ProvenanceTracker singleton (v2.5+). Returns None if unavailable."""
126
+ global _provenance_tracker
127
+ if not PROVENANCE_AVAILABLE:
128
+ return None
129
+ if _provenance_tracker is None:
130
+ _provenance_tracker = ProvenanceTracker.get_instance(DB_PATH)
131
+ return _provenance_tracker
132
+
133
+
134
+ def _register_mcp_agent(agent_name: str = "mcp-client"):
135
+ """Register the calling MCP agent and record activity. Non-blocking."""
136
+ registry = get_agent_registry()
137
+ if registry:
138
+ try:
139
+ registry.register_agent(
140
+ agent_id=f"mcp:{agent_name}",
141
+ agent_name=agent_name,
142
+ protocol="mcp",
143
+ )
144
+ except Exception:
145
+ pass
146
+
147
+
66
148
  # ============================================================================
67
149
  # MCP TOOLS (Functions callable by AI)
68
150
  # ============================================================================
@@ -103,8 +185,11 @@ async def remember(
103
185
  remember("JWT auth with refresh tokens", tags="security,auth", importance=8)
104
186
  """
105
187
  try:
188
+ # Register MCP agent (v2.5 — agent tracking)
189
+ _register_mcp_agent()
190
+
106
191
  # Use existing MemoryStoreV2 class (no duplicate logic)
107
- store = MemoryStoreV2(DB_PATH)
192
+ store = get_store()
108
193
 
109
194
  # Call existing add_memory method
110
195
  memory_id = store.add_memory(
@@ -114,6 +199,22 @@ async def remember(
114
199
  importance=importance
115
200
  )
116
201
 
202
+ # Record provenance (v2.5 — who created this memory)
203
+ prov = get_provenance_tracker()
204
+ if prov:
205
+ try:
206
+ prov.record_provenance(memory_id, created_by="mcp:client", source_protocol="mcp")
207
+ except Exception:
208
+ pass
209
+
210
+ # Track write in agent registry
211
+ registry = get_agent_registry()
212
+ if registry:
213
+ try:
214
+ registry.record_write("mcp:mcp-client")
215
+ except Exception:
216
+ pass
217
+
117
218
  # Format response
118
219
  preview = content[:100] + "..." if len(content) > 100 else content
119
220
 
@@ -174,7 +275,7 @@ async def recall(
174
275
  """
175
276
  try:
176
277
  # Use existing MemoryStoreV2 class
177
- store = MemoryStoreV2(DB_PATH)
278
+ store = get_store()
178
279
 
179
280
  # Call existing search method
180
281
  results = store.search(query, limit=limit)
@@ -223,7 +324,7 @@ async def list_recent(limit: int = 10) -> dict:
223
324
  """
224
325
  try:
225
326
  # Use existing MemoryStoreV2 class
226
- store = MemoryStoreV2(DB_PATH)
327
+ store = get_store()
227
328
 
228
329
  # Call existing list_all method
229
330
  memories = store.list_all(limit=limit)
@@ -263,7 +364,7 @@ async def get_status() -> dict:
263
364
  """
264
365
  try:
265
366
  # Use existing MemoryStoreV2 class
266
- store = MemoryStoreV2(DB_PATH)
367
+ store = get_store()
267
368
 
268
369
  # Call existing get_stats method
269
370
  stats = store.get_stats()
@@ -303,7 +404,7 @@ async def build_graph() -> dict:
303
404
  """
304
405
  try:
305
406
  # Use existing GraphEngine class
306
- engine = GraphEngine(DB_PATH)
407
+ engine = get_graph_engine()
307
408
 
308
409
  # Call existing build_graph method
309
410
  stats = engine.build_graph()
@@ -461,7 +562,7 @@ async def search(query: str) -> dict:
461
562
  {"results": [{"id": str, "title": str, "text": str, "url": str}]}
462
563
  """
463
564
  try:
464
- store = MemoryStoreV2(DB_PATH)
565
+ store = get_store()
465
566
  raw_results = store.search(query, limit=20)
466
567
 
467
568
  results = []
@@ -504,7 +605,7 @@ async def fetch(id: str) -> dict:
504
605
  {"id": str, "title": str, "text": str, "url": str, "metadata": dict|null}
505
606
  """
506
607
  try:
507
- store = MemoryStoreV2(DB_PATH)
608
+ store = get_store()
508
609
  mem = store.get_by_id(int(id))
509
610
 
510
611
  if not mem:
@@ -549,7 +650,7 @@ async def get_recent_memories_resource(limit: str) -> str:
549
650
  Usage: memory://recent/10
550
651
  """
551
652
  try:
552
- store = MemoryStoreV2(DB_PATH)
653
+ store = get_store()
553
654
  memories = store.list_all(limit=int(limit))
554
655
  return json.dumps(memories, indent=2)
555
656
  except Exception as e:
@@ -564,7 +665,7 @@ async def get_stats_resource() -> str:
564
665
  Usage: memory://stats
565
666
  """
566
667
  try:
567
- store = MemoryStoreV2(DB_PATH)
668
+ store = get_store()
568
669
  stats = store.get_stats()
569
670
  return json.dumps(stats, indent=2)
570
671
  except Exception as e:
@@ -579,7 +680,7 @@ async def get_clusters_resource() -> str:
579
680
  Usage: memory://graph/clusters
580
681
  """
581
682
  try:
582
- engine = GraphEngine(DB_PATH)
683
+ engine = get_graph_engine()
583
684
  stats = engine.get_stats()
584
685
  clusters = stats.get('clusters', [])
585
686
  return json.dumps(clusters, indent=2)
@@ -595,7 +696,7 @@ async def get_coding_identity_resource() -> str:
595
696
  Usage: memory://patterns/identity
596
697
  """
597
698
  try:
598
- learner = PatternLearner(DB_PATH)
699
+ learner = get_pattern_learner()
599
700
  patterns = learner.get_identity_context(min_confidence=0.5)
600
701
  return json.dumps(patterns, indent=2)
601
702
  except Exception as e:
@@ -615,7 +716,7 @@ async def coding_identity_prompt() -> str:
615
716
  based on learned preferences and patterns.
616
717
  """
617
718
  try:
618
- learner = PatternLearner(DB_PATH)
719
+ learner = get_pattern_learner()
619
720
  patterns = learner.get_identity_context(min_confidence=0.6)
620
721
 
621
722
  if not patterns:
@@ -656,7 +757,7 @@ async def project_context_prompt(project_name: str) -> str:
656
757
  Formatted prompt with relevant project memories
657
758
  """
658
759
  try:
659
- store = MemoryStoreV2(DB_PATH)
760
+ store = get_store()
660
761
 
661
762
  # Search for project-related memories
662
763
  memories = store.search(f"project:{project_name}", limit=20)
@@ -711,7 +812,7 @@ if __name__ == "__main__":
711
812
  # Print startup message to stderr (stdout is used for MCP protocol)
712
813
  print("=" * 60, file=sys.stderr)
713
814
  print("SuperLocalMemory V2 - MCP Server", file=sys.stderr)
714
- print("Version: 2.4.1", file=sys.stderr)
815
+ print("Version: 2.5.0", file=sys.stderr)
715
816
  print("=" * 60, file=sys.stderr)
716
817
  print("Created by: Varun Pratap Bhardwaj (Solution Architect)", file=sys.stderr)
717
818
  print("Repository: https://github.com/varun369/SuperLocalMemoryV2", file=sys.stderr)
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "superlocalmemory",
3
- "version": "2.4.2",
3
+ "version": "2.5.0",
4
4
  "description": "Your AI Finally Remembers You - Local-first intelligent memory system for AI assistants. Works with Claude, Cursor, Windsurf, VS Code/Copilot, Codex, and 16+ AI tools. 100% local, zero cloud dependencies.",
5
5
  "keywords": [
6
6
  "ai-memory",