superlocalmemory 2.3.4 → 2.3.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -16,6 +16,28 @@ SuperLocalMemory V2 - Intelligent local memory system for AI coding assistants.
16
16
 
17
17
  ---
18
18
 
19
+ ## [2.3.5] - 2026-02-09
20
+
21
+ ### Added
22
+ - **ChatGPT Connector Support**: `search(query)` and `fetch(id)` MCP tools per OpenAI spec
23
+ - **Streamable HTTP transport**: `slm serve --transport streamable-http` for ChatGPT 2026+
24
+ - **UI: Memory detail modal**: Click any memory row to see full content, tags, metadata
25
+ - **UI: Dark mode toggle**: Sun/moon icon in navbar, saved to localStorage, respects system preference
26
+ - **UI: Export buttons**: Export All (JSON/JSONL), Export Search Results, Export individual memory as Markdown
27
+ - **UI: Search score bars**: Color-coded relevance bars (red/yellow/green) in search results
28
+ - **UI: Animated stat counters**: Numbers animate up on page load with ease-out cubic
29
+ - **UI: Loading spinners and empty states**: Professional feedback across all tabs
30
+ - npm keywords: chatgpt, chatgpt-connector, openai, deep-research
31
+
32
+ ### Fixed
33
+ - **XSS vulnerability**: Replaced inline onclick with JSON injection with safe event delegation
34
+ - **UI: Content preview**: Increased from 80 to 100 characters
35
+
36
+ ### Changed
37
+ - npm package now includes `ui/`, `ui_server.py`, `api_server.py`
38
+
39
+ ---
40
+
19
41
  ## [2.3.0] - 2026-02-08
20
42
 
21
43
  **Release Type:** Universal Integration Release
package/README.md CHANGED
@@ -268,7 +268,7 @@ slm recall "API design patterns"
268
268
  | **OpenCode** | ✅ MCP | Native MCP tools |
269
269
  | **Perplexity** | ✅ MCP | Native MCP tools |
270
270
  | **Antigravity** | ✅ MCP + Skills | Native MCP tools |
271
- | **ChatGPT** | ✅ HTTP Transport | `slm serve` + tunnel |
271
+ | **ChatGPT** | ✅ MCP Connector | `search()` + `fetch()` via HTTP tunnel |
272
272
  | **Aider** | ✅ Smart Wrapper | `aider-smart` with context |
273
273
  | **Any Terminal** | ✅ Universal CLI | `slm remember "content"` |
274
274
 
@@ -1,7 +1,16 @@
1
1
  {
2
- "_README": "ChatGPT requires HTTP transport, not stdio. See docs/MCP-MANUAL-SETUP.md#chatgpt-desktop-app",
3
- "_step1": "Run: slm serve --port 8001",
4
- "_step2": "Expose via: ngrok http 8001 (or cloudflared tunnel)",
5
- "_step3": "Add the HTTPS URL to ChatGPT -> Settings -> Apps & Connectors -> Developer Mode",
6
- "_note": "100% local - your MCP server runs on YOUR machine. The tunnel just makes it reachable."
2
+ "_README": "ChatGPT Integration SuperLocalMemory V2 as ChatGPT Connector",
3
+ "_requires": "ChatGPT Pro/Plus/Business/Enterprise with Developer Mode enabled",
4
+ "_tools": "search(query) and fetch(id) required by OpenAI MCP spec for Connectors",
5
+ "_setup": {
6
+ "step1": "Start MCP server: slm serve --port 8417",
7
+ "step2": "Expose via tunnel: ngrok http 8417",
8
+ "step3": "Copy the HTTPS URL from ngrok (e.g. https://abc123.ngrok.app)",
9
+ "step4": "In ChatGPT: Settings → Connectors → Advanced → Enable Developer Mode",
10
+ "step5": "Create new Connector → paste URL with /sse/ suffix (e.g. https://abc123.ngrok.app/sse/)",
11
+ "step6": "Name it 'SuperLocalMemory' and click Create",
12
+ "step7": "In any chat, click + → More → select SuperLocalMemory connector"
13
+ },
14
+ "_alternative_transport": "For Streamable HTTP (recommended): slm serve --transport streamable-http --port 8417",
15
+ "_note": "100% local — your MCP server runs on YOUR machine. The tunnel just makes it reachable by ChatGPT servers. All data stays in ~/.claude-memory/memory.db"
7
16
  }
@@ -85,13 +85,15 @@ Simple Storage → Intelligent Organization → Adaptive Learning
85
85
 
86
86
  The MCP server provides native integration with modern AI tools:
87
87
 
88
- **6 Tools:**
88
+ **8 Tools:**
89
89
  - `remember(content, tags, project)` - Save memories
90
90
  - `recall(query, limit)` - Search memories
91
91
  - `list_recent(limit)` - Recent memories
92
92
  - `get_status()` - System status
93
93
  - `build_graph()` - Build knowledge graph
94
94
  - `switch_profile(name)` - Change profile
95
+ - `search(query)` - Search memories (OpenAI MCP spec for ChatGPT Connectors)
96
+ - `fetch(id)` - Fetch memory by ID (OpenAI MCP spec for ChatGPT Connectors)
95
97
 
96
98
  **4 Resources:**
97
99
  - `memory://recent` - Recent memories feed
@@ -106,6 +108,7 @@ The MCP server provides native integration with modern AI tools:
106
108
  **Transport Support:**
107
109
  - stdio (default) - For local IDE integration
108
110
  - HTTP - For remote/network access
111
+ - streamable-http - For ChatGPT 2026+ Connectors
109
112
 
110
113
  **Key Design:** Zero duplication - calls existing `memory_store_v2.py` functions
111
114
 
@@ -390,34 +390,75 @@ PYTHONPATH = "/Users/yourusername/.claude-memory"
390
390
 
391
391
  ---
392
392
 
393
- ### ChatGPT Desktop App
393
+ ### ChatGPT Desktop App / ChatGPT Connectors
394
394
 
395
- **Important:** ChatGPT requires HTTP transport, not stdio. You need to run a local HTTP server and expose it via a tunnel.
395
+ **Important:** ChatGPT requires HTTP transport, not stdio. You need to run a local HTTP server and expose it via a tunnel. As of v2.3.5, SuperLocalMemory includes `search(query)` and `fetch(id)` MCP tools required by OpenAI's MCP spec for ChatGPT Connectors and Deep Research.
396
396
 
397
- **Steps:**
397
+ **Requirements:**
398
+ - ChatGPT Plus, Team, or Enterprise plan
399
+ - **Developer Mode** must be enabled in ChatGPT settings
400
+ - A tunnel tool: `cloudflared` (recommended, free) or `ngrok`
401
+ - Reference: https://platform.openai.com/docs/mcp
402
+
403
+ **Available Tools in ChatGPT:**
404
+
405
+ | Tool | Purpose |
406
+ |------|---------|
407
+ | `search(query)` | Search memories (required by OpenAI MCP spec) |
408
+ | `fetch(id)` | Fetch a specific memory by ID (required by OpenAI MCP spec) |
409
+ | `remember(content, tags, project)` | Save a new memory |
410
+ | `recall(query, limit)` | Search memories with full options |
411
+
412
+ **Step-by-Step Setup:**
398
413
 
399
414
  1. **Start the MCP HTTP server:**
400
415
  ```bash
401
- slm serve --port 8001
402
- # or: python3 ~/.claude-memory/mcp_server.py --transport http --port 8001
416
+ slm serve --port 8417
417
+ # or with streamable-http transport (ChatGPT 2026+):
418
+ slm serve --port 8417 --transport streamable-http
419
+ # or using Python directly:
420
+ python3 ~/.claude-memory/mcp_server.py --transport http --port 8417
403
421
  ```
404
422
 
405
- 2. **Expose via tunnel** (in another terminal):
423
+ 2. **Expose via cloudflared tunnel** (in another terminal):
406
424
  ```bash
407
- ngrok http 8001
408
- # or: cloudflared tunnel --url http://localhost:8001
409
- ```
425
+ # Install cloudflared (if not installed)
426
+ # macOS: brew install cloudflared
427
+ # Linux: sudo apt install cloudflared
410
428
 
411
- 3. **Copy the HTTPS URL** from ngrok/cloudflared output
412
-
413
- 4. **Add to ChatGPT:**
414
- - Open ChatGPT Desktop
415
- - Go to **Settings → Apps & Connectors → Developer Mode**
416
- - Add the HTTPS URL as an MCP endpoint
429
+ cloudflared tunnel --url http://localhost:8417
430
+ ```
431
+ Cloudflared will output a URL like `https://random-name.trycloudflare.com`
417
432
 
418
- 5. In a new chat, look for MCP tools in the tool selector
433
+ **Alternative ngrok:**
434
+ ```bash
435
+ ngrok http 8417
436
+ ```
419
437
 
420
- **Note:** 100% local — your MCP server runs on YOUR machine. The tunnel just makes it reachable by ChatGPT.
438
+ 3. **Copy the HTTPS URL** from cloudflared/ngrok output
439
+
440
+ 4. **Add to ChatGPT as a Connector:**
441
+ - Open ChatGPT (desktop or web)
442
+ - Go to **Settings → Connectors** (or **Settings → Apps & Connectors → Developer Mode**)
443
+ - Click **"Add Connector"**
444
+ - Paste the HTTPS URL with the `/sse/` suffix:
445
+ ```
446
+ https://random-name.trycloudflare.com/sse/
447
+ ```
448
+ - Name it: `SuperLocalMemory`
449
+ - Click **Save**
450
+
451
+ 5. **Verify in a new chat:**
452
+ - Start a new conversation in ChatGPT
453
+ - Look for the SuperLocalMemory connector in the tool selector
454
+ - Try: "Search my memories for authentication decisions"
455
+ - ChatGPT will call `search()` and return your local memories
456
+
457
+ **Important Notes:**
458
+ - The `/sse/` suffix on the URL is **required** by ChatGPT's MCP implementation
459
+ - 100% local — your MCP server runs on YOUR machine. The tunnel just makes it reachable by ChatGPT. Your data is served on demand and never stored by OpenAI beyond the conversation.
460
+ - The tunnel URL changes each time you restart cloudflared (unless you set up a named tunnel)
461
+ - For persistent URLs, configure a cloudflared named tunnel: `cloudflared tunnel create slm`
421
462
 
422
463
  **Auto-configured by install.sh:** ❌ No (requires HTTP transport + tunnel)
423
464
 
@@ -690,7 +731,7 @@ In your IDE/app, check:
690
731
 
691
732
  ## Available MCP Tools
692
733
 
693
- Once configured, these 6 tools are available:
734
+ Once configured, these 8 tools are available:
694
735
 
695
736
  | Tool | Purpose | Example Usage |
696
737
  |------|---------|---------------|
@@ -700,6 +741,10 @@ Once configured, these 6 tools are available:
700
741
  | `get_status()` | System health | "How many memories do I have?" |
701
742
  | `build_graph()` | Build knowledge graph | "Build the knowledge graph" |
702
743
  | `switch_profile()` | Change profile | "Switch to work profile" |
744
+ | `search()` | Search memories (OpenAI MCP spec) | Used by ChatGPT Connectors and Deep Research |
745
+ | `fetch()` | Fetch memory by ID (OpenAI MCP spec) | Used by ChatGPT Connectors and Deep Research |
746
+
747
+ **Note:** `search()` and `fetch()` are required by OpenAI's MCP specification for ChatGPT Connectors. They are available in all transports but primarily used by ChatGPT.
703
748
 
704
749
  Plus **2 MCP prompts** and **4 MCP resources** for advanced use.
705
750
 
package/docs/UI-SERVER.md CHANGED
@@ -22,7 +22,15 @@ Web-based visualization interface for exploring the SuperLocalMemory knowledge g
22
22
  - **Cluster Analysis**: Visual breakdown of thematic clusters with top entities
23
23
  - **Pattern Viewer**: Display learned preferences and coding styles
24
24
  - **Timeline**: Chart showing memory creation over time
25
- - **Statistics Dashboard**: Real-time system metrics
25
+ - **Statistics Dashboard**: Real-time system metrics with animated stat counters (ease-out cubic on page load)
26
+ - **Memory Detail Modal**: Click any memory row to see full content, metadata, and tags in an overlay modal
27
+ - **Dark Mode Toggle**: Sun/moon icon in the navbar; preference saved to localStorage and respects system preference (`prefers-color-scheme`)
28
+ - **Export Functionality**:
29
+ - **Export All**: Download entire memory database as JSON or JSONL
30
+ - **Export Search Results**: Export current search results to JSON
31
+ - **Export as Markdown**: Export an individual memory as a Markdown file
32
+ - **Search Score Bars**: Color-coded relevance bars in search results (red for low, yellow for medium, green for high relevance)
33
+ - **Loading Spinners and Empty States**: Professional loading feedback and informative empty states across all tabs
26
34
 
27
35
  ## Quick Start
28
36
 
@@ -242,9 +250,9 @@ This UI server is intended for **local development** only.
242
250
 
243
251
  - Add authentication for multi-user access
244
252
  - Implement real-time updates via WebSockets
245
- - Add export functionality (JSON, CSV)
246
253
  - Create memory editing interface
247
254
  - Build cluster visualization with hierarchical layout
255
+ - Add CSV export format alongside existing JSON/JSONL/Markdown exports
248
256
 
249
257
  ## Support
250
258
 
@@ -231,6 +231,62 @@ aider-smart "Add authentication to the API"
231
231
  3. Passes context to Aider
232
232
  4. Aider gets relevant memories without you asking
233
233
 
234
+ ### ChatGPT (Connectors / Deep Research) ✅
235
+
236
+ **How It Works:** HTTP transport with `search()` and `fetch()` MCP tools per OpenAI spec. Requires a tunnel to expose local server to ChatGPT.
237
+
238
+ **Requirements:**
239
+ - ChatGPT Plus, Team, or Enterprise plan
240
+ - Developer Mode enabled in ChatGPT settings
241
+ - `cloudflared` (recommended) or `ngrok` for tunneling
242
+ - Reference: https://platform.openai.com/docs/mcp
243
+
244
+ **Setup:**
245
+
246
+ ```bash
247
+ # Terminal 1: Start MCP server
248
+ slm serve --port 8417
249
+
250
+ # Terminal 2: Start tunnel
251
+ cloudflared tunnel --url http://localhost:8417
252
+ ```
253
+
254
+ Then in ChatGPT:
255
+ 1. Go to **Settings → Connectors**
256
+ 2. Click **"Add Connector"**
257
+ 3. Paste the HTTPS URL from cloudflared with `/sse/` suffix:
258
+ ```
259
+ https://random-name.trycloudflare.com/sse/
260
+ ```
261
+ 4. Name it `SuperLocalMemory` and save
262
+
263
+ **Available Tools in ChatGPT:**
264
+
265
+ | Tool | Purpose |
266
+ |------|---------|
267
+ | `search(query)` | Search memories (required by OpenAI MCP spec) |
268
+ | `fetch(id)` | Fetch a specific memory by ID (required by OpenAI MCP spec) |
269
+ | `remember(content, tags, project)` | Save a new memory |
270
+ | `recall(query, limit)` | Search memories with full options |
271
+
272
+ **Usage Examples:**
273
+ ```
274
+ User: "Search my memories for database decisions"
275
+ ChatGPT: [calls search("database decisions")]
276
+ → Returns matching memories from your local database
277
+
278
+ User: "What's memory #42 about?"
279
+ ChatGPT: [calls fetch(42)]
280
+ → Returns full content, tags, and metadata for memory 42
281
+ ```
282
+
283
+ **Notes:**
284
+ - 100% local — your data is served on demand and never stored beyond the conversation
285
+ - The tunnel URL changes on restart unless you configure a named cloudflared tunnel
286
+ - For streamable-http transport (ChatGPT 2026+): `slm serve --port 8417 --transport streamable-http`
287
+
288
+ ---
289
+
234
290
  ### Any Terminal / Script ✅
235
291
 
236
292
  **How It Works:** Universal CLI wrapper
@@ -0,0 +1,70 @@
1
+ #!/usr/bin/env node
2
+ /**
3
+ * SuperLocalMemory V2 - Session Start Context Hook
4
+ * Copyright (c) 2026 Varun Pratap Bhardwaj
5
+ * Licensed under MIT License
6
+ *
7
+ * Loads recent memories and learned patterns on Claude Code session start.
8
+ * Outputs context to stderr (Claude Code reads hook stderr as context).
9
+ * Fails gracefully — never blocks session start if DB is missing or errors occur.
10
+ */
11
+
12
+ const { execFile } = require('child_process');
13
+ const { promisify } = require('util');
14
+ const path = require('path');
15
+ const fs = require('fs');
16
+
17
+ const execFileAsync = promisify(execFile);
18
+
19
+ const MEMORY_DIR = path.join(process.env.HOME, '.claude-memory');
20
+ const DB_PATH = path.join(MEMORY_DIR, 'memory.db');
21
+ const MEMORY_SCRIPT = path.join(MEMORY_DIR, 'memory_store_v2.py');
22
+
23
+ async function loadSessionContext() {
24
+ // Fail gracefully if not installed
25
+ if (!fs.existsSync(DB_PATH)) {
26
+ return;
27
+ }
28
+
29
+ if (!fs.existsSync(MEMORY_SCRIPT)) {
30
+ return;
31
+ }
32
+
33
+ try {
34
+ // Get stats (memory_store_v2.py stats → JSON output)
35
+ const { stdout: statsOutput } = await execFileAsync('python3', [
36
+ MEMORY_SCRIPT, 'stats'
37
+ ], { timeout: 5000 });
38
+
39
+ // Get recent memories (memory_store_v2.py list <limit>)
40
+ const { stdout: recentOutput } = await execFileAsync('python3', [
41
+ MEMORY_SCRIPT, 'list', '5'
42
+ ], { timeout: 5000 });
43
+
44
+ // Build context output
45
+ let context = '';
46
+
47
+ if (statsOutput && statsOutput.trim()) {
48
+ try {
49
+ const stats = JSON.parse(statsOutput.trim());
50
+ const total = stats.total_memories || 0;
51
+ const clusters = stats.total_clusters || 0;
52
+ if (total > 0) {
53
+ context += 'SuperLocalMemory: ' + total + ' memories, ' + clusters + ' clusters loaded.\n';
54
+ }
55
+ } catch (e) {
56
+ // Stats output wasn't JSON — use first line as-is
57
+ context += 'SuperLocalMemory: ' + statsOutput.trim().split('\n')[0] + '\n';
58
+ }
59
+ }
60
+
61
+ if (context) {
62
+ process.stderr.write(context);
63
+ }
64
+ } catch (error) {
65
+ // Never fail — session start must not be blocked
66
+ // Silently ignore errors (timeout, missing python, etc.)
67
+ }
68
+ }
69
+
70
+ loadSessionContext();
package/mcp_server.py CHANGED
@@ -374,6 +374,108 @@ async def switch_profile(name: str) -> dict:
374
374
  }
375
375
 
376
376
 
377
+ # ============================================================================
378
+ # CHATGPT CONNECTOR TOOLS (search + fetch — required by OpenAI MCP spec)
379
+ # These two tools are required for ChatGPT Connectors and Deep Research.
380
+ # They wrap existing SuperLocalMemory search/retrieval logic.
381
+ # Ref: https://platform.openai.com/docs/mcp
382
+ # ============================================================================
383
+
384
+ @mcp.tool(annotations=ToolAnnotations(
385
+ readOnlyHint=True,
386
+ destructiveHint=False,
387
+ openWorldHint=False,
388
+ ))
389
+ async def search(query: str) -> dict:
390
+ """
391
+ Search for documents in SuperLocalMemory.
392
+
393
+ Required by ChatGPT Connectors and Deep Research.
394
+ Returns a list of search results with id, title, text snippet, and url.
395
+
396
+ Args:
397
+ query: Search query string. Natural language queries work best.
398
+
399
+ Returns:
400
+ {"results": [{"id": str, "title": str, "text": str, "url": str}]}
401
+ """
402
+ try:
403
+ store = MemoryStoreV2(DB_PATH)
404
+ raw_results = store.search(query, limit=20)
405
+
406
+ results = []
407
+ for r in raw_results:
408
+ if r.get('score', 0) < 0.2:
409
+ continue
410
+ content = r.get('content', '') or r.get('summary', '') or ''
411
+ snippet = content[:200] + "..." if len(content) > 200 else content
412
+ mem_id = str(r.get('id', ''))
413
+ title = r.get('category', 'Memory') + ': ' + (content[:60].replace('\n', ' ') if content else 'Untitled')
414
+ results.append({
415
+ "id": mem_id,
416
+ "title": title,
417
+ "text": snippet,
418
+ "url": f"memory://local/{mem_id}"
419
+ })
420
+
421
+ return {"results": results}
422
+
423
+ except Exception as e:
424
+ return {"results": [], "error": str(e)}
425
+
426
+
427
+ @mcp.tool(annotations=ToolAnnotations(
428
+ readOnlyHint=True,
429
+ destructiveHint=False,
430
+ openWorldHint=False,
431
+ ))
432
+ async def fetch(id: str) -> dict:
433
+ """
434
+ Retrieve full content of a memory by ID.
435
+
436
+ Required by ChatGPT Connectors and Deep Research.
437
+ Use after search() to get complete document content for analysis and citation.
438
+
439
+ Args:
440
+ id: Memory ID from search results.
441
+
442
+ Returns:
443
+ {"id": str, "title": str, "text": str, "url": str, "metadata": dict|null}
444
+ """
445
+ try:
446
+ store = MemoryStoreV2(DB_PATH)
447
+ mem = store.get_by_id(int(id))
448
+
449
+ if not mem:
450
+ raise ValueError(f"Memory with ID {id} not found")
451
+
452
+ content = mem.get('content', '') or mem.get('summary', '') or ''
453
+ title = (mem.get('category', 'Memory') or 'Memory') + ': ' + (content[:60].replace('\n', ' ') if content else 'Untitled')
454
+
455
+ metadata = {}
456
+ if mem.get('tags'):
457
+ metadata['tags'] = mem['tags']
458
+ if mem.get('project_name'):
459
+ metadata['project'] = mem['project_name']
460
+ if mem.get('importance'):
461
+ metadata['importance'] = mem['importance']
462
+ if mem.get('cluster_id'):
463
+ metadata['cluster_id'] = mem['cluster_id']
464
+ if mem.get('created_at'):
465
+ metadata['created_at'] = mem['created_at']
466
+
467
+ return {
468
+ "id": str(id),
469
+ "title": title,
470
+ "text": content,
471
+ "url": f"memory://local/{id}",
472
+ "metadata": metadata if metadata else None
473
+ }
474
+
475
+ except Exception as e:
476
+ raise ValueError(f"Failed to fetch memory {id}: {str(e)}")
477
+
478
+
377
479
  # ============================================================================
378
480
  # MCP RESOURCES (Data endpoints)
379
481
  # ============================================================================
@@ -532,9 +634,9 @@ if __name__ == "__main__":
532
634
  )
533
635
  parser.add_argument(
534
636
  "--transport",
535
- choices=["stdio", "http", "sse"],
637
+ choices=["stdio", "http", "sse", "streamable-http"],
536
638
  default="stdio",
537
- help="Transport method: stdio for local IDEs (default), http/sse for remote access"
639
+ help="Transport method: stdio for local IDEs (default), sse/streamable-http for ChatGPT and remote access"
538
640
  )
539
641
  parser.add_argument(
540
642
  "--port",
@@ -565,6 +667,8 @@ if __name__ == "__main__":
565
667
  print("MCP Tools Available:", file=sys.stderr)
566
668
  print(" - remember(content, tags, project, importance)", file=sys.stderr)
567
669
  print(" - recall(query, limit, min_score)", file=sys.stderr)
670
+ print(" - search(query) [ChatGPT Connector]", file=sys.stderr)
671
+ print(" - fetch(id) [ChatGPT Connector]", file=sys.stderr)
568
672
  print(" - list_recent(limit)", file=sys.stderr)
569
673
  print(" - get_status()", file=sys.stderr)
570
674
  print(" - build_graph()", file=sys.stderr)
@@ -588,8 +692,14 @@ if __name__ == "__main__":
588
692
  if args.transport == "stdio":
589
693
  # stdio transport for local IDEs (default)
590
694
  mcp.run(transport="stdio")
695
+ elif args.transport == "streamable-http":
696
+ # Streamable HTTP transport (recommended for ChatGPT 2026+)
697
+ print(f"Streamable HTTP server at http://localhost:{args.port}", file=sys.stderr)
698
+ print("ChatGPT setup: expose via ngrok, paste URL in Settings > Connectors", file=sys.stderr)
699
+ mcp.run(transport="streamable-http")
591
700
  else:
592
701
  # SSE transport for remote access (ChatGPT, web clients)
593
702
  # "http" is accepted as alias for "sse"
594
703
  print(f"HTTP/SSE server will be available at http://localhost:{args.port}", file=sys.stderr)
704
+ print("ChatGPT setup: expose via ngrok, paste URL in Settings > Connectors", file=sys.stderr)
595
705
  mcp.run(transport="sse")
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "superlocalmemory",
3
- "version": "2.3.4",
3
+ "version": "2.3.6",
4
4
  "description": "Your AI Finally Remembers You - Local-first intelligent memory system for AI assistants. Works with Claude, Cursor, Windsurf, VS Code/Copilot, Codex, and 16+ AI tools. 100% local, zero cloud dependencies.",
5
5
  "keywords": [
6
6
  "ai-memory",
@@ -18,7 +18,11 @@
18
18
  "claude-desktop",
19
19
  "ai-tools",
20
20
  "memory-extension",
21
- "local-ai"
21
+ "local-ai",
22
+ "chatgpt",
23
+ "chatgpt-connector",
24
+ "openai",
25
+ "deep-research"
22
26
  ],
23
27
  "author": {
24
28
  "name": "Varun Pratap Bhardwaj",