prism-mcp-server 12.5.4 โ†’ 12.5.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -146,7 +146,7 @@ Prism v11.6.0 introduces **production-grade agent infrastructure** for running m
146
146
 
147
147
  ## ๐Ÿ”ฌ <a name="deep-research-intelligence"></a>v11.5.1 Deep Research Intelligence (Auto-Scholar)
148
148
 
149
- Prism v11.5.1 transforms your AI agent from a "Coder" into a "Clinical Scientist." It features a **Tavily-Enhanced Multi-Provider Discovery Pipeline** that grounds Gemini 2.5 Flash's thinking in real-world empirical data.
149
+ Prism v11.5.1 transforms your AI agent from a "Coder" into a "Clinical Scientist." It features a **Multi-Provider Discovery Pipeline** (Brave + Firecrawl + Google Scholar) that grounds Gemini 2.5 Flash's thinking in real-world empirical data.
150
150
 
151
151
  ### ๐ŸฅŠ The Global Benchmarks: Prism v11 vs. Standard RAG
152
152
 
@@ -176,7 +176,7 @@ Prism features a cutting-edge **Zero-Search Retrieval** system for its cognitive
176
176
 
177
177
  ### ๐Ÿ” Supported Discovery Engines & Databases
178
178
 
179
- 1. **Tavily AI** (Elite): Primary discovery engine for AI-native deep crawling and PDF/Abstract extraction.
179
+ 1. **Brave Search + Firecrawl** (Primary): Web search + deep scraping for full-text article extraction.
180
180
  2. **PubMed (NCBI)** (Clinical): The world's largest biomedical database for clinical citations.
181
181
  3. **ERIC (Education Research)** (Behavioral): The definitive database for ABA and pediatric interventions.
182
182
  4. **Semantic Scholar** (Academic): AI-powered research tool providing "TLDR" summaries of 200M+ papers.
@@ -187,6 +187,16 @@ Prism features a cutting-edge **Zero-Search Retrieval** system for its cognitive
187
187
  ### ๐Ÿฅ Flagship Implementation: [Synalux](https://synalux.ai)
188
188
  **Synalux** is a high-compliance, local-first Practice Management System for ABA and Pediatrics. It is the flagship implementation of the Prism v11.5.1 engine, utilizing **Zero-Search Retrieval** and **Parallel Academic Discovery** to provide clinicians with real-time, evidence-based reasoning.
189
189
 
190
+ | Resource | Link |
191
+ |----------|------|
192
+ | **Synalux Docs** | [github.com/dcostenco/synalux-docs](https://github.com/dcostenco/synalux-docs) |
193
+ | **Synalux Platform** | [synalux.ai](https://synalux.ai) |
194
+ | **Prism Coder 7B (HuggingFace)** | [huggingface.co/dcostenco/prism-coder-7b](https://huggingface.co/dcostenco/prism-coder-7b) |
195
+ | **BFCL Submission** | [PR #1332](https://github.com/ShishirPatil/gorilla/pull/1332) |
196
+ | **Changelog** | [synalux-docs/CHANGELOG.md](https://github.com/dcostenco/synalux-docs/blob/main/CHANGELOG.md) |
197
+ | **Roadmap** | [synalux-docs/ROADMAP.md](https://github.com/dcostenco/synalux-docs/blob/main/ROADMAP.md) |
198
+ | **PrismAAC** | [github.com/dcostenco/prism-aac](https://github.com/dcostenco/prism-aac) โ€” AAC web app for children with motor impairments |
199
+
190
200
  ---
191
201
 
192
202
  <details>
@@ -287,7 +297,7 @@ Then open `http://localhost:3001` instead.
287
297
  | **Ledger compaction** | โœ… `prism-coder:7b` via Ollama | โœ… Text provider key |
288
298
  | **Task routing (LLM tiebreaker)** | โœ… `prism-coder:7b` via Ollama | N/A (heuristic-only) |
289
299
  | Morning Briefings | โŒ | โœ… Text provider key |
290
- | Web Scholar research | โŒ | โœ… [`BRAVE_API_KEY`](#environment-variables) + [`FIRECRAWL_API_KEY`](#environment-variables) (or `TAVILY_API_KEY`) |
300
+ | Web Scholar research | โŒ | โœ… [`BRAVE_API_KEY`](#environment-variables) + [`FIRECRAWL_API_KEY`](#environment-variables) |
291
301
  | VLM image captioning | โŒ | โœ… Provider key |
292
302
  | Autonomous Pipelines (Dark Factory) | โŒ | โœ… Text provider key |
293
303
 
@@ -318,6 +328,9 @@ Then open `http://localhost:3001` instead.
318
328
 
319
329
  ---
320
330
 
331
+ <details>
332
+ <summary><strong>๐Ÿ“– Setup Guides</strong> โ€” MCP client configs, schema migrations, troubleshooting <em>(technical ยท click to expand)</em></summary>
333
+
321
334
  ## <a name="setup-guides"></a>๐Ÿ“– Setup Guides
322
335
 
323
336
  <details>
@@ -599,6 +612,8 @@ Prism can be deployed natively to cloud platforms like [Render](https://render.c
599
612
  > ```
600
613
  > Claude Code users can use the `.clauderules` auto-load hook shown in the [Setup Guides](#setup-guides). Prism also has a **server-side fallback** (v5.2.1+) that auto-pushes context after 10 seconds if no load is detected.
601
614
 
615
+ </details>
616
+
602
617
  ---
603
618
 
604
619
  ## <a name="universal-import-bring-your-history"></a>๐Ÿ“ฅ Universal Import: Bring Your History
@@ -1338,6 +1353,9 @@ prism scm dora --repo synalux/portal --period 2024-Q4
1338
1353
 
1339
1354
  ---
1340
1355
 
1356
+ <details>
1357
+ <summary><strong>๐Ÿ’ป CLI Reference</strong> โ€” commands for CI/CD, scripts, and non-MCP environments <em>(technical ยท click to expand)</em></summary>
1358
+
1341
1359
  ## <a name="cli-reference"></a>๐Ÿ’ป CLI Reference
1342
1360
 
1343
1361
  Prism includes a CLI for environments where MCP tools aren't available (CI/CD pipelines, Bash scripts, non-MCP IDEs like Antigravity).
@@ -1369,6 +1387,13 @@ prism verify generate # Bless current rubric as canonic
1369
1387
 
1370
1388
  ---
1371
1389
 
1390
+ </details>
1391
+
1392
+ ---
1393
+
1394
+ <details>
1395
+ <summary><strong>๐Ÿ”ง Tool Reference</strong> โ€” full schema for all 30+ MCP tools <em>(technical ยท click to expand)</em></summary>
1396
+
1372
1397
  ## <a name="tool-reference"></a>๐Ÿ”ง Tool Reference
1373
1398
 
1374
1399
  Prism ships 30+ tools, but **90% of your workflow uses just three:**
@@ -1502,6 +1527,13 @@ Requires `PRISM_DARK_FACTORY_ENABLED=true`.
1502
1527
 
1503
1528
  ---
1504
1529
 
1530
+ </details>
1531
+
1532
+ ---
1533
+
1534
+ <details>
1535
+ <summary><strong>โš™๏ธ Environment Variables</strong> โ€” full config reference, env vars, dashboard settings <em>(technical ยท click to expand)</em></summary>
1536
+
1505
1537
  ## <a name="environment-variables"></a>Environment Variables
1506
1538
 
1507
1539
  > **๐Ÿšฆ TL;DR โ€” Just want the best experience fast?** Two options:
@@ -1512,7 +1544,7 @@ Requires `PRISM_DARK_FACTORY_ENABLED=true`.
1512
1544
  > # Option B: Cloud-powered (best quality)
1513
1545
  > GOOGLE_API_KEY=... # Unlocks: Gemini embeddings, Morning Briefings, auto-compaction
1514
1546
  > BRAVE_API_KEY=... # Unlocks: Web Scholar research + Brave Answers
1515
- > FIRECRAWL_API_KEY=... # Unlocks: Web Scholar deep scraping (or use TAVILY_API_KEY instead)
1547
+ > FIRECRAWL_API_KEY=... # Unlocks: Web Scholar deep scraping
1516
1548
  > ```
1517
1549
  > **Zero keys = zero problem.** Core session memory, keyword search, semantic search (local embeddings), time travel, and the full dashboard work 100% offline. Cloud keys are optional power-ups.
1518
1550
 
@@ -1522,8 +1554,7 @@ Requires `PRISM_DARK_FACTORY_ENABLED=true`.
1522
1554
  | Variable | Required | Description |
1523
1555
  |----------|----------|-------------|
1524
1556
  | `BRAVE_API_KEY` | No | Brave Search Pro API key |
1525
- | `FIRECRAWL_API_KEY` | No | Firecrawl API key โ€” required for Web Scholar (unless using Tavily) |
1526
- | `TAVILY_API_KEY` | No | Tavily Search API key โ€” alternative to Brave+Firecrawl for Web Scholar |
1557
+ | `FIRECRAWL_API_KEY` | No | Firecrawl API key โ€” required for Web Scholar deep scraping |
1527
1558
  | `PRISM_STORAGE` | No | `"local"` (default) or `"supabase"` โ€” restart required |
1528
1559
  | `PRISM_ENABLE_HIVEMIND` | No | `"true"` to enable multi-agent tools โ€” restart required |
1529
1560
  | `PRISM_INSTANCE` | No | Instance name for multi-server PID isolation |
@@ -1567,6 +1598,13 @@ Some configurations are stored dynamically in SQLite (`system_settings` table) a
1567
1598
 
1568
1599
  ---
1569
1600
 
1601
+ </details>
1602
+
1603
+ ---
1604
+
1605
+ <details>
1606
+ <summary><strong>๐Ÿ—๏ธ Architecture</strong> โ€” startup sequence, storage layers, auto-load mechanics <em>(technical ยท click to expand)</em></summary>
1607
+
1570
1608
  ## <a name="architecture"></a>Architecture
1571
1609
 
1572
1610
  Prism is a **stdio-based MCP server** that manages persistent agent memory. Here's how the pieces fit together:
@@ -1643,6 +1681,13 @@ All platforms benefit from the **server-side fallback** (v5.2.1): if `session_lo
1643
1681
 
1644
1682
  ---
1645
1683
 
1684
+ </details>
1685
+
1686
+ ---
1687
+
1688
+ <details>
1689
+ <summary><strong>๐Ÿงฌ Scientific Foundation</strong> โ€” cognitive science citations, ACT-R, HRR, peer-reviewed models <em>(technical ยท click to expand)</em></summary>
1690
+
1646
1691
  ## <a name="scientific-foundation"></a>๐Ÿงฌ Scientific Foundation
1647
1692
 
1648
1693
  Prism has evolved from smart session logging into a **cognitive memory architecture** โ€” grounded in real research, not marketing. Every retrieval decision is backed by peer-reviewed models from cognitive psychology, neuroscience, and distributed computing.
@@ -1703,6 +1748,10 @@ The core unbinding engine is verified via Synalux's cognitive testing suite:
1703
1748
 
1704
1749
  ---
1705
1750
 
1751
+ </details>
1752
+
1753
+ ---
1754
+
1706
1755
  ## ๐Ÿ’ผ B2B Consulting & Enterprise Support
1707
1756
 
1708
1757
  Prism MCP is open-source and free for individual developers. For teams and enterprises building autonomous AI workflows or integrating MCP-native memory at scale, we offer professional consulting and setup packages.
@@ -100,9 +100,10 @@ export function buildVSCodePrompt(identity) {
100
100
  ].join('\n');
101
101
  }
102
102
  // โ”€โ”€โ”€ Input Sanitization โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
103
- /** Strip XML-like tags that could hijack system instructions */
103
+ import { sanitizeMcpOutput } from './utils/sanitizer.js';
104
+ /** Strip XML-like tags that could hijack system instructions โ€” delegates to shared sanitizer */
104
105
  export function sanitizeUserInput(text) {
105
- return text.replace(/<\/?(?:anti_pattern|desired_pattern|system|user_input|instruction)[^>]*>/gi, '');
106
+ return sanitizeMcpOutput(text);
106
107
  }
107
108
  /** Wrap user input in <user_input> tags after sanitization */
108
109
  export function wrapUserInput(text) {
package/dist/config.js CHANGED
@@ -78,17 +78,25 @@ export const VOYAGE_API_KEY = process.env.VOYAGE_API_KEY;
78
78
  // Get yours at: https://developers.google.com/custom-search/v1/overview
79
79
  export const GOOGLE_SEARCH_API_KEY = process.env.GOOGLE_SEARCH_API_KEY;
80
80
  export const GOOGLE_SEARCH_CX = process.env.GOOGLE_SEARCH_CX;
81
- // โ”€โ”€โ”€ v2.0 / v12.1: Storage Backend Selection โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
82
- // Both "local" (SQLite) and "supabase" (PostgreSQL) are implemented.
81
+ // โ”€โ”€โ”€ v2.0 / v12.1 / v13: Storage Backend Selection โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
82
+ // Three backends are implemented:
83
+ // "local" โ€” SQLite, fully offline. Free-tier default.
84
+ // "supabase" โ€” direct Supabase REST. Legacy direct-write path. Deprecated for paid tiers.
85
+ // "synalux" โ€” thin HTTP client of synalux portal. Paid-tier default. Mediates project
86
+ // validation, tier gating, and audit. See SynaluxStorage.
83
87
  //
84
- // v12.1: Default changed from "local" to "auto".
85
- // "auto" = prefer Supabase when credentials are resolvable, else local.
86
- // "local" = forced local SQLite (free tier, HIPAA/PRISM_STRICT_LOCAL_MODE).
87
- // "supabase" = forced Supabase (error if credentials missing).
88
- //
89
- // Set PRISM_STORAGE=local to force local SQLite.
90
- // Set PRISM_STORAGE=supabase to force Supabase REST API.
88
+ // Auto-resolution (PRISM_STORAGE=auto, the default) picks in this order:
89
+ // 1. PRISM_FORCE_LOCAL=true โ†’ "local" (override everything)
90
+ // 2. SYNALUX_API_KEY + PRISM_SYNALUX_BASE_URL set โ†’ "synalux"
91
+ // 3. SUPABASE_URL + SUPABASE_KEY set โ†’ "supabase" (legacy)
92
+ // 4. else โ†’ "local"
91
93
  export const PRISM_STORAGE = process.env.PRISM_STORAGE || "auto";
94
+ /**
95
+ * Hard override โ€” when true, forces local SQLite regardless of any cloud
96
+ * credentials. Used by free-tier installs and HIPAA deployments that must
97
+ * never touch the network for memory operations.
98
+ */
99
+ export const PRISM_FORCE_LOCAL = process.env.PRISM_FORCE_LOCAL === "true";
92
100
  // Logged at debug level โ€” see debug() at bottom of file
93
101
  // โ”€โ”€โ”€ Optional: Supabase (Session Memory Module) โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
94
102
  // When both SUPABASE_URL and SUPABASE_KEY are set, session memory tools
@@ -117,6 +125,16 @@ export const SUPABASE_KEY = sanitizeEnv(process.env.SUPABASE_KEY);
117
125
  export const SUPABASE_CONFIGURED = !!SUPABASE_URL &&
118
126
  !!SUPABASE_KEY &&
119
127
  isHttpUrl(SUPABASE_URL);
128
+ // โ”€โ”€โ”€ Synalux Cloud Backend (thin-client mode) โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
129
+ // When PRISM_SYNALUX_BASE_URL + PRISM_SYNALUX_API_KEY are set, the MCP
130
+ // becomes a thin HTTP client of the synalux portal. This is the paid-tier
131
+ // default. Synalux portal owns project validation, tier gating, audit logs,
132
+ // and hivemind agent coordination.
133
+ export const PRISM_SYNALUX_BASE_URL = sanitizeEnv(process.env.PRISM_SYNALUX_BASE_URL);
134
+ export const PRISM_SYNALUX_API_KEY = sanitizeEnv(process.env.PRISM_SYNALUX_API_KEY);
135
+ export const SYNALUX_CONFIGURED = !!PRISM_SYNALUX_BASE_URL &&
136
+ !!PRISM_SYNALUX_API_KEY &&
137
+ isHttpUrl(PRISM_SYNALUX_BASE_URL);
120
138
  if (process.env.SUPABASE_URL && !SUPABASE_URL) {
121
139
  console.error("Warning: SUPABASE_URL appears unresolved/empty (e.g. template placeholder). Falling back to local storage unless explicitly fixed.");
122
140
  }
@@ -179,13 +197,10 @@ export const PRISM_SCHEDULER_INTERVAL_MS = parseInt(process.env.PRISM_SCHEDULER_
179
197
  );
180
198
  // โ”€โ”€โ”€ v5.4: Autonomous Web Scholar โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
181
199
  // Background LLM research pipeline powered by Brave Search + Firecrawl.
182
- // Tavily can be used as an alternative when TAVILY_API_KEY is set.
183
- // Defaults are conservative to prevent runaway API costs.
184
200
  export const FIRECRAWL_API_KEY = process.env.FIRECRAWL_API_KEY;
185
- export const TAVILY_API_KEY = process.env.TAVILY_API_KEY;
186
- export const PRISM_SCHOLAR_ENABLED = process.env.PRISM_SCHOLAR_ENABLED === "true"; // Opt-in
187
- if (PRISM_SCHOLAR_ENABLED && !FIRECRAWL_API_KEY && !TAVILY_API_KEY) {
188
- console.error("Warning: Neither FIRECRAWL_API_KEY nor TAVILY_API_KEY is set. Web Scholar will fall back to free search.");
201
+ export const PRISM_SCHOLAR_ENABLED = process.env.PRISM_SCHOLAR_ENABLED === "true";
202
+ if (PRISM_SCHOLAR_ENABLED && !FIRECRAWL_API_KEY) {
203
+ console.error("Warning: FIRECRAWL_API_KEY not set. Web Scholar will fall back to free search.");
189
204
  }
190
205
  export const PRISM_SCHOLAR_INTERVAL_MS = parseInt(process.env.PRISM_SCHOLAR_INTERVAL_MS || "0", 10 // Default manual-only
191
206
  );
@@ -17,6 +17,9 @@ let config = {
17
17
  const activeTasks = new Map();
18
18
  const taskHistory = [];
19
19
  export function configureDelegate(updates) {
20
+ if (updates.endpoint && !updates.endpoint.startsWith('https://') && !updates.endpoint.includes('localhost')) {
21
+ throw new Error('Cloud delegate endpoint must use HTTPS');
22
+ }
20
23
  config = { ...config, ...updates };
21
24
  debugLog(`Cloud Delegate: Configured โ†’ ${config.endpoint}`);
22
25
  }
@@ -116,6 +119,8 @@ export async function dispatchTask(taskId) {
116
119
  }
117
120
  // Move to history
118
121
  taskHistory.push({ ...task });
122
+ if (taskHistory.length > 1000)
123
+ taskHistory.splice(0, taskHistory.length - 1000);
119
124
  activeTasks.delete(taskId);
120
125
  debugLog(`Cloud Delegate: Task ${taskId} โ†’ ${task.status}`);
121
126
  return task;
@@ -129,6 +134,8 @@ export function cancelTask(taskId) {
129
134
  return false;
130
135
  task.status = "cancelled";
131
136
  taskHistory.push({ ...task });
137
+ if (taskHistory.length > 1000)
138
+ taskHistory.splice(0, taskHistory.length - 1000);
132
139
  activeTasks.delete(taskId);
133
140
  debugLog(`Cloud Delegate: Cancelled task ${taskId}`);
134
141
  return true;
@@ -187,7 +187,7 @@ return false;}
187
187
  }
188
188
  }
189
189
  else {
190
- res.setHeader("Access-Control-Allow-Origin", "*");
190
+ res.setHeader("Access-Control-Allow-Origin", `http://localhost:${PORT}`);
191
191
  }
192
192
  res.setHeader("Access-Control-Allow-Methods", "GET, POST, DELETE, OPTIONS");
193
193
  res.setHeader("Access-Control-Allow-Headers", "Content-Type, Authorization");
@@ -217,7 +217,7 @@ return false;}
217
217
  loginRateLimiter.reset(clientIP);
218
218
  res.writeHead(200, {
219
219
  "Content-Type": "application/json",
220
- "Set-Cookie": `prism_session=${token}; Path=/; HttpOnly; SameSite=Strict; Max-Age=${SESSION_TTL_MS / 1000}`,
220
+ "Set-Cookie": `prism_session=${token}; Path=/; HttpOnly; SameSite=Strict; Max-Age=${SESSION_TTL_MS / 1000}${(process.env.PRISM_DASHBOARD_ORIGIN?.startsWith("https://") || process.env.PRISM_DASHBOARD_SECURE) ? "; Secure" : ""}`,
221
221
  });
222
222
  return res.end(JSON.stringify({ ok: true }));
223
223
  }
@@ -237,7 +237,7 @@ return false;}
237
237
  res.writeHead(200, {
238
238
  "Content-Type": "application/json",
239
239
  // Clear the cookie on the client side
240
- "Set-Cookie": `prism_session=; Path=/; HttpOnly; SameSite=Strict; Max-Age=0`,
240
+ "Set-Cookie": `prism_session=; Path=/; HttpOnly; SameSite=Strict; Max-Age=0${(process.env.PRISM_DASHBOARD_ORIGIN?.startsWith("https://") || process.env.PRISM_DASHBOARD_SECURE) ? "; Secure" : ""}`,
241
241
  });
242
242
  return res.end(JSON.stringify({ ok: true }));
243
243
  }
@@ -635,10 +635,7 @@ return false;}
635
635
  // SECURITY: Allowlist of dashboard-settable keys to prevent
636
636
  // credential overwrite (SUPABASE_KEY, STRIPE_SECRET_KEY, etc.)
637
637
  const SETTABLE_KEYS = new Set([
638
- "PRISM_STORAGE", "SUPABASE_URL", "SUPABASE_KEY",
639
- "BRAVE_API_KEY", "BRAVE_ANSWERS_API_KEY",
640
- "GOOGLE_API_KEY", "VOYAGE_API_KEY",
641
- "FIRECRAWL_API_KEY", "TAVILY_API_KEY",
638
+ "PRISM_STORAGE",
642
639
  "embedding_provider", "embedding_model",
643
640
  "PRISM_ENABLE_HIVEMIND", "PRISM_DARK_FACTORY_ENABLED",
644
641
  "PRISM_TASK_ROUTER_ENABLED", "PRISM_SCHOLAR_ENABLED",
@@ -830,7 +827,7 @@ return false;}
830
827
  // Only allow files from home directory, /tmp, and current working directory.
831
828
  const resolvedPath = path.resolve(filePath);
832
829
  const homeDir = os.homedir();
833
- const allowedPrefixes = [homeDir, os.tmpdir(), process.cwd()];
830
+ const allowedPrefixes = [path.join(homeDir, ".prism-mcp", "imports"), os.tmpdir()];
834
831
  const isAllowed = allowedPrefixes.some(prefix => resolvedPath.startsWith(prefix + path.sep) || resolvedPath === prefix);
835
832
  if (!isAllowed) {
836
833
  res.writeHead(403, { "Content-Type": "application/json" });
@@ -1362,7 +1359,8 @@ self.addEventListener('message', (e) => {
1362
1359
  reject(err);
1363
1360
  };
1364
1361
  httpServer.on("error", onError);
1365
- httpServer.listen(port, () => {
1362
+ const bindHost = AUTH_ENABLED ? "0.0.0.0" : "127.0.0.1";
1363
+ httpServer.listen(port, bindHost, () => {
1366
1364
  httpServer.removeListener("error", onError);
1367
1365
  // Re-register a permanent error handler for runtime errors
1368
1366
  httpServer.on("error", (err) => {
package/dist/lifecycle.js CHANGED
@@ -141,13 +141,41 @@ export function acquireLock() {
141
141
  log(`Warning: Failed to process existing PID file: ${err instanceof Error ? err.message : String(err)}`);
142
142
  }
143
143
  }
144
- // Claim the lock for this process
144
+ // Claim the lock for this process โ€” use 'wx' for atomic exclusive creation (prevents TOCTOU)
145
145
  try {
146
- fs.writeFileSync(PID_FILE, process.pid.toString(), "utf8");
146
+ fs.writeFileSync(PID_FILE, process.pid.toString(), { encoding: "utf8", flag: "wx" });
147
147
  log(`Acquired singleton lock (PID ${process.pid})`);
148
148
  }
149
- catch (err) {
150
- log(`Warning: Failed to write PID file: ${err instanceof Error ? err.message : String(err)}`);
149
+ catch (wxErr) {
150
+ if (wxErr.code === "EEXIST") {
151
+ // File was created between our check and write โ€” re-read and verify
152
+ try {
153
+ const racePid = parseInt(fs.readFileSync(PID_FILE, "utf8").trim(), 10);
154
+ let raceAlive = false;
155
+ try {
156
+ process.kill(racePid, 0);
157
+ raceAlive = true;
158
+ }
159
+ catch {
160
+ raceAlive = false;
161
+ }
162
+ if (!raceAlive || isOrphanProcess(racePid)) {
163
+ // Dead or orphan โ€” safe to overwrite
164
+ fs.writeFileSync(PID_FILE, process.pid.toString(), "utf8");
165
+ log(`Acquired singleton lock (PID ${process.pid}) โ€” stale PID ${racePid} replaced`);
166
+ }
167
+ else {
168
+ log(`Existing server (PID ${racePid}) is active. Coexisting...`);
169
+ return;
170
+ }
171
+ }
172
+ catch (innerErr) {
173
+ log(`Warning: Failed to handle PID race: ${innerErr instanceof Error ? innerErr.message : String(innerErr)}`);
174
+ }
175
+ }
176
+ else {
177
+ log(`Warning: Failed to write PID file: ${wxErr instanceof Error ? wxErr.message : String(wxErr)}`);
178
+ }
151
179
  }
152
180
  }
153
181
  /**
@@ -12,7 +12,8 @@ export async function searchYahooFree(query, limit = 5) {
12
12
  method: 'GET',
13
13
  headers: {
14
14
  'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36'
15
- }
15
+ },
16
+ signal: AbortSignal.timeout(15_000),
16
17
  });
17
18
  if (!response.ok) {
18
19
  throw new Error(`Yahoo Search failed with status: ${response.status}`);
@@ -46,10 +47,27 @@ export async function searchYahooFree(query, limit = 5) {
46
47
  * and converts it to Markdown using Turndown.
47
48
  */
48
49
  export async function scrapeArticleLocal(url) {
50
+ // SSRF protection: reject private/internal URLs
51
+ try {
52
+ const parsed = new URL(url);
53
+ if (parsed.protocol !== 'https:' && parsed.protocol !== 'http:')
54
+ throw new Error('Invalid protocol');
55
+ const host = parsed.hostname.toLowerCase();
56
+ if (host === 'localhost' || host === '127.0.0.1' || host === '::1' ||
57
+ host.startsWith('10.') || host.startsWith('192.168.') || host.startsWith('169.254.') ||
58
+ /^172\.(1[6-9]|2\d|3[01])\./.test(host) || host.endsWith('.internal') || host.endsWith('.local')) {
59
+ throw new Error('Internal URLs not allowed');
60
+ }
61
+ }
62
+ catch (e) {
63
+ throw new Error(`Invalid URL: ${e instanceof Error ? e.message : 'unknown'}`);
64
+ }
49
65
  const response = await fetch(url, {
50
66
  headers: {
51
67
  'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36'
52
- }
68
+ },
69
+ redirect: 'error',
70
+ signal: AbortSignal.timeout(15_000),
53
71
  });
54
72
  if (!response.ok) {
55
73
  throw new Error(`Failed to fetch article HTML: ${response.statusText}`);
@@ -1,10 +1,9 @@
1
- import { BRAVE_API_KEY, FIRECRAWL_API_KEY, TAVILY_API_KEY, GOOGLE_SEARCH_API_KEY, GOOGLE_SEARCH_CX, SEMANTIC_SCHOLAR_API_KEY, PRISM_SCHOLAR_MAX_ARTICLES_PER_RUN, PRISM_USER_ID, PRISM_SCHOLAR_TOPICS, PRISM_ENABLE_HIVEMIND } from "../config.js";
1
+ import { BRAVE_API_KEY, FIRECRAWL_API_KEY, GOOGLE_SEARCH_API_KEY, GOOGLE_SEARCH_CX, SEMANTIC_SCHOLAR_API_KEY, PRISM_SCHOLAR_MAX_ARTICLES_PER_RUN, PRISM_USER_ID, PRISM_SCHOLAR_TOPICS, PRISM_ENABLE_HIVEMIND } from "../config.js";
2
2
  import { getStorage } from "../storage/index.js";
3
3
  import { debugLog } from "../utils/logger.js";
4
4
  import { getLLMProvider } from "../utils/llm/factory.js";
5
5
  import { randomUUID } from "node:crypto";
6
6
  import { performWebSearchRaw } from "../utils/braveApi.js";
7
- import { performTavilySearch, performTavilyExtract } from "../utils/tavilyApi.js";
8
7
  import { performGoogleSearch } from "../utils/googleSearchApi.js";
9
8
  import { getTracer } from "../utils/telemetry.js";
10
9
  import { searchYahooFree, scrapeArticleLocal } from "./freeSearch.js";
@@ -91,8 +90,7 @@ export async function runWebScholar(overrideTopic, overrideProject) {
91
90
  try {
92
91
  const useGoogle = !!(GOOGLE_SEARCH_API_KEY && GOOGLE_SEARCH_CX);
93
92
  const useBraveFirecrawl = !useGoogle && !!(BRAVE_API_KEY && FIRECRAWL_API_KEY);
94
- const useTavily = !useGoogle && !useBraveFirecrawl && !!TAVILY_API_KEY;
95
- const useFreeFallback = !useGoogle && !useBraveFirecrawl && !useTavily;
93
+ const useFreeFallback = !useGoogle && !useBraveFirecrawl;
96
94
  const topic = overrideTopic || await selectTopic();
97
95
  const project = overrideProject || SCHOLAR_PROJECT;
98
96
  if (!topic) {
@@ -112,10 +110,6 @@ export async function runWebScholar(overrideTopic, overrideProject) {
112
110
  const braveData = JSON.parse(braveResponse);
113
111
  urls = (braveData.web?.results || []).map((r) => r.url).filter(Boolean);
114
112
  }
115
- else if (useTavily) {
116
- const tavilyResults = await performTavilySearch(TAVILY_API_KEY, topic, PRISM_SCHOLAR_MAX_ARTICLES_PER_RUN);
117
- urls = tavilyResults.map(r => r.url).filter(Boolean);
118
- }
119
113
  else {
120
114
  // Parallel Academic Discovery (PubMed + ERIC + Semantic Scholar)
121
115
  const academicCount = Math.ceil(PRISM_SCHOLAR_MAX_ARTICLES_PER_RUN / 2);
@@ -135,21 +129,12 @@ export async function runWebScholar(overrideTopic, overrideProject) {
135
129
  return `No articles found for "${topic}"`;
136
130
  await hivemindHeartbeat(`Scraping ${urls.length} articles on: ${topic}`);
137
131
  const scrapedTexts = [];
138
- if (useTavily) {
139
- const extracted = await performTavilyExtract(TAVILY_API_KEY, urls);
140
- for (const item of extracted) {
141
- if (item.rawContent)
142
- scrapedTexts.push(`Source: ${item.url}\n\n${item.rawContent.slice(0, 15_000)}`);
143
- }
144
- }
145
- else {
146
- for (const url of urls) {
147
- try {
148
- const article = await scrapeArticleLocal(url);
149
- scrapedTexts.push(`Source: ${url}\nTitle: ${article.title}\n\n${article.content.slice(0, 15_000)}`);
150
- }
151
- catch { }
132
+ for (const url of urls) {
133
+ try {
134
+ const article = await scrapeArticleLocal(url);
135
+ scrapedTexts.push(`Source: ${url}\nTitle: ${article.title}\n\n${article.content.slice(0, 15_000)}`);
152
136
  }
137
+ catch { }
153
138
  }
154
139
  if (scrapedTexts.length === 0)
155
140
  return "All scrapes failed";
@@ -187,7 +172,7 @@ export async function runWebScholar(overrideTopic, overrideProject) {
187
172
  async function searchPubMed(query, count) {
188
173
  const searchUrl = `https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esearch.fcgi?db=pubmed&term=${encodeURIComponent(query)}&retmode=json&retmax=${count}`;
189
174
  try {
190
- const searchResp = await fetch(searchUrl);
175
+ const searchResp = await fetch(searchUrl, { signal: AbortSignal.timeout(15_000) });
191
176
  if (!searchResp.ok)
192
177
  throw new Error("PubMed Search failed");
193
178
  const searchData = await searchResp.json();
@@ -202,7 +187,7 @@ async function searchPubMed(query, count) {
202
187
  async function searchERIC(query, count) {
203
188
  const url = `https://api.ies.ed.gov/eric/?search=${encodeURIComponent(query)}&rows=${count}&format=json`;
204
189
  try {
205
- const resp = await fetch(url);
190
+ const resp = await fetch(url, { signal: AbortSignal.timeout(15_000) });
206
191
  if (!resp.ok)
207
192
  throw new Error("ERIC Search failed");
208
193
  const data = await resp.json();
@@ -219,7 +204,7 @@ async function searchSemanticScholar(query, count) {
219
204
  const headers = {};
220
205
  if (SEMANTIC_SCHOLAR_API_KEY)
221
206
  headers["x-api-key"] = SEMANTIC_SCHOLAR_API_KEY;
222
- const resp = await fetch(url, { headers });
207
+ const resp = await fetch(url, { headers, signal: AbortSignal.timeout(15_000) });
223
208
  if (!resp.ok)
224
209
  throw new Error("Semantic Scholar Search failed");
225
210
  const data = await resp.json();
@@ -11,6 +11,7 @@
11
11
  * SYNALUX_API_URL โ€” Base URL (default: https://synalux.ai)
12
12
  * SYNALUX_API_KEY โ€” API key for authentication (required for paid tiers)
13
13
  */
14
+ const SAFE_SCM_NAME = /^[a-zA-Z0-9._-]+$/;
14
15
  export class ScmClient {
15
16
  baseUrl;
16
17
  apiKey;
@@ -18,13 +19,20 @@ export class ScmClient {
18
19
  this.baseUrl = (baseUrl || process.env.SYNALUX_API_URL || 'https://synalux.ai').replace(/\/$/, '');
19
20
  this.apiKey = apiKey || process.env.SYNALUX_API_KEY;
20
21
  }
22
+ parseRepo(repo) {
23
+ const [owner, name] = repo.split('/');
24
+ if (!owner || !name || !SAFE_SCM_NAME.test(owner) || !SAFE_SCM_NAME.test(name)) {
25
+ throw new Error(`Invalid repo identifier: "${repo}". Owner and name must match /^[a-zA-Z0-9._-]+$/.`);
26
+ }
27
+ return [owner, name];
28
+ }
21
29
  async request(path, options) {
22
30
  const url = `${this.baseUrl}/api/v1/scm${path}`;
23
31
  const headers = {
24
32
  'Content-Type': 'application/json',
25
33
  ...(this.apiKey ? { Authorization: `Bearer ${this.apiKey}` } : {}),
26
34
  };
27
- const res = await fetch(url, { ...options, headers: { ...headers, ...options?.headers } });
35
+ const res = await fetch(url, { ...options, headers: { ...headers, ...options?.headers }, signal: AbortSignal.timeout(15_000) });
28
36
  if (!res.ok) {
29
37
  const body = await res.text().catch(() => '');
30
38
  throw new Error(`SCM API ${res.status}: ${body || res.statusText}`);
@@ -33,7 +41,7 @@ export class ScmClient {
33
41
  }
34
42
  // โ”€โ”€ Code Search โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
35
43
  async search(repo, query) {
36
- const [owner, name] = repo.split('/');
44
+ const [owner, name] = this.parseRepo(repo);
37
45
  return this.request(`/repos/${owner}/${name}/search`, {
38
46
  method: 'POST',
39
47
  body: JSON.stringify(query),
@@ -41,7 +49,7 @@ export class ScmClient {
41
49
  }
42
50
  // โ”€โ”€ AI Review โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
43
51
  async review(repo, files, options) {
44
- const [owner, name] = repo.split('/');
52
+ const [owner, name] = this.parseRepo(repo);
45
53
  return this.request(`/repos/${owner}/${name}/review`, {
46
54
  method: 'POST',
47
55
  body: JSON.stringify({ files, hipaa: options?.hipaa }),
@@ -49,7 +57,7 @@ export class ScmClient {
49
57
  }
50
58
  // โ”€โ”€ Security Scan โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
51
59
  async scan(repo, files) {
52
- const [owner, name] = repo.split('/');
60
+ const [owner, name] = this.parseRepo(repo);
53
61
  return this.request(`/repos/${owner}/${name}/security`, {
54
62
  method: 'POST',
55
63
  body: JSON.stringify({ files }),
@@ -57,7 +65,7 @@ export class ScmClient {
57
65
  }
58
66
  // โ”€โ”€ DORA Metrics โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
59
67
  async dora(repo, period) {
60
- const [owner, name] = repo.split('/');
68
+ const [owner, name] = this.parseRepo(repo);
61
69
  const qs = period ? `?period=${encodeURIComponent(period)}` : '';
62
70
  return this.request(`/repos/${owner}/${name}/dora${qs}`);
63
71
  }
@@ -47,7 +47,7 @@ async function githubFetch(path, method = "GET", body) {
47
47
  if (body) {
48
48
  options.body = JSON.stringify(body);
49
49
  }
50
- const response = await fetch(url, options);
50
+ const response = await fetch(url, { ...options, signal: AbortSignal.timeout(15_000) });
51
51
  const data = await response.json();
52
52
  return { status: response.status, data };
53
53
  }