opik-mcp 0.1.2 → 2.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -40,7 +40,8 @@ A Model Context Protocol (MCP) implementation for the <a href="https://github.co
40
40
  </a>
41
41
  </p>
42
42
 
43
- > **⚠️ Notice:** SSE (Server-Sent Events) transport support is currently experimental and untested. For production use, we recommend using the direct process execution approach shown in the IDE integration examples.
43
+ > **Note:** This repository provides the MCP server implementation. We do not currently provide a hosted remote MCP service for Opik.
44
+ > If you run streamable-http remotely, authentication is fail-closed by default.
44
45
 
45
46
  ## 🚀 What is Opik MCP Server?
46
47
 
@@ -50,12 +51,12 @@ Opik MCP Server is an open-source implementation of the Model Context Protocol f
50
51
 
51
52
  You can use Opik MCP Server for:
52
53
  * **IDE Integration:**
53
- * Seamlessly integrate with Cursor and other compatible IDEs
54
+ * Seamlessly integrate with Cursor, VS Code, Windsurf and other compatible IDEs
54
55
  * Provide direct access to Opik's capabilities from your development environment
55
56
 
56
57
  * **Unified API Access:**
57
58
  * Access all Opik features through a standardized protocol
58
- * Leverage multiple transport options (stdio, SSE) for different integration scenarios
59
+ * Leverage multiple transport options (stdio, streamable-http) for different integration scenarios
59
60
 
60
61
  * **Platform Management:**
61
62
  * Manage prompts, projects, traces, and metrics through a consistent interface
@@ -67,6 +68,7 @@ You can use Opik MCP Server for:
67
68
  - **Projects/Workspaces Management**: Organize and manage projects
68
69
  - **Traces**: Track and analyze trace data
69
70
  - **Metrics**: Gather and query metrics data
71
+ - **MCP Resources**: Read-only resources for workspace/project metadata plus resource templates for prompts, datasets, and traces
70
72
 
71
73
  ## Quick Start
72
74
 
@@ -102,6 +104,65 @@ Alternatively, you can create a `.cursor/mcp.json` in your project and add:
102
104
  Note: If you are using the Open-Source version of Opik, you will need to specify
103
105
  the `apiBaseUrl` parameter as `http://localhost:5173/api`.
104
106
 
107
+ #### VS Code Integration (GitHub Copilot)
108
+
109
+ To integrate Opik with VS Code (GitHub Copilot), you need to add the MCP server
110
+ configuration to your workspace or user settings.
111
+
112
+ 1. Create or open the `.vscode/mcp.json` file in your workspace (or run the
113
+ **MCP: Open User Configuration** command to add it globally).
114
+
115
+ 2. Add the Opik MCP server configuration:
116
+
117
+ ```json
118
+ {
119
+ "inputs": [
120
+ {
121
+ "type": "promptString",
122
+ "id": "opik-api-key",
123
+ "description": "Opik API Key",
124
+ "password": true
125
+ }
126
+ ],
127
+ "servers": {
128
+ "opik-mcp": {
129
+ "type": "stdio",
130
+ "command": "npx",
131
+ "args": [
132
+ "-y",
133
+ "opik-mcp",
134
+ "--apiKey",
135
+ "${input:opik-api-key}"
136
+ ]
137
+ }
138
+ }
139
+ }
140
+ ```
141
+
142
+ 3. When you start the MCP server for the first time, VS Code will prompt you
143
+ to enter your Opik API key. The value is securely stored for subsequent use.
144
+
145
+ Note: If you are using the Open-Source version of Opik, add the `--apiBaseUrl`
146
+ argument and remove the `--apiKey` argument:
147
+
148
+ ```json
149
+ {
150
+ "servers": {
151
+ "opik-mcp": {
152
+ "type": "stdio",
153
+ "command": "npx",
154
+ "args": [
155
+ "-y",
156
+ "opik-mcp",
157
+ "--apiBaseUrl",
158
+ "http://localhost:5173/api"
159
+ ]
160
+ }
161
+ },
162
+ "inputs": []
163
+ }
164
+ ```
165
+
105
166
  #### Windsurf Installation
106
167
 
107
168
  To install the MCP server in Windsurf, you will need to open the Windsurf settings
@@ -153,8 +214,8 @@ cp .env.example .env
153
214
  # Start with stdio transport (default)
154
215
  npm run start:stdio
155
216
 
156
- # Start with SSE transport for network access (experimental)
157
- npm run start:sse
217
+ # Start with streamable-http transport for remote/self-hosted access
218
+ npm run start:http
158
219
  ```
159
220
 
160
221
  ## Transport Options
@@ -167,15 +228,34 @@ Ideal for local integration where the client and server run on the same machine.
167
228
  make start-stdio
168
229
  ```
169
230
 
170
- ### Server-Sent Events (SSE)
231
+ ### Streamable HTTP
232
+
233
+ Enables remote/self-hosted MCP over the standard Streamable HTTP endpoint (`/mcp`).
234
+
235
+ Remote auth behavior:
236
+ - `Authorization: Bearer <OPIK_API_KEY>` or `x-api-key` is required by default.
237
+ - Workspace is resolved server-side (recommended via token mapping). Header workspaces are not trusted by default.
238
+ - In remote mode, request-context workspace takes precedence over tool `workspaceName` args.
239
+ - Missing/invalid auth returns HTTP `401`.
171
240
 
172
- Enables remote access and multiple simultaneous clients over HTTP. Note that this transport option is experimental.
241
+ Remote auth environment flags:
242
+ - `STREAMABLE_HTTP_REQUIRE_AUTH` (default `true`): require auth headers on `/mcp`.
243
+ - `STREAMABLE_HTTP_VALIDATE_REMOTE_AUTH` (default `true`, except test env): validate bearer/API key against Opik before accepting requests.
244
+ - `REMOTE_TOKEN_WORKSPACE_MAP`: JSON map of token -> workspace for server-side tenant routing.
245
+ - `STREAMABLE_HTTP_TRUST_WORKSPACE_HEADERS` (default `false`): allow workspace headers when token map is not configured.
173
246
 
174
247
  ```bash
175
- make start-sse
248
+ npm run start:http
176
249
  ```
177
250
 
178
- For detailed information about the SSE transport, see [docs/sse-transport.md](docs/sse-transport.md).
251
+ For detailed information about streamable-http transport, see [docs/streamable-http-transport.md](docs/streamable-http-transport.md).
252
+
253
+ ## Resources and Prompts Capabilities
254
+
255
+ - `resources/list` exposes static resources (for example, `opik://workspace-info`, `opik://projects-list`).
256
+ - `resources/templates/list` exposes dynamic URI templates (for example, `opik://projects/{page}/{size}`, `opik://prompt/{name}`).
257
+ - `resources/read` supports both static URIs and filled template URIs.
258
+ - `prompts/list` and `prompts/get` expose workflow prompts (for example, `opik-triage-workflow`).
179
259
 
180
260
  ## Development
181
261
 
@@ -186,7 +266,7 @@ For detailed information about the SSE transport, see [docs/sse-transport.md](do
186
266
  npm test
187
267
 
188
268
  # Run specific test suite
189
- npm test -- tests/transports/sse-transport.test.ts
269
+ npm test -- tests/transports/streamable-http-transport.test.ts
190
270
  ```
191
271
 
192
272
  ### Pre-commit Hooks
@@ -200,7 +280,7 @@ make precommit
200
280
 
201
281
  ## Documentation
202
282
 
203
- - [SSE Transport](docs/sse-transport.md) - Details on using the SSE transport
283
+ - [Streamable HTTP Transport](docs/streamable-http-transport.md) - Details on remote transport
204
284
  - [API Reference](docs/api-reference.md) - Complete API documentation
205
285
  - [Configuration](docs/configuration.md) - Advanced configuration options
206
286
  - [IDE Integration](docs/ide-integration.md) - Integration with Cursor IDE
package/build/cli.js CHANGED
@@ -8,25 +8,27 @@ const argv = yargs(hideBin(process.argv))
8
8
  .usage('$0 [args]')
9
9
  .option('transport', {
10
10
  alias: 't',
11
- description: 'Transport to use (stdio or sse)',
11
+ description: 'Transport to use (stdio or streamable-http)',
12
12
  type: 'string',
13
- default: 'stdio',
14
- choices: ['stdio', 'sse'],
13
+ choices: ['stdio', 'streamable-http'],
15
14
  })
16
15
  .option('port', {
17
16
  alias: 'p',
18
- description: 'Port to listen on (for sse transport)',
17
+ description: 'Port to listen on (for streamable-http transport)',
19
18
  type: 'number',
20
- default: 3001,
21
19
  })
22
20
  .help()
23
21
  .alias('help', 'h')
24
22
  .strict(false) // Allow unknown options
25
23
  .parseSync();
26
24
  // Update config based on CLI arguments
27
- configImport.transport = argv.transport;
28
- if (argv.transport === 'sse') {
29
- configImport.ssePort = argv.port;
25
+ if (argv.transport) {
26
+ configImport.transport = argv.transport;
27
+ process.env.TRANSPORT = argv.transport;
28
+ }
29
+ if (argv.transport === 'streamable-http' && typeof argv.port === 'number') {
30
+ configImport.streamableHttpPort = argv.port;
31
+ process.env.STREAMABLE_HTTP_PORT = String(argv.port);
30
32
  }
31
33
  // Import and start the server (index.js will handle the main() call)
32
34
  import './index.js';
package/build/config.js CHANGED
@@ -31,7 +31,74 @@ function writeToLogFile(message, forceWrite = false) {
31
31
  // Silently fail if we can't write to the log file
32
32
  }
33
33
  }
34
- export const DEFAULT_TOOLSETS = ['integration', 'prompts', 'projects', 'traces'];
34
+ export const DEFAULT_TOOLSETS = ['core'];
35
+ export const ALL_TOOLSETS = [
36
+ 'core',
37
+ 'expert-prompts',
38
+ 'expert-datasets',
39
+ 'expert-trace-actions',
40
+ 'expert-project-actions',
41
+ 'integration',
42
+ 'metrics',
43
+ ];
44
+ const ALL_TOOLSET_CHOICES = [
45
+ 'all',
46
+ 'core',
47
+ 'expert-prompts',
48
+ 'expert-datasets',
49
+ 'expert-trace-actions',
50
+ 'expert-project-actions',
51
+ 'integration',
52
+ 'metrics',
53
+ 'capabilities',
54
+ 'prompts',
55
+ 'datasets',
56
+ 'projects',
57
+ 'traces',
58
+ ];
59
+ export function normalizeToolsets(values) {
60
+ const normalized = new Set();
61
+ for (const value of values.flatMap(v => v.split(',')).map(v => v.trim())) {
62
+ const toolset = value;
63
+ switch (toolset) {
64
+ case 'all':
65
+ for (const item of ALL_TOOLSETS) {
66
+ normalized.add(item);
67
+ }
68
+ break;
69
+ case 'core':
70
+ case 'expert-prompts':
71
+ case 'expert-datasets':
72
+ case 'expert-trace-actions':
73
+ case 'expert-project-actions':
74
+ case 'integration':
75
+ case 'metrics':
76
+ normalized.add(toolset);
77
+ break;
78
+ // Legacy aliases
79
+ case 'capabilities':
80
+ normalized.add('core');
81
+ break;
82
+ case 'prompts':
83
+ normalized.add('expert-prompts');
84
+ break;
85
+ case 'datasets':
86
+ normalized.add('expert-datasets');
87
+ break;
88
+ case 'projects':
89
+ normalized.add('core');
90
+ normalized.add('expert-project-actions');
91
+ break;
92
+ case 'traces':
93
+ normalized.add('core');
94
+ normalized.add('expert-trace-actions');
95
+ break;
96
+ default:
97
+ break;
98
+ }
99
+ }
100
+ return Array.from(normalized);
101
+ }
35
102
  /**
36
103
  * Load configuration from ~/.opik.config file
37
104
  */
@@ -78,7 +145,7 @@ function loadOpikConfigFile() {
78
145
  }
79
146
  }
80
147
  }
81
- writeToLogFile(`Loaded config from ~/.opik.config: ${JSON.stringify(config)}`);
148
+ writeToLogFile(`Loaded config from ~/.opik.config with keys: ${Object.keys(config).join(', ') || '(none)'}`);
82
149
  return config;
83
150
  }
84
151
  catch (error) {
@@ -118,24 +185,20 @@ function parseCommandLineArgs() {
118
185
  // Transport Configuration
119
186
  .option('transport', {
120
187
  type: 'string',
121
- description: 'Transport type (stdio or sse)',
122
- choices: ['stdio', 'sse'],
123
- default: 'stdio',
188
+ description: 'Transport type (stdio or streamable-http)',
189
+ choices: ['stdio', 'streamable-http'],
124
190
  })
125
- .option('ssePort', {
191
+ .option('streamableHttpPort', {
126
192
  type: 'number',
127
- description: 'Port for SSE transport',
128
- default: 3001,
193
+ description: 'Port for streamable-http transport',
129
194
  })
130
- .option('sseHost', {
195
+ .option('streamableHttpHost', {
131
196
  type: 'string',
132
- description: 'Host for SSE transport',
133
- default: 'localhost',
197
+ description: 'Host for streamable-http transport',
134
198
  })
135
- .option('sseLogPath', {
199
+ .option('streamableHttpLogPath', {
136
200
  type: 'string',
137
- description: 'Log file path for SSE transport',
138
- default: '/tmp/opik-mcp-sse.log',
201
+ description: 'Log file path for streamable-http transport',
139
202
  })
140
203
  // MCP Configuration
141
204
  .option('mcpName', {
@@ -162,7 +225,7 @@ function parseCommandLineArgs() {
162
225
  .option('toolsets', {
163
226
  type: 'array',
164
227
  description: 'Comma-separated list of toolsets to enable',
165
- choices: ['capabilities', 'integration', 'prompts', 'projects', 'traces', 'metrics'],
228
+ choices: ALL_TOOLSET_CHOICES,
166
229
  })
167
230
  .help()
168
231
  .parse();
@@ -192,13 +255,15 @@ export function loadConfig() {
192
255
  : process.env.OPIK_SELF_HOSTED === 'true' || false,
193
256
  debugMode: args.debug !== undefined ? args.debug : process.env.DEBUG_MODE === 'true' || false,
194
257
  // Transport configuration
195
- transport: (args.transport || process.env.TRANSPORT || 'stdio'),
196
- ssePort: args.ssePort || (process.env.SSE_PORT ? parseInt(process.env.SSE_PORT, 10) : 3001),
197
- sseHost: args.sseHost || process.env.SSE_HOST || 'localhost',
198
- sseLogPath: args.sseLogPath || process.env.SSE_LOG_PATH || '/tmp/opik-mcp-sse.log',
258
+ transport: (args.transport ?? process.env.TRANSPORT ?? 'stdio'),
259
+ streamableHttpPort: args.streamableHttpPort ??
260
+ (process.env.STREAMABLE_HTTP_PORT ? parseInt(process.env.STREAMABLE_HTTP_PORT, 10) : 3001),
261
+ streamableHttpHost: args.streamableHttpHost ?? process.env.STREAMABLE_HTTP_HOST ?? '127.0.0.1',
262
+ streamableHttpLogPath: args.streamableHttpLogPath ??
263
+ (process.env.STREAMABLE_HTTP_LOG_PATH || '/tmp/opik-mcp-streamable-http.log'),
199
264
  // MCP configuration with fallbacks
200
265
  mcpName: args.mcpName || process.env.MCP_NAME || 'opik-manager',
201
- mcpVersion: args.mcpVersion || process.env.MCP_VERSION || '1.0.0',
266
+ mcpVersion: args.mcpVersion || process.env.MCP_VERSION || '2.0.0',
202
267
  mcpPort: args.mcpPort || (process.env.MCP_PORT ? parseInt(process.env.MCP_PORT, 10) : undefined),
203
268
  mcpLogging: args.mcpLogging !== undefined ? args.mcpLogging : process.env.MCP_LOGGING === 'true' || false,
204
269
  mcpDefaultWorkspace: args.mcpDefaultWorkspace || process.env.MCP_DEFAULT_WORKSPACE || 'default',
@@ -206,12 +271,11 @@ export function loadConfig() {
206
271
  enabledToolsets: (() => {
207
272
  // Command line takes precedence
208
273
  if (args.toolsets && args.toolsets.length > 0) {
209
- return args.toolsets.filter((t) => ['integration', 'prompts', 'projects', 'traces', 'metrics'].includes(t));
274
+ return normalizeToolsets(args.toolsets);
210
275
  }
211
276
  // Environment variable fallback
212
277
  if (process.env.OPIK_TOOLSETS) {
213
- const envToolsets = process.env.OPIK_TOOLSETS.split(',').map(t => t.trim());
214
- return envToolsets.filter((t) => ['integration', 'prompts', 'projects', 'traces', 'metrics'].includes(t));
278
+ return normalizeToolsets(process.env.OPIK_TOOLSETS.split(','));
215
279
  }
216
280
  // Default toolsets
217
281
  return DEFAULT_TOOLSETS;
@@ -244,10 +308,10 @@ export function loadConfig() {
244
308
  // Log transport configuration
245
309
  writeToLogFile('\nTransport Configuration:');
246
310
  writeToLogFile(`- Transport: ${config.transport}`);
247
- if (config.transport === 'sse') {
248
- writeToLogFile(`- SSE Port: ${config.ssePort}`);
249
- writeToLogFile(`- SSE Host: ${config.sseHost}`);
250
- writeToLogFile(`- SSE Log Path: ${config.sseLogPath}`);
311
+ if (config.transport === 'streamable-http') {
312
+ writeToLogFile(`- Streamable HTTP Port: ${config.streamableHttpPort}`);
313
+ writeToLogFile(`- Streamable HTTP Host: ${config.streamableHttpHost}`);
314
+ writeToLogFile(`- Streamable HTTP Log Path: ${config.streamableHttpLogPath}`);
251
315
  }
252
316
  // Log MCP configuration
253
317
  writeToLogFile('\nMCP Configuration:');
package/build/index.js CHANGED
@@ -2,9 +2,8 @@ import fs from 'fs';
2
2
  // Import other modules
3
3
  import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
4
4
  import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
5
- import { makeApiRequest } from './utils/api.js';
6
5
  // Import custom transports
7
- import { SSEServerTransport } from './transports/sse-transport.js';
6
+ import { StreamableHttpTransport } from './transports/streamable-http-transport.js';
8
7
  // Import environment variables loader - no console output
9
8
  import './utils/env.js';
10
9
  import { logToFile, logFile } from './utils/logging.js';
@@ -14,9 +13,19 @@ import { loadPromptTools } from './tools/prompt.js';
14
13
  import { loadProjectTools } from './tools/project.js';
15
14
  import { loadMetricTools } from './tools/metrics.js';
16
15
  import { loadIntegrationTools } from './tools/integration.js';
16
+ import { loadCapabilitiesTools } from './tools/capabilities.js';
17
+ import { loadDatasetTools } from './tools/dataset.js';
18
+ import { loadCorePrompts } from './prompts/core-prompts.js';
19
+ import { loadOpikResources } from './resources/opik-resources.js';
17
20
  // Import configuration
18
21
  import { loadConfig } from './config.js';
19
22
  const config = loadConfig();
23
+ function toErrorMessage(error) {
24
+ if (error instanceof Error) {
25
+ return error.message;
26
+ }
27
+ return String(error);
28
+ }
20
29
  // Only initialize log file if debug mode is enabled
21
30
  if (config.debugMode) {
22
31
  try {
@@ -53,90 +62,57 @@ export let server = new McpServer({
53
62
  });
54
63
  // Load tools based on enabled toolsets
55
64
  logToFile(`Loading toolsets: ${config.enabledToolsets.join(', ')}`);
56
- if (config.enabledToolsets.includes('integration')) {
65
+ const enabledToolsets = new Set(config.enabledToolsets);
66
+ if (enabledToolsets.has('integration')) {
57
67
  server = loadIntegrationTools(server);
58
68
  logToFile('Loaded integration toolset');
59
69
  }
60
- if (config.enabledToolsets.includes('prompts')) {
70
+ if (enabledToolsets.has('core')) {
71
+ server = loadCapabilitiesTools(server, config);
72
+ logToFile('Loaded core capabilities tools');
73
+ server = loadCorePrompts(server);
74
+ logToFile('Loaded core prompts');
75
+ server = loadProjectTools(server, { includeReadOps: true, includeMutations: false });
76
+ logToFile('Loaded core project read tools');
77
+ server = loadTraceTools(server, { includeCoreTools: true, includeExpertActions: false });
78
+ logToFile('Loaded core trace tools');
79
+ }
80
+ if (enabledToolsets.has('expert-prompts')) {
61
81
  server = loadPromptTools(server);
62
- logToFile('Loaded prompts toolset');
82
+ logToFile('Loaded expert prompts toolset');
63
83
  }
64
- if (config.enabledToolsets.includes('projects')) {
65
- server = loadProjectTools(server);
66
- logToFile('Loaded projects toolset');
84
+ if (enabledToolsets.has('expert-datasets')) {
85
+ server = loadDatasetTools(server);
86
+ logToFile('Loaded expert datasets toolset');
67
87
  }
68
- if (config.enabledToolsets.includes('traces')) {
69
- server = loadTraceTools(server);
70
- logToFile('Loaded traces toolset');
88
+ if (enabledToolsets.has('expert-project-actions')) {
89
+ server = loadProjectTools(server, { includeReadOps: false, includeMutations: true });
90
+ logToFile('Loaded expert project actions toolset');
71
91
  }
72
- if (config.enabledToolsets.includes('metrics')) {
92
+ if (enabledToolsets.has('expert-trace-actions')) {
93
+ server = loadTraceTools(server, { includeCoreTools: false, includeExpertActions: true });
94
+ logToFile('Loaded expert trace actions toolset');
95
+ }
96
+ if (enabledToolsets.has('metrics')) {
73
97
  server = loadMetricTools(server);
74
98
  logToFile('Loaded metrics toolset');
75
99
  }
76
100
  // Add resources to the MCP server
77
- if (config.workspaceName) {
78
- // Define a workspace info resource
79
- server.resource('workspace-info', 'opik://workspace-info', async () => ({
80
- contents: [
81
- {
82
- uri: 'opik://workspace-info',
83
- text: JSON.stringify({
84
- name: config.workspaceName,
85
- apiUrl: config.apiBaseUrl,
86
- selfHosted: config.isSelfHosted,
87
- }, null, 2),
88
- },
89
- ],
90
- }));
91
- // Define a projects resource that provides the list of projects in the workspace
92
- server.resource('projects-list', 'opik://projects-list', async () => {
93
- try {
94
- const response = await makeApiRequest('/v1/private/projects');
95
- if (!response.data) {
96
- return {
97
- contents: [
98
- {
99
- uri: 'opik://projects-list',
100
- text: `Error: ${response.error || 'Unknown error fetching projects'}`,
101
- },
102
- ],
103
- };
104
- }
105
- return {
106
- contents: [
107
- {
108
- uri: 'opik://projects-list',
109
- text: JSON.stringify(response.data, null, 2),
110
- },
111
- ],
112
- };
113
- }
114
- catch (error) {
115
- logToFile(`Error fetching projects resource: ${error}`);
116
- return {
117
- contents: [
118
- {
119
- uri: 'opik://projects-list',
120
- text: `Error: Failed to fetch projects data`,
121
- },
122
- ],
123
- };
124
- }
125
- });
126
- }
101
+ server = loadOpikResources(server, config);
127
102
  // ----------- SERVER CONFIGURATION TOOLS -----------
128
103
  // Main function to start the server
129
104
  export async function main() {
130
105
  logToFile('Starting main function');
131
106
  // Create the appropriate transport based on configuration
132
107
  let transport;
133
- if (config.transport === 'sse') {
134
- logToFile(`Creating SSEServerTransport on port ${config.ssePort}`);
135
- transport = new SSEServerTransport({
136
- port: config.ssePort || 3001,
108
+ if (config.transport === 'streamable-http') {
109
+ logToFile(`Creating Streamable HTTP transport on port ${config.streamableHttpPort}`);
110
+ transport = new StreamableHttpTransport({
111
+ port: config.streamableHttpPort || 3001,
112
+ host: config.streamableHttpHost || '127.0.0.1',
137
113
  });
138
- // Explicitly start the SSE transport
139
- logToFile('Starting SSE transport');
114
+ // Explicitly start the remote transport host
115
+ logToFile('Starting remote transport');
140
116
  await transport.start();
141
117
  }
142
118
  else {
@@ -145,23 +121,27 @@ export async function main() {
145
121
  }
146
122
  // Connect the server to the transport
147
123
  logToFile('Connecting server to transport');
148
- server.connect(transport);
124
+ await server.connect(transport);
149
125
  logToFile('Transport connection established');
150
126
  // Log server status
151
- if (config.transport === 'sse') {
152
- logToFile(`Opik MCP Server running on SSE (port ${config.ssePort})`);
127
+ if (config.transport === 'streamable-http') {
128
+ logToFile(`Opik MCP Server running on Streamable HTTP (port ${config.streamableHttpPort})`);
153
129
  }
154
130
  else {
155
131
  logToFile('Opik MCP Server running on stdio');
156
132
  }
157
133
  logToFile('Main function completed successfully');
158
- // Start heartbeat for keeping the process alive
159
- setInterval(() => {
160
- logToFile('Heartbeat ping');
161
- }, 5000);
162
134
  }
163
135
  // Start the server
164
136
  main().catch(error => {
165
- logToFile(`Error starting server: ${error}`);
137
+ const message = toErrorMessage(error);
138
+ logToFile(`Error starting server: ${message}`);
139
+ console.error(`Failed to start Opik MCP server: ${message}`);
140
+ if (error instanceof Error && error.stack) {
141
+ logToFile(error.stack);
142
+ if (config.debugMode) {
143
+ console.error(error.stack);
144
+ }
145
+ }
166
146
  process.exit(1);
167
147
  });
@@ -0,0 +1,52 @@
1
+ import { z } from 'zod';
2
+ import { registerPrompt } from '../tools/registration.js';
3
+ export function loadCorePrompts(server) {
4
+ registerPrompt(server, 'opik-triage-workflow', 'Guide for selecting the right Opik MCP toolset and first actions.', {
5
+ goal: z.string().describe('User goal to accomplish with Opik.'),
6
+ scope: z
7
+ .enum(['core', 'prompts', 'datasets', 'traces', 'projects', 'metrics'])
8
+ .default('core')
9
+ .describe('Primary domain for this task.'),
10
+ }, async ({ goal, scope }) => ({
11
+ messages: [
12
+ {
13
+ role: 'user',
14
+ content: {
15
+ type: 'text',
16
+ text: [
17
+ 'You are operating the Opik MCP server.',
18
+ `Goal: ${goal}`,
19
+ `Scope: ${scope}`,
20
+ 'First, call get-server-info.',
21
+ 'Then propose a short 3-step action plan and execute only read operations before mutations.',
22
+ 'When mutating, confirm target IDs from read results.',
23
+ ].join('\n'),
24
+ },
25
+ },
26
+ ],
27
+ }), { title: 'Opik Workflow Triage' });
28
+ registerPrompt(server, 'opik-dataset-maintenance', 'Workflow template for dataset curation and quality checks.', {
29
+ datasetName: z.string().describe('Dataset name to inspect or curate.'),
30
+ objective: z
31
+ .string()
32
+ .default('Find low-quality items and produce cleanup actions.')
33
+ .describe('Dataset maintenance objective.'),
34
+ }, async ({ datasetName, objective }) => ({
35
+ messages: [
36
+ {
37
+ role: 'user',
38
+ content: {
39
+ type: 'text',
40
+ text: [
41
+ `Dataset: ${datasetName}`,
42
+ `Objective: ${objective}`,
43
+ 'Use list-datasets to locate the dataset ID.',
44
+ 'Use list-dataset-items with pagination to inspect records.',
45
+ 'Return a concise cleanup plan with explicit item IDs before delete-dataset-item.',
46
+ ].join('\n'),
47
+ },
48
+ },
49
+ ],
50
+ }), { title: 'Dataset Maintenance' });
51
+ return server;
52
+ }