@sghanavati/relay-mcp 0.1.4 → 0.1.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -3,9 +3,9 @@
3
3
  [![Node.js](https://img.shields.io/node/v/@sghanavati/relay-mcp.svg)](https://nodejs.org)
4
4
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE)
5
5
 
6
- MCP server that lets Claude Code delegate coding tasks to Codex.
6
+ MCP server that lets Claude Code delegate work to Codex, OpenRouter, and LM Studio through one MCP tool.
7
7
 
8
- > **Prerequisites:** Codex CLI >= 0.39.0 must be installed (`npm install -g @openai/codex`) and authenticated (`codex login`).
8
+ > **Prerequisites:** For `provider: "codex"`, install Codex CLI >= 0.39.0 (`npm install -g @openai/codex`) and authenticate (`codex login`). For `provider: "openrouter"` set `OPENROUTER_API_KEY`. For `provider: "lmstudio"` run LM Studio and set `LMSTUDIO_ENDPOINT` if needed.
9
9
 
10
10
  ## Quick Start
11
11
  Add this to your Claude Code `.mcp.json`, then restart Claude Code:
@@ -27,7 +27,7 @@ Add this to your Claude Code `.mcp.json`, then restart Claude Code:
27
27
 
28
28
  - No global install of `relay-mcp` is required.
29
29
  - `npx -y @sghanavati/relay-mcp` downloads and runs the published package directly.
30
- - You only need to keep Codex CLI installed and authenticated locally.
30
+ - You need provider-specific runtime setup (Codex CLI for `codex`, API key for `openrouter`, local endpoint for `lmstudio`).
31
31
 
32
32
  ## Claude Code Registration
33
33
  ### npx (recommended)
@@ -133,7 +133,9 @@ Typical response payload:
133
133
  "warnings": [],
134
134
  "model": null,
135
135
  "token_estimate": 1200,
136
- "exit_code": 0
136
+ "exit_code": 0,
137
+ "provider": "codex",
138
+ "log_file": "~/.relay-mcp/run-<hash>.log"
137
139
  }
138
140
  }
139
141
  ```
@@ -226,22 +228,45 @@ tmux window
226
228
 
227
229
  **The loop:**
228
230
  1. CC writes a plan (GSD), then fires `delegate` calls — one per task
229
- 2. relay-mcp spawns Codex for each and returns `files_changed` + output when done
230
- 3. *(Phase 3)* Each task streams output to a log file; adjacent panes `tail -f` those logs live
231
- 4. CC reads `files_changed`, verifies output, moves to the next task
232
- 5. You supervise only CC Codex runs unattended
233
-
234
- > **Note:** Live log streaming (`meta.log_file`) lands in Phase 3. Steps 1, 2, 4, and 5 work today.
231
+ 2. relay-mcp executes each call with the selected provider and returns `files_changed` + output
232
+ 3. Each task writes to a provider run log file (`meta.log_file`); adjacent panes can `tail -f` logs live
233
+ 4. CC reads `files_changed` and output, verifies results, then moves to the next task
234
+ 5. You supervise only CC while providers run unattended
235
235
 
236
236
  ## Configuration
237
237
  | Variable | Default | Description |
238
238
  | --- | --- | --- |
239
239
  | `RELAY_CODEX_PATH` | `codex` from `PATH` | Full path to Codex binary for PATH-limited environments (especially Claude Desktop). |
240
240
  | `RELAY_LOG_LEVEL` | `info` | Startup log verbosity control. `error` suppresses the ready banner; other values show it. |
241
+ | `RELAY_ALLOWED_ROOTS` | unset | Optional allowlist for delegate `workdir`. Path-delimited list (`:` on macOS/Linux, `;` on Windows). If set, workdirs outside these roots are rejected. |
242
+ | `RELAY_SKIP_GIT_CHECK` | `0` | Set to `1` to add `--skip-git-repo-check` to Codex runs. Default keeps repo check enabled. |
243
+ | `OPENROUTER_API_KEY` | unset | Required for `provider: "openrouter"`. |
244
+ | `LMSTUDIO_ENDPOINT` | `http://localhost:1234` | Base URL for LM Studio OpenAI-compatible server. |
245
+ | `LMSTUDIO_API_KEY` | unset | Optional bearer token for LM Studio endpoint. |
241
246
 
242
247
  ### Provider model requirement
243
248
  - `provider: "codex"`: `model` is optional.
244
249
  - `provider: "openrouter"` or `provider: "lmstudio"`: `model` is required.
250
+ - `provider: "openrouter"` and `provider: "lmstudio"` are generation-only and always return `files_changed: []`.
251
+
252
+ ### Provider examples
253
+ ```javascript
254
+ delegate({
255
+ provider: "openrouter",
256
+ model: "google/gemini-2.5-pro",
257
+ task: "Review this implementation and return a concise risk list",
258
+ workdir: "/path/to/repo"
259
+ });
260
+ ```
261
+
262
+ ```javascript
263
+ delegate({
264
+ provider: "lmstudio",
265
+ model: "qwen2.5-coder-32b-instruct",
266
+ task: "Draft unit tests for src/auth.ts",
267
+ workdir: "/path/to/repo"
268
+ });
269
+ ```
245
270
 
246
271
  ## Troubleshooting
247
272
  ### 1) PATH issues (Codex not found)
@@ -278,7 +303,15 @@ Fix:
278
303
  - Confirm network/DNS connectivity and any proxy requirements.
279
304
  - Re-run `codex login` if your session may have expired.
280
305
 
281
- ### 4) `npx` from this package's own repo root
306
+ ### 4) Missing model for OpenRouter or LM Studio
307
+ Symptom:
308
+ `status: "error"` with `error.code: "INVALID_ARGS"` and message that model is required.
309
+
310
+ Fix:
311
+ - Add `model` whenever `provider` is `openrouter` or `lmstudio`.
312
+ - Example: `model: "google/gemini-2.5-pro"` for OpenRouter.
313
+
314
+ ### 5) `npx` from this package's own repo root
282
315
  Symptom:
283
316
  `sh: relay-mcp: command not found` when running `npx -y @sghanavati/relay-mcp` from inside the `relay-mcp` source checkout.
284
317
 
@@ -288,9 +321,11 @@ Fix:
288
321
  - For published-package verification, run from any other directory (or use `npm --prefix /tmp exec --yes --package=@sghanavati/relay-mcp -- relay-mcp --version`).
289
322
 
290
323
  ## Security
291
- - `relay-mcp` only delegates work inside the `workdir` you pass to `delegate`.
292
324
  - `workdir` values are canonicalized with `path.resolve()` before subprocess execution.
293
- - Codex runs with `--full-auto` and can read/write files in that delegated directory.
325
+ - Optional `RELAY_ALLOWED_ROOTS` enforces an explicit allowlist for delegate `workdir`.
326
+ - Codex runs with `--full-auto` and can read/write files in the delegated directory.
327
+ - `--skip-git-repo-check` is disabled by default and only enabled when `RELAY_SKIP_GIT_CHECK=1`.
328
+ - Run/event logs are written under `~/.relay-mcp` (directory mode `700`, file mode `600` best-effort).
294
329
  - `relay-mcp` does not store credentials, API keys, or session tokens; authentication remains in Codex CLI.
295
330
 
296
331
  ## License
@@ -1,3 +1,11 @@
1
+ export const SUPPORTED_PROVIDERS = ["codex", "openrouter", "lmstudio"];
2
+ export const DEFAULT_PROVIDER = "codex";
3
+ export function isSupportedProvider(provider) {
4
+ return SUPPORTED_PROVIDERS.includes(provider);
5
+ }
6
+ export function isAgenticProvider(provider) {
7
+ return provider === "codex";
8
+ }
1
9
  export function getOpenRouterApiKey() {
2
10
  return process.env["OPENROUTER_API_KEY"]?.trim() || null;
3
11
  }
@@ -1,11 +1,12 @@
1
1
  import { z } from "zod";
2
+ import { DEFAULT_PROVIDER, SUPPORTED_PROVIDERS } from "../config/providers.js";
2
3
  const delegateSchemaShape = {
3
4
  task: z.string().min(1).describe("The coding task to delegate to the worker"),
4
5
  workdir: z.string().describe("Absolute path to the working directory"),
5
6
  provider: z
6
- .enum(["codex", "openrouter", "lmstudio"])
7
+ .enum(SUPPORTED_PROVIDERS)
7
8
  .optional()
8
- .default("codex")
9
+ .default(DEFAULT_PROVIDER)
9
10
  .describe("Worker provider. 'codex' = agentic execution (modifies files). " +
10
11
  "'openrouter' and 'lmstudio' = generation-only (text response, no file changes)."),
11
12
  context: z
@@ -1,8 +1,28 @@
1
- import { appendFileSync } from "node:fs";
2
- export const RELAY_EVENTS_LOG = "/tmp/relay-mcp-events.jsonl";
1
+ import { appendFileSync, chmodSync, mkdirSync } from "node:fs";
2
+ import { homedir } from "node:os";
3
+ import { join } from "node:path";
4
+ function relayDir() {
5
+ const dir = join(homedir(), ".relay-mcp");
6
+ mkdirSync(dir, { recursive: true, mode: 0o700 });
7
+ try {
8
+ chmodSync(dir, 0o700);
9
+ }
10
+ catch {
11
+ /* best-effort */
12
+ }
13
+ return dir;
14
+ }
15
+ export const RELAY_EVENTS_LOG = join(homedir(), ".relay-mcp", "events.jsonl");
3
16
  export function writeRunEvent(event, logPath = RELAY_EVENTS_LOG) {
4
17
  try {
18
+ relayDir();
5
19
  appendFileSync(logPath, JSON.stringify(event) + "\n");
20
+ try {
21
+ chmodSync(logPath, 0o600);
22
+ }
23
+ catch {
24
+ /* best-effort */
25
+ }
6
26
  }
7
27
  catch {
8
28
  // Best-effort — never crash the server over a log write.
@@ -1,15 +1,25 @@
1
- import { diagnoseCodexWorker } from "../workers/diagnostics.js";
1
+ import { diagnoseCodexWorker, diagnoseLmStudioWorker, diagnoseOpenRouterWorker, } from "../workers/diagnostics.js";
2
2
  export async function handleCheckWorkers() {
3
- const codexDiagRaw = await diagnoseCodexWorker();
4
- const codexDiag = {
5
- worker: codexDiagRaw.worker,
6
- available: codexDiagRaw.available,
7
- reason: codexDiagRaw.reason,
8
- remediation: codexDiagRaw.remediation,
9
- };
3
+ const [codexDiagRaw, openRouterDiagRaw, lmStudioDiagRaw] = await Promise.all([
4
+ diagnoseCodexWorker(),
5
+ diagnoseOpenRouterWorker(),
6
+ diagnoseLmStudioWorker(),
7
+ ]);
8
+ const codexDiag = toWorkerDiagnostic(codexDiagRaw);
9
+ const openRouterDiag = toWorkerDiagnostic(openRouterDiagRaw);
10
+ const lmStudioDiag = toWorkerDiagnostic(lmStudioDiagRaw);
11
+ const workers = [codexDiag, openRouterDiag, lmStudioDiag];
10
12
  const result = {
11
- workers: [codexDiag],
12
- all_available: codexDiag.available,
13
+ workers,
14
+ all_available: workers.every((worker) => worker.available),
13
15
  };
14
16
  return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] };
15
17
  }
18
+ function toWorkerDiagnostic(diagnostic) {
19
+ return {
20
+ worker: diagnostic.worker,
21
+ available: diagnostic.available,
22
+ reason: diagnostic.reason,
23
+ remediation: diagnostic.remediation,
24
+ };
25
+ }
@@ -1,7 +1,10 @@
1
1
  import * as path from "node:path";
2
+ import { execFileSync } from "node:child_process";
2
3
  import { createHash, randomUUID } from "node:crypto";
3
- import { createWriteStream, readFileSync } from "node:fs";
4
+ import { chmodSync, createWriteStream, mkdirSync, readFileSync } from "node:fs";
5
+ import { homedir } from "node:os";
4
6
  import { join } from "node:path";
7
+ import { DEFAULT_PROVIDER, isAgenticProvider as providerIsAgentic, isSupportedProvider, } from "../config/providers.js";
5
8
  import { DEFAULT_TIMEOUT_MS, TOKEN_HARD_CAP, TOKEN_WARN_THRESHOLD } from "../config/constants.js";
6
9
  import { makeError } from "../errors.js";
7
10
  import { getWorkdirMutex } from "../concurrency/index.js";
@@ -14,13 +17,32 @@ function normalizeWorkdir(rawWorkdir) {
14
17
  }
15
18
  function computeLogPath(workdir) {
16
19
  const hash = createHash("sha1").update(workdir).digest("hex").slice(0, 8);
17
- return `/tmp/relay-mcp-${hash}.log`;
20
+ return path.join(homedir(), ".relay-mcp", `run-${hash}.log`);
21
+ }
22
+ function parseAllowedRoots(raw) {
23
+ return raw
24
+ .split(path.delimiter)
25
+ .map((segment) => segment.trim())
26
+ .filter((segment) => segment.length > 0)
27
+ .map((segment) => path.resolve(segment));
18
28
  }
19
29
  /**
20
- * Resolve AGENTS.md from workdir upward to filesystem root.
30
+ * Resolve AGENTS.md from workdir upward to git root (when inside a repo),
31
+ * otherwise to filesystem root.
21
32
  * Nearest non-empty file wins.
22
33
  */
23
34
  export function readNearestAgentsMd(workdir) {
35
+ let gitRoot = null;
36
+ try {
37
+ gitRoot = execFileSync("git", ["rev-parse", "--show-toplevel"], {
38
+ cwd: workdir,
39
+ stdio: ["ignore", "pipe", "ignore"],
40
+ encoding: "utf8",
41
+ }).trim();
42
+ }
43
+ catch {
44
+ gitRoot = null;
45
+ }
24
46
  let current = path.resolve(workdir);
25
47
  while (true) {
26
48
  try {
@@ -35,6 +57,8 @@ export function readNearestAgentsMd(workdir) {
35
57
  if (parent === current)
36
58
  return null;
37
59
  current = parent;
60
+ if (gitRoot !== null && !current.startsWith(gitRoot))
61
+ return null;
38
62
  }
39
63
  }
40
64
  function buildDelegateMeta(params) {
@@ -84,7 +108,7 @@ function normalizeThrownError(err) {
84
108
  return makeError("UNKNOWN", `Unexpected worker error: ${String(err)}`, false);
85
109
  }
86
110
  export async function handleDelegate(args) {
87
- const { task, workdir: rawWorkdir, context, timeout_ms, model, provider = "codex" } = args;
111
+ const { task, workdir: rawWorkdir, context, timeout_ms, model, provider = DEFAULT_PROVIDER } = args;
88
112
  const modelOverride = model?.trim() ? model.trim() : undefined;
89
113
  const run_id = randomUUID();
90
114
  const queued_at = Date.now();
@@ -97,10 +121,36 @@ export async function handleDelegate(args) {
97
121
  parts.push(context);
98
122
  parts.push(task);
99
123
  const finalTask = parts.join("\n\n");
100
- const knownProviders = new Set(["codex", "openrouter", "lmstudio"]);
101
- const isKnownProvider = knownProviders.has(provider);
102
- const isAgenticProvider = provider === "codex";
124
+ const isKnownProvider = isSupportedProvider(provider);
125
+ const isAgenticProvider = isKnownProvider && providerIsAgentic(provider);
103
126
  const logPath = computeLogPath(workdir);
127
+ const allowedRootsRaw = process.env["RELAY_ALLOWED_ROOTS"]?.trim();
128
+ if (allowedRootsRaw) {
129
+ const roots = parseAllowedRoots(allowedRootsRaw);
130
+ const allowed = roots.some((root) => workdir === root || workdir.startsWith(root + path.sep));
131
+ if (!allowed) {
132
+ const response = {
133
+ status: "error",
134
+ output: "",
135
+ files_changed: [],
136
+ meta: buildDelegateMeta({
137
+ duration_ms: 0,
138
+ truncated: false,
139
+ warnings: [],
140
+ model: modelOverride,
141
+ token_estimate: 0,
142
+ exit_code: null,
143
+ log_file: logPath,
144
+ run_id,
145
+ provider,
146
+ spawn_time_ms: 0,
147
+ token_usage: null,
148
+ }),
149
+ error: makeError("WORKDIR_NOT_FOUND", `workdir "${workdir}" is not under any allowed root. Set RELAY_ALLOWED_ROOTS to permit it.`, false),
150
+ };
151
+ return toMcpResult(response);
152
+ }
153
+ }
104
154
  if (!isKnownProvider) {
105
155
  const response = {
106
156
  status: "error",
@@ -152,7 +202,21 @@ export async function handleDelegate(args) {
152
202
  release = await getWorkdirMutex(workdir).acquire();
153
203
  preSnapshot = await takeSnapshot(workdir);
154
204
  }
205
+ const logDir = path.dirname(logPath);
206
+ mkdirSync(logDir, { recursive: true, mode: 0o700 });
207
+ try {
208
+ chmodSync(logDir, 0o700);
209
+ }
210
+ catch {
211
+ /* best-effort */
212
+ }
155
213
  const logStream = createWriteStream(logPath, { flags: "a" });
214
+ try {
215
+ chmodSync(logPath, 0o600);
216
+ }
217
+ catch {
218
+ /* best-effort */
219
+ }
156
220
  logStream.on("error", () => {
157
221
  /* swallow write errors - log is best-effort */
158
222
  });
@@ -20,13 +20,20 @@ export function parseCodexLine(line) {
20
20
  }
21
21
  return null;
22
22
  }
23
- export async function runCodexWorker(task) {
24
- const startTime = Date.now();
25
- const codexBin = getCodexBin();
26
- const args = ["exec", "--cd", task.workdir, "--json", "--full-auto", "--skip-git-repo-check"];
23
+ /** Pure function for argument construction - exported for testing */
24
+ export function buildCodexArgs(task, env = process.env) {
25
+ const args = ["exec", "--cd", task.workdir, "--json", "--full-auto"];
26
+ if (env["RELAY_SKIP_GIT_CHECK"] === "1")
27
+ args.push("--skip-git-repo-check");
27
28
  if (task.model)
28
29
  args.push("--model", task.model);
29
30
  args.push(task.task);
31
+ return args;
32
+ }
33
+ export async function runCodexWorker(task) {
34
+ const startTime = Date.now();
35
+ const codexBin = getCodexBin();
36
+ const args = buildCodexArgs(task);
30
37
  const child = spawn(codexBin, args, {
31
38
  stdio: ["pipe", "pipe", "pipe"],
32
39
  env: { ...process.env },
@@ -1,9 +1,11 @@
1
1
  import { execFile } from "node:child_process";
2
2
  import { promisify } from "node:util";
3
3
  import { MIN_CODEX_VERSION } from "../config/constants.js";
4
+ import { getLmStudioEndpoint, getOpenRouterApiKey, } from "../config/providers.js";
4
5
  import { getCodexBin } from "../config/runtime.js";
5
6
  import { formatVersionTriplet, isVersionBelow } from "../utils/version.js";
6
7
  const execFileAsync = promisify(execFile);
8
+ const LMSTUDIO_TIMEOUT_MS = 3_000;
7
9
  export async function diagnoseCodexWorker() {
8
10
  const codexBin = getCodexBin();
9
11
  const configuredPath = process.env.RELAY_CODEX_PATH?.trim();
@@ -54,3 +56,79 @@ export async function diagnoseCodexWorker() {
54
56
  remediation: "",
55
57
  };
56
58
  }
59
+ export async function diagnoseOpenRouterWorker() {
60
+ const apiKey = getOpenRouterApiKey();
61
+ if (!apiKey) {
62
+ return {
63
+ worker: "openrouter",
64
+ available: false,
65
+ reason: "OPENROUTER_API_KEY is not set",
66
+ remediation: "Set OPENROUTER_API_KEY in your MCP config env",
67
+ };
68
+ }
69
+ return {
70
+ worker: "openrouter",
71
+ available: true,
72
+ reason: "OPENROUTER_API_KEY is set",
73
+ remediation: "",
74
+ };
75
+ }
76
+ export async function diagnoseLmStudioWorker() {
77
+ const endpoint = getLmStudioEndpoint().replace(/\/+$/, "");
78
+ const modelsUrl = `${endpoint}/v1/models`;
79
+ const controller = new AbortController();
80
+ const timeout = setTimeout(() => controller.abort(), LMSTUDIO_TIMEOUT_MS);
81
+ try {
82
+ const response = await fetch(modelsUrl, {
83
+ method: "GET",
84
+ signal: controller.signal,
85
+ });
86
+ if (response.status === 200) {
87
+ return {
88
+ worker: "lmstudio",
89
+ available: true,
90
+ reason: `LM Studio reachable at ${modelsUrl}`,
91
+ remediation: "",
92
+ };
93
+ }
94
+ return {
95
+ worker: "lmstudio",
96
+ available: false,
97
+ reason: `LM Studio responded with HTTP ${response.status} at ${modelsUrl}`,
98
+ remediation: "Start LM Studio's local server or set LMSTUDIO_ENDPOINT in your MCP config env",
99
+ };
100
+ }
101
+ catch (error) {
102
+ const errorCode = getErrorCode(error);
103
+ let reason = `LM Studio request failed at ${modelsUrl}`;
104
+ if (error instanceof Error && error.name === "AbortError") {
105
+ reason = `LM Studio request to ${modelsUrl} timed out after ${LMSTUDIO_TIMEOUT_MS}ms`;
106
+ }
107
+ else if (errorCode === "ECONNREFUSED") {
108
+ reason = `LM Studio connection refused at ${modelsUrl}`;
109
+ }
110
+ else if (error instanceof Error && error.message) {
111
+ reason = `LM Studio request failed at ${modelsUrl}: ${error.message}`;
112
+ }
113
+ return {
114
+ worker: "lmstudio",
115
+ available: false,
116
+ reason,
117
+ remediation: "Start LM Studio's local server or set LMSTUDIO_ENDPOINT in your MCP config env",
118
+ };
119
+ }
120
+ finally {
121
+ clearTimeout(timeout);
122
+ }
123
+ }
124
+ function getErrorCode(error) {
125
+ if (!error || typeof error !== "object" || !("cause" in error)) {
126
+ return null;
127
+ }
128
+ const cause = error.cause;
129
+ if (!cause || typeof cause !== "object" || !("code" in cause)) {
130
+ return null;
131
+ }
132
+ const code = cause.code;
133
+ return typeof code === "string" ? code : null;
134
+ }
@@ -0,0 +1,55 @@
1
+ import { makeError } from "../errors.js";
2
+ export async function dispatchHttpRunner(task, opts) {
3
+ const startTime = Date.now();
4
+ const controller = new AbortController();
5
+ const timeout = setTimeout(() => controller.abort(), task.timeout_ms);
6
+ let res;
7
+ try {
8
+ res = await fetch(opts.url, {
9
+ method: "POST",
10
+ headers: opts.headers,
11
+ body: JSON.stringify(opts.body),
12
+ signal: controller.signal,
13
+ });
14
+ }
15
+ catch (err) {
16
+ const duration_ms = Date.now() - startTime;
17
+ if (err.name === "AbortError") {
18
+ return {
19
+ status: "timeout",
20
+ output: "",
21
+ duration_ms,
22
+ exit_code: null,
23
+ error: makeError("TIMEOUT", opts.timeoutMessage?.(task.timeout_ms) ??
24
+ `${opts.providerName} timed out after ${task.timeout_ms}ms`, true),
25
+ };
26
+ }
27
+ return {
28
+ status: "error",
29
+ output: "",
30
+ duration_ms,
31
+ exit_code: null,
32
+ error: makeError("PROVIDER_ERROR", opts.fetchFailureMessage?.(err) ??
33
+ `${opts.providerName} fetch failed: ${String(err)}`, true),
34
+ };
35
+ }
36
+ finally {
37
+ clearTimeout(timeout);
38
+ }
39
+ const duration_ms = Date.now() - startTime;
40
+ if (!res.ok) {
41
+ const body = await res.text();
42
+ return {
43
+ status: "error",
44
+ output: "",
45
+ duration_ms,
46
+ exit_code: res.status,
47
+ error: makeError("PROVIDER_ERROR", opts.responseErrorMessage?.(res.status, body) ??
48
+ `${opts.providerName} returned ${res.status}: ${body}`, res.status >= 500),
49
+ };
50
+ }
51
+ const body = (await res.json());
52
+ const { output, token_usage } = opts.parseResponse(body);
53
+ task.logStream?.write(output + "\n");
54
+ return { status: "success", output, duration_ms, exit_code: 0, token_usage };
55
+ }
@@ -1,5 +1,6 @@
1
1
  import { makeError } from "../errors.js";
2
2
  import { getLmStudioEndpoint, getLmStudioApiKey } from "../config/providers.js";
3
+ import { dispatchHttpRunner } from "./http-runner.js";
3
4
  export function parseLmStudioResponse(body) {
4
5
  const choices = body["choices"] ?? [];
5
6
  const output = choices[0]?.message?.content ?? "";
@@ -20,9 +21,6 @@ export class LmStudioRunner {
20
21
  }
21
22
  const endpoint = getLmStudioEndpoint();
22
23
  const apiKey = getLmStudioApiKey();
23
- const startTime = Date.now();
24
- const controller = new AbortController();
25
- const timeout = setTimeout(() => controller.abort(), task.timeout_ms);
26
24
  const headers = {
27
25
  "Content-Type": "application/json",
28
26
  };
@@ -33,52 +31,13 @@ export class LmStudioRunner {
33
31
  model,
34
32
  messages: [{ role: "user", content: task.task }],
35
33
  };
36
- let res;
37
- try {
38
- res = await fetch(`${endpoint}/v1/chat/completions`, {
39
- method: "POST",
40
- headers,
41
- body: JSON.stringify(requestBody),
42
- signal: controller.signal,
43
- });
44
- }
45
- catch (err) {
46
- clearTimeout(timeout);
47
- const duration_ms = Date.now() - startTime;
48
- if (err.name === "AbortError") {
49
- return {
50
- status: "timeout",
51
- output: "",
52
- duration_ms,
53
- exit_code: null,
54
- error: makeError("TIMEOUT", `LM Studio timed out after ${task.timeout_ms}ms`, true),
55
- };
56
- }
57
- return {
58
- status: "error",
59
- output: "",
60
- duration_ms,
61
- exit_code: null,
62
- error: makeError("PROVIDER_ERROR", `LM Studio fetch failed: ${String(err)}. Is it running at ${endpoint}?`, true),
63
- };
64
- }
65
- finally {
66
- clearTimeout(timeout);
67
- }
68
- const duration_ms = Date.now() - startTime;
69
- if (!res.ok) {
70
- const body = await res.text();
71
- return {
72
- status: "error",
73
- output: "",
74
- duration_ms,
75
- exit_code: res.status,
76
- error: makeError("PROVIDER_ERROR", `LM Studio returned ${res.status}: ${body}`, res.status >= 500),
77
- };
78
- }
79
- const body = (await res.json());
80
- const { output, token_usage } = parseLmStudioResponse(body);
81
- task.logStream?.write(output + "\n");
82
- return { status: "success", output, duration_ms, exit_code: 0, token_usage };
34
+ return dispatchHttpRunner(task, {
35
+ url: `${endpoint}/v1/chat/completions`,
36
+ headers,
37
+ body: requestBody,
38
+ parseResponse: parseLmStudioResponse,
39
+ providerName: "LM Studio",
40
+ fetchFailureMessage: (err) => `LM Studio fetch failed: ${String(err)}. Is it running at ${endpoint}?`,
41
+ });
83
42
  }
84
43
  }
@@ -1,5 +1,6 @@
1
1
  import { makeError } from "../errors.js";
2
2
  import { getOpenRouterApiKey } from "../config/providers.js";
3
+ import { dispatchHttpRunner } from "./http-runner.js";
3
4
  export function parseOpenRouterResponse(body) {
4
5
  const choices = body["choices"] ?? [];
5
6
  const output = choices[0]?.message?.content ?? "";
@@ -28,61 +29,18 @@ export class OpenRouterRunner {
28
29
  error: makeError("PROVIDER_NOT_CONFIGURED", "OPENROUTER_API_KEY is not set. Add it to your MCP config env.", false),
29
30
  };
30
31
  }
31
- const startTime = Date.now();
32
- const controller = new AbortController();
33
- const timeout = setTimeout(() => controller.abort(), task.timeout_ms);
34
- let res;
35
- try {
36
- res = await fetch("https://openrouter.ai/api/v1/chat/completions", {
37
- method: "POST",
38
- headers: {
39
- Authorization: `Bearer ${apiKey}`,
40
- "Content-Type": "application/json",
41
- },
42
- body: JSON.stringify({
43
- model,
44
- messages: [{ role: "user", content: task.task }],
45
- }),
46
- signal: controller.signal,
47
- });
48
- }
49
- catch (err) {
50
- clearTimeout(timeout);
51
- const duration_ms = Date.now() - startTime;
52
- if (err.name === "AbortError") {
53
- return {
54
- status: "timeout",
55
- output: "",
56
- duration_ms,
57
- exit_code: null,
58
- error: makeError("TIMEOUT", `OpenRouter timed out after ${task.timeout_ms}ms`, true),
59
- };
60
- }
61
- return {
62
- status: "error",
63
- output: "",
64
- duration_ms,
65
- exit_code: null,
66
- error: makeError("PROVIDER_ERROR", `OpenRouter fetch failed: ${String(err)}`, true),
67
- };
68
- }
69
- finally {
70
- clearTimeout(timeout);
71
- }
72
- const duration_ms = Date.now() - startTime;
73
- if (!res.ok) {
74
- const body = await res.text();
75
- return {
76
- status: "error",
77
- output: "",
78
- duration_ms,
79
- exit_code: res.status,
80
- error: makeError("PROVIDER_ERROR", `OpenRouter returned ${res.status}: ${body}`, res.status >= 500),
81
- };
82
- }
83
- const body = (await res.json());
84
- const { output, token_usage } = parseOpenRouterResponse(body);
85
- task.logStream?.write(output + "\n");
86
- return { status: "success", output, duration_ms, exit_code: 0, token_usage };
32
+ return dispatchHttpRunner(task, {
33
+ url: "https://openrouter.ai/api/v1/chat/completions",
34
+ headers: {
35
+ Authorization: `Bearer ${apiKey}`,
36
+ "Content-Type": "application/json",
37
+ },
38
+ body: {
39
+ model,
40
+ messages: [{ role: "user", content: task.task }],
41
+ },
42
+ parseResponse: parseOpenRouterResponse,
43
+ providerName: "OpenRouter",
44
+ });
87
45
  }
88
46
  }
@@ -1,15 +1,15 @@
1
1
  import { CodexRunner } from "./codex.js";
2
2
  import { OpenRouterRunner } from "./openrouter.js";
3
3
  import { LmStudioRunner } from "./lmstudio.js";
4
+ import { isSupportedProvider } from "../config/providers.js";
4
5
  const runners = {
5
6
  codex: new CodexRunner(),
6
7
  openrouter: new OpenRouterRunner(),
7
8
  lmstudio: new LmStudioRunner(),
8
9
  };
9
10
  export function getRunner(provider) {
10
- const runner = runners[provider];
11
- if (!runner) {
11
+ if (!isSupportedProvider(provider)) {
12
12
  throw new Error(`PROVIDER_NOT_CONFIGURED: unknown provider "${provider}"`);
13
13
  }
14
- return runner;
14
+ return runners[provider];
15
15
  }
package/package.json CHANGED
@@ -1,8 +1,24 @@
1
1
  {
2
2
  "name": "@sghanavati/relay-mcp",
3
- "version": "0.1.4",
4
- "description": "MCP server that lets Claude Code delegate coding tasks to Codex",
3
+ "version": "0.1.6",
4
+ "description": "MCP server for Claude Code delegation across Codex, OpenRouter, and LM Studio",
5
5
  "type": "module",
6
+ "repository": {
7
+ "type": "git",
8
+ "url": "git+https://github.com/ghanavati/relay-mcp.git"
9
+ },
10
+ "homepage": "https://github.com/ghanavati/relay-mcp#readme",
11
+ "bugs": {
12
+ "url": "https://github.com/ghanavati/relay-mcp/issues"
13
+ },
14
+ "keywords": [
15
+ "mcp",
16
+ "claude-code",
17
+ "codex",
18
+ "openrouter",
19
+ "lmstudio",
20
+ "orchestration"
21
+ ],
6
22
  "bin": {
7
23
  "relay-mcp": "dist/index.js"
8
24
  },
@@ -12,6 +28,7 @@
12
28
  "files": [
13
29
  "dist/",
14
30
  "LICENSE",
31
+ "README.md",
15
32
  "!dist/**/*.test.js"
16
33
  ],
17
34
  "scripts": {
@@ -25,12 +42,13 @@
25
42
  },
26
43
  "dependencies": {
27
44
  "@modelcontextprotocol/sdk": "^1.20.2",
28
- "zod": "^3.25.0",
45
+ "async-mutex": "^0.5.0",
29
46
  "execa": "^9.6.0",
30
- "async-mutex": "^0.5.0"
47
+ "zod": "^3.25.0"
31
48
  },
32
49
  "devDependencies": {
33
- "typescript": "^5.0.0",
34
- "@types/node": "^22.0.0"
35
- }
50
+ "@types/node": "^22.0.0",
51
+ "typescript": "^5.0.0"
52
+ },
53
+ "license": "MIT"
36
54
  }