docdex 0.2.24 → 0.2.26

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -3,7 +3,6 @@
3
3
  ![GitHub License](https://img.shields.io/github/license/bekirdag/docdex)
4
4
  ![GitHub Release](https://img.shields.io/github/v/release/bekirdag/docdex)
5
5
  ![Made with Rust](https://img.shields.io/badge/Made%20with-Rust-orange?logo=rust)
6
- [![MCP Badge](https://lobehub.com/badge/mcp/bekirdag-docdex)](https://lobehub.com/mcp/bekirdag-docdex)
7
6
 
8
7
  <a href="https://glama.ai/mcp/servers/@bekirdag/docdex">
9
8
  <img width="380" height="200" src="https://glama.ai/mcp/servers/@bekirdag/docdex/badge" />
@@ -108,7 +107,23 @@ flowchart LR
108
107
 
109
108
  ```
110
109
 
111
- Use the daemon HTTP/SSE endpoint.
110
+ Use the daemon HTTP/SSE endpoint. For sandboxed clients, Docdex can also serve MCP over local IPC
111
+ (Unix socket or Windows named pipe), while HTTP/SSE remains the default for most MCP clients.
112
+
113
+ ### Stdio MCP (Smithery/stdio-only clients)
114
+
115
+ If your client requires a stdio MCP server (e.g., Smithery), use the bundled stdio bridge:
116
+
117
+ ```bash
118
+ npx docdex docdex-mcp-stdio
119
+ ```
120
+
121
+ The bridge reads newline-delimited JSON-RPC from stdin and forwards to Docdex. You can force IPC or
122
+ override endpoints with:
123
+
124
+ ```bash
125
+ DOCDEX_MCP_TRANSPORT=ipc DOCDEX_MCP_SOCKET_PATH=/path/to/mcp.sock npx docdex docdex-mcp-stdio
126
+ ```
112
127
 
113
128
  ### Manual Configuration
114
129
 
package/assets/agents.md CHANGED
@@ -1,10 +1,16 @@
1
+ ---- START OF DOCDEX INFO V0.2.26 ----
2
+ Docdex URL: http://127.0.0.1:28491
3
+ Use this base URL for Docdex HTTP endpoints.
4
+ Health check endpoint: `GET /healthz` (not `/v1/health`).
5
+
1
6
  ## Priority Directives (Mandatory)
2
7
 
3
8
  - Always use both memory lobes early: docdex_get_profile + docdex_memory_recall; save new facts via docdex_memory_save and new preferences via docdex_save_preference.
4
9
  - Maintain agent-specific profile memory: use your agent_id with docdex_get_profile/docdex_save_preference; store generic self-knowledge (role, tooling, preferences).
5
- - Use impact analysis for every code change: call HTTP /v1/graph/impact?file=... and account for inbound/outbound deps; if unavailable, state it and proceed cautiously.
10
+ - Use impact analysis for every code change: prefer MCP tools `docdex_impact_graph` / `docdex_dag_export` (IPC/HTTP). If shell networking is blocked, do not use curl; use MCP/IPC instead. If unavailable, state it and proceed cautiously.
6
11
  - Apply DAG reasoning for planning: prefer dependency graph facts (impact results and /v1/dag/export) to choose the right change order and scope.
7
12
  - Use Docdex tools intentionally: docdex_search/symbols/ast for repo truth; docdex_stats/files/repo_inspect/index for index health.
13
+ - For folder structure, use docdex_tree instead of raw `rg --files`/`find` to avoid noisy folders.
8
14
  - When you do not know something, run docdex_web_research (force_web=true). Web research is encouraged by default for non-repo facts and external APIs.
9
15
 
10
16
  # Docdex Agent Usage Instructions
@@ -65,7 +71,8 @@ Precision tools for structural analysis. Do not rely on text search for definiti
65
71
  | docdex_symbols | Get exact definitions/signatures for a file. |
66
72
  | docdex_ast | Specific AST nodes (e.g., "Find all class definitions"). |
67
73
  | docdex_impact_diagnostics | Check for broken/dynamic imports. |
68
- | HTTP /v1/graph/impact | Impact Analysis: "What breaks if I change this?" Returns inbound/outbound dependencies. |
74
+ | docdex_impact_graph | Impact Analysis: "What breaks if I change this?" Returns inbound/outbound dependencies. |
75
+ | docdex_dag_export | Export the dependency DAG for change ordering and scope. |
69
76
 
70
77
  ### C. Memory Operations
71
78
 
@@ -76,6 +83,31 @@ Precision tools for structural analysis. Do not rely on text search for definiti
76
83
  | docdex_save_preference | Store a global user preference (Style, Tooling, Constraint). |
77
84
  | docdex_get_profile | Retrieve global preferences. |
78
85
 
86
+ ### D. Local Delegation (Cheap Models)
87
+
88
+ Use local delegation for low-complexity, code-generation-oriented tasks to reduce paid-model usage.
89
+
90
+ | MCP Tool / HTTP | Purpose |
91
+ | --- | --- |
92
+ | docdex_local_completion | Delegate small tasks to a local model with strict output formats. |
93
+ | HTTP /v1/delegate | HTTP endpoint for delegated completions with structured responses. |
94
+
95
+ Required fields: `task_type`, `instruction`, `context`. Optional: `max_tokens`, `timeout_ms`, `mode` (`draft_only` or `draft_then_refine`), `agent` (local agent id/slug).
96
+ Expensive model library: `docs/expensive_models.json` (match by `agent_id`, `agent_slug`, `model`, or adapter type; case-insensitive).
97
+
98
+ ### E. Index Health + File Access
99
+
100
+ Use these to verify index coverage, repo binding, and to read precise file slices.
101
+
102
+ | MCP Tool | Purpose |
103
+ | --- | --- |
104
+ | docdex_repo_inspect | Confirm normalized repo root/identity (resolve missing_repo). |
105
+ | docdex_stats | Index size/last update; detect stale indexes. |
106
+ | docdex_files | Indexed file coverage; confirm a file is in the index. |
107
+ | docdex_index | Reindex full repo or ingest specific files when stale/missing. |
108
+ | docdex_open | Read exact file slices after you identify targets. |
109
+ | docdex_tree | Render a repo folder tree with standard excludes (avoid noisy folders). |
110
+
79
111
  ## Quick Tool Map (Often Missed)
80
112
 
81
113
  - docdex_files: List indexed docs with rel_path/doc_id/token_estimate; use to verify indexing coverage.
@@ -84,8 +116,145 @@ Precision tools for structural analysis. Do not rely on text search for definiti
84
116
  - docdex_index: Reindex the full repo or ingest specific files when stale.
85
117
  - docdex_search diff: Limit search to working tree, staged, or ref ranges; filter by paths.
86
118
  - docdex_web_research knobs: force_web, skip_local_search, repo_only, no_cache, web_limit, llm_filter_local_results, llm_model.
87
- - HTTP /v1/initialize: Bind a default repo root for MCP when clients omit project_root.
88
- - HTTP /v1/dag/export: Export the dependency graph for external analysis.
119
+ - docdex_open: Read narrow file slices after targets are identified.
120
+ - docdex_tree: Render a filtered folder tree (prefer this over `rg --files` / `find`).
121
+ - docdex_impact_diagnostics: Scan dynamic imports when imports are unclear or failing.
122
+ - docdex_local_completion: Delegate low-complexity codegen tasks (tests, docstrings, boilerplate, simple refactors).
123
+ - docdex_ast: Use AST queries for precise structure (class/function definitions, call sites, imports).
124
+ - docdex_symbols: Use symbols to confirm exact signatures/locations before edits.
125
+ - docdex_impact_graph: Mandatory before code changes to review inbound/outbound deps (use MCP/IPC if shell networking is blocked).
126
+ - docdex_dag_export: Export dependency graph to plan change order.
127
+ - HTTP /v1/initialize: Mount/bind a repo for HTTP daemon mode. Request JSON uses rootUri/root_uri (NOT repo_root).
128
+
129
+ ## CLI Fallbacks (when MCP/IPC is unavailable)
130
+ Use these only when MCP tools cannot be called (e.g., blocked sandbox networking). Prefer MCP/IPC otherwise.
131
+
132
+ - `docdexd repo init --repo <path>`: initialize repo in daemon and return repo_id JSON.
133
+ - `docdexd repo id --repo <path>`: compute repo fingerprint locally.
134
+ - `docdexd repo status --repo <path>` / `docdexd repo dirty --exit-code`: git working tree status.
135
+ - `docdexd impact-graph --repo <path> --file <rel>`: impact graph (HTTP/local).
136
+ - `docdexd dag export --repo <path> <session_id>`: DAG export alias.
137
+ - `docdexd search --repo <path> --query "<q>"`: /search equivalent (HTTP/local).
138
+ - `docdexd open --repo <path> --file <rel>`: safe file slice read (head/start/end/clamp).
139
+ - `docdexd file ensure-newline|write --repo <path> --file <rel>`: minimal file edits.
140
+ - `docdexd test run-node --repo <path> --file <rel> --args "..."`: run Node scripts.
141
+
142
+ ## Docdex Usage Cookbook (Mandatory, Exact Schemas)
143
+
144
+ This section is the authoritative source for how to call Docdex. Do not guess field names or payloads.
145
+
146
+ ### 0) Base URL + daemon modes
147
+
148
+ - Default HTTP base URL: http://127.0.0.1:28491 (override with DOCDEX_HTTP_BASE_URL).
149
+ - Single-repo HTTP daemon: `docdexd serve --repo /abs/path`. /v1/initialize is NOT used. repo_id is optional, but must match the serving repo if provided.
150
+ - Multi-repo HTTP daemon: `docdexd daemon`. You MUST call /v1/initialize before repo-scoped HTTP endpoints. When multiple repos are mounted, repo_id is required on every repo-scoped request.
151
+
152
+ ### 1) Initialize (HTTP) - exact request payload
153
+
154
+ POST /v1/initialize
155
+
156
+ Request JSON (exact field names):
157
+
158
+ ```json
159
+ { "rootUri": "file:///abs/path/to/repo" }
160
+ ```
161
+
162
+ Alias accepted:
163
+
164
+ ```json
165
+ { "root_uri": "/abs/path/to/repo" }
166
+ ```
167
+
168
+ Rules:
169
+ - Do NOT send `repo_root` in the request. `repo_root` is a response field.
170
+ - Use file:// URIs when possible; plain absolute paths are also accepted.
171
+ - Response returns `repo_id`, `status`, and `repo_root`. Use that repo_id for subsequent HTTP calls.
172
+
173
+ ### 2) Repo scoping (HTTP)
174
+
175
+ - Send repo_id via header `x-docdex-repo-id: <repo_id>` or query param `repo_id=<repo_id>`.
176
+ - If the daemon is single-repo, do not send a repo_id for a different repo (you will get `unknown_repo`).
177
+ - If the daemon is multi-repo and more than one repo is mounted, repo_id is required.
178
+
179
+ ### 3) Search (HTTP)
180
+
181
+ `GET /search`
182
+
183
+ Required:
184
+ - `q` (query string).
185
+
186
+ Common params:
187
+ - `limit`, `snippets`, `max_tokens`, `include_libs`, `force_web`, `skip_local_search`, `no_cache`,
188
+ `max_web_results`, `llm_filter_local_results`, `diff_mode`, `diff_base`, `diff_head`, `diff_path`,
189
+ `dag_session_id`, `repo_id`.
190
+
191
+ Notes:
192
+ - `skip_local_search=true` effectively forces web discovery (Tier 2).
193
+ - If DOCDEX_WEB_ENABLED=1, web discovery can be slow; plan timeouts accordingly.
194
+
195
+ ### 4) Snippet (HTTP)
196
+
197
+ `GET /snippet/:doc_id`
198
+
199
+ Common params:
200
+ - `window`, `q`, `text_only`, `max_tokens`, `repo_id`.
201
+
202
+ ### 5) Impact graph (HTTP)
203
+
204
+ `GET /v1/graph/impact?file=<repo-relative-path>`
205
+
206
+ Rules:
207
+ - `file` must be a path relative to the repo root (not an absolute path).
208
+ - Include repo_id header/query when required by daemon mode.
209
+
210
+ ### 6) DAG export (HTTP)
211
+
212
+ `GET /v1/dag/export?session_id=<id>`
213
+
214
+ Query params:
215
+ - `session_id` (required)
216
+ - `format` (optional: json/text/dot; default json)
217
+ - `max_nodes` (optional)
218
+ - `repo_id` (required when multiple repos are mounted)
219
+
220
+ ### 7) MCP over HTTP/SSE
221
+
222
+ - SSE: `/v1/mcp/sse` + `/v1/mcp/message`. When multiple repos are mounted, initialize with `rootUri` first.
223
+ - HTTP: `/v1/mcp` accepts repo context in the payload or via prior initialize.
224
+ - If HTTP/SSE is unreachable (sandboxed clients), fall back to local IPC: configure `transport = "ipc"` with `socket_path` (Unix) or `pipe_name` (Windows) and send MCP JSON-RPC to `/v1/mcp` over IPC.
225
+ - For stdio-only clients (e.g., Smithery), use the `docdex-mcp-stdio` entrypoint to bridge stdio JSON-RPC to Docdex MCP.
226
+ - For impact/DAG in sandboxed shells, prefer MCP/IPC tools over `curl` to `/v1/graph/impact` or `/v1/dag/export`.
227
+ - MCP tools: `docdex_impact_graph` (impact traversal) and `docdex_dag_export` (DAG export).
228
+
229
+ ### 8) MCP tools (local) - required fields
230
+
231
+ Do not guess fields; use these canonical shapes.
232
+
233
+ - `docdex_search`: `{ project_root, query, limit?, diff?, repo_only?, force_web? }`
234
+ - `docdex_open`: `{ project_root, path, start_line?, end_line?, head?, clamp? }` (range must be valid unless clamp/head used)
235
+ - `docdex_files`: `{ project_root, limit?, offset? }`
236
+ - `docdex_stats`: `{ project_root }`
237
+ - `docdex_repo_inspect`: `{ project_root }`
238
+ - `docdex_index`: `{ project_root, paths? }` (paths empty => full reindex)
239
+ - `docdex_symbols`: `{ project_root, path }`
240
+ - `docdex_ast`: `{ project_root, path, max_nodes? }`
241
+ - `docdex_impact_diagnostics`: `{ project_root, file? }`
242
+ - `docdex_impact_graph`: `{ project_root, file, max_edges?, max_depth?, edge_types? }`
243
+ - `docdex_dag_export`: `{ project_root, session_id, format?, max_nodes? }`
244
+ - `docdex_memory_save`: `{ project_root, text }`
245
+ - `docdex_memory_recall`: `{ project_root, query, top_k? }`
246
+ - `docdex_get_profile`: `{ agent_id }`
247
+ - `docdex_save_preference`: `{ agent_id, category, content }`
248
+ - `docdex_local_completion`: `{ task_type, instruction, context, max_tokens?, timeout_ms?, mode?, max_context_chars?, agent? }`
249
+ - `docdex_web_research`: `{ project_root, query, force_web, skip_local_search?, web_limit?, no_cache? }`
250
+
251
+ ### 9) Common error fixes (do not guess)
252
+
253
+ - `unknown_repo`: You are talking to a daemon that does not know that repo. Fix by:
254
+ - Starting a single-repo server for that repo (`docdexd serve --repo /abs/path`), OR
255
+ - Calling `/v1/initialize` on the multi-repo daemon with `rootUri`, then using the returned repo_id.
256
+ - `missing_repo`: Supply repo_id (HTTP) or project_root (MCP), or call /v1/initialize.
257
+ - `invalid_range` (docdex_open): Adjust start/end line to fit total_lines.
89
258
 
90
259
  ## Interaction Patterns
91
260
 
@@ -96,9 +265,76 @@ When answering a complex coding query, follow this "Reasoning Trace":
96
265
  1. Retrieve Profile: Call docdex_get_profile to load user style/constraints (e.g., "Use functional components").
97
266
  2. Search Code: Call docdex_search or docdex_symbols to find the relevant code.
98
267
  3. Check Memory: Call docdex_memory_recall for project-specific caveats (e.g., "Auth logic was refactored last week").
99
- 4. Synthesize: Generate code that matches the Repo Truth while adhering to the Profile Style.
268
+ 4. Validate structure: Use docdex_ast/docdex_symbols to confirm targets before editing.
269
+ 5. Read context: Use docdex_open to fetch minimal file slices after locating targets.
270
+ 6. Plan with DAG: Use /v1/dag/export or /v1/graph/impact to order changes by dependencies.
271
+ 7. Synthesize: Generate code that matches the Repo Truth while adhering to the Profile Style.
272
+
273
+ ### 2. Memory Capture (Mandatory)
274
+
275
+ Save more memories for both lobes during the task, not just at the end.
276
+
277
+ 1. Repo memory: After each meaningful discovery or code change, save at least one durable fact (file location, behavior, config, gotcha) via `docdex_memory_save`.
278
+ 2. Profile memory: When the user expresses a preference, constraint, or workflow correction, call `docdex_save_preference` immediately with the right category.
279
+ 3. Keep it crisp: 1-3 short sentences, include file paths when relevant, avoid raw code blobs.
280
+ 4. Safety: Never store secrets, tokens, or sensitive user data. Skip transient or speculative info.
281
+
282
+ ### 3. Index Health + Diff-Aware Search (Mandatory)
283
+
284
+ Use these when results look incomplete or when the task is about recent changes.
285
+
286
+ 1. Confirm repo binding: Use docdex_repo_inspect or /v1/initialize when repo_id is missing/ambiguous.
287
+ 2. Check index coverage: Use docdex_stats + docdex_files before assuming code is missing.
288
+ 3. Reindex if needed: Run docdex_index (or advise it) when stale_index/missing files appear.
289
+ 4. Use diff search: For change-specific tasks, use docdex_search with diff mode (working tree/staged/range).
290
+
291
+ ### 4. Local Delegation Workflow (Cheap Models)
292
+
293
+ When you are an expensive/paid model, delegate eligible low-complexity tasks first unless the user opts out.
294
+ Delegation is mandatory for eligible tasks. If you are expensive and delegation is enabled (explicitly or auto-enabled with local candidates), you MUST attempt local delegation before doing the work yourself. Only skip when delegation is disabled, no local candidate exists, or local output fails validation after a retry/fallback.
295
+
296
+ #### Mandatory Delegation Triggers
297
+
298
+ - The work fits a delegation task type or can be split into sub-tasks that do.
299
+ - The change is small, localized, or mechanical (boilerplate, formatting, docstrings, small refactors).
300
+ - The task can be fully expressed in a capsule with explicit context and output rules.
301
+
302
+ #### Reasons to Delegate (Non-Optional)
303
+
304
+ - Cuts paid-token usage and cost.
305
+ - Lowers latency and avoids rate-limit pressure on premium models.
306
+ - Keeps expensive models focused on complex reasoning and multi-step planning.
307
+ - Uses tuned prompts that reduce drift for repetitive edits.
308
+ - Enforces the product's cost-control policy; skipping delegation is a regression.
309
+
310
+ 1. Determine if you are expensive: load `docs/expensive_models.json` and match your `agent_id`, `agent_slug`, `model`, or adapter type. If you match, continue; if not and you are clearly local/cheap, skip delegation. If unsure, default to expensive.
311
+ 2. Check config: Only delegate when `[llm.delegation].enabled` is true or `auto_enable` is true with an eligible local model/agent (and `task_type` is allowed). If uncertain, attempt delegation and handle the error.
312
+ 3. Choose task type: Use one of `GENERATE_TESTS`, `WRITE_DOCSTRING`, `SCAFFOLD_BOILERPLATE`, `REFACTOR_SIMPLE`, `FORMAT_CODE`.
313
+ 4. Call the tool: `docdex_local_completion` with `task_type`, `instruction`, and minimal `context` (smallest necessary snippet).
314
+ 5. Validate output: If the local output is invalid or empty, fall back to the primary agent or handle with the paid model.
315
+ 6. Optional refine: If mode is `draft_then_refine`, refine the draft with the primary agent and return a final result.
316
+
317
+ #### Delegation Handoff Package (Required)
318
+
319
+ Local models cannot call tools. The leading agent must provide a complete, minimal capsule.
320
+
321
+ 1. Task capsule: `task_type`, goal, success criteria, output format, and constraints (tests to update, style rules).
322
+ 2. Context payload: file paths plus the exact snippets from docdex_open; include symbol signatures/AST findings.
323
+ 3. Dependency notes: summarize impact analysis and any DAG ordering that affects the change.
324
+ 4. Boundaries: explicit files allowed to edit vs read-only; no new dependencies unless allowed.
325
+ 5. Guardrails: ask for clarification if context is insufficient; do not invent missing APIs; return only the requested format.
326
+
327
+ ### 5. Graph + AST Usage (Mandatory for Code Changes)
328
+
329
+ For any code change, use both AST and graph tools to reduce drift and hidden coupling.
330
+
331
+ 1. Use `docdex_ast` or `docdex_symbols` to locate exact definitions and call sites.
332
+ 2. Call HTTP `/v1/graph/impact?file=...` before edits and summarize inbound/outbound deps.
333
+ 3. For multi-file changes, export the DAG (`/v1/dag/export`) and order edits by dependency direction.
334
+ 4. Use docdex_impact_diagnostics when imports are dynamic or unresolved.
335
+ 5. If graph endpoints are unavailable, state it and proceed cautiously with extra local search.
100
336
 
101
- ### 2. Handling Corrections (Learning)
337
+ ### 6. Handling Corrections (Learning)
102
338
 
103
339
  If the user says: "I told you, we do not use Moment.js here, use date-fns!"
104
340
 
@@ -107,21 +343,21 @@ If the user says: "I told you, we do not use Moment.js here, use date-fns!"
107
343
  - content: "Do not use Moment.js; prefer date-fns."
108
344
  - agent_id: "default" (or active agent ID)
109
345
 
110
- ### 3. Impact Analysis
346
+ ### 7. Impact Analysis
111
347
 
112
348
  If the user asks: "Safe to delete getUser?"
113
349
 
114
350
  - Action: Call GET /v1/graph/impact?file=src/user.ts
115
351
  - Output: Analyze the inbound edges. If the list is not empty, it is unsafe.
116
352
 
117
- ### 4. Non-Repo Real-World Queries (Web First)
353
+ ### 8. Non-Repo Real-World Queries (Web First)
118
354
 
119
355
  If the user asks a non-repo, real-world question (weather, news, general facts), immediately call docdex_web_research with force_web=true.
120
356
  - Resolve relative dates ("yesterday", "last week") using system time by default.
121
357
  - Do not run docdex_search unless the user explicitly wants repo-local context.
122
358
  - Assume web access is allowed unless the user forbids it; if the web call fails, report the failure and ask for a source or permission.
123
359
 
124
- ### 5. Failure Handling (Missing Results or Errors)
360
+ ### 9. Failure Handling (Missing Results or Errors)
125
361
 
126
362
  - Ensure project_root or repo_path is set, or call /v1/initialize to bind a default root.
127
363
  - Use docdex_repo_inspect to confirm repo identity and normalized root.
@@ -151,3 +387,4 @@ Docdex adapts to the host.
151
387
 
152
388
  - Project Mapping: On constrained hardware, docdex uses a "Spotlight Heuristic" to show you only a skeletal file tree based on your role keywords, rather than the full file system.
153
389
  - LLM: It may be running a quantized model (e.g., phi3.5) or a heavy model (llama3.1:70b) depending on VRAM. Trust the daemon's token limits; it handles truncation.
390
+ ---- END OF DOCDEX INFO -----
@@ -0,0 +1,19 @@
1
+ #!/usr/bin/env node
2
+ "use strict";
3
+
4
+ const { runBridge } = require("../lib/mcp_stdio_bridge");
5
+
6
+ async function main() {
7
+ try {
8
+ await runBridge({
9
+ stdin: process.stdin,
10
+ stdout: process.stdout,
11
+ stderr: process.stderr
12
+ });
13
+ } catch (err) {
14
+ process.stderr.write(`[docdex-mcp-stdio] fatal: ${err}\n`);
15
+ process.exit(1);
16
+ }
17
+ }
18
+
19
+ main();
@@ -0,0 +1,236 @@
1
+ "use strict";
2
+
3
+ const http = require("node:http");
4
+ const https = require("node:https");
5
+ const readline = require("node:readline");
6
+ const { URL } = require("node:url");
7
+
8
+ const DEFAULT_HTTP_BASE = "http://127.0.0.1:28491";
9
+
10
+ function trimEnv(value) {
11
+ if (!value) return "";
12
+ const trimmed = String(value).trim();
13
+ return trimmed;
14
+ }
15
+
16
+ function jsonRpcError(message, id, code = -32000, data) {
17
+ const payload = {
18
+ jsonrpc: "2.0",
19
+ id: typeof id === "undefined" ? null : id,
20
+ error: {
21
+ code,
22
+ message
23
+ }
24
+ };
25
+ if (data !== undefined) {
26
+ payload.error.data = data;
27
+ }
28
+ return payload;
29
+ }
30
+
31
+ function extractIds(payload) {
32
+ if (Array.isArray(payload)) {
33
+ return payload
34
+ .map((item) => (item && Object.prototype.hasOwnProperty.call(item, "id") ? item.id : undefined))
35
+ .filter((value) => value !== undefined);
36
+ }
37
+ if (payload && Object.prototype.hasOwnProperty.call(payload, "id")) {
38
+ return [payload.id];
39
+ }
40
+ return [];
41
+ }
42
+
43
+ function normalizePipeName(name) {
44
+ if (!name) return "";
45
+ if (name.startsWith("\\\\.\\pipe\\")) return name;
46
+ return `\\\\.\\pipe\\${name}`;
47
+ }
48
+
49
+ function defaultUnixSocketPath() {
50
+ const runtime = trimEnv(process.env.XDG_RUNTIME_DIR);
51
+ if (runtime) {
52
+ return `${runtime.replace(/\/$/, "")}/docdex/mcp.sock`;
53
+ }
54
+ const home = trimEnv(process.env.HOME);
55
+ if (!home) return "";
56
+ return `${home.replace(/\/$/, "")}/.docdex/run/mcp.sock`;
57
+ }
58
+
59
+ function readTransportEnv() {
60
+ const transport = trimEnv(process.env.DOCDEX_MCP_TRANSPORT).toLowerCase();
61
+ if (transport && transport !== "http" && transport !== "ipc") {
62
+ throw new Error(`invalid DOCDEX_MCP_TRANSPORT: ${transport}`);
63
+ }
64
+ return transport;
65
+ }
66
+
67
+ function resolveIpcConfig(transport) {
68
+ const socketPathEnv = trimEnv(process.env.DOCDEX_MCP_SOCKET_PATH);
69
+ const pipeNameEnv = trimEnv(process.env.DOCDEX_MCP_PIPE_NAME);
70
+ const explicitIpc = transport === "ipc" || socketPathEnv || pipeNameEnv;
71
+ if (!explicitIpc) return null;
72
+
73
+ if (process.platform === "win32") {
74
+ const pipeName = normalizePipeName(pipeNameEnv || "docdex-mcp");
75
+ return { type: "pipe", pipeName };
76
+ }
77
+
78
+ const socketPath = socketPathEnv || defaultUnixSocketPath();
79
+ if (!socketPath) {
80
+ throw new Error("DOCDEX_MCP_SOCKET_PATH not set and HOME/XDG_RUNTIME_DIR unavailable");
81
+ }
82
+ return { type: "unix", socketPath };
83
+ }
84
+
85
+ function resolveHttpBaseUrl() {
86
+ const base = trimEnv(process.env.DOCDEX_HTTP_BASE_URL) || DEFAULT_HTTP_BASE;
87
+ return base;
88
+ }
89
+
90
+ function buildHttpEndpoint(baseUrl) {
91
+ const parsed = new URL(baseUrl);
92
+ const endpoint = new URL("/v1/mcp", parsed);
93
+ return endpoint;
94
+ }
95
+
96
+ function requestJson({ url, payload, socketPath }) {
97
+ const data = JSON.stringify(payload);
98
+ const isHttps = url.protocol === "https:";
99
+ const requestFn = isHttps ? https.request : http.request;
100
+ const options = {
101
+ method: "POST",
102
+ headers: {
103
+ "content-type": "application/json",
104
+ "content-length": Buffer.byteLength(data)
105
+ }
106
+ };
107
+ if (socketPath) {
108
+ options.socketPath = socketPath;
109
+ options.path = url.pathname;
110
+ } else {
111
+ options.hostname = url.hostname;
112
+ options.port = url.port || (isHttps ? 443 : 80);
113
+ options.path = url.pathname;
114
+ }
115
+
116
+ return new Promise((resolve, reject) => {
117
+ const req = requestFn(options, (res) => {
118
+ let body = "";
119
+ res.setEncoding("utf8");
120
+ res.on("data", (chunk) => {
121
+ body += chunk;
122
+ });
123
+ res.on("end", () => {
124
+ resolve({
125
+ status: res.statusCode || 0,
126
+ body
127
+ });
128
+ });
129
+ });
130
+ req.on("error", reject);
131
+ req.write(data);
132
+ req.end();
133
+ });
134
+ }
135
+
136
+ async function forwardRequest(payload, stderr) {
137
+ const transport = readTransportEnv();
138
+ const httpBase = resolveHttpBaseUrl();
139
+ const endpoint = buildHttpEndpoint(httpBase);
140
+ const ipcConfig = resolveIpcConfig(transport);
141
+
142
+ const tryHttp = async () => requestJson({ url: endpoint, payload });
143
+ const tryIpc = async () => {
144
+ if (!ipcConfig) {
145
+ throw new Error("IPC transport not configured");
146
+ }
147
+ const ipcUrl = new URL("http://localhost/v1/mcp");
148
+ const socketPath = ipcConfig.type === "unix" ? ipcConfig.socketPath : ipcConfig.pipeName;
149
+ return requestJson({ url: ipcUrl, payload, socketPath });
150
+ };
151
+
152
+ if (transport === "ipc") {
153
+ return await tryIpc();
154
+ }
155
+ if (transport === "http") {
156
+ return await tryHttp();
157
+ }
158
+
159
+ try {
160
+ return await tryHttp();
161
+ } catch (err) {
162
+ if (ipcConfig) {
163
+ stderr.write(`[docdex-mcp-stdio] HTTP failed, falling back to IPC: ${err}\n`);
164
+ return await tryIpc();
165
+ }
166
+ throw err;
167
+ }
168
+ }
169
+
170
+ async function writeLine(stdout, line) {
171
+ if (!line.endsWith("\n")) {
172
+ line += "\n";
173
+ }
174
+ if (!stdout.write(line)) {
175
+ await new Promise((resolve) => stdout.once("drain", resolve));
176
+ }
177
+ }
178
+
179
+ async function handlePayload(payload, stdout, stderr) {
180
+ const ids = extractIds(payload);
181
+ let response;
182
+ try {
183
+ const result = await forwardRequest(payload, stderr);
184
+ if (result.body) {
185
+ try {
186
+ response = JSON.parse(result.body);
187
+ } catch (err) {
188
+ response = jsonRpcError("invalid json response from docdex", ids[0], -32001, {
189
+ status: result.status,
190
+ body: result.body
191
+ });
192
+ }
193
+ } else if (ids.length === 1) {
194
+ response = { jsonrpc: "2.0", id: ids[0], result: null };
195
+ } else if (ids.length > 1) {
196
+ response = ids.map((id) => ({ jsonrpc: "2.0", id, result: null }));
197
+ } else {
198
+ response = null;
199
+ }
200
+ } catch (err) {
201
+ response = jsonRpcError("transport error", ids[0], -32002, {
202
+ error: String(err)
203
+ });
204
+ }
205
+ if (response !== null) {
206
+ await writeLine(stdout, JSON.stringify(response));
207
+ }
208
+ }
209
+
210
+ async function runBridge({ stdin, stdout, stderr }) {
211
+ readTransportEnv();
212
+ const rl = readline.createInterface({
213
+ input: stdin,
214
+ crlfDelay: Infinity
215
+ });
216
+
217
+ for await (const line of rl) {
218
+ const trimmed = line.trim();
219
+ if (!trimmed) {
220
+ continue;
221
+ }
222
+ let payload;
223
+ try {
224
+ payload = JSON.parse(trimmed);
225
+ } catch (err) {
226
+ const response = jsonRpcError("parse error", null, -32700, { error: String(err) });
227
+ await writeLine(stdout, JSON.stringify(response));
228
+ continue;
229
+ }
230
+ await handlePayload(payload, stdout, stderr);
231
+ }
232
+ }
233
+
234
+ module.exports = {
235
+ runBridge
236
+ };
@@ -26,6 +26,7 @@ const SETUP_PENDING_MARKER = "setup_pending.json";
26
26
  const AGENTS_DOC_FILENAME = "agents.md";
27
27
  const DOCDEX_INFO_START_PREFIX = "---- START OF DOCDEX INFO V";
28
28
  const DOCDEX_INFO_END = "---- END OF DOCDEX INFO -----";
29
+ const DOCDEX_INFO_END_LEGACY = "---- END OF DOCDEX INFO ----";
29
30
 
30
31
  function defaultConfigPath() {
31
32
  return path.join(os.homedir(), ".docdex", "config.toml");
@@ -353,9 +354,14 @@ function docdexBlockStart(version) {
353
354
  return `${DOCDEX_INFO_START_PREFIX}${version} ----`;
354
355
  }
355
356
 
357
+ function docdexInfoEndPattern() {
358
+ return `(?:${escapeRegExp(DOCDEX_INFO_END)}|${escapeRegExp(DOCDEX_INFO_END_LEGACY)})`;
359
+ }
360
+
356
361
  function buildDocdexInstructionBlock(instructions) {
357
362
  const next = normalizeInstructionText(instructions);
358
363
  if (!next) return "";
364
+ if (hasDocdexBlock(next)) return next;
359
365
  const version = resolvePackageVersion();
360
366
  return `${docdexBlockStart(version)}\n${next}\n${DOCDEX_INFO_END}`;
361
367
  }
@@ -363,9 +369,7 @@ function buildDocdexInstructionBlock(instructions) {
363
369
  function extractDocdexBlockBody(text) {
364
370
  const match = String(text || "").match(
365
371
  new RegExp(
366
- `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\r\\n]* ----\\r?\\n([\\s\\S]*?)\\r?\\n${escapeRegExp(
367
- DOCDEX_INFO_END
368
- )}`
372
+ `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\r\\n]* ----\\r?\\n([\\s\\S]*?)\\r?\\n${docdexInfoEndPattern()}`
369
373
  )
370
374
  );
371
375
  return match ? normalizeInstructionText(match[1]) : "";
@@ -383,11 +387,17 @@ function hasDocdexBlockVersion(text, version) {
383
387
  return String(text || "").includes(docdexBlockStart(version));
384
388
  }
385
389
 
390
+ function hasDocdexBlock(text) {
391
+ const source = String(text || "");
392
+ return (
393
+ source.includes(DOCDEX_INFO_START_PREFIX) &&
394
+ (source.includes(DOCDEX_INFO_END) || source.includes(DOCDEX_INFO_END_LEGACY))
395
+ );
396
+ }
397
+
386
398
  function stripDocdexBlocks(text) {
387
399
  const re = new RegExp(
388
- `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\r\\n]* ----\\r?\\n[\\s\\S]*?\\r?\\n${escapeRegExp(
389
- DOCDEX_INFO_END
390
- )}\\r?\\n?`,
400
+ `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\r\\n]* ----\\r?\\n[\\s\\S]*?\\r?\\n${docdexInfoEndPattern()}\\r?\\n?`,
391
401
  "g"
392
402
  );
393
403
  return String(text || "").replace(re, "").trim();
@@ -397,9 +407,7 @@ function stripDocdexBlocksExcept(text, version) {
397
407
  if (!version) return stripDocdexBlocks(text);
398
408
  const source = String(text || "");
399
409
  const re = new RegExp(
400
- `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\r\\n]* ----\\r?\\n[\\s\\S]*?\\r?\\n${escapeRegExp(
401
- DOCDEX_INFO_END
402
- )}\\r?\\n?`,
410
+ `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\r\\n]* ----\\r?\\n[\\s\\S]*?\\r?\\n${docdexInfoEndPattern()}\\r?\\n?`,
403
411
  "g"
404
412
  );
405
413
  let result = "";
@@ -432,7 +440,7 @@ function stripLegacyDocdexBody(text, body) {
432
440
  if (!body) return String(text || "");
433
441
  const source = String(text || "").replace(/\r\n/g, "\n");
434
442
  const re = new RegExp(
435
- `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\n]* ----\\n[\\s\\S]*?\\n${escapeRegExp(DOCDEX_INFO_END)}\\n?`,
443
+ `${escapeRegExp(DOCDEX_INFO_START_PREFIX)}[^\\n]* ----\\n[\\s\\S]*?\\n${docdexInfoEndPattern()}\\n?`,
436
444
  "g"
437
445
  );
438
446
  let result = "";
@@ -493,6 +501,18 @@ function upsertPromptFile(pathname, instructions, { prepend = false } = {}) {
493
501
  return writeTextFile(pathname, merged);
494
502
  }
495
503
 
504
+ function removePromptFile(pathname) {
505
+ if (!fs.existsSync(pathname)) return false;
506
+ const current = fs.readFileSync(pathname, "utf8");
507
+ const stripped = stripDocdexBlocks(current);
508
+ if (normalizeInstructionText(stripped) === normalizeInstructionText(current)) return false;
509
+ if (!stripped) {
510
+ fs.unlinkSync(pathname);
511
+ return true;
512
+ }
513
+ return writeTextFile(pathname, stripped);
514
+ }
515
+
496
516
  function escapeRegExp(value) {
497
517
  return value.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
498
518
  }
@@ -554,13 +574,7 @@ function upsertYamlInstruction(pathname, key, instructions) {
554
574
  }
555
575
 
556
576
  function upsertClaudeInstructions(pathname, instructions) {
557
- const { value } = readJson(pathname);
558
- if (typeof value !== "object" || value == null || Array.isArray(value)) return false;
559
- const merged = mergeInstructionText(value.instructions, instructions);
560
- if (!merged || merged === value.instructions) return false;
561
- value.instructions = merged;
562
- writeJson(pathname, value);
563
- return true;
577
+ return upsertPromptFile(pathname, instructions);
564
578
  }
565
579
 
566
580
  function upsertContinueJsonInstructions(pathname, instructions) {
@@ -731,7 +745,7 @@ function rewriteContinueYamlRules(source, instructions, addDocdex) {
731
745
 
732
746
  const keptItems = items.filter((item) => {
733
747
  const text = item.join("\n");
734
- return !(text.includes(DOCDEX_INFO_START_PREFIX) && text.includes(DOCDEX_INFO_END));
748
+ return !hasDocdexBlock(text);
735
749
  });
736
750
 
737
751
  if (addDocdex) {
@@ -786,45 +800,43 @@ function upsertVsCodeInstructionKey(value, key, instructions) {
786
800
  return true;
787
801
  }
788
802
 
789
- function upsertVsCodeInstructionLocations(value, instructionsDir) {
790
- const key = "chat.instructionsFilesLocations";
791
- const location = String(instructionsDir);
792
- if (value[key] && typeof value[key] === "object" && !Array.isArray(value[key])) {
793
- if (value[key][location] === true) return false;
794
- value[key][location] = true;
795
- return true;
796
- }
797
- if (Array.isArray(value[key])) {
798
- if (value[key].some((entry) => entry === location)) return false;
799
- value[key].push(location);
803
+ function removeVsCodeInstructionKey(value, key, instructions, { legacyPath } = {}) {
804
+ if (typeof value[key] !== "string") return false;
805
+ const current = value[key];
806
+ const stripped = stripDocdexBlocks(current);
807
+ if (normalizeInstructionText(stripped) !== normalizeInstructionText(current)) {
808
+ if (!stripped) {
809
+ delete value[key];
810
+ } else {
811
+ value[key] = stripped;
812
+ }
800
813
  return true;
801
814
  }
802
- if (typeof value[key] === "string") {
803
- if (value[key] === location) return false;
804
- value[key] = [value[key], location];
815
+ const normalized = normalizeInstructionText(instructions);
816
+ if (current === normalized || (legacyPath && current === legacyPath)) {
817
+ delete value[key];
805
818
  return true;
806
819
  }
807
- value[key] = { [location]: true };
808
- return true;
820
+ return false;
809
821
  }
810
822
 
811
- function upsertVsCodeInstructions(pathname, instructions, instructionsDir) {
823
+ function upsertVsCodeInstructions(pathname, instructions, legacyPath) {
812
824
  const { value } = readJson(pathname);
813
825
  if (typeof value !== "object" || value == null || Array.isArray(value)) return false;
814
826
  const normalized = normalizeInstructionText(instructions);
815
827
  if (!normalized) return false;
816
828
  let updated = false;
817
- if (upsertVsCodeInstructionKey(value, "github.copilot.chat.codeGeneration.instructions", instructions)) {
818
- updated = true;
819
- }
820
- if (upsertVsCodeInstructionKey(value, "copilot.chat.codeGeneration.instructions", instructions)) {
829
+ if (upsertVsCodeInstructionKey(value, "chat.instructions", instructions)) {
821
830
  updated = true;
822
831
  }
823
- if (value["github.copilot.chat.codeGeneration.useInstructionFiles"] !== true) {
824
- value["github.copilot.chat.codeGeneration.useInstructionFiles"] = true;
832
+ if (removeVsCodeInstructionKey(value, "github.copilot.chat.codeGeneration.instructions", instructions)) {
825
833
  updated = true;
826
834
  }
827
- if (upsertVsCodeInstructionLocations(value, instructionsDir)) {
835
+ if (
836
+ removeVsCodeInstructionKey(value, "copilot.chat.codeGeneration.instructions", instructions, {
837
+ legacyPath
838
+ })
839
+ ) {
828
840
  updated = true;
829
841
  }
830
842
  if (!updated) return false;
@@ -1169,11 +1181,18 @@ function clientInstructionPaths() {
1169
1181
  const aiderConfig = path.join(home, ".aider.conf.yml");
1170
1182
  const gooseConfig = path.join(home, ".config", "goose", "config.yaml");
1171
1183
  const openInterpreterConfig = path.join(home, ".openinterpreter", "profiles", "default.yaml");
1184
+ const geminiInstructions = path.join(userProfile, ".gemini", "GEMINI.md");
1185
+ const claudeInstructions = path.join(userProfile, ".claude", "CLAUDE.md");
1186
+ const cursorAgents = path.join(userProfile, ".cursor", "agents.md");
1187
+ const cursorAgentsUpper = path.join(userProfile, ".cursor", "AGENTS.md");
1172
1188
  const codexAgents = path.join(userProfile, ".codex", "AGENTS.md");
1173
1189
  switch (process.platform) {
1174
1190
  case "win32":
1175
1191
  return {
1176
- claude: path.join(appData, "Claude", "claude_desktop_config.json"),
1192
+ gemini: geminiInstructions,
1193
+ claude: claudeInstructions,
1194
+ cursorAgents,
1195
+ cursorAgentsUpper,
1177
1196
  continue: continueJson,
1178
1197
  continueYaml,
1179
1198
  continueYml,
@@ -1192,7 +1211,10 @@ function clientInstructionPaths() {
1192
1211
  };
1193
1212
  case "darwin":
1194
1213
  return {
1195
- claude: path.join(home, "Library", "Application Support", "Claude", "claude_desktop_config.json"),
1214
+ gemini: geminiInstructions,
1215
+ claude: claudeInstructions,
1216
+ cursorAgents,
1217
+ cursorAgentsUpper,
1196
1218
  continue: continueJson,
1197
1219
  continueYaml,
1198
1220
  continueYml,
@@ -1211,7 +1233,10 @@ function clientInstructionPaths() {
1211
1233
  };
1212
1234
  default:
1213
1235
  return {
1214
- claude: path.join(home, ".config", "Claude", "claude_desktop_config.json"),
1236
+ gemini: geminiInstructions,
1237
+ claude: claudeInstructions,
1238
+ cursorAgents,
1239
+ cursorAgentsUpper,
1215
1240
  continue: continueJson,
1216
1241
  continueYaml,
1217
1242
  continueYml,
@@ -1295,18 +1320,22 @@ function applyAgentInstructions({ logger } = {}) {
1295
1320
  };
1296
1321
 
1297
1322
  if (paths.vscodeGlobalInstructions) {
1298
- safeApply("vscode-global", () =>
1299
- upsertPromptFile(paths.vscodeGlobalInstructions, instructions, { prepend: true })
1300
- );
1323
+ safeApply("vscode-global-cleanup", () => removePromptFile(paths.vscodeGlobalInstructions));
1301
1324
  }
1302
1325
  if (paths.vscodeInstructionsFile) {
1303
- safeApply("vscode-instructions-file", () =>
1304
- upsertPromptFile(paths.vscodeInstructionsFile, instructions, { prepend: true })
1326
+ safeApply("vscode-instructions-file-cleanup", () =>
1327
+ removePromptFile(paths.vscodeInstructionsFile)
1305
1328
  );
1306
1329
  }
1307
- if (paths.vscodeSettings && paths.vscodeInstructionsDir) {
1330
+ if (paths.cursorAgents) {
1331
+ safeApply("cursor-legacy-cleanup", () => removePromptFile(paths.cursorAgents));
1332
+ }
1333
+ if (paths.cursorAgentsUpper) {
1334
+ safeApply("cursor-legacy-upper-cleanup", () => removePromptFile(paths.cursorAgentsUpper));
1335
+ }
1336
+ if (paths.vscodeSettings) {
1308
1337
  safeApply("vscode-settings", () =>
1309
- upsertVsCodeInstructions(paths.vscodeSettings, instructions, paths.vscodeInstructionsDir)
1338
+ upsertVsCodeInstructions(paths.vscodeSettings, instructions, paths.vscodeGlobalInstructions)
1310
1339
  );
1311
1340
  }
1312
1341
  if (paths.windsurfGlobalRules) {
@@ -1321,6 +1350,9 @@ function applyAgentInstructions({ logger } = {}) {
1321
1350
  if (paths.claude) {
1322
1351
  safeApply("claude", () => upsertClaudeInstructions(paths.claude, instructions));
1323
1352
  }
1353
+ if (paths.gemini) {
1354
+ safeApply("gemini", () => upsertPromptFile(paths.gemini, instructions));
1355
+ }
1324
1356
  const continueYamlExists =
1325
1357
  (paths.continueYaml && fs.existsSync(paths.continueYaml)) ||
1326
1358
  (paths.continueYml && fs.existsSync(paths.continueYml));
package/package.json CHANGED
@@ -1,11 +1,12 @@
1
1
  {
2
2
  "name": "docdex",
3
- "version": "0.2.24",
3
+ "version": "0.2.26",
4
4
  "mcpName": "io.github.bekirdag/docdex",
5
5
  "description": "Local-first documentation and code indexer with HTTP/MCP search, AST, and agent memory.",
6
6
  "bin": {
7
7
  "docdex": "bin/docdex.js",
8
- "docdexd": "bin/docdex.js"
8
+ "docdexd": "bin/docdex.js",
9
+ "docdex-mcp-stdio": "bin/docdex-mcp-stdio.js"
9
10
  },
10
11
  "files": [
11
12
  "bin",