mcp-server-markdown 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Ofer Shapira
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,113 @@
1
+ # mcp-server-markdown
2
+
3
+ [![npm version](https://img.shields.io/npm/v/mcp-server-markdown.svg)](https://www.npmjs.com/package/mcp-server-markdown)
4
+ [![npm downloads](https://img.shields.io/npm/dm/mcp-server-markdown.svg)](https://www.npmjs.com/package/mcp-server-markdown)
5
+ [![CI](https://github.com/ofershap/mcp-server-markdown/actions/workflows/ci.yml/badge.svg)](https://github.com/ofershap/mcp-server-markdown/actions/workflows/ci.yml)
6
+ [![TypeScript](https://img.shields.io/badge/TypeScript-strict-blue.svg)](https://www.typescriptlang.org/)
7
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
8
+
9
+ Search, navigate, and extract content from local markdown files. Full-text search, section extraction, heading navigation, code block discovery, and frontmatter parsing.
10
+
11
+ ```bash
12
+ npx mcp-server-markdown
13
+ ```
14
+
15
+ > Works with Claude Desktop, Cursor, VS Code Copilot, and any MCP client. Reads local `.md` files, no auth needed.
16
+
17
+ ![MCP server for searching and navigating markdown documentation](assets/demo.gif)
18
+
19
+ <sub>Demo built with <a href="https://github.com/ofershap/remotion-readme-kit">remotion-readme-kit</a></sub>
20
+
21
+ ## Why
22
+
23
+ Tools like Context7 are great for looking up library docs from npm, but they don't help with your own documentation. Project wikis, internal knowledge bases, architecture decision records, onboarding guides: they all live as markdown files in your repo or on disk. The filesystem MCP server can read those files, but it treats them as raw text. It doesn't understand headings, sections, or code blocks. This server does. Point it at a directory and your assistant can search across all your docs, pull out a specific section by heading, list the table of contents, or find every TypeScript code example in your knowledge base.
24
+
25
+ ## Tools
26
+
27
+ | Tool | What it does |
28
+ | ------------------ | -------------------------------------------------------------------------- |
29
+ | `list_files` | List all .md files in a directory recursively (sorted alphabetically) |
30
+ | `search_docs` | Full-text search across all .md files (case-insensitive, up to 50 results) |
31
+ | `get_section` | Extract a section by heading until the next heading of same/higher level |
32
+ | `list_headings` | List all headings as a table of contents |
33
+ | `find_code_blocks` | Find fenced code blocks, optionally filter by language (e.g. typescript) |
34
+ | `get_frontmatter` | Parse YAML frontmatter metadata at the start of a file |
35
+
36
+ ## Quick Start
37
+
38
+ ### Cursor
39
+
40
+ Add to `.cursor/mcp.json`:
41
+
42
+ ```json
43
+ {
44
+ "mcpServers": {
45
+ "markdown": {
46
+ "command": "npx",
47
+ "args": ["-y", "mcp-server-markdown"]
48
+ }
49
+ }
50
+ }
51
+ ```
52
+
53
+ ### Claude Desktop
54
+
55
+ Add to `claude_desktop_config.json`:
56
+
57
+ ```json
58
+ {
59
+ "mcpServers": {
60
+ "markdown": {
61
+ "command": "npx",
62
+ "args": ["-y", "mcp-server-markdown"]
63
+ }
64
+ }
65
+ }
66
+ ```
67
+
68
+ ### VS Code
69
+
70
+ Add to user settings or `.vscode/mcp.json`:
71
+
72
+ ```json
73
+ {
74
+ "mcp": {
75
+ "servers": {
76
+ "markdown": {
77
+ "command": "npx",
78
+ "args": ["-y", "mcp-server-markdown"]
79
+ }
80
+ }
81
+ }
82
+ }
83
+ ```
84
+
85
+ ## Examples
86
+
87
+ - "Search all docs in ./docs for mentions of 'authentication'"
88
+ - "Show me the 'API Reference' section from README.md"
89
+ - "List all headings in CONTRIBUTING.md"
90
+ - "Find all TypeScript code blocks in the docs"
91
+ - "What's the frontmatter metadata in this file?"
92
+ - "Give me the table of contents for our architecture docs"
93
+
94
+ ## Development
95
+
96
+ ```bash
97
+ git clone https://github.com/ofershap/mcp-server-markdown.git
98
+ cd mcp-server-markdown
99
+ npm install
100
+ npm test
101
+ npm run build
102
+ ```
103
+
104
+ ## Author
105
+
106
+ **Ofer Shapira**
107
+
108
+ [![LinkedIn](https://img.shields.io/badge/LinkedIn-ofershap-blue?logo=linkedin)](https://linkedin.com/in/ofershap)
109
+ [![GitHub](https://img.shields.io/badge/GitHub-ofershap-black?logo=github)](https://github.com/ofershap)
110
+
111
+ ## License
112
+
113
+ MIT © 2026 Ofer Shapira
package/dist/index.js ADDED
@@ -0,0 +1,283 @@
1
+ #!/usr/bin/env node
2
+
3
+ // src/index.ts
4
+ import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
5
+ import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
6
+ import path2 from "path";
7
+ import { z } from "zod";
8
+
9
+ // src/markdown.ts
10
+ import fs from "fs/promises";
11
+ import path from "path";
12
+ async function walkDir(dir, baseDir) {
13
+ const entries = await fs.readdir(dir, { withFileTypes: true });
14
+ const files = [];
15
+ for (const entry of entries) {
16
+ const fullPath = path.join(dir, entry.name);
17
+ const relativePath = path.relative(baseDir, fullPath);
18
+ if (entry.isDirectory()) {
19
+ const subFiles = await walkDir(fullPath, baseDir);
20
+ files.push(...subFiles);
21
+ } else if (entry.isFile() && entry.name.endsWith(".md")) {
22
+ files.push(relativePath);
23
+ }
24
+ }
25
+ return files;
26
+ }
27
+ async function listMarkdownFiles(directory) {
28
+ const absDir = path.resolve(directory);
29
+ const stat = await fs.stat(absDir);
30
+ if (!stat.isDirectory()) {
31
+ throw new Error(`Not a directory: ${directory}`);
32
+ }
33
+ const files = await walkDir(absDir, absDir);
34
+ return files.sort();
35
+ }
36
+ async function searchDocs(directory, query) {
37
+ const absDir = path.resolve(directory);
38
+ const files = await listMarkdownFiles(absDir);
39
+ const results = [];
40
+ const lowerQuery = query.toLowerCase();
41
+ const limit = 50;
42
+ for (const file of files) {
43
+ if (results.length >= limit) break;
44
+ const fullPath = path.join(absDir, file);
45
+ const content = await fs.readFile(fullPath, "utf-8");
46
+ const lines = content.split("\n");
47
+ for (let i = 0; i < lines.length && results.length < limit; i++) {
48
+ const line = lines[i];
49
+ if (line !== void 0 && line.toLowerCase().includes(lowerQuery)) {
50
+ results.push({ file, line: i + 1, content: line });
51
+ }
52
+ }
53
+ }
54
+ return results;
55
+ }
56
+ async function getSection(filePath, heading) {
57
+ const absPath = path.resolve(filePath);
58
+ const content = await fs.readFile(absPath, "utf-8");
59
+ const lines = content.split("\n");
60
+ const lowerHeading = heading.toLowerCase().trim();
61
+ const headingRe = /^(#{1,6})\s+(.+)$/;
62
+ let targetLevel = 0;
63
+ let startLine = -1;
64
+ let endLine = lines.length;
65
+ for (let i = 0; i < lines.length; i++) {
66
+ const line = lines[i];
67
+ const match = line !== void 0 ? line.match(headingRe) : null;
68
+ if (match && match[1] !== void 0 && match[2] !== void 0) {
69
+ const level = match[1].length;
70
+ const text = match[2].trim().toLowerCase();
71
+ if (text === lowerHeading) {
72
+ targetLevel = level;
73
+ startLine = i;
74
+ break;
75
+ }
76
+ }
77
+ }
78
+ if (startLine < 0) {
79
+ return "";
80
+ }
81
+ for (let i = startLine + 1; i < lines.length; i++) {
82
+ const line = lines[i];
83
+ const match = line !== void 0 ? line.match(headingRe) : null;
84
+ if (match && match[1] !== void 0) {
85
+ const level = match[1].length;
86
+ if (level <= targetLevel) {
87
+ endLine = i;
88
+ break;
89
+ }
90
+ }
91
+ }
92
+ return lines.slice(startLine, endLine).join("\n").trim();
93
+ }
94
+ async function listHeadings(filePath) {
95
+ const absPath = path.resolve(filePath);
96
+ const content = await fs.readFile(absPath, "utf-8");
97
+ const lines = content.split("\n");
98
+ const headings = [];
99
+ const headingRe = /^(#{1,6})\s+(.+)$/;
100
+ for (let i = 0; i < lines.length; i++) {
101
+ const line = lines[i];
102
+ const match = line !== void 0 ? line.match(headingRe) : null;
103
+ if (match && match[1] !== void 0 && match[2] !== void 0) {
104
+ headings.push({
105
+ level: match[1].length,
106
+ text: match[2].trim(),
107
+ line: i + 1
108
+ });
109
+ }
110
+ }
111
+ return headings;
112
+ }
113
+ async function findCodeBlocks(filePath, language) {
114
+ const absPath = path.resolve(filePath);
115
+ const content = await fs.readFile(absPath, "utf-8");
116
+ const lines = content.split("\n");
117
+ const blocks = [];
118
+ const fenceRe = /^```(\w*)\s*$/;
119
+ let inBlock = false;
120
+ let blockLang = "";
121
+ let blockStart = 0;
122
+ const blockLines = [];
123
+ for (let i = 0; i < lines.length; i++) {
124
+ const line = lines[i];
125
+ if (line === void 0) continue;
126
+ const fence = line.match(fenceRe);
127
+ if (fence) {
128
+ if (!inBlock) {
129
+ inBlock = true;
130
+ blockLang = fence[1] ?? "";
131
+ blockStart = i + 1;
132
+ blockLines.length = 0;
133
+ } else {
134
+ inBlock = false;
135
+ if (language === void 0 || blockLang.toLowerCase() === language.toLowerCase()) {
136
+ blocks.push({
137
+ language: blockLang,
138
+ code: blockLines.join("\n"),
139
+ line: blockStart
140
+ });
141
+ }
142
+ }
143
+ } else if (inBlock) {
144
+ blockLines.push(line);
145
+ }
146
+ }
147
+ return blocks;
148
+ }
149
+ async function getFrontmatter(filePath) {
150
+ const absPath = path.resolve(filePath);
151
+ const content = await fs.readFile(absPath, "utf-8");
152
+ const match = content.match(/^---\r?\n([\s\S]*?)\r?\n---/);
153
+ if (!match || match[1] === void 0) return null;
154
+ const yaml = match[1];
155
+ const result = {};
156
+ const keyValueRe = /^([^:#\s]+):\s*(.*)$/m;
157
+ for (const line of yaml.split("\n")) {
158
+ const kv = line.trim().match(keyValueRe);
159
+ const key = kv?.[1];
160
+ const valRaw = kv?.[2];
161
+ if (key !== void 0 && valRaw !== void 0) {
162
+ const keyTrim = key.trim();
163
+ let val = valRaw.trim();
164
+ if (val.startsWith('"') && val.endsWith('"') || val.startsWith("'") && val.endsWith("'")) {
165
+ val = val.slice(1, -1);
166
+ }
167
+ result[keyTrim] = val;
168
+ }
169
+ }
170
+ return Object.keys(result).length > 0 ? result : null;
171
+ }
172
+
173
+ // src/index.ts
174
+ var server = new McpServer({
175
+ name: "mcp-server-markdown",
176
+ version: "1.0.0"
177
+ });
178
+ server.tool(
179
+ "list_files",
180
+ "List all .md files in a directory recursively. Returns relative paths sorted alphabetically.",
181
+ {
182
+ directory: z.string().describe("Path to the directory to scan (e.g. ./docs)")
183
+ },
184
+ async ({ directory }) => {
185
+ const absDir = path2.resolve(directory);
186
+ const files = await listMarkdownFiles(absDir);
187
+ const text = files.length > 0 ? files.join("\n") : "(no markdown files found)";
188
+ return { content: [{ type: "text", text }] };
189
+ }
190
+ );
191
+ server.tool(
192
+ "search_docs",
193
+ "Full-text search across all .md files in a directory. Returns file, line number, and matching line. Limited to 50 results.",
194
+ {
195
+ directory: z.string().describe("Path to the directory to search (e.g. ./docs)"),
196
+ query: z.string().describe("Search string (case-insensitive)")
197
+ },
198
+ async ({ directory, query }) => {
199
+ const absDir = path2.resolve(directory);
200
+ const results = await searchDocs(absDir, query);
201
+ const lines = results.map((r) => `${r.file}:${r.line} ${r.content}`);
202
+ const text = lines.length > 0 ? lines.join("\n") : "(no matches)";
203
+ return { content: [{ type: "text", text }] };
204
+ }
205
+ );
206
+ server.tool(
207
+ "get_section",
208
+ "Extract a section by heading from a markdown file. Returns content from that heading until the next heading of same or higher level.",
209
+ {
210
+ file: z.string().describe("Path to the markdown file"),
211
+ heading: z.string().describe("Heading text to find (case-insensitive)")
212
+ },
213
+ async ({ file, heading }) => {
214
+ const absPath = path2.resolve(file);
215
+ const content = await getSection(absPath, heading);
216
+ const text = content || `(heading "${heading}" not found)`;
217
+ return { content: [{ type: "text", text }] };
218
+ }
219
+ );
220
+ server.tool(
221
+ "list_headings",
222
+ "List all headings (# through ######) in a markdown file as a table of contents.",
223
+ {
224
+ file: z.string().describe("Path to the markdown file")
225
+ },
226
+ async ({ file }) => {
227
+ const absPath = path2.resolve(file);
228
+ const headings = await listHeadings(absPath);
229
+ const lines = headings.map(
230
+ (h) => `${" ".repeat(h.level - 1)}- ${h.text} (L${h.line})`
231
+ );
232
+ const text = lines.length > 0 ? lines.join("\n") : "(no headings found)";
233
+ return { content: [{ type: "text", text }] };
234
+ }
235
+ );
236
+ server.tool(
237
+ "find_code_blocks",
238
+ "Find all fenced code blocks in a markdown file. Optionally filter by language (e.g. typescript, python).",
239
+ {
240
+ file: z.string().describe("Path to the markdown file"),
241
+ language: z.string().optional().describe("Optional: filter by language (e.g. typescript, python)")
242
+ },
243
+ async ({ file, language }) => {
244
+ const absPath = path2.resolve(file);
245
+ const blocks = await findCodeBlocks(absPath, language);
246
+ const parts = blocks.map((b) => {
247
+ const lang = b.language || "(no lang)";
248
+ return `--- ${lang} (L${b.line}) ---
249
+ ${b.code}`;
250
+ });
251
+ const text = parts.length > 0 ? parts.join("\n\n") : "(no code blocks found)";
252
+ return { content: [{ type: "text", text }] };
253
+ }
254
+ );
255
+ server.tool(
256
+ "get_frontmatter",
257
+ "Parse YAML frontmatter (between --- delimiters) at the start of a markdown file. Returns key-value metadata.",
258
+ {
259
+ file: z.string().describe("Path to the markdown file")
260
+ },
261
+ async ({ file }) => {
262
+ const absPath = path2.resolve(file);
263
+ const frontmatter = await getFrontmatter(absPath);
264
+ if (!frontmatter) {
265
+ return {
266
+ content: [{ type: "text", text: "(no frontmatter)" }]
267
+ };
268
+ }
269
+ const lines = Object.entries(frontmatter).map(([k, v]) => `${k}: ${v}`);
270
+ return {
271
+ content: [{ type: "text", text: lines.join("\n") }]
272
+ };
273
+ }
274
+ );
275
+ async function main() {
276
+ const transport = new StdioServerTransport();
277
+ await server.connect(transport);
278
+ }
279
+ main().catch((error) => {
280
+ console.error("Fatal error:", error);
281
+ process.exit(1);
282
+ });
283
+ //# sourceMappingURL=index.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/index.ts","../src/markdown.ts"],"sourcesContent":["import { McpServer } from \"@modelcontextprotocol/sdk/server/mcp.js\";\nimport { StdioServerTransport } from \"@modelcontextprotocol/sdk/server/stdio.js\";\nimport path from \"node:path\";\nimport { z } from \"zod\";\nimport {\n listMarkdownFiles,\n searchDocs,\n getSection,\n listHeadings,\n findCodeBlocks,\n getFrontmatter,\n} from \"./markdown.js\";\n\nconst server = new McpServer({\n name: \"mcp-server-markdown\",\n version: \"1.0.0\",\n});\n\nserver.tool(\n \"list_files\",\n \"List all .md files in a directory recursively. Returns relative paths sorted alphabetically.\",\n {\n directory: z\n .string()\n .describe(\"Path to the directory to scan (e.g. ./docs)\"),\n },\n async ({ directory }) => {\n const absDir = path.resolve(directory);\n const files = await listMarkdownFiles(absDir);\n const text =\n files.length > 0 ? files.join(\"\\n\") : \"(no markdown files found)\";\n return { content: [{ type: \"text\" as const, text }] };\n },\n);\n\nserver.tool(\n \"search_docs\",\n \"Full-text search across all .md files in a directory. Returns file, line number, and matching line. Limited to 50 results.\",\n {\n directory: z\n .string()\n .describe(\"Path to the directory to search (e.g. ./docs)\"),\n query: z.string().describe(\"Search string (case-insensitive)\"),\n },\n async ({ directory, query }) => {\n const absDir = path.resolve(directory);\n const results = await searchDocs(absDir, query);\n const lines = results.map((r) => `${r.file}:${r.line} ${r.content}`);\n const text = lines.length > 0 ? lines.join(\"\\n\") : \"(no matches)\";\n return { content: [{ type: \"text\" as const, text }] };\n },\n);\n\nserver.tool(\n \"get_section\",\n \"Extract a section by heading from a markdown file. Returns content from that heading until the next heading of same or higher level.\",\n {\n file: z.string().describe(\"Path to the markdown file\"),\n heading: z.string().describe(\"Heading text to find (case-insensitive)\"),\n },\n async ({ file, heading }) => {\n const absPath = path.resolve(file);\n const content = await getSection(absPath, heading);\n const text = content || `(heading \"${heading}\" not found)`;\n return { content: [{ type: \"text\" as const, text }] };\n },\n);\n\nserver.tool(\n \"list_headings\",\n \"List all headings (# through ######) in a markdown file as a table of contents.\",\n {\n file: z.string().describe(\"Path to the markdown file\"),\n },\n async ({ file }) => {\n const absPath = path.resolve(file);\n const headings = await listHeadings(absPath);\n const lines = headings.map(\n (h) => `${\" \".repeat(h.level - 1)}- ${h.text} (L${h.line})`,\n );\n const text = lines.length > 0 ? lines.join(\"\\n\") : \"(no headings found)\";\n return { content: [{ type: \"text\" as const, text }] };\n },\n);\n\nserver.tool(\n \"find_code_blocks\",\n \"Find all fenced code blocks in a markdown file. Optionally filter by language (e.g. typescript, python).\",\n {\n file: z.string().describe(\"Path to the markdown file\"),\n language: z\n .string()\n .optional()\n .describe(\"Optional: filter by language (e.g. typescript, python)\"),\n },\n async ({ file, language }) => {\n const absPath = path.resolve(file);\n const blocks = await findCodeBlocks(absPath, language);\n const parts = blocks.map((b) => {\n const lang = b.language || \"(no lang)\";\n return `--- ${lang} (L${b.line}) ---\\n${b.code}`;\n });\n const text =\n parts.length > 0 ? parts.join(\"\\n\\n\") : \"(no code blocks found)\";\n return { content: [{ type: \"text\" as const, text }] };\n },\n);\n\nserver.tool(\n \"get_frontmatter\",\n \"Parse YAML frontmatter (between --- delimiters) at the start of a markdown file. Returns key-value metadata.\",\n {\n file: z.string().describe(\"Path to the markdown file\"),\n },\n async ({ file }) => {\n const absPath = path.resolve(file);\n const frontmatter = await getFrontmatter(absPath);\n if (!frontmatter) {\n return {\n content: [{ type: \"text\" as const, text: \"(no frontmatter)\" }],\n };\n }\n const lines = Object.entries(frontmatter).map(([k, v]) => `${k}: ${v}`);\n return {\n content: [{ type: \"text\" as const, text: lines.join(\"\\n\") }],\n };\n },\n);\n\nasync function main() {\n const transport = new StdioServerTransport();\n await server.connect(transport);\n}\n\nmain().catch((error) => {\n console.error(\"Fatal error:\", error);\n process.exit(1);\n});\n","import fs from \"node:fs/promises\";\nimport path from \"node:path\";\n\nexport interface SearchResult {\n file: string;\n line: number;\n content: string;\n}\n\nexport interface Heading {\n level: number;\n text: string;\n line: number;\n}\n\nexport interface CodeBlock {\n language: string;\n code: string;\n line: number;\n}\n\nexport type Frontmatter = Record<string, string>;\n\nasync function walkDir(dir: string, baseDir: string): Promise<string[]> {\n const entries = await fs.readdir(dir, { withFileTypes: true });\n const files: string[] = [];\n for (const entry of entries) {\n const fullPath = path.join(dir, entry.name);\n const relativePath = path.relative(baseDir, fullPath);\n if (entry.isDirectory()) {\n const subFiles = await walkDir(fullPath, baseDir);\n files.push(...subFiles);\n } else if (entry.isFile() && entry.name.endsWith(\".md\")) {\n files.push(relativePath);\n }\n }\n return files;\n}\n\nexport async function listMarkdownFiles(directory: string): Promise<string[]> {\n const absDir = path.resolve(directory);\n const stat = await fs.stat(absDir);\n if (!stat.isDirectory()) {\n throw new Error(`Not a directory: ${directory}`);\n }\n const files = await walkDir(absDir, absDir);\n return files.sort();\n}\n\nexport async function searchDocs(\n directory: string,\n query: string,\n): Promise<SearchResult[]> {\n const absDir = path.resolve(directory);\n const files = await listMarkdownFiles(absDir);\n const results: SearchResult[] = [];\n const lowerQuery = query.toLowerCase();\n const limit = 50;\n\n for (const file of files) {\n if (results.length >= limit) break;\n const fullPath = path.join(absDir, file);\n const content = await fs.readFile(fullPath, \"utf-8\");\n const lines = content.split(\"\\n\");\n for (let i = 0; i < lines.length && results.length < limit; i++) {\n const line = lines[i];\n if (line !== undefined && line.toLowerCase().includes(lowerQuery)) {\n results.push({ file, line: i + 1, content: line });\n }\n }\n }\n return results;\n}\n\nexport async function getSection(\n filePath: string,\n heading: string,\n): Promise<string> {\n const absPath = path.resolve(filePath);\n const content = await fs.readFile(absPath, \"utf-8\");\n const lines = content.split(\"\\n\");\n const lowerHeading = heading.toLowerCase().trim();\n const headingRe = /^(#{1,6})\\s+(.+)$/;\n let targetLevel = 0;\n let startLine = -1;\n let endLine = lines.length;\n\n for (let i = 0; i < lines.length; i++) {\n const line = lines[i];\n const match = line !== undefined ? line.match(headingRe) : null;\n if (match && match[1] !== undefined && match[2] !== undefined) {\n const level = match[1].length;\n const text = match[2].trim().toLowerCase();\n if (text === lowerHeading) {\n targetLevel = level;\n startLine = i;\n break;\n }\n }\n }\n\n if (startLine < 0) {\n return \"\";\n }\n\n for (let i = startLine + 1; i < lines.length; i++) {\n const line = lines[i];\n const match = line !== undefined ? line.match(headingRe) : null;\n if (match && match[1] !== undefined) {\n const level = match[1].length;\n if (level <= targetLevel) {\n endLine = i;\n break;\n }\n }\n }\n\n return lines.slice(startLine, endLine).join(\"\\n\").trim();\n}\n\nexport async function listHeadings(filePath: string): Promise<Heading[]> {\n const absPath = path.resolve(filePath);\n const content = await fs.readFile(absPath, \"utf-8\");\n const lines = content.split(\"\\n\");\n const headings: Heading[] = [];\n const headingRe = /^(#{1,6})\\s+(.+)$/;\n\n for (let i = 0; i < lines.length; i++) {\n const line = lines[i];\n const match = line !== undefined ? line.match(headingRe) : null;\n if (match && match[1] !== undefined && match[2] !== undefined) {\n headings.push({\n level: match[1].length,\n text: match[2].trim(),\n line: i + 1,\n });\n }\n }\n return headings;\n}\n\nexport async function findCodeBlocks(\n filePath: string,\n language?: string,\n): Promise<CodeBlock[]> {\n const absPath = path.resolve(filePath);\n const content = await fs.readFile(absPath, \"utf-8\");\n const lines = content.split(\"\\n\");\n const blocks: CodeBlock[] = [];\n const fenceRe = /^```(\\w*)\\s*$/;\n let inBlock = false;\n let blockLang = \"\";\n let blockStart = 0;\n const blockLines: string[] = [];\n\n for (let i = 0; i < lines.length; i++) {\n const line = lines[i];\n if (line === undefined) continue;\n const fence = line.match(fenceRe);\n if (fence) {\n if (!inBlock) {\n inBlock = true;\n blockLang = fence[1] ?? \"\";\n blockStart = i + 1;\n blockLines.length = 0;\n } else {\n inBlock = false;\n if (\n language === undefined ||\n blockLang.toLowerCase() === language.toLowerCase()\n ) {\n blocks.push({\n language: blockLang,\n code: blockLines.join(\"\\n\"),\n line: blockStart,\n });\n }\n }\n } else if (inBlock) {\n blockLines.push(line);\n }\n }\n return blocks;\n}\n\nexport async function getFrontmatter(\n filePath: string,\n): Promise<Frontmatter | null> {\n const absPath = path.resolve(filePath);\n const content = await fs.readFile(absPath, \"utf-8\");\n const match = content.match(/^---\\r?\\n([\\s\\S]*?)\\r?\\n---/);\n if (!match || match[1] === undefined) return null;\n const yaml = match[1];\n const result: Frontmatter = {};\n const keyValueRe = /^([^:#\\s]+):\\s*(.*)$/m;\n for (const line of yaml.split(\"\\n\")) {\n const kv = line.trim().match(keyValueRe);\n const key = kv?.[1];\n const valRaw = kv?.[2];\n if (key !== undefined && valRaw !== undefined) {\n const keyTrim = key.trim();\n let val = valRaw.trim();\n if (\n (val.startsWith('\"') && val.endsWith('\"')) ||\n (val.startsWith(\"'\") && val.endsWith(\"'\"))\n ) {\n val = val.slice(1, -1);\n }\n result[keyTrim] = val;\n }\n }\n return Object.keys(result).length > 0 ? result : null;\n}\n"],"mappings":";;;AAAA,SAAS,iBAAiB;AAC1B,SAAS,4BAA4B;AACrC,OAAOA,WAAU;AACjB,SAAS,SAAS;;;ACHlB,OAAO,QAAQ;AACf,OAAO,UAAU;AAsBjB,eAAe,QAAQ,KAAa,SAAoC;AACtE,QAAM,UAAU,MAAM,GAAG,QAAQ,KAAK,EAAE,eAAe,KAAK,CAAC;AAC7D,QAAM,QAAkB,CAAC;AACzB,aAAW,SAAS,SAAS;AAC3B,UAAM,WAAW,KAAK,KAAK,KAAK,MAAM,IAAI;AAC1C,UAAM,eAAe,KAAK,SAAS,SAAS,QAAQ;AACpD,QAAI,MAAM,YAAY,GAAG;AACvB,YAAM,WAAW,MAAM,QAAQ,UAAU,OAAO;AAChD,YAAM,KAAK,GAAG,QAAQ;AAAA,IACxB,WAAW,MAAM,OAAO,KAAK,MAAM,KAAK,SAAS,KAAK,GAAG;AACvD,YAAM,KAAK,YAAY;AAAA,IACzB;AAAA,EACF;AACA,SAAO;AACT;AAEA,eAAsB,kBAAkB,WAAsC;AAC5E,QAAM,SAAS,KAAK,QAAQ,SAAS;AACrC,QAAM,OAAO,MAAM,GAAG,KAAK,MAAM;AACjC,MAAI,CAAC,KAAK,YAAY,GAAG;AACvB,UAAM,IAAI,MAAM,oBAAoB,SAAS,EAAE;AAAA,EACjD;AACA,QAAM,QAAQ,MAAM,QAAQ,QAAQ,MAAM;AAC1C,SAAO,MAAM,KAAK;AACpB;AAEA,eAAsB,WACpB,WACA,OACyB;AACzB,QAAM,SAAS,KAAK,QAAQ,SAAS;AACrC,QAAM,QAAQ,MAAM,kBAAkB,MAAM;AAC5C,QAAM,UAA0B,CAAC;AACjC,QAAM,aAAa,MAAM,YAAY;AACrC,QAAM,QAAQ;AAEd,aAAW,QAAQ,OAAO;AACxB,QAAI,QAAQ,UAAU,MAAO;AAC7B,UAAM,WAAW,KAAK,KAAK,QAAQ,IAAI;AACvC,UAAM,UAAU,MAAM,GAAG,SAAS,UAAU,OAAO;AACnD,UAAM,QAAQ,QAAQ,MAAM,IAAI;AAChC,aAAS,IAAI,GAAG,IAAI,MAAM,UAAU,QAAQ,SAAS,OAAO,KAAK;AAC/D,YAAM,OAAO,MAAM,CAAC;AACpB,UAAI,SAAS,UAAa,KAAK,YAAY,EAAE,SAAS,UAAU,GAAG;AACjE,gBAAQ,KAAK,EAAE,MAAM,MAAM,IAAI,GAAG,SAAS,KAAK,CAAC;AAAA,MACnD;AAAA,IACF;AAAA,EACF;AACA,SAAO;AACT;AAEA,eAAsB,WACpB,UACA,SACiB;AACjB,QAAM,UAAU,KAAK,QAAQ,QAAQ;AACrC,QAAM,UAAU,MAAM,GAAG,SAAS,SAAS,OAAO;AAClD,QAAM,QAAQ,QAAQ,MAAM,IAAI;AAChC,QAAM,eAAe,QAAQ,YAAY,EAAE,KAAK;AAChD,QAAM,YAAY;AAClB,MAAI,cAAc;AAClB,MAAI,YAAY;AAChB,MAAI,UAAU,MAAM;AAEpB,WAAS,IAAI,GAAG,IAAI,MAAM,QAAQ,KAAK;AACrC,UAAM,OAAO,MAAM,CAAC;AACpB,UAAM,QAAQ,SAAS,SAAY,KAAK,MAAM,SAAS,IAAI;AAC3D,QAAI,SAAS,MAAM,CAAC,MAAM,UAAa,MAAM,CAAC,MAAM,QAAW;AAC7D,YAAM,QAAQ,MAAM,CAAC,EAAE;AACvB,YAAM,OAAO,MAAM,CAAC,EAAE,KAAK,EAAE,YAAY;AACzC,UAAI,SAAS,cAAc;AACzB,sBAAc;AACd,oBAAY;AACZ;AAAA,MACF;AAAA,IACF;AAAA,EACF;AAEA,MAAI,YAAY,GAAG;AACjB,WAAO;AAAA,EACT;AAEA,WAAS,IAAI,YAAY,GAAG,IAAI,MAAM,QAAQ,KAAK;AACjD,UAAM,OAAO,MAAM,CAAC;AACpB,UAAM,QAAQ,SAAS,SAAY,KAAK,MAAM,SAAS,IAAI;AAC3D,QAAI,SAAS,MAAM,CAAC,MAAM,QAAW;AACnC,YAAM,QAAQ,MAAM,CAAC,EAAE;AACvB,UAAI,SAAS,aAAa;AACxB,kBAAU;AACV;AAAA,MACF;AAAA,IACF;AAAA,EACF;AAEA,SAAO,MAAM,MAAM,WAAW,OAAO,EAAE,KAAK,IAAI,EAAE,KAAK;AACzD;AAEA,eAAsB,aAAa,UAAsC;AACvE,QAAM,UAAU,KAAK,QAAQ,QAAQ;AACrC,QAAM,UAAU,MAAM,GAAG,SAAS,SAAS,OAAO;AAClD,QAAM,QAAQ,QAAQ,MAAM,IAAI;AAChC,QAAM,WAAsB,CAAC;AAC7B,QAAM,YAAY;AAElB,WAAS,IAAI,GAAG,IAAI,MAAM,QAAQ,KAAK;AACrC,UAAM,OAAO,MAAM,CAAC;AACpB,UAAM,QAAQ,SAAS,SAAY,KAAK,MAAM,SAAS,IAAI;AAC3D,QAAI,SAAS,MAAM,CAAC,MAAM,UAAa,MAAM,CAAC,MAAM,QAAW;AAC7D,eAAS,KAAK;AAAA,QACZ,OAAO,MAAM,CAAC,EAAE;AAAA,QAChB,MAAM,MAAM,CAAC,EAAE,KAAK;AAAA,QACpB,MAAM,IAAI;AAAA,MACZ,CAAC;AAAA,IACH;AAAA,EACF;AACA,SAAO;AACT;AAEA,eAAsB,eACpB,UACA,UACsB;AACtB,QAAM,UAAU,KAAK,QAAQ,QAAQ;AACrC,QAAM,UAAU,MAAM,GAAG,SAAS,SAAS,OAAO;AAClD,QAAM,QAAQ,QAAQ,MAAM,IAAI;AAChC,QAAM,SAAsB,CAAC;AAC7B,QAAM,UAAU;AAChB,MAAI,UAAU;AACd,MAAI,YAAY;AAChB,MAAI,aAAa;AACjB,QAAM,aAAuB,CAAC;AAE9B,WAAS,IAAI,GAAG,IAAI,MAAM,QAAQ,KAAK;AACrC,UAAM,OAAO,MAAM,CAAC;AACpB,QAAI,SAAS,OAAW;AACxB,UAAM,QAAQ,KAAK,MAAM,OAAO;AAChC,QAAI,OAAO;AACT,UAAI,CAAC,SAAS;AACZ,kBAAU;AACV,oBAAY,MAAM,CAAC,KAAK;AACxB,qBAAa,IAAI;AACjB,mBAAW,SAAS;AAAA,MACtB,OAAO;AACL,kBAAU;AACV,YACE,aAAa,UACb,UAAU,YAAY,MAAM,SAAS,YAAY,GACjD;AACA,iBAAO,KAAK;AAAA,YACV,UAAU;AAAA,YACV,MAAM,WAAW,KAAK,IAAI;AAAA,YAC1B,MAAM;AAAA,UACR,CAAC;AAAA,QACH;AAAA,MACF;AAAA,IACF,WAAW,SAAS;AAClB,iBAAW,KAAK,IAAI;AAAA,IACtB;AAAA,EACF;AACA,SAAO;AACT;AAEA,eAAsB,eACpB,UAC6B;AAC7B,QAAM,UAAU,KAAK,QAAQ,QAAQ;AACrC,QAAM,UAAU,MAAM,GAAG,SAAS,SAAS,OAAO;AAClD,QAAM,QAAQ,QAAQ,MAAM,6BAA6B;AACzD,MAAI,CAAC,SAAS,MAAM,CAAC,MAAM,OAAW,QAAO;AAC7C,QAAM,OAAO,MAAM,CAAC;AACpB,QAAM,SAAsB,CAAC;AAC7B,QAAM,aAAa;AACnB,aAAW,QAAQ,KAAK,MAAM,IAAI,GAAG;AACnC,UAAM,KAAK,KAAK,KAAK,EAAE,MAAM,UAAU;AACvC,UAAM,MAAM,KAAK,CAAC;AAClB,UAAM,SAAS,KAAK,CAAC;AACrB,QAAI,QAAQ,UAAa,WAAW,QAAW;AAC7C,YAAM,UAAU,IAAI,KAAK;AACzB,UAAI,MAAM,OAAO,KAAK;AACtB,UACG,IAAI,WAAW,GAAG,KAAK,IAAI,SAAS,GAAG,KACvC,IAAI,WAAW,GAAG,KAAK,IAAI,SAAS,GAAG,GACxC;AACA,cAAM,IAAI,MAAM,GAAG,EAAE;AAAA,MACvB;AACA,aAAO,OAAO,IAAI;AAAA,IACpB;AAAA,EACF;AACA,SAAO,OAAO,KAAK,MAAM,EAAE,SAAS,IAAI,SAAS;AACnD;;;ADvMA,IAAM,SAAS,IAAI,UAAU;AAAA,EAC3B,MAAM;AAAA,EACN,SAAS;AACX,CAAC;AAED,OAAO;AAAA,EACL;AAAA,EACA;AAAA,EACA;AAAA,IACE,WAAW,EACR,OAAO,EACP,SAAS,6CAA6C;AAAA,EAC3D;AAAA,EACA,OAAO,EAAE,UAAU,MAAM;AACvB,UAAM,SAASC,MAAK,QAAQ,SAAS;AACrC,UAAM,QAAQ,MAAM,kBAAkB,MAAM;AAC5C,UAAM,OACJ,MAAM,SAAS,IAAI,MAAM,KAAK,IAAI,IAAI;AACxC,WAAO,EAAE,SAAS,CAAC,EAAE,MAAM,QAAiB,KAAK,CAAC,EAAE;AAAA,EACtD;AACF;AAEA,OAAO;AAAA,EACL;AAAA,EACA;AAAA,EACA;AAAA,IACE,WAAW,EACR,OAAO,EACP,SAAS,+CAA+C;AAAA,IAC3D,OAAO,EAAE,OAAO,EAAE,SAAS,kCAAkC;AAAA,EAC/D;AAAA,EACA,OAAO,EAAE,WAAW,MAAM,MAAM;AAC9B,UAAM,SAASA,MAAK,QAAQ,SAAS;AACrC,UAAM,UAAU,MAAM,WAAW,QAAQ,KAAK;AAC9C,UAAM,QAAQ,QAAQ,IAAI,CAAC,MAAM,GAAG,EAAE,IAAI,IAAI,EAAE,IAAI,IAAI,EAAE,OAAO,EAAE;AACnE,UAAM,OAAO,MAAM,SAAS,IAAI,MAAM,KAAK,IAAI,IAAI;AACnD,WAAO,EAAE,SAAS,CAAC,EAAE,MAAM,QAAiB,KAAK,CAAC,EAAE;AAAA,EACtD;AACF;AAEA,OAAO;AAAA,EACL;AAAA,EACA;AAAA,EACA;AAAA,IACE,MAAM,EAAE,OAAO,EAAE,SAAS,2BAA2B;AAAA,IACrD,SAAS,EAAE,OAAO,EAAE,SAAS,yCAAyC;AAAA,EACxE;AAAA,EACA,OAAO,EAAE,MAAM,QAAQ,MAAM;AAC3B,UAAM,UAAUA,MAAK,QAAQ,IAAI;AACjC,UAAM,UAAU,MAAM,WAAW,SAAS,OAAO;AACjD,UAAM,OAAO,WAAW,aAAa,OAAO;AAC5C,WAAO,EAAE,SAAS,CAAC,EAAE,MAAM,QAAiB,KAAK,CAAC,EAAE;AAAA,EACtD;AACF;AAEA,OAAO;AAAA,EACL;AAAA,EACA;AAAA,EACA;AAAA,IACE,MAAM,EAAE,OAAO,EAAE,SAAS,2BAA2B;AAAA,EACvD;AAAA,EACA,OAAO,EAAE,KAAK,MAAM;AAClB,UAAM,UAAUA,MAAK,QAAQ,IAAI;AACjC,UAAM,WAAW,MAAM,aAAa,OAAO;AAC3C,UAAM,QAAQ,SAAS;AAAA,MACrB,CAAC,MAAM,GAAG,KAAK,OAAO,EAAE,QAAQ,CAAC,CAAC,KAAK,EAAE,IAAI,MAAM,EAAE,IAAI;AAAA,IAC3D;AACA,UAAM,OAAO,MAAM,SAAS,IAAI,MAAM,KAAK,IAAI,IAAI;AACnD,WAAO,EAAE,SAAS,CAAC,EAAE,MAAM,QAAiB,KAAK,CAAC,EAAE;AAAA,EACtD;AACF;AAEA,OAAO;AAAA,EACL;AAAA,EACA;AAAA,EACA;AAAA,IACE,MAAM,EAAE,OAAO,EAAE,SAAS,2BAA2B;AAAA,IACrD,UAAU,EACP,OAAO,EACP,SAAS,EACT,SAAS,wDAAwD;AAAA,EACtE;AAAA,EACA,OAAO,EAAE,MAAM,SAAS,MAAM;AAC5B,UAAM,UAAUA,MAAK,QAAQ,IAAI;AACjC,UAAM,SAAS,MAAM,eAAe,SAAS,QAAQ;AACrD,UAAM,QAAQ,OAAO,IAAI,CAAC,MAAM;AAC9B,YAAM,OAAO,EAAE,YAAY;AAC3B,aAAO,OAAO,IAAI,MAAM,EAAE,IAAI;AAAA,EAAU,EAAE,IAAI;AAAA,IAChD,CAAC;AACD,UAAM,OACJ,MAAM,SAAS,IAAI,MAAM,KAAK,MAAM,IAAI;AAC1C,WAAO,EAAE,SAAS,CAAC,EAAE,MAAM,QAAiB,KAAK,CAAC,EAAE;AAAA,EACtD;AACF;AAEA,OAAO;AAAA,EACL;AAAA,EACA;AAAA,EACA;AAAA,IACE,MAAM,EAAE,OAAO,EAAE,SAAS,2BAA2B;AAAA,EACvD;AAAA,EACA,OAAO,EAAE,KAAK,MAAM;AAClB,UAAM,UAAUA,MAAK,QAAQ,IAAI;AACjC,UAAM,cAAc,MAAM,eAAe,OAAO;AAChD,QAAI,CAAC,aAAa;AAChB,aAAO;AAAA,QACL,SAAS,CAAC,EAAE,MAAM,QAAiB,MAAM,mBAAmB,CAAC;AAAA,MAC/D;AAAA,IACF;AACA,UAAM,QAAQ,OAAO,QAAQ,WAAW,EAAE,IAAI,CAAC,CAAC,GAAG,CAAC,MAAM,GAAG,CAAC,KAAK,CAAC,EAAE;AACtE,WAAO;AAAA,MACL,SAAS,CAAC,EAAE,MAAM,QAAiB,MAAM,MAAM,KAAK,IAAI,EAAE,CAAC;AAAA,IAC7D;AAAA,EACF;AACF;AAEA,eAAe,OAAO;AACpB,QAAM,YAAY,IAAI,qBAAqB;AAC3C,QAAM,OAAO,QAAQ,SAAS;AAChC;AAEA,KAAK,EAAE,MAAM,CAAC,UAAU;AACtB,UAAQ,MAAM,gBAAgB,KAAK;AACnC,UAAQ,KAAK,CAAC;AAChB,CAAC;","names":["path","path"]}
package/package.json ADDED
@@ -0,0 +1,69 @@
1
+ {
2
+ "name": "mcp-server-markdown",
3
+ "version": "1.0.0",
4
+ "description": "MCP server for markdown files — search, extract sections, list headings, find code blocks across your docs directory",
5
+ "type": "module",
6
+ "bin": {
7
+ "mcp-server-markdown": "./dist/index.js"
8
+ },
9
+ "files": [
10
+ "dist"
11
+ ],
12
+ "scripts": {
13
+ "build": "tsup",
14
+ "typecheck": "tsc --noEmit",
15
+ "test": "vitest run",
16
+ "test:watch": "vitest",
17
+ "test:coverage": "vitest run --coverage",
18
+ "lint": "eslint . && prettier --check .",
19
+ "format": "prettier --write .",
20
+ "prepare": "husky"
21
+ },
22
+ "keywords": [
23
+ "mcp",
24
+ "mcp-server",
25
+ "model-context-protocol",
26
+ "markdown",
27
+ "documentation",
28
+ "docs",
29
+ "knowledge-base",
30
+ "search",
31
+ "headings",
32
+ "code-blocks",
33
+ "ai",
34
+ "llm",
35
+ "claude",
36
+ "cursor"
37
+ ],
38
+ "author": "Ofer Shapira",
39
+ "license": "MIT",
40
+ "repository": {
41
+ "type": "git",
42
+ "url": "https://github.com/ofershap/mcp-server-markdown.git"
43
+ },
44
+ "bugs": {
45
+ "url": "https://github.com/ofershap/mcp-server-markdown/issues"
46
+ },
47
+ "homepage": "https://github.com/ofershap/mcp-server-markdown#readme",
48
+ "dependencies": {
49
+ "@modelcontextprotocol/sdk": "^1.26.0",
50
+ "zod": "^3.25.0"
51
+ },
52
+ "devDependencies": {
53
+ "@eslint/js": "^9.0.0",
54
+ "@types/node": "^22.0.0",
55
+ "eslint": "^9.0.0",
56
+ "eslint-config-prettier": "^10.0.0",
57
+ "husky": "^9.0.0",
58
+ "lint-staged": "^15.0.0",
59
+ "prettier": "^3.0.0",
60
+ "tsup": "^8.0.0",
61
+ "typescript": "^5.7.0",
62
+ "typescript-eslint": "^8.0.0",
63
+ "vitest": "^3.2.0"
64
+ },
65
+ "lint-staged": {
66
+ "*.{ts,tsx,js}": "eslint --fix",
67
+ "*.{json,md,yml,yaml}": "prettier --write"
68
+ }
69
+ }