@mastra/mcp-docs-server 0.13.3 → 0.13.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -10,180 +10,11 @@ import { Callout, FileTree, Steps } from 'nextra/components'
10
10
  [Assistant UI](https://assistant-ui.com) is the TypeScript/React library for AI Chat.
11
11
  Built on shadcn/ui and Tailwind CSS, it enables developers to create beautiful, enterprise-grade chat experiences in minutes.
12
12
 
13
- ## Integrating with Next.js and Assistant UI
14
-
15
- There are two primary ways to integrate Mastra into your Next.js project when using Assistant UI:
16
-
17
- 1. **Full-Stack Integration**: Integrate Mastra directly into your Next.js application's API routes. This approach keeps your backend and frontend code within the same project. [Learn how to set up Full-Stack Integration](#full-stack-integration)
18
- 2. **Separate Backend Integration**: Run Mastra as a standalone server and connect your Next.js frontend to its API endpoints. This approach separates concerns and allows for independent scaling. [Learn how to set up Separate Backend Integration](#separate-backend-integration)
19
-
20
- ## Full-Stack Integration
21
-
22
- <Steps>
23
- ### Initialize Assistant UI
24
-
25
- There are two options when setting up Assistant UI using the `assistant-ui` CLI:
26
-
27
- 1. **New Project**: Create a new Next.js project with Assistant UI.
28
- 2. **Existing Project**: Initialize Assistant UI into an existing React project.
29
-
30
- #### New Project
31
-
32
- ```bash copy
33
- npx assistant-ui@latest create
34
- ```
35
-
36
- #### Existing Project
37
-
38
- ```bash copy
39
- npx assistant-ui@latest init
40
- ```
41
-
42
- <Callout>For detailed setup instructions, including adding API keys, basic configuration, and manual setup steps, please refer to [assistant-ui's official documentation](https://assistant-ui.com/docs).</Callout>
43
-
44
-
45
- ### Install Mastra Packages
46
-
47
- Install the required Mastra packages:
48
-
49
- ```bash copy
50
- npm install @mastra/core@latest @mastra/memory@latest @mastra/libsql@latest
51
- ```
52
-
53
- ### Configure Next.js
54
-
55
- To ensure Next.js correctly bundles your application when using Mastra directly in API routes, you need to configure `serverExternalPackages`.
56
-
57
- Update your `next.config.js` file to include `@mastra/`:
58
-
59
- ```js showLineNumbers copy {3}
60
- /** @type {import('next').NextConfig} */
61
- const nextConfig = {
62
- serverExternalPackages: ["@mastra/*"],
63
- // ... your other Next.js config
64
- };
65
-
66
- module.exports = nextConfig;
67
- ```
68
-
69
- ### Configure Mastra Memory and Storage
70
-
71
- ```typescript showLineNumbers copy filename="mastra/memory.ts" {4,8}
72
- import { LibSQLStore } from "@mastra/libsql";
73
- import { Memory } from "@mastra/memory";
74
-
75
- export const storage = new LibSQLStore({
76
- url: 'file:./memory.db',
77
- })
78
-
79
- export const memory = new Memory({
80
- storage,
81
- })
82
- ```
83
-
84
- <Callout>
85
- If deploying to the edge, you should use a compatible storage solution and not a file-based storage.
13
+ <Callout type="info">
14
+ For a full-stack integration approach where Mastra runs directly in your Next.js API routes, see the [Full-Stack Integration Guide](https://www.assistant-ui.com/docs/runtimes/mastra/full-stack-integration) on Assistant UI's documentation site.
86
15
  </Callout>
87
16
 
88
- ### Define Mastra Agent
89
-
90
- ```typescript showLineNumbers copy filename="mastra/agents/chef-agent.ts" {5-12}
91
- import { openai } from "@ai-sdk/openai";
92
- import { Agent } from "@mastra/core/agent";
93
- import { memory} from "../memory";
94
-
95
- export const chefAgent = new Agent({
96
- name: "chefAgent",
97
- instructions:
98
- "You are Michel, a practical and experienced home chef. " +
99
- "You help people cook with whatever ingredients they have available.",
100
- model: openai("gpt-4o-mini"),
101
- memory,
102
- });
103
- ```
104
-
105
- ### Register Agent to Mastra Instance
106
-
107
- ```typescript showLineNumbers copy filename="mastra/index.ts" {4-5}
108
- import { Mastra } from "@mastra/core";
109
- import { chefAgent } from "./agents/chef-agent";
110
-
111
- export const mastra = new Mastra({
112
- agents: { chefAgent },
113
- // ... other config
114
- });
115
- ```
116
-
117
- This initializes Mastra and makes the `chefAgent` available for use.
118
-
119
- ### Modify the Chat API endpoints
120
-
121
- The initial bootstrapped Next.js project has a `app/api/chat/route.ts` file that exports a `POST` handler. The initial implementation may look like this:
122
-
123
- ```typescript showLineNumbers copy filename="app/api/chat/route.ts" {11-21}
124
- import { openai } from "@ai-sdk/openai";
125
- import { frontendTools } from "@assistant-ui/react-ai-sdk";
126
- import { streamText } from "ai";
127
-
128
- export const runtime = "edge";
129
- export const maxDuration = 30;
130
-
131
- export async function POST(req: Request) {
132
- const { messages, system, tools } = await req.json();
133
-
134
- const result = streamText({
135
- model: openai("gpt-4o"),
136
- messages,
137
- // forward system prompt and tools from the frontend
138
- toolCallStreaming: true,
139
- system,
140
- tools: {
141
- ...frontendTools(tools),
142
- },
143
- onError: console.log,
144
- });
145
-
146
- return result.toDataStreamResponse();
147
- }
148
- ```
149
-
150
- Now we need to modify the `POST` handler to use the `chefAgent` instead of this implementation.
151
-
152
- ```typescript showLineNumbers copy filename="app/api/chat/route.ts" {1,6,8}
153
- import { mastra } from "@/mastra";
154
-
155
- export async function POST(req: Request) {
156
- const { messages } = await req.json();
157
-
158
- const agent = mastra.getAgent("chefAgent");
159
-
160
- const stream = await agent.stream(messages);
161
-
162
- return stream.toDataStreamResponse();
163
- }
164
- ```
165
-
166
- Key changes
167
- - We import the `mastra` instance we created.
168
- - We use the `mastra.getAgent("chefAgent")` to get the agent we want to use.
169
- - We use the `agent.stream(messages)` to get the stream of messages from the agent.
170
- - We return the stream as a data stream response which is compatible with `assistant-ui`.
171
-
172
- ### Run the application
173
-
174
- You're all set! Start your Next.js development server:
175
-
176
- ```bash copy
177
- npm run dev
178
- ```
179
-
180
- You should now be able to chat with your agent in the browser.
181
-
182
- </Steps>
183
-
184
- Congratulations! You have successfully integrated Mastra into your Next.js application using the full-stack approach. Your Assistant UI frontend now communicates with a Mastra agent running in your Next.js backend API route.
185
-
186
- ## Separate Backend Integration
17
+ ## Integration Guide
187
18
 
188
19
  Run Mastra as a standalone server and connect your Next.js frontend (with Assistant UI) to its API endpoints.
189
20
 
@@ -215,41 +46,29 @@ npx create-mastra@latest
215
46
  This command will launch an interactive wizard to help you scaffold a new Mastra project, including prompting you for a project name and setting up basic configurations.
216
47
  Follow the prompts to create your server project.
217
48
 
218
- You now have a basic Mastra server project ready.
49
+ You now have a basic Mastra server project ready. You should have the following files and folders:
50
+
51
+ <FileTree>
52
+ <FileTree.Folder name="src" defaultOpen>
53
+ <FileTree.Folder name="mastra" defaultOpen>
54
+ <FileTree.File name="index.ts" />
55
+ <FileTree.Folder name="agents" defaultOpen>
56
+ <FileTree.File name="weather-agent.ts" />
57
+ </FileTree.Folder>
58
+ <FileTree.Folder name="tools" defaultOpen>
59
+ <FileTree.File name="weather-tool.ts" />
60
+ </FileTree.Folder>
61
+ <FileTree.Folder name="workflows" defaultOpen>
62
+ <FileTree.File name="weather-workflow.ts" />
63
+ </FileTree.Folder>
64
+ </FileTree.Folder>
65
+ </FileTree.Folder>
66
+ </FileTree>
219
67
 
220
68
  <Callout>
221
69
  Ensure that you have set the appropriate environment variables for your LLM provider in the `.env` file.
222
70
  </Callout>
223
71
 
224
- ### Define Mastra Agent
225
-
226
- ```typescript showLineNumbers copy filename="mastra/agents/chef-agent.ts" {5-12}
227
- import { openai } from "@ai-sdk/openai";
228
- import { Agent } from "@mastra/core/agent";
229
- import { memory} from "../memory";
230
-
231
- export const chefAgent = new Agent({
232
- name: "chefAgent",
233
- instructions:
234
- "You are Michel, a practical and experienced home chef. " +
235
- "You help people cook with whatever ingredients they have available.",
236
- model: openai("gpt-4o-mini"),
237
- memory,
238
- });
239
- ```
240
-
241
- ### Register Agent to Mastra Instance
242
-
243
- ```typescript copy filename="mastra/index.ts" showLineNumbers
244
- import { Mastra } from "@mastra/core";
245
- import { chefAgent } from "./agents/chef-agent";
246
-
247
- export const mastra = new Mastra({
248
- agents: { chefAgent },
249
- // ... other config
250
- });
251
- ```
252
-
253
72
  ### Run the Mastra Server
254
73
 
255
74
  Run the Mastra server using the following command:
@@ -258,38 +77,27 @@ Run the Mastra server using the following command:
258
77
  npm run dev
259
78
  ```
260
79
 
261
- By default, the Mastra server will run on http://localhost:4111. Your chefAgent should now be accessible via a POST request endpoint, typically http://localhost:4111/api/agents/chefAgent/stream. Keep this server running for the next steps where we'll set up the Assistant UI frontend to connect to it.
80
+ By default, the Mastra server will run on `http://localhost:4111`. Your `weatherAgent` should now be accessible via a POST request endpoint, typically `http://localhost:4111/api/agents/weatherAgent/stream`. Keep this server running for the next steps where we'll set up the Assistant UI frontend to connect to it.
262
81
 
263
82
  ### Initialize Assistant UI
264
83
 
265
- There are two options when setting up Assistant UI using the `assistant-ui` CLI:
266
-
267
- 1. **New Project**: Create a new Next.js project with Assistant UI.
268
- 2. **Existing Project**: Initialize Assistant UI into an existing React project.
269
-
270
- #### New Project
84
+ Create a new `assistant-ui` project with the following command.
271
85
 
272
86
  ```bash copy
273
87
  npx assistant-ui@latest create
274
88
  ```
275
89
 
276
- #### Existing Project
277
-
278
- ```bash copy
279
- npx assistant-ui@latest init
280
- ```
281
-
282
90
  <Callout>For detailed setup instructions, including adding API keys, basic configuration, and manual setup steps, please refer to [assistant-ui's official documentation](https://assistant-ui.com/docs).</Callout>
283
91
 
284
92
  ### Configure Frontend API Endpoint
285
93
 
286
94
  The default Assistant UI setup configures the chat runtime to use a local API route (`/api/chat`) within the Next.js project. Since our Mastra agent is running on a separate server, we need to update the frontend to point to that server's endpoint.
287
95
 
288
- Open the main page file in your Assistant UI frontend project (usually `app/page.tsx` or `src/app/page.tsx`). Find the `useChatRuntime` hook and change the `api` property to the full URL of your Mastra agent's stream endpoint:
96
+ Find the `useChatRuntime` hook in the `assistant-ui` project, typically at `app/assistant.tsx` and change the `api` property to the full URL of your Mastra agent's stream endpoint:
289
97
 
290
- ```typescript showLineNumbers copy filename="app/page.tsx" {2}
98
+ ```typescript showLineNumbers copy filename="app/assistant.tsx" {2}
291
99
  const runtime = useChatRuntime({
292
- api: "http://localhost:4111/api/agents/chefAgent/stream",
100
+ api: "http://localhost:4111/api/agents/weatherAgent/stream",
293
101
  });
294
102
  ```
295
103
 
@@ -307,4 +115,4 @@ You should now be able to chat with your agent in the browser.
307
115
 
308
116
  </Steps>
309
117
 
310
- Congratulations! You have successfully integrated Mastra with Assistant UI using a separate server approach. Your Assistant UI frontend now communicates with a standalone Mastra agent server.
118
+ Congratulations! You have successfully integrated Mastra with Assistant UI using a separate server approach. Your Assistant UI frontend now communicates with a standalone Mastra agent server.
@@ -1,7 +1,7 @@
1
1
  import { MCPClient } from '@mastra/mcp';
2
2
  import { MCPServer } from '@mastra/mcp';
3
3
  import { ServerType } from '@hono/node-server/.';
4
- import { z } from 'zod';
4
+ import z from 'zod';
5
5
 
6
6
  export declare type BlogInput = z.infer<typeof blogInputSchema>;
7
7
 
@@ -13,6 +13,60 @@ export declare const blogInputSchema: z.ZodObject<{
13
13
  url: string;
14
14
  }>;
15
15
 
16
+ export declare const blogPostSchema: z.ZodObject<{
17
+ slug: z.ZodString;
18
+ content: z.ZodString;
19
+ metadata: z.ZodObject<{
20
+ title: z.ZodString;
21
+ publishedAt: z.ZodString;
22
+ summary: z.ZodString;
23
+ image: z.ZodOptional<z.ZodString>;
24
+ author: z.ZodOptional<z.ZodString>;
25
+ draft: z.ZodDefault<z.ZodOptional<z.ZodBoolean>>;
26
+ categories: z.ZodUnion<[z.ZodArray<z.ZodString, "many">, z.ZodString]>;
27
+ }, "strip", z.ZodTypeAny, {
28
+ title: string;
29
+ publishedAt: string;
30
+ summary: string;
31
+ draft: boolean;
32
+ categories: string | string[];
33
+ image?: string | undefined;
34
+ author?: string | undefined;
35
+ }, {
36
+ title: string;
37
+ publishedAt: string;
38
+ summary: string;
39
+ categories: string | string[];
40
+ image?: string | undefined;
41
+ author?: string | undefined;
42
+ draft?: boolean | undefined;
43
+ }>;
44
+ }, "strip", z.ZodTypeAny, {
45
+ content: string;
46
+ slug: string;
47
+ metadata: {
48
+ title: string;
49
+ publishedAt: string;
50
+ summary: string;
51
+ draft: boolean;
52
+ categories: string | string[];
53
+ image?: string | undefined;
54
+ author?: string | undefined;
55
+ };
56
+ }, {
57
+ content: string;
58
+ slug: string;
59
+ metadata: {
60
+ title: string;
61
+ publishedAt: string;
62
+ summary: string;
63
+ categories: string | string[];
64
+ image?: string | undefined;
65
+ author?: string | undefined;
66
+ draft?: boolean | undefined;
67
+ };
68
+ }>;
69
+
16
70
  export declare const blogTool: {
17
71
  name: string;
18
72
  description: string;
@@ -1,6 +1,7 @@
1
1
  import fs2 from 'fs/promises';
2
2
  import path4, { dirname } from 'path';
3
3
  import { fileURLToPath } from 'url';
4
+ import z from 'zod';
4
5
 
5
6
  // src/utils.ts
6
7
  var mdxFileCache = /* @__PURE__ */ new Map();
@@ -117,6 +118,19 @@ async function getMatchingPaths(path5, queryKeywords, baseDir) {
117
118
 
118
119
  ${pathList}`;
119
120
  }
121
+ var blogPostSchema = z.object({
122
+ slug: z.string(),
123
+ content: z.string(),
124
+ metadata: z.object({
125
+ title: z.string(),
126
+ publishedAt: z.string(),
127
+ summary: z.string(),
128
+ image: z.string().optional(),
129
+ author: z.string().optional(),
130
+ draft: z.boolean().optional().default(false),
131
+ categories: z.array(z.string()).or(z.string())
132
+ })
133
+ });
120
134
  var EXAMPLES_SOURCE = fromRepoRoot("examples");
121
135
  var OUTPUT_DIR = fromPackageRoot(".docs/organized/code-examples");
122
136
  async function loadExampleConfig(examplePath) {
@@ -339,4 +353,4 @@ if (process.env.PREPARE === `true`) {
339
353
  }
340
354
  }
341
355
 
342
- export { fromPackageRoot, getMatchingPaths, prepare };
356
+ export { blogPostSchema, fromPackageRoot, getMatchingPaths, prepare };
@@ -1 +1 @@
1
- export { prepare } from '../chunk-7QXT2IEP.js';
1
+ export { prepare } from '../chunk-TUAHUTTB.js';
package/dist/stdio.js CHANGED
@@ -1,5 +1,5 @@
1
1
  #!/usr/bin/env node
2
- import { fromPackageRoot, prepare, getMatchingPaths } from './chunk-7QXT2IEP.js';
2
+ import { fromPackageRoot, prepare, getMatchingPaths, blogPostSchema } from './chunk-TUAHUTTB.js';
3
3
  import * as fs from 'fs';
4
4
  import { existsSync, mkdirSync } from 'fs';
5
5
  import * as os2 from 'os';
@@ -8,7 +8,6 @@ import * as path3 from 'path';
8
8
  import path3__default from 'path';
9
9
  import fs3 from 'fs/promises';
10
10
  import { MCPServer } from '@mastra/mcp';
11
- import { JSDOM } from 'jsdom';
12
11
  import { z } from 'zod';
13
12
 
14
13
  var writeErrorLog = (message, data) => {
@@ -76,22 +75,20 @@ var logger = createLogger();
76
75
  var BLOG_BASE_URL = process.env.BLOG_URL || "https://mastra.ai";
77
76
  async function fetchBlogPosts() {
78
77
  void logger.debug("Fetching list of blog posts");
79
- const response = await fetch(`${BLOG_BASE_URL}/blog`);
78
+ const response = await fetch(`${BLOG_BASE_URL}/api/blog`);
80
79
  if (!response.ok) {
81
80
  throw new Error("Failed to fetch blog posts");
82
81
  }
83
- const html = await response.text();
84
- const dom = new JSDOM(html);
85
- const document = dom.window.document;
86
- const blogLinks = Array.from(document.querySelectorAll('a[href^="/blog/"]')).filter((link) => {
87
- const href = link.getAttribute("href");
88
- return href !== "/blog" && !href?.includes("authors");
89
- }).map((link) => {
90
- const h2 = link.querySelector("h2");
91
- const title = h2?.textContent?.trim();
92
- const href = link.getAttribute("href");
82
+ const blogData = await response.json();
83
+ const blogPosts = blogPostSchema.array().safeParse(blogData);
84
+ if (!blogPosts.success) {
85
+ return "Failed to parse blog posts";
86
+ }
87
+ const blogLinks = blogPosts.data.map((post) => {
88
+ const title = post.metadata.title;
89
+ const href = post.slug;
93
90
  if (title && href) {
94
- return `[${title}](${href})`;
91
+ return `[${title}](${BLOG_BASE_URL}/blog/${href}) | [Markdown URL](${BLOG_BASE_URL}/api/blog/${href})`;
95
92
  }
96
93
  return null;
97
94
  }).filter(Boolean);
@@ -121,12 +118,12 @@ ${blogPosts}`;
121
118
 
122
119
  ${blogList}`;
123
120
  }
124
- const html = await response.text();
125
- const dom = new JSDOM(html);
126
- const document = dom.window.document;
127
- const scripts = document.querySelectorAll("script");
128
- scripts.forEach((script) => script.remove());
129
- const content = document.body.textContent?.trim() || "";
121
+ const blogData = await response.json();
122
+ const blogPost = blogPostSchema.safeParse(blogData);
123
+ if (!blogPost.success) {
124
+ return "Failed to parse blog post";
125
+ }
126
+ const content = blogPost.data.content;
130
127
  if (!content) {
131
128
  throw new Error("No content found in blog post");
132
129
  }
@@ -134,7 +131,7 @@ ${blogList}`;
134
131
  }
135
132
  var blogInputSchema = z.object({
136
133
  url: z.string().describe(
137
- "URL of a specific blog post to fetch. If the string /blog is passed as the url it returns a list of all blog posts."
134
+ "URL of a specific blog post to fetch. If the string /api/blog is passed as the url it returns a list of all blog posts. The markdownUrl is the URL of a single blog post which can be used to fetch the blog post content in markdown format."
138
135
  )
139
136
  });
140
137
  var blogTool = {
@@ -145,8 +142,8 @@ var blogTool = {
145
142
  void logger.debug("Executing mastraBlog tool", { url: args.url });
146
143
  try {
147
144
  let content;
148
- if (args.url !== `/blog`) {
149
- content = await fetchBlogPost(`${BLOG_BASE_URL}${args.url}`);
145
+ if (args.url.trim() !== `/api/blog`) {
146
+ content = await fetchBlogPost(args.url);
150
147
  } else {
151
148
  content = await fetchBlogPosts();
152
149
  }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@mastra/mcp-docs-server",
3
- "version": "0.13.3",
3
+ "version": "0.13.4",
4
4
  "description": "MCP server for accessing Mastra.ai documentation, changelogs, and news.",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
@@ -47,8 +47,8 @@
47
47
  "tsx": "^4.19.4",
48
48
  "typescript": "^5.8.3",
49
49
  "vitest": "^3.2.4",
50
- "@internal/lint": "0.0.16",
51
- "@mastra/core": "0.10.9"
50
+ "@internal/lint": "0.0.17",
51
+ "@mastra/core": "0.10.10"
52
52
  },
53
53
  "peerDependencies": {
54
54
  "@mastra/core": "^0.10.0-alpha.0"