@mastra/mcp-docs-server 1.1.26-alpha.4 → 1.1.26-alpha.6
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.docs/docs/mastra-platform/overview.md +3 -1
- package/.docs/docs/observability/tracing/exporters/cloud.md +6 -0
- package/.docs/models/index.md +1 -1
- package/.docs/models/providers/hpc-ai.md +73 -0
- package/.docs/models/providers/nvidia.md +1 -1
- package/.docs/models/providers/opencode-go.md +4 -2
- package/.docs/models/providers/opencode.md +4 -2
- package/.docs/models/providers.md +1 -0
- package/.docs/reference/client-js/mastra-client.md +23 -0
- package/CHANGELOG.md +8 -0
- package/package.json +4 -4
|
@@ -73,4 +73,6 @@ Develop your project locally with [`mastra dev`](https://mastra.ai/reference/cli
|
|
|
73
73
|
|
|
74
74
|
Once you're ready to deploy your application to production, use [`mastra studio deploy`](https://mastra.ai/reference/cli/mastra) and [`mastra server deploy`](https://mastra.ai/reference/cli/mastra) to push your application to the cloud.
|
|
75
75
|
|
|
76
|
-
Follow the [Studio deployment guide](https://mastra.ai/docs/studio/deployment) and [Server deployment guide](https://mastra.ai/guides/deployment/mastra-platform) for step-by-step instructions.
|
|
76
|
+
Follow the [Studio deployment guide](https://mastra.ai/docs/studio/deployment) and [Server deployment guide](https://mastra.ai/guides/deployment/mastra-platform) for step-by-step instructions.
|
|
77
|
+
|
|
78
|
+
If you host your Mastra application on your own infrastructure, you can still send observability data to Studio using the [CloudExporter](https://mastra.ai/docs/observability/tracing/exporters/cloud).
|
|
@@ -2,6 +2,12 @@
|
|
|
2
2
|
|
|
3
3
|
The `CloudExporter` sends traces, logs, metrics, scores, and feedback to the Mastra platform. Use it to route observability data from any Mastra app to a hosted project in the Mastra platform.
|
|
4
4
|
|
|
5
|
+
> **Self-hosted or standalone apps:** If you host your Mastra application on your own infrastructure (not on Mastra Platform), you still need a deployed Studio project to view traces, logs, and metrics. `CloudExporter` sends data to a Studio project, so one must exist before you can use it.
|
|
6
|
+
>
|
|
7
|
+
> 1. [Create a Mastra project](https://mastra.ai/guides/getting-started/quickstart) if you don't have one yet.
|
|
8
|
+
> 2. [Deploy Studio](https://mastra.ai/docs/studio/deployment) to the Mastra platform with `mastra studio deploy`.
|
|
9
|
+
> 3. Follow the [quickstart steps below](#quickstart) to create an access token and find your project ID.
|
|
10
|
+
|
|
5
11
|
## Version compatibility
|
|
6
12
|
|
|
7
13
|
- Use `CloudExporter` with `@mastra/observability@1.8.0` or later.
|
package/.docs/models/index.md
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
# Model Providers
|
|
2
2
|
|
|
3
|
-
Mastra provides a unified interface for working with LLMs across multiple providers, giving you access to
|
|
3
|
+
Mastra provides a unified interface for working with LLMs across multiple providers, giving you access to 3599 models from 100 providers through a single API.
|
|
4
4
|
|
|
5
5
|
## Features
|
|
6
6
|
|
|
@@ -0,0 +1,73 @@
|
|
|
1
|
+
# HPC-AI
|
|
2
|
+
|
|
3
|
+
Access 3 HPC-AI models through Mastra's model router. Authentication is handled automatically using the `HPC_AI_API_KEY` environment variable.
|
|
4
|
+
|
|
5
|
+
Learn more in the [HPC-AI documentation](https://www.hpc-ai.com/doc/docs/quickstart/).
|
|
6
|
+
|
|
7
|
+
```bash
|
|
8
|
+
HPC_AI_API_KEY=your-api-key
|
|
9
|
+
```
|
|
10
|
+
|
|
11
|
+
```typescript
|
|
12
|
+
import { Agent } from "@mastra/core/agent";
|
|
13
|
+
|
|
14
|
+
const agent = new Agent({
|
|
15
|
+
id: "my-agent",
|
|
16
|
+
name: "My Agent",
|
|
17
|
+
instructions: "You are a helpful assistant",
|
|
18
|
+
model: "hpc-ai/minimax/minimax-m2.5"
|
|
19
|
+
});
|
|
20
|
+
|
|
21
|
+
// Generate a response
|
|
22
|
+
const response = await agent.generate("Hello!");
|
|
23
|
+
|
|
24
|
+
// Stream a response
|
|
25
|
+
const stream = await agent.stream("Tell me a story");
|
|
26
|
+
for await (const chunk of stream) {
|
|
27
|
+
console.log(chunk);
|
|
28
|
+
}
|
|
29
|
+
```
|
|
30
|
+
|
|
31
|
+
> **Info:** Mastra uses the OpenAI-compatible `/chat/completions` endpoint. Some provider-specific features may not be available. Check the [HPC-AI documentation](https://www.hpc-ai.com/doc/docs/quickstart/) for details.
|
|
32
|
+
|
|
33
|
+
## Models
|
|
34
|
+
|
|
35
|
+
| Model | Context | Tools | Reasoning | Image | Audio | Video | Input $/1M | Output $/1M |
|
|
36
|
+
| ----------------------------- | ------- | ----- | --------- | ----- | ----- | ----- | ---------- | ----------- |
|
|
37
|
+
| `hpc-ai/minimax/minimax-m2.5` | 1.0M | | | | | | $0.14 | $0.56 |
|
|
38
|
+
| `hpc-ai/moonshotai/kimi-k2.5` | 262K | | | | | | $0.21 | $1 |
|
|
39
|
+
| `hpc-ai/zai-org/glm-5.1` | 202K | | | | | | $0.66 | $2 |
|
|
40
|
+
|
|
41
|
+
## Advanced configuration
|
|
42
|
+
|
|
43
|
+
### Custom headers
|
|
44
|
+
|
|
45
|
+
```typescript
|
|
46
|
+
const agent = new Agent({
|
|
47
|
+
id: "custom-agent",
|
|
48
|
+
name: "custom-agent",
|
|
49
|
+
model: {
|
|
50
|
+
url: "https://api.hpc-ai.com/inference/v1",
|
|
51
|
+
id: "hpc-ai/minimax/minimax-m2.5",
|
|
52
|
+
apiKey: process.env.HPC_AI_API_KEY,
|
|
53
|
+
headers: {
|
|
54
|
+
"X-Custom-Header": "value"
|
|
55
|
+
}
|
|
56
|
+
}
|
|
57
|
+
});
|
|
58
|
+
```
|
|
59
|
+
|
|
60
|
+
### Dynamic model selection
|
|
61
|
+
|
|
62
|
+
```typescript
|
|
63
|
+
const agent = new Agent({
|
|
64
|
+
id: "dynamic-agent",
|
|
65
|
+
name: "Dynamic Agent",
|
|
66
|
+
model: ({ requestContext }) => {
|
|
67
|
+
const useAdvanced = requestContext.task === "complex";
|
|
68
|
+
return useAdvanced
|
|
69
|
+
? "hpc-ai/zai-org/glm-5.1"
|
|
70
|
+
: "hpc-ai/minimax/minimax-m2.5";
|
|
71
|
+
}
|
|
72
|
+
});
|
|
73
|
+
```
|
|
@@ -71,7 +71,7 @@ for await (const chunk of stream) {
|
|
|
71
71
|
| `nvidia/microsoft/phi-4-mini-instruct` | 131K | | | | | | — | — |
|
|
72
72
|
| `nvidia/minimaxai/minimax-m2.1` | 205K | | | | | | — | — |
|
|
73
73
|
| `nvidia/minimaxai/minimax-m2.5` | 205K | | | | | | — | — |
|
|
74
|
-
| `nvidia/minimaxai/minimax-m2.7` | 205K | | | | | |
|
|
74
|
+
| `nvidia/minimaxai/minimax-m2.7` | 205K | | | | | | — | — |
|
|
75
75
|
| `nvidia/mistralai/codestral-22b-instruct-v0.1` | 128K | | | | | | — | — |
|
|
76
76
|
| `nvidia/mistralai/devstral-2-123b-instruct-2512` | 262K | | | | | | — | — |
|
|
77
77
|
| `nvidia/mistralai/mamba-codestral-7b-v0.1` | 128K | | | | | | — | — |
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
# OpenCode Go
|
|
2
2
|
|
|
3
|
-
Access
|
|
3
|
+
Access 9 OpenCode Go models through Mastra's model router. Authentication is handled automatically using the `OPENCODE_API_KEY` environment variable.
|
|
4
4
|
|
|
5
5
|
Learn more in the [OpenCode Go documentation](https://opencode.ai/docs/zen).
|
|
6
6
|
|
|
@@ -41,6 +41,8 @@ for await (const chunk of stream) {
|
|
|
41
41
|
| `opencode-go/mimo-v2-pro` | 1.0M | | | | | | $1 | $3 |
|
|
42
42
|
| `opencode-go/minimax-m2.5` | 205K | | | | | | $0.30 | $1 |
|
|
43
43
|
| `opencode-go/minimax-m2.7` | 205K | | | | | | $0.30 | $1 |
|
|
44
|
+
| `opencode-go/qwen3.5-plus` | 262K | | | | | | $0.20 | $1 |
|
|
45
|
+
| `opencode-go/qwen3.6-plus` | 262K | | | | | | $0.50 | $3 |
|
|
44
46
|
|
|
45
47
|
## Advanced configuration
|
|
46
48
|
|
|
@@ -70,7 +72,7 @@ const agent = new Agent({
|
|
|
70
72
|
model: ({ requestContext }) => {
|
|
71
73
|
const useAdvanced = requestContext.task === "complex";
|
|
72
74
|
return useAdvanced
|
|
73
|
-
? "opencode-go/
|
|
75
|
+
? "opencode-go/qwen3.6-plus"
|
|
74
76
|
: "opencode-go/glm-5";
|
|
75
77
|
}
|
|
76
78
|
});
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
# OpenCode Zen
|
|
2
2
|
|
|
3
|
-
Access
|
|
3
|
+
Access 34 OpenCode Zen models through Mastra's model router. Authentication is handled automatically using the `OPENCODE_API_KEY` environment variable.
|
|
4
4
|
|
|
5
5
|
Learn more in the [OpenCode Zen documentation](https://opencode.ai/docs/zen).
|
|
6
6
|
|
|
@@ -66,6 +66,8 @@ for await (const chunk of stream) {
|
|
|
66
66
|
| `opencode/minimax-m2.5` | 205K | | | | | | $0.30 | $1 |
|
|
67
67
|
| `opencode/minimax-m2.5-free` | 205K | | | | | | — | — |
|
|
68
68
|
| `opencode/nemotron-3-super-free` | 205K | | | | | | — | — |
|
|
69
|
+
| `opencode/qwen3.5-plus` | 262K | | | | | | $0.20 | $1 |
|
|
70
|
+
| `opencode/qwen3.6-plus` | 262K | | | | | | $0.50 | $3 |
|
|
69
71
|
|
|
70
72
|
## Advanced configuration
|
|
71
73
|
|
|
@@ -95,7 +97,7 @@ const agent = new Agent({
|
|
|
95
97
|
model: ({ requestContext }) => {
|
|
96
98
|
const useAdvanced = requestContext.task === "complex";
|
|
97
99
|
return useAdvanced
|
|
98
|
-
? "opencode/
|
|
100
|
+
? "opencode/qwen3.6-plus"
|
|
99
101
|
: "opencode/big-pickle";
|
|
100
102
|
}
|
|
101
103
|
});
|
|
@@ -34,6 +34,7 @@ Direct access to individual AI model providers. Each provider offers unique mode
|
|
|
34
34
|
- [Friendli](https://mastra.ai/models/providers/friendli)
|
|
35
35
|
- [GitHub Models](https://mastra.ai/models/providers/github-models)
|
|
36
36
|
- [Helicone](https://mastra.ai/models/providers/helicone)
|
|
37
|
+
- [HPC-AI](https://mastra.ai/models/providers/hpc-ai)
|
|
37
38
|
- [Hugging Face](https://mastra.ai/models/providers/huggingface)
|
|
38
39
|
- [iFlow](https://mastra.ai/models/providers/iflowcn)
|
|
39
40
|
- [Inception](https://mastra.ai/models/providers/inception)
|
|
@@ -12,6 +12,29 @@ export const mastraClient = new MastraClient({
|
|
|
12
12
|
})
|
|
13
13
|
```
|
|
14
14
|
|
|
15
|
+
## `RequestContext`
|
|
16
|
+
|
|
17
|
+
When you use `RequestContext` with the client SDK, import it from `@mastra/client-js`.
|
|
18
|
+
|
|
19
|
+
```typescript
|
|
20
|
+
import { MastraClient, RequestContext } from '@mastra/client-js'
|
|
21
|
+
|
|
22
|
+
const client = new MastraClient({
|
|
23
|
+
baseUrl: 'http://localhost:4111/',
|
|
24
|
+
})
|
|
25
|
+
|
|
26
|
+
const requestContext = new RequestContext()
|
|
27
|
+
requestContext.set('userId', 'user-123')
|
|
28
|
+
|
|
29
|
+
const agent = client.getAgent('support-agent')
|
|
30
|
+
|
|
31
|
+
const response = await agent.generate('Summarize this ticket', {
|
|
32
|
+
requestContext,
|
|
33
|
+
})
|
|
34
|
+
```
|
|
35
|
+
|
|
36
|
+
You can also pass `requestContext` as a `Record<string, any>`.
|
|
37
|
+
|
|
15
38
|
## Parameters
|
|
16
39
|
|
|
17
40
|
**baseUrl** (`string`): The base URL for the Mastra API. All requests will be sent relative to this URL.
|
package/CHANGELOG.md
CHANGED
|
@@ -1,5 +1,13 @@
|
|
|
1
1
|
# @mastra/mcp-docs-server
|
|
2
2
|
|
|
3
|
+
## 1.1.26-alpha.5
|
|
4
|
+
|
|
5
|
+
### Patch Changes
|
|
6
|
+
|
|
7
|
+
- Updated dependencies [[`fdd54cf`](https://github.com/mastra-ai/mastra/commit/fdd54cf612a9af876e9fdd85e534454f6e7dd518), [`30456b6`](https://github.com/mastra-ai/mastra/commit/30456b6b08c8fd17e109dd093b73d93b65e83bc5), [`9d11a8c`](https://github.com/mastra-ai/mastra/commit/9d11a8c1c8924eb975a245a5884d40ca1b7e0491), [`d246696`](https://github.com/mastra-ai/mastra/commit/d246696139a3144a5b21b042d41c532688e957e1), [`354f9ce`](https://github.com/mastra-ai/mastra/commit/354f9ce1ca6af2074b6a196a23f8ec30012dccca), [`e9837b5`](https://github.com/mastra-ai/mastra/commit/e9837b53699e18711b09e0ca010a4106376f2653)]:
|
|
8
|
+
- @mastra/core@1.26.0-alpha.3
|
|
9
|
+
- @mastra/mcp@1.5.1-alpha.1
|
|
10
|
+
|
|
3
11
|
## 1.1.26-alpha.3
|
|
4
12
|
|
|
5
13
|
### Patch Changes
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@mastra/mcp-docs-server",
|
|
3
|
-
"version": "1.1.26-alpha.
|
|
3
|
+
"version": "1.1.26-alpha.6",
|
|
4
4
|
"description": "MCP server for accessing Mastra.ai documentation, changelogs, and news.",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "dist/index.js",
|
|
@@ -29,7 +29,7 @@
|
|
|
29
29
|
"jsdom": "^26.1.0",
|
|
30
30
|
"local-pkg": "^1.1.2",
|
|
31
31
|
"zod": "^4.3.6",
|
|
32
|
-
"@mastra/core": "1.26.0-alpha.
|
|
32
|
+
"@mastra/core": "1.26.0-alpha.3",
|
|
33
33
|
"@mastra/mcp": "^1.5.1-alpha.1"
|
|
34
34
|
},
|
|
35
35
|
"devDependencies": {
|
|
@@ -47,8 +47,8 @@
|
|
|
47
47
|
"typescript": "^5.9.3",
|
|
48
48
|
"vitest": "4.0.18",
|
|
49
49
|
"@internal/lint": "0.0.83",
|
|
50
|
-
"@
|
|
51
|
-
"@
|
|
50
|
+
"@mastra/core": "1.26.0-alpha.3",
|
|
51
|
+
"@internal/types-builder": "0.0.58"
|
|
52
52
|
},
|
|
53
53
|
"homepage": "https://mastra.ai",
|
|
54
54
|
"repository": {
|