@mastra/mcp-docs-server 1.1.20-alpha.1 → 1.1.20-alpha.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,35 +1,61 @@
1
1
  # Observability overview
2
2
 
3
- Mastra provides observability features for AI applications. Monitor LLM operations, trace agent decisions, and debug complex workflows with tools that understand AI-specific patterns.
3
+ Mastra's observability system gives you visibility into every agent run, workflow step, tool call, and model interaction. It captures three complementary signals that work together to help you understand what your application is doing and why.
4
4
 
5
- ## Key features
5
+ - [**Tracing**](https://mastra.ai/docs/observability/tracing/overview): Records every operation as a hierarchical timeline of spans, capturing inputs, outputs, token usage, and timing.
6
+ - [**Logging**](https://mastra.ai/docs/observability/logging): Forwards structured log entries from your application and Mastra internals to observability storage, correlated to traces automatically.
7
+ - [**Metrics**](https://mastra.ai/docs/observability/metrics/overview): Extracts duration, token usage, and cost data from traces automatically, with no additional instrumentation required.
6
8
 
7
- ### Tracing
9
+ ## When to use observability
8
10
 
9
- Specialized tracing for AI operations that captures:
11
+ - Debug unexpected agent behavior by inspecting the full decision path, tool calls, and model responses.
12
+ - Monitor latency across agents, workflows, and tools to identify bottlenecks.
13
+ - Track token consumption and estimated cost over time to control spending.
14
+ - Diagnose workflow failures by tracing execution through each step.
15
+ - Compare agent performance before and after prompt or model changes.
10
16
 
11
- - **Model interactions**: Token usage, latency, prompts, and completions
12
- - **Agent execution**: Decision paths, tool calls, and memory operations
13
- - **Workflow steps**: Branching logic, parallel execution, and step outputs
14
- - **Automatic instrumentation**: Tracing with decorators
17
+ ## How the pieces fit together
15
18
 
16
- ### Logging
19
+ Tracing is the foundation. When observability is configured, every agent run, workflow execution, tool call, and model interaction produces a [span](https://opentelemetry.io/docs/concepts/signals/traces/#spans). Spans are organized into traces that show the full request lifecycle as a hierarchical timeline.
17
20
 
18
- All logger calls from your application and Mastra's internal components are automatically forwarded to observability storage when observability is configured. You can [control the log level and disable forwarding](https://mastra.ai/docs/observability/logging) independently from your console logger.
21
+ Metrics are derived from traces automatically. When a span ends, Mastra extracts duration, token counts, and cost estimates without any extra code. These metrics power the dashboards in [Studio](https://mastra.ai/docs/studio/observability).
19
22
 
20
- ## Storage requirements
23
+ Logs are correlated to traces automatically. Every `logger.info()`, `logger.warn()`, or `logger.error()` call within a traced context is tagged with the current trace and span IDs. You can navigate from a log entry directly to the trace that produced it.
21
24
 
22
- The `DefaultExporter` persists traces to your configured storage backend. Not all storage providers support observability—for the full list, see [Storage Provider Support](https://mastra.ai/docs/observability/tracing/exporters/default).
25
+ All three signals share correlation IDs (trace ID, span ID, entity type, entity name), so you can jump between a metric spike, the traces behind it, and the logs within those traces.
23
26
 
24
- For production environments with high traffic, we recommend using **ClickHouse** for the observability domain via [composite storage](https://mastra.ai/reference/storage/composite). See [Production Recommendations](https://mastra.ai/docs/observability/tracing/exporters/default) for details.
27
+ ## Get started
25
28
 
26
- ## Quickstart
29
+ Install `@mastra/observability` and a storage backend:
27
30
 
28
- Configure Observability in your Mastra instance:
31
+ **npm**:
32
+
33
+ ```bash
34
+ npm install @mastra/observability @mastra/libsql @mastra/duckdb
35
+ ```
36
+
37
+ **pnpm**:
38
+
39
+ ```bash
40
+ pnpm add @mastra/observability @mastra/libsql @mastra/duckdb
41
+ ```
42
+
43
+ **Yarn**:
44
+
45
+ ```bash
46
+ yarn add @mastra/observability @mastra/libsql @mastra/duckdb
47
+ ```
48
+
49
+ **Bun**:
50
+
51
+ ```bash
52
+ bun add @mastra/observability @mastra/libsql @mastra/duckdb
53
+ ```
54
+
55
+ Then configure observability in your Mastra instance. The following example uses composite storage to route observability data to DuckDB (which supports metrics aggregation) while keeping everything else in LibSQL:
29
56
 
30
57
  ```ts
31
58
  import { Mastra } from '@mastra/core/mastra'
32
- import { PinoLogger } from '@mastra/loggers'
33
59
  import { LibSQLStore } from '@mastra/libsql'
34
60
  import { DuckDBStore } from '@mastra/duckdb'
35
61
  import { MastraCompositeStore } from '@mastra/core/storage'
@@ -41,7 +67,6 @@ import {
41
67
  } from '@mastra/observability'
42
68
 
43
69
  export const mastra = new Mastra({
44
- logger: new PinoLogger(),
45
70
  storage: new MastraCompositeStore({
46
71
  id: 'composite-storage',
47
72
  default: new LibSQLStore({
@@ -60,9 +85,6 @@ export const mastra = new Mastra({
60
85
  new DefaultExporter(), // Persists traces to storage for Mastra Studio
61
86
  new CloudExporter(), // Sends traces to Mastra Cloud (if MASTRA_CLOUD_ACCESS_TOKEN is set)
62
87
  ],
63
- logging: {
64
- level: 'info', // Minimum log level forwarded to storage (default: 'debug')
65
- },
66
88
  spanOutputProcessors: [
67
89
  new SensitiveDataFilter(), // Redacts sensitive data like passwords, tokens, keys
68
90
  ],
@@ -72,14 +94,18 @@ export const mastra = new Mastra({
72
94
  })
73
95
  ```
74
96
 
75
- > **Serverless environments:** The `file:./mastra.db` storage URL uses the local filesystem, which doesn't work in serverless environments like Vercel, AWS Lambda, or Cloudflare Workers. For serverless deployments, use external storage. See the [Vercel deployment guide](https://mastra.ai/guides/deployment/vercel) for a complete example.
97
+ This enables tracing, log forwarding, and metrics. Mastra also supports external tracing providers like Langfuse, Datadog, and any OpenTelemetry-compatible platform. See [Tracing](https://mastra.ai/docs/observability/tracing/overview) for configuration details.
98
+
99
+ ## Storage
76
100
 
77
- With this basic setup, you will see Traces and Logs in both Studio and in Mastra Cloud.
101
+ Not all storage backends support every signal. Traces and logs work with most backends, but metrics require an OLAP-capable store like DuckDB (development) or ClickHouse (production). For the full compatibility list, see [storage provider support](https://mastra.ai/docs/observability/tracing/exporters/default).
78
102
 
79
- We also support various external tracing providers like MLflow, Langfuse, Braintrust, and any OpenTelemetry-compatible platform (Datadog, New Relic, SigNoz, etc.). See more about this in the [Tracing](https://mastra.ai/docs/observability/tracing/overview) documentation.
103
+ For production environments with high traffic, use composite storage to route the observability domain to a dedicated backend. See [production recommendations](https://mastra.ai/docs/observability/tracing/exporters/default) for details.
80
104
 
81
- ## What's next?
105
+ ## Next steps
82
106
 
83
- - **[Set up Tracing](https://mastra.ai/docs/observability/tracing/overview)**: Configure tracing for your application
84
- - **[Configure Logging](https://mastra.ai/docs/observability/logging)**: Add structured logging
85
- - **[API Reference](https://mastra.ai/reference/observability/tracing/instances)**: Detailed configuration options
107
+ - [Tracing](https://mastra.ai/docs/observability/tracing/overview)
108
+ - [Logging](https://mastra.ai/docs/observability/logging)
109
+ - [Metrics](https://mastra.ai/docs/observability/metrics/overview)
110
+ - [Mastra Studio](https://mastra.ai/docs/studio/observability)
111
+ - [Automatic metrics reference](https://mastra.ai/reference/observability/metrics/automatic-metrics)
@@ -4,6 +4,7 @@ Studio includes these observability views:
4
4
 
5
5
  - **Metrics** for aggregate performance data
6
6
  - **Traces** for individual request inspection
7
+ - **Logs** for browsing internal and application logs
7
8
 
8
9
  All require an [observability storage backend](#quickstart) to be configured.
9
10
 
@@ -21,6 +22,12 @@ When you run an agent or workflow, the Observability tab displays traces that hi
21
22
 
22
23
  Tracing filters out low-level framework details so your traces stay focused and readable. Visit the [tracing overview](https://mastra.ai/docs/observability/tracing/overview) for more details.
23
24
 
25
+ ## Logs
26
+
27
+ Browse internal Mastra logs forwarded to your observability storage. Logs provide full-text search (across message content, entity names, and trace IDs), date presets (last 24 hours to 30 days), and multi-select filters for level, entity type, and entity name. Selecting a log opens a detail panel showing the full message, structured data, and metadata. If the log is correlated with a trace, you can navigate directly to the trace and span timeline.
28
+
29
+ Log forwarding is enabled by default when you configure observability. See [logging](https://mastra.ai/docs/observability/logging) for level configuration, query examples, and customization details.
30
+
24
31
  ## Quickstart
25
32
 
26
33
  For detailed instructions, follow the [observability instructions](https://mastra.ai/docs/observability/overview). To get up and running quickly, add the `@mastra/observability` package to your project and configure it with [LibSQL](https://mastra.ai/reference/storage/libsql) and [DuckDB](https://mastra.ai/reference/vectors/duckdb) for a local development setup that supports both traces and metrics.
@@ -95,4 +102,5 @@ export const mastra = new Mastra({
95
102
 
96
103
  - [Observability overview](https://mastra.ai/docs/observability/overview)
97
104
  - [Metrics overview](https://mastra.ai/docs/observability/metrics/overview)
98
- - [Tracing overview](https://mastra.ai/docs/observability/tracing/overview)
105
+ - [Tracing overview](https://mastra.ai/docs/observability/tracing/overview)
106
+ - [Logging](https://mastra.ai/docs/observability/logging)
@@ -124,4 +124,4 @@ Mastra also supports HTTPS development through the [`--https`](https://mastra.ai
124
124
 
125
125
  - Learn how to [deploy Studio](https://mastra.ai/docs/studio/deployment) for production use.
126
126
  - Add [authentication](https://mastra.ai/docs/studio/auth) to control access to your deployed Studio.
127
- - Explore [Studio observability](https://mastra.ai/docs/studio/observability) to monitor agent performance and gain insights through metrics, logs, and traces.
127
+ - Explore [Studio observability](https://mastra.ai/docs/studio/observability) to monitor agent performance through metrics, traces, and logs.
@@ -137,7 +137,7 @@ Use `text.format` when you want JSON output.
137
137
  - `json_object` enables JSON mode.
138
138
  - `json_schema` enables schema-constrained structured output.
139
139
 
140
- Mastra routes both through the agent's structured output path and still returns the JSON in the normal assistant message text output.
140
+ Both formats return JSON in the assistant message content. Use `json_schema` when you need strict schema enforcement. Use `json_object` when you only need valid JSON output.
141
141
 
142
142
  ```typescript
143
143
  const response = await client.responses.create({
package/CHANGELOG.md CHANGED
@@ -1,5 +1,12 @@
1
1
  # @mastra/mcp-docs-server
2
2
 
3
+ ## 1.1.20-alpha.2
4
+
5
+ ### Patch Changes
6
+
7
+ - Updated dependencies [[`13f4327`](https://github.com/mastra-ai/mastra/commit/13f4327f052faebe199cefbe906d33bf90238767)]:
8
+ - @mastra/core@1.21.0-alpha.1
9
+
3
10
  ## 1.1.20-alpha.1
4
11
 
5
12
  ### Patch Changes
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@mastra/mcp-docs-server",
3
- "version": "1.1.20-alpha.1",
3
+ "version": "1.1.20-alpha.3",
4
4
  "description": "MCP server for accessing Mastra.ai documentation, changelogs, and news.",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
@@ -29,8 +29,8 @@
29
29
  "jsdom": "^26.1.0",
30
30
  "local-pkg": "^1.1.2",
31
31
  "zod": "^4.3.6",
32
- "@mastra/mcp": "^1.4.1",
33
- "@mastra/core": "1.21.0-alpha.0"
32
+ "@mastra/core": "1.21.0-alpha.1",
33
+ "@mastra/mcp": "^1.4.1"
34
34
  },
35
35
  "devDependencies": {
36
36
  "@hono/node-server": "^1.19.11",
@@ -48,7 +48,7 @@
48
48
  "vitest": "4.0.18",
49
49
  "@internal/lint": "0.0.77",
50
50
  "@internal/types-builder": "0.0.52",
51
- "@mastra/core": "1.21.0-alpha.0"
51
+ "@mastra/core": "1.21.0-alpha.1"
52
52
  },
53
53
  "homepage": "https://mastra.ai",
54
54
  "repository": {