@mastra/mcp-docs-server 0.13.24 → 0.13.25-alpha.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.docs/organized/changelogs/%40internal%2Fstorage-test-utils.md +8 -8
- package/.docs/organized/changelogs/%40internal%2Ftypes-builder.md +2 -0
- package/.docs/organized/changelogs/%40mastra%2Fagent-builder.md +24 -1
- package/.docs/organized/changelogs/%40mastra%2Fai-sdk.md +22 -0
- package/.docs/organized/changelogs/%40mastra%2Fastra.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fchroma.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fclickhouse.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fclient-js.md +46 -46
- package/.docs/organized/changelogs/%40mastra%2Fcloud.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fcloudflare-d1.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fcloudflare.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fcore.md +104 -104
- package/.docs/organized/changelogs/%40mastra%2Fcouchbase.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fdeployer-cloud.md +40 -40
- package/.docs/organized/changelogs/%40mastra%2Fdeployer-cloudflare.md +33 -33
- package/.docs/organized/changelogs/%40mastra%2Fdeployer-netlify.md +34 -34
- package/.docs/organized/changelogs/%40mastra%2Fdeployer-vercel.md +34 -34
- package/.docs/organized/changelogs/%40mastra%2Fdeployer.md +59 -59
- package/.docs/organized/changelogs/%40mastra%2Fdynamodb.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fevals.md +24 -24
- package/.docs/organized/changelogs/%40mastra%2Flance.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Flibsql.md +29 -29
- package/.docs/organized/changelogs/%40mastra%2Floggers.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fmcp-docs-server.md +25 -25
- package/.docs/organized/changelogs/%40mastra%2Fmcp-registry-registry.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fmcp.md +22 -22
- package/.docs/organized/changelogs/%40mastra%2Fmemory.md +27 -27
- package/.docs/organized/changelogs/%40mastra%2Fmongodb.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fmssql.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fopensearch.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fpg.md +33 -33
- package/.docs/organized/changelogs/%40mastra%2Fpinecone.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fplayground-ui.md +73 -73
- package/.docs/organized/changelogs/%40mastra%2Fqdrant.md +20 -20
- package/.docs/organized/changelogs/%40mastra%2Frag.md +22 -22
- package/.docs/organized/changelogs/%40mastra%2Freact.md +17 -0
- package/.docs/organized/changelogs/%40mastra%2Fs3vectors.md +18 -0
- package/.docs/organized/changelogs/%40mastra%2Fschema-compat.md +6 -0
- package/.docs/organized/changelogs/%40mastra%2Fserver.md +51 -51
- package/.docs/organized/changelogs/%40mastra%2Fturbopuffer.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fupstash.md +24 -24
- package/.docs/organized/changelogs/%40mastra%2Fvectorize.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fvoice-azure.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fvoice-cloudflare.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fvoice-deepgram.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fvoice-elevenlabs.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fvoice-gladia.md +20 -3
- package/.docs/organized/changelogs/%40mastra%2Fvoice-google-gemini-live.md +18 -0
- package/.docs/organized/changelogs/%40mastra%2Fvoice-google.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fvoice-murf.md +20 -20
- package/.docs/organized/changelogs/%40mastra%2Fvoice-openai-realtime.md +20 -20
- package/.docs/organized/changelogs/%40mastra%2Fvoice-openai.md +21 -21
- package/.docs/organized/changelogs/%40mastra%2Fvoice-playai.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fvoice-sarvam.md +19 -19
- package/.docs/organized/changelogs/%40mastra%2Fvoice-speechify.md +19 -19
- package/.docs/organized/changelogs/create-mastra.md +32 -32
- package/.docs/organized/changelogs/mastra.md +64 -64
- package/.docs/organized/code-examples/agent.md +0 -4
- package/.docs/organized/code-examples/ai-elements.md +47 -0
- package/.docs/organized/code-examples/heads-up-game.md +5 -5
- package/.docs/raw/auth/clerk.mdx +3 -3
- package/.docs/raw/index.mdx +1 -0
- package/.docs/raw/observability/ai-tracing/exporters/braintrust.mdx +81 -0
- package/.docs/raw/observability/ai-tracing/exporters/cloud.mdx +120 -0
- package/.docs/raw/observability/ai-tracing/exporters/default.mdx +168 -0
- package/.docs/raw/observability/ai-tracing/exporters/langfuse.mdx +121 -0
- package/.docs/raw/observability/ai-tracing/exporters/langsmith.mdx +88 -0
- package/.docs/raw/observability/ai-tracing/exporters/otel.mdx +250 -0
- package/.docs/raw/observability/ai-tracing/overview.mdx +565 -0
- package/.docs/raw/observability/ai-tracing/processors/sensitive-data-filter.mdx +274 -0
- package/.docs/raw/observability/{tracing.mdx → otel-tracing.mdx} +2 -2
- package/.docs/raw/observability/overview.mdx +66 -0
- package/.docs/raw/reference/agents/generate.mdx +39 -1
- package/.docs/raw/reference/agents/generateVNext.mdx +34 -2
- package/.docs/raw/reference/agents/network.mdx +35 -3
- package/.docs/raw/reference/auth/clerk.mdx +1 -1
- package/.docs/raw/reference/client-js/agents.mdx +4 -13
- package/.docs/raw/reference/client-js/mastra-client.mdx +10 -0
- package/.docs/raw/reference/client-js/observability.mdx +76 -0
- package/.docs/raw/reference/core/getScorer.mdx +75 -0
- package/.docs/raw/reference/core/getScorerByName.mdx +75 -0
- package/.docs/raw/reference/core/getScorers.mdx +42 -0
- package/.docs/raw/reference/core/mastra-class.mdx +7 -0
- package/.docs/raw/reference/observability/ai-tracing/ai-tracing.mdx +182 -0
- package/.docs/raw/reference/observability/ai-tracing/configuration.mdx +234 -0
- package/.docs/raw/reference/observability/ai-tracing/exporters/braintrust.mdx +112 -0
- package/.docs/raw/reference/observability/ai-tracing/exporters/cloud-exporter.mdx +176 -0
- package/.docs/raw/reference/observability/ai-tracing/exporters/console-exporter.mdx +148 -0
- package/.docs/raw/reference/observability/ai-tracing/exporters/default-exporter.mdx +196 -0
- package/.docs/raw/reference/observability/ai-tracing/exporters/langfuse.mdx +116 -0
- package/.docs/raw/reference/observability/ai-tracing/exporters/langsmith.mdx +112 -0
- package/.docs/raw/reference/observability/ai-tracing/exporters/otel.mdx +355 -0
- package/.docs/raw/reference/observability/ai-tracing/interfaces.mdx +651 -0
- package/.docs/raw/reference/observability/ai-tracing/processors/sensitive-data-filter.mdx +178 -0
- package/.docs/raw/reference/observability/ai-tracing/span.mdx +371 -0
- package/.docs/raw/reference/observability/{logger.mdx → logging/pino-logger.mdx} +1 -1
- package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/index.mdx +6 -4
- package/.docs/raw/reference/scorers/create-scorer.mdx +59 -9
- package/.docs/raw/reference/scorers/mastra-scorer.mdx +6 -0
- package/.docs/raw/reference/scorers/run-experiment.mdx +216 -0
- package/.docs/raw/reference/streaming/ChunkType.mdx +3 -2
- package/.docs/raw/reference/streaming/agents/MastraModelOutput.mdx +1 -1
- package/.docs/raw/reference/streaming/agents/stream.mdx +41 -3
- package/.docs/raw/reference/streaming/agents/streamVNext.mdx +34 -2
- package/.docs/raw/reference/streaming/workflows/resumeStreamVNext.mdx +17 -1
- package/.docs/raw/reference/streaming/workflows/stream.mdx +39 -1
- package/.docs/raw/reference/streaming/workflows/streamVNext.mdx +39 -1
- package/.docs/raw/reference/tools/create-tool.mdx +34 -1
- package/.docs/raw/reference/workflows/run-methods/resume.mdx +38 -0
- package/.docs/raw/reference/workflows/run-methods/start.mdx +38 -0
- package/.docs/raw/scorers/custom-scorers.mdx +16 -1
- package/.docs/raw/scorers/overview.mdx +28 -0
- package/CHANGELOG.md +15 -0
- package/package.json +6 -6
- package/.docs/raw/observability/ai-tracing.mdx +0 -597
- /package/.docs/raw/reference/observability/{otel-config.mdx → otel-tracing/otel-config.mdx} +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/arize-ax.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/arize-phoenix.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/braintrust.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/dash0.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/keywordsai.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/laminar.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/langfuse.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/langsmith.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/langwatch.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/new-relic.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/signoz.mdx +0 -0
- /package/.docs/raw/reference/observability/{providers → otel-tracing/providers}/traceloop.mdx +0 -0
|
@@ -0,0 +1,121 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: "Langfuse Exporter | AI Tracing | Observability | Mastra Docs"
|
|
3
|
+
description: "Send AI traces to Langfuse for LLM observability and analytics"
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
import { Callout } from "nextra/components";
|
|
7
|
+
|
|
8
|
+
# Langfuse Exporter
|
|
9
|
+
|
|
10
|
+
[Langfuse](https://langfuse.com/) is an open-source observability platform specifically designed for LLM applications. The Langfuse exporter sends your AI traces to Langfuse, providing detailed insights into model performance, token usage, and conversation flows.
|
|
11
|
+
|
|
12
|
+
## When to Use Langfuse
|
|
13
|
+
|
|
14
|
+
Langfuse is ideal when you need:
|
|
15
|
+
- **LLM-specific analytics** - Token usage, costs, latency breakdown
|
|
16
|
+
- **Conversation tracking** - Session-based trace grouping
|
|
17
|
+
- **Quality scoring** - Manual and automated evaluation scores
|
|
18
|
+
- **Model comparison** - A/B testing and version comparisons
|
|
19
|
+
- **Self-hosted option** - Deploy on your own infrastructure
|
|
20
|
+
|
|
21
|
+
## Installation
|
|
22
|
+
|
|
23
|
+
```bash npm2yarn
|
|
24
|
+
npm install @mastra/langfuse
|
|
25
|
+
```
|
|
26
|
+
|
|
27
|
+
## Configuration
|
|
28
|
+
|
|
29
|
+
### Prerequisites
|
|
30
|
+
|
|
31
|
+
1. **Langfuse Account**: Sign up at [cloud.langfuse.com](https://cloud.langfuse.com) or deploy self-hosted
|
|
32
|
+
2. **API Keys**: Create public/secret key pair in Langfuse Settings → API Keys
|
|
33
|
+
3. **Environment Variables**: Set your credentials
|
|
34
|
+
|
|
35
|
+
```bash filename=".env"
|
|
36
|
+
LANGFUSE_PUBLIC_KEY=pk-lf-xxxxxxxxxxxx
|
|
37
|
+
LANGFUSE_SECRET_KEY=sk-lf-xxxxxxxxxxxx
|
|
38
|
+
LANGFUSE_BASE_URL=https://cloud.langfuse.com # Or your self-hosted URL
|
|
39
|
+
```
|
|
40
|
+
|
|
41
|
+
### Basic Setup
|
|
42
|
+
|
|
43
|
+
```typescript filename="src/mastra/index.ts"
|
|
44
|
+
import { Mastra } from "@mastra/core";
|
|
45
|
+
import { LangfuseExporter } from "@mastra/langfuse";
|
|
46
|
+
|
|
47
|
+
export const mastra = new Mastra({
|
|
48
|
+
observability: {
|
|
49
|
+
configs: {
|
|
50
|
+
langfuse: {
|
|
51
|
+
serviceName: 'my-service',
|
|
52
|
+
exporters: [
|
|
53
|
+
new LangfuseExporter({
|
|
54
|
+
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
|
|
55
|
+
secretKey: process.env.LANGFUSE_SECRET_KEY!,
|
|
56
|
+
baseUrl: process.env.LANGFUSE_BASE_URL,
|
|
57
|
+
options: {
|
|
58
|
+
environment: process.env.NODE_ENV,
|
|
59
|
+
},
|
|
60
|
+
}),
|
|
61
|
+
],
|
|
62
|
+
},
|
|
63
|
+
},
|
|
64
|
+
},
|
|
65
|
+
});
|
|
66
|
+
```
|
|
67
|
+
|
|
68
|
+
## Configuration Options
|
|
69
|
+
|
|
70
|
+
### Realtime vs Batch Mode
|
|
71
|
+
|
|
72
|
+
The Langfuse exporter supports two modes for sending traces:
|
|
73
|
+
|
|
74
|
+
#### Realtime Mode (Development)
|
|
75
|
+
Traces appear immediately in Langfuse dashboard, ideal for debugging:
|
|
76
|
+
|
|
77
|
+
```typescript
|
|
78
|
+
new LangfuseExporter({
|
|
79
|
+
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
|
|
80
|
+
secretKey: process.env.LANGFUSE_SECRET_KEY!,
|
|
81
|
+
realtime: true, // Flush after each event
|
|
82
|
+
})
|
|
83
|
+
```
|
|
84
|
+
|
|
85
|
+
#### Batch Mode (Production)
|
|
86
|
+
Better performance with automatic batching:
|
|
87
|
+
|
|
88
|
+
```typescript
|
|
89
|
+
new LangfuseExporter({
|
|
90
|
+
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
|
|
91
|
+
secretKey: process.env.LANGFUSE_SECRET_KEY!,
|
|
92
|
+
realtime: false, // Default - batch traces
|
|
93
|
+
})
|
|
94
|
+
```
|
|
95
|
+
|
|
96
|
+
### Complete Configuration
|
|
97
|
+
|
|
98
|
+
```typescript
|
|
99
|
+
new LangfuseExporter({
|
|
100
|
+
// Required credentials
|
|
101
|
+
publicKey: process.env.LANGFUSE_PUBLIC_KEY!,
|
|
102
|
+
secretKey: process.env.LANGFUSE_SECRET_KEY!,
|
|
103
|
+
|
|
104
|
+
// Optional settings
|
|
105
|
+
baseUrl: process.env.LANGFUSE_BASE_URL, // Default: https://cloud.langfuse.com
|
|
106
|
+
realtime: process.env.NODE_ENV === 'development', // Dynamic mode selection
|
|
107
|
+
logLevel: 'info', // Diagnostic logging: debug | info | warn | error
|
|
108
|
+
|
|
109
|
+
// Langfuse-specific options
|
|
110
|
+
options: {
|
|
111
|
+
environment: process.env.NODE_ENV, // Shows in UI for filtering
|
|
112
|
+
version: process.env.APP_VERSION, // Track different versions
|
|
113
|
+
release: process.env.GIT_COMMIT, // Git commit hash
|
|
114
|
+
},
|
|
115
|
+
})
|
|
116
|
+
```
|
|
117
|
+
|
|
118
|
+
## Related
|
|
119
|
+
|
|
120
|
+
- [AI Tracing Overview](/docs/observability/ai-tracing/overview)
|
|
121
|
+
- [Langfuse Documentation](https://langfuse.com/docs)
|
|
@@ -0,0 +1,88 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: "LangSmith Exporter | AI Tracing | Observability | Mastra Docs"
|
|
3
|
+
description: "Send AI traces to LangSmith for LLM observability and evaluation"
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
import { Callout } from "nextra/components";
|
|
7
|
+
|
|
8
|
+
# LangSmith Exporter
|
|
9
|
+
|
|
10
|
+
[LangSmith](https://smith.langchain.com/) is LangChain's platform for monitoring and evaluating LLM applications. The LangSmith exporter sends your AI traces to LangSmith, providing insights into model performance, debugging capabilities, and evaluation workflows.
|
|
11
|
+
|
|
12
|
+
## When to Use LangSmith
|
|
13
|
+
|
|
14
|
+
LangSmith is ideal when you need:
|
|
15
|
+
- **LangChain ecosystem integration** - Native support for LangChain applications
|
|
16
|
+
- **Debugging and testing** - Detailed trace visualization and replay
|
|
17
|
+
- **Evaluation pipelines** - Built-in evaluation and dataset management
|
|
18
|
+
- **Prompt versioning** - Track and compare prompt variations
|
|
19
|
+
- **Collaboration features** - Team workspaces and shared projects
|
|
20
|
+
|
|
21
|
+
## Installation
|
|
22
|
+
|
|
23
|
+
```bash npm2yarn
|
|
24
|
+
npm install @mastra/langsmith
|
|
25
|
+
```
|
|
26
|
+
|
|
27
|
+
## Configuration
|
|
28
|
+
|
|
29
|
+
### Prerequisites
|
|
30
|
+
|
|
31
|
+
1. **LangSmith Account**: Sign up at [smith.langchain.com](https://smith.langchain.com)
|
|
32
|
+
2. **API Key**: Generate an API key in LangSmith Settings → API Keys
|
|
33
|
+
3. **Environment Variables**: Set your credentials
|
|
34
|
+
|
|
35
|
+
```bash filename=".env"
|
|
36
|
+
LANGSMITH_API_KEY=ls-xxxxxxxxxxxx
|
|
37
|
+
LANGSMITH_BASE_URL=https://api.smith.langchain.com # Optional for self-hosted
|
|
38
|
+
```
|
|
39
|
+
|
|
40
|
+
### Basic Setup
|
|
41
|
+
|
|
42
|
+
```typescript filename="src/mastra/index.ts"
|
|
43
|
+
import { Mastra } from "@mastra/core";
|
|
44
|
+
import { LangSmithExporter } from "@mastra/langsmith";
|
|
45
|
+
|
|
46
|
+
export const mastra = new Mastra({
|
|
47
|
+
observability: {
|
|
48
|
+
configs: {
|
|
49
|
+
langsmith: {
|
|
50
|
+
serviceName: 'my-service',
|
|
51
|
+
exporters: [
|
|
52
|
+
new LangSmithExporter({
|
|
53
|
+
apiKey: process.env.LANGSMITH_API_KEY,
|
|
54
|
+
}),
|
|
55
|
+
],
|
|
56
|
+
},
|
|
57
|
+
},
|
|
58
|
+
},
|
|
59
|
+
});
|
|
60
|
+
```
|
|
61
|
+
|
|
62
|
+
## Configuration Options
|
|
63
|
+
|
|
64
|
+
### Complete Configuration
|
|
65
|
+
|
|
66
|
+
```typescript
|
|
67
|
+
new LangSmithExporter({
|
|
68
|
+
// Required credentials
|
|
69
|
+
apiKey: process.env.LANGSMITH_API_KEY!,
|
|
70
|
+
|
|
71
|
+
// Optional settings
|
|
72
|
+
apiUrl: process.env.LANGSMITH_BASE_URL, // Default: https://api.smith.langchain.com
|
|
73
|
+
callerOptions: { // HTTP client options
|
|
74
|
+
timeout: 30000, // Request timeout in ms
|
|
75
|
+
maxRetries: 3, // Retry attempts
|
|
76
|
+
},
|
|
77
|
+
logLevel: 'info', // Diagnostic logging: debug | info | warn | error
|
|
78
|
+
|
|
79
|
+
// LangSmith-specific options
|
|
80
|
+
hideInputs: false, // Hide input data in UI
|
|
81
|
+
hideOutputs: false, // Hide output data in UI
|
|
82
|
+
})
|
|
83
|
+
```
|
|
84
|
+
|
|
85
|
+
## Related
|
|
86
|
+
|
|
87
|
+
- [AI Tracing Overview](/docs/observability/ai-tracing/overview)
|
|
88
|
+
- [LangSmith Documentation](https://docs.smith.langchain.com/)
|
|
@@ -0,0 +1,250 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: "OpenTelemetry Exporter | AI Tracing | Observability | Mastra Docs"
|
|
3
|
+
description: "Send AI traces to any OpenTelemetry-compatible observability platform"
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
import { Callout } from "nextra/components";
|
|
7
|
+
|
|
8
|
+
# OpenTelemetry Exporter
|
|
9
|
+
|
|
10
|
+
<Callout type="warning">
|
|
11
|
+
The OpenTelemetry exporter is currently **experimental**. APIs and configuration options may change in future releases.
|
|
12
|
+
</Callout>
|
|
13
|
+
|
|
14
|
+
The OpenTelemetry (OTEL) exporter sends your AI traces to any OTEL-compatible observability platform using standardized [OpenTelemetry Semantic Conventions for GenAI](https://opentelemetry.io/docs/specs/semconv/gen-ai/). This ensures broad compatibility with platforms like Datadog, New Relic, SigNoz, Dash0, Traceloop, Laminar, and more.
|
|
15
|
+
|
|
16
|
+
## When to Use OTEL Exporter
|
|
17
|
+
|
|
18
|
+
The OTEL exporter is ideal when you need:
|
|
19
|
+
- **Platform flexibility** - Send traces to any OTEL-compatible backend
|
|
20
|
+
- **Standards compliance** - Follow OpenTelemetry GenAI semantic conventions
|
|
21
|
+
- **Multi-vendor support** - Configure once, switch providers easily
|
|
22
|
+
- **Enterprise platforms** - Integrate with existing observability infrastructure
|
|
23
|
+
- **Custom collectors** - Send to your own OTEL collector
|
|
24
|
+
|
|
25
|
+
## Installation
|
|
26
|
+
|
|
27
|
+
Each provider requires specific protocol packages. Install the base exporter plus the protocol package for your provider:
|
|
28
|
+
|
|
29
|
+
### For HTTP/Protobuf Providers (SigNoz, New Relic, Laminar)
|
|
30
|
+
|
|
31
|
+
```bash npm2yarn
|
|
32
|
+
npm install @mastra/otel-exporter @opentelemetry/exporter-trace-otlp-proto
|
|
33
|
+
```
|
|
34
|
+
|
|
35
|
+
### For gRPC Providers (Dash0)
|
|
36
|
+
|
|
37
|
+
```bash npm2yarn
|
|
38
|
+
npm install @mastra/otel-exporter @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js
|
|
39
|
+
```
|
|
40
|
+
|
|
41
|
+
### For HTTP/JSON Providers (Traceloop)
|
|
42
|
+
|
|
43
|
+
```bash npm2yarn
|
|
44
|
+
npm install @mastra/otel-exporter @opentelemetry/exporter-trace-otlp-http
|
|
45
|
+
```
|
|
46
|
+
|
|
47
|
+
## Provider Configurations
|
|
48
|
+
|
|
49
|
+
### Dash0
|
|
50
|
+
|
|
51
|
+
[Dash0](https://www.dash0.com/) provides real-time observability with automatic insights.
|
|
52
|
+
|
|
53
|
+
```typescript filename="src/mastra/index.ts"
|
|
54
|
+
import { Mastra } from "@mastra/core";
|
|
55
|
+
import { OtelExporter } from "@mastra/otel-exporter";
|
|
56
|
+
|
|
57
|
+
export const mastra = new Mastra({
|
|
58
|
+
observability: {
|
|
59
|
+
configs: {
|
|
60
|
+
otel: {
|
|
61
|
+
serviceName: 'my-service',
|
|
62
|
+
exporters: [
|
|
63
|
+
new OtelExporter({
|
|
64
|
+
provider: {
|
|
65
|
+
dash0: {
|
|
66
|
+
apiKey: process.env.DASH0_API_KEY,
|
|
67
|
+
endpoint: process.env.DASH0_ENDPOINT, // e.g., 'ingress.us-west-2.aws.dash0.com:4317'
|
|
68
|
+
dataset: 'production', // Optional dataset name
|
|
69
|
+
}
|
|
70
|
+
},
|
|
71
|
+
}),
|
|
72
|
+
],
|
|
73
|
+
},
|
|
74
|
+
},
|
|
75
|
+
},
|
|
76
|
+
});
|
|
77
|
+
```
|
|
78
|
+
|
|
79
|
+
<Callout type="info">
|
|
80
|
+
Get your Dash0 endpoint from your dashboard. It should be in the format `ingress.{region}.aws.dash0.com:4317`.
|
|
81
|
+
</Callout>
|
|
82
|
+
|
|
83
|
+
### SigNoz
|
|
84
|
+
|
|
85
|
+
[SigNoz](https://signoz.io/) is an open-source APM alternative with built-in AI tracing support.
|
|
86
|
+
|
|
87
|
+
```typescript filename="src/mastra/index.ts"
|
|
88
|
+
new OtelExporter({
|
|
89
|
+
provider: {
|
|
90
|
+
signoz: {
|
|
91
|
+
apiKey: process.env.SIGNOZ_API_KEY,
|
|
92
|
+
region: 'us', // 'us' | 'eu' | 'in'
|
|
93
|
+
// endpoint: 'https://my-signoz.example.com', // For self-hosted
|
|
94
|
+
}
|
|
95
|
+
},
|
|
96
|
+
})
|
|
97
|
+
```
|
|
98
|
+
|
|
99
|
+
### New Relic
|
|
100
|
+
|
|
101
|
+
[New Relic](https://newrelic.com/) provides comprehensive observability with AI monitoring capabilities.
|
|
102
|
+
|
|
103
|
+
```typescript filename="src/mastra/index.ts"
|
|
104
|
+
new OtelExporter({
|
|
105
|
+
provider: {
|
|
106
|
+
newrelic: {
|
|
107
|
+
apiKey: process.env.NEW_RELIC_LICENSE_KEY,
|
|
108
|
+
// endpoint: 'https://otlp.eu01.nr-data.net', // For EU region
|
|
109
|
+
}
|
|
110
|
+
},
|
|
111
|
+
})
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
### Traceloop
|
|
115
|
+
|
|
116
|
+
[Traceloop](https://www.traceloop.com/) specializes in LLM observability with automatic prompt tracking.
|
|
117
|
+
|
|
118
|
+
```typescript filename="src/mastra/index.ts"
|
|
119
|
+
new OtelExporter({
|
|
120
|
+
provider: {
|
|
121
|
+
traceloop: {
|
|
122
|
+
apiKey: process.env.TRACELOOP_API_KEY,
|
|
123
|
+
destinationId: 'my-destination', // Optional
|
|
124
|
+
}
|
|
125
|
+
},
|
|
126
|
+
})
|
|
127
|
+
```
|
|
128
|
+
|
|
129
|
+
### Laminar
|
|
130
|
+
|
|
131
|
+
[Laminar](https://www.lmnr.ai/) provides specialized LLM observability and analytics.
|
|
132
|
+
|
|
133
|
+
```typescript filename="src/mastra/index.ts"
|
|
134
|
+
new OtelExporter({
|
|
135
|
+
provider: {
|
|
136
|
+
laminar: {
|
|
137
|
+
apiKey: process.env.LMNR_PROJECT_API_KEY,
|
|
138
|
+
// teamId: process.env.LAMINAR_TEAM_ID, // Optional, for backwards compatibility
|
|
139
|
+
}
|
|
140
|
+
},
|
|
141
|
+
})
|
|
142
|
+
```
|
|
143
|
+
|
|
144
|
+
### Custom/Generic OTEL Endpoints
|
|
145
|
+
|
|
146
|
+
For other OTEL-compatible platforms or custom collectors:
|
|
147
|
+
|
|
148
|
+
```typescript filename="src/mastra/index.ts"
|
|
149
|
+
new OtelExporter({
|
|
150
|
+
provider: {
|
|
151
|
+
custom: {
|
|
152
|
+
endpoint: 'https://your-collector.example.com/v1/traces',
|
|
153
|
+
protocol: 'http/protobuf', // 'http/json' | 'http/protobuf' | 'grpc'
|
|
154
|
+
headers: {
|
|
155
|
+
'x-api-key': process.env.API_KEY,
|
|
156
|
+
},
|
|
157
|
+
}
|
|
158
|
+
},
|
|
159
|
+
})
|
|
160
|
+
```
|
|
161
|
+
|
|
162
|
+
## Configuration Options
|
|
163
|
+
|
|
164
|
+
### Complete Configuration
|
|
165
|
+
|
|
166
|
+
```typescript
|
|
167
|
+
new OtelExporter({
|
|
168
|
+
// Provider configuration (required)
|
|
169
|
+
provider: {
|
|
170
|
+
// Use one of: dash0, signoz, newrelic, traceloop, laminar, custom
|
|
171
|
+
},
|
|
172
|
+
|
|
173
|
+
// Export configuration
|
|
174
|
+
timeout: 30000, // Export timeout in milliseconds
|
|
175
|
+
batchSize: 100, // Number of spans per batch
|
|
176
|
+
|
|
177
|
+
// Debug options
|
|
178
|
+
logLevel: 'info', // 'debug' | 'info' | 'warn' | 'error'
|
|
179
|
+
})
|
|
180
|
+
```
|
|
181
|
+
|
|
182
|
+
## OpenTelemetry Semantic Conventions
|
|
183
|
+
|
|
184
|
+
The exporter follows [OpenTelemetry Semantic Conventions for GenAI](https://opentelemetry.io/docs/specs/semconv/gen-ai/), ensuring compatibility with observability platforms:
|
|
185
|
+
|
|
186
|
+
### Span Naming
|
|
187
|
+
- **LLM Operations**: `chat {model}` or `tool_selection {model}`
|
|
188
|
+
- **Tool Execution**: `tool.execute {tool_name}`
|
|
189
|
+
- **Agent Runs**: `agent.{agent_id}`
|
|
190
|
+
- **Workflow Runs**: `workflow.{workflow_id}`
|
|
191
|
+
|
|
192
|
+
### Key Attributes
|
|
193
|
+
- `gen_ai.operation.name` - Operation type (chat, tool.execute, etc.)
|
|
194
|
+
- `gen_ai.system` - AI provider (openai, anthropic, etc.)
|
|
195
|
+
- `gen_ai.request.model` - Model identifier
|
|
196
|
+
- `gen_ai.usage.input_tokens` - Number of input tokens
|
|
197
|
+
- `gen_ai.usage.output_tokens` - Number of output tokens
|
|
198
|
+
- `gen_ai.request.temperature` - Sampling temperature
|
|
199
|
+
- `gen_ai.response.finish_reasons` - Completion reason
|
|
200
|
+
|
|
201
|
+
## Buffering Strategy
|
|
202
|
+
|
|
203
|
+
The exporter buffers spans until a trace is complete:
|
|
204
|
+
1. Collects all spans for a trace
|
|
205
|
+
2. Waits 5 seconds after root span completes
|
|
206
|
+
3. Exports complete trace with preserved parent-child relationships
|
|
207
|
+
4. Ensures no orphaned spans
|
|
208
|
+
|
|
209
|
+
## Protocol Selection Guide
|
|
210
|
+
|
|
211
|
+
Choose the right protocol package based on your provider:
|
|
212
|
+
|
|
213
|
+
| Provider | Protocol | Required Package |
|
|
214
|
+
|----------|----------|------------------|
|
|
215
|
+
| Dash0 | gRPC | `@opentelemetry/exporter-trace-otlp-grpc` |
|
|
216
|
+
| SigNoz | HTTP/Protobuf | `@opentelemetry/exporter-trace-otlp-proto` |
|
|
217
|
+
| New Relic | HTTP/Protobuf | `@opentelemetry/exporter-trace-otlp-proto` |
|
|
218
|
+
| Traceloop | HTTP/JSON | `@opentelemetry/exporter-trace-otlp-http` |
|
|
219
|
+
| Laminar | HTTP/Protobuf | `@opentelemetry/exporter-trace-otlp-proto` |
|
|
220
|
+
| Custom | Varies | Depends on your collector |
|
|
221
|
+
|
|
222
|
+
<Callout type="warning">
|
|
223
|
+
Make sure to install the correct protocol package for your provider. The exporter will provide a helpful error message if the wrong package is installed.
|
|
224
|
+
</Callout>
|
|
225
|
+
|
|
226
|
+
## Troubleshooting
|
|
227
|
+
|
|
228
|
+
### Missing Dependency Error
|
|
229
|
+
|
|
230
|
+
If you see an error like:
|
|
231
|
+
```
|
|
232
|
+
HTTP/Protobuf exporter is not installed (required for signoz).
|
|
233
|
+
To use HTTP/Protobuf export, install the required package:
|
|
234
|
+
npm install @opentelemetry/exporter-trace-otlp-proto
|
|
235
|
+
```
|
|
236
|
+
|
|
237
|
+
Install the suggested package for your provider.
|
|
238
|
+
|
|
239
|
+
### Common Issues
|
|
240
|
+
|
|
241
|
+
1. **Wrong protocol package**: Verify you installed the correct exporter for your provider
|
|
242
|
+
2. **Invalid endpoint**: Check endpoint format matches provider requirements
|
|
243
|
+
3. **Authentication failures**: Verify API keys and headers are correct
|
|
244
|
+
4. **No traces appearing**: Check that traces complete (root span must end)
|
|
245
|
+
|
|
246
|
+
## Related
|
|
247
|
+
|
|
248
|
+
- [AI Tracing Overview](/docs/observability/ai-tracing/overview)
|
|
249
|
+
- [OpenTelemetry GenAI Conventions](https://opentelemetry.io/docs/specs/semconv/gen-ai/)
|
|
250
|
+
- [OTEL Exporter Reference](/reference/observability/ai-tracing/exporters/otel)
|