@azure/mcp-linux-arm64 2.0.0-beta.9 → 2.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/NOTICE.txt +5683 -4849
- package/README.md +148 -27
- package/dist/Azure.Mcp.Tools.AzureMigrate.xml +1060 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/ActivityProcessors.md +119 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/AddApplicationInsightsTelemetry.md +129 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/AddApplicationInsightsTelemetryWorkerService.md +115 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/AddOpenTelemetry.md +153 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/ApplicationInsightsWeb.md +103 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/AzureMonitorExporter.md +137 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/ConfigureOpenTelemetryProvider.md +218 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/ConfigureResource.md +119 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/ConsoleExporter.md +47 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/EntityFrameworkInstrumentation.md +56 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/HttpInstrumentation.md +109 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/LogProcessors.md +101 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/OpenTelemetrySdkCreate.md +146 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/OtlpExporter.md +88 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/RedisInstrumentation.md +63 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/Sampling.md +86 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/SdkCreateTracerProviderBuilder.md +127 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/SqlClientInstrumentation.md +53 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/TelemetryClient.md +122 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/TelemetryConfigurationBuilder.md +173 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/UseAzureMonitor.md +96 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/UseAzureMonitorExporter.md +146 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/WithLogging.md +109 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/WithMetrics.md +105 -0
- package/dist/Instrumentation/Resources/api-reference/dotnet/WithTracing.md +91 -0
- package/dist/Instrumentation/Resources/concepts/dotnet/appinsights-aspnetcore.md +113 -0
- package/dist/Instrumentation/Resources/concepts/dotnet/aspnet-classic-appinsights.md +95 -0
- package/dist/Instrumentation/Resources/concepts/dotnet/azure-monitor-distro.md +102 -0
- package/dist/Instrumentation/Resources/concepts/dotnet/opentelemetry-pipeline.md +57 -0
- package/dist/Instrumentation/Resources/concepts/nodejs/azure-monitor-overview.md +106 -0
- package/dist/Instrumentation/Resources/concepts/nodejs/opentelemetry-pipeline.md +201 -0
- package/dist/Instrumentation/Resources/concepts/python/azure-monitor-overview.md +122 -0
- package/dist/Instrumentation/Resources/concepts/python/opentelemetry-pipeline.md +154 -0
- package/dist/Instrumentation/Resources/examples/dotnet/aspnet-classic-setup.md +80 -0
- package/dist/Instrumentation/Resources/examples/dotnet/aspnetcore-distro-setup.md +156 -0
- package/dist/Instrumentation/Resources/examples/dotnet/aspnetcore-setup.md +160 -0
- package/dist/Instrumentation/Resources/examples/dotnet/workerservice-setup.md +154 -0
- package/dist/Instrumentation/Resources/examples/nodejs/bunyan-setup.md +301 -0
- package/dist/Instrumentation/Resources/examples/nodejs/console-setup.md +284 -0
- package/dist/Instrumentation/Resources/examples/nodejs/express-setup.md +169 -0
- package/dist/Instrumentation/Resources/examples/nodejs/fastify-setup.md +237 -0
- package/dist/Instrumentation/Resources/examples/nodejs/langchain-js-setup.md +310 -0
- package/dist/Instrumentation/Resources/examples/nodejs/mongodb-setup.md +185 -0
- package/dist/Instrumentation/Resources/examples/nodejs/mysql-setup.md +231 -0
- package/dist/Instrumentation/Resources/examples/nodejs/nestjs-setup.md +184 -0
- package/dist/Instrumentation/Resources/examples/nodejs/nextjs-setup.md +320 -0
- package/dist/Instrumentation/Resources/examples/nodejs/postgres-setup.md +147 -0
- package/dist/Instrumentation/Resources/examples/nodejs/redis-setup.md +198 -0
- package/dist/Instrumentation/Resources/examples/nodejs/winston-setup.md +260 -0
- package/dist/Instrumentation/Resources/examples/python/console-setup.md +392 -0
- package/dist/Instrumentation/Resources/examples/python/django-setup.md +269 -0
- package/dist/Instrumentation/Resources/examples/python/fastapi-setup.md +256 -0
- package/dist/Instrumentation/Resources/examples/python/flask-setup.md +218 -0
- package/dist/Instrumentation/Resources/examples/python/genai-setup.md +214 -0
- package/dist/Instrumentation/Resources/examples/python/generic-setup.md +164 -0
- package/dist/Instrumentation/Resources/migration/dotnet/aad-authentication-migration.md +150 -0
- package/dist/Instrumentation/Resources/migration/dotnet/appinsights-2x-to-3x-code-migration.md +164 -0
- package/dist/Instrumentation/Resources/migration/dotnet/appinsights-2x-to-3x-no-code-change.md +92 -0
- package/dist/Instrumentation/Resources/migration/dotnet/aspnet-classic-2x-to-3x-code-migration.md +190 -0
- package/dist/Instrumentation/Resources/migration/dotnet/console-2x-to-3x-code-migration.md +106 -0
- package/dist/Instrumentation/Resources/migration/dotnet/ilogger-migration.md +54 -0
- package/dist/Instrumentation/Resources/migration/dotnet/workerservice-2x-to-3x-code-migration.md +126 -0
- package/dist/Instrumentation/Resources/migration/dotnet/workerservice-2x-to-3x-no-code-change.md +102 -0
- package/dist/azmcp +0 -0
- package/package.json +1 -1
|
@@ -0,0 +1,218 @@
|
|
|
1
|
+
# Basic Azure Monitor Setup for Flask
|
|
2
|
+
|
|
3
|
+
This guide shows how to add Azure Monitor OpenTelemetry to a Flask application.
|
|
4
|
+
|
|
5
|
+
## Prerequisites
|
|
6
|
+
|
|
7
|
+
- Python 3.8 or higher
|
|
8
|
+
- Flask application
|
|
9
|
+
- Azure Application Insights resource
|
|
10
|
+
|
|
11
|
+
## Step 1: Install Packages
|
|
12
|
+
|
|
13
|
+
```bash
|
|
14
|
+
pip install azure-monitor-opentelemetry flask
|
|
15
|
+
```
|
|
16
|
+
|
|
17
|
+
Or add to your `requirements.txt`:
|
|
18
|
+
```
|
|
19
|
+
azure-monitor-opentelemetry
|
|
20
|
+
flask
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
## Step 2: Initialize at Startup
|
|
24
|
+
|
|
25
|
+
Update your main application file (e.g., `app.py`):
|
|
26
|
+
|
|
27
|
+
```python
|
|
28
|
+
# IMPORTANT: Configure Azure Monitor BEFORE importing Flask
|
|
29
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
30
|
+
configure_azure_monitor()
|
|
31
|
+
|
|
32
|
+
# Now import Flask
|
|
33
|
+
from flask import Flask, jsonify, request
|
|
34
|
+
|
|
35
|
+
app = Flask(__name__)
|
|
36
|
+
|
|
37
|
+
@app.route('/')
|
|
38
|
+
def hello():
|
|
39
|
+
return jsonify({"message": "Hello, World!"})
|
|
40
|
+
|
|
41
|
+
@app.route('/api/users')
|
|
42
|
+
def get_users():
|
|
43
|
+
# This request is automatically tracked
|
|
44
|
+
return jsonify([
|
|
45
|
+
{"id": 1, "name": "Alice"},
|
|
46
|
+
{"id": 2, "name": "Bob"}
|
|
47
|
+
])
|
|
48
|
+
|
|
49
|
+
if __name__ == '__main__':
|
|
50
|
+
app.run(debug=True)
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
## Step 3: Configure Connection String
|
|
54
|
+
|
|
55
|
+
Create a `.env` file:
|
|
56
|
+
```env
|
|
57
|
+
APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=00000000-0000-0000-0000-000000000000;IngestionEndpoint=https://...
|
|
58
|
+
FLASK_ENV=development
|
|
59
|
+
```
|
|
60
|
+
|
|
61
|
+
Load environment variables:
|
|
62
|
+
```python
|
|
63
|
+
from dotenv import load_dotenv
|
|
64
|
+
load_dotenv()
|
|
65
|
+
|
|
66
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
67
|
+
configure_azure_monitor()
|
|
68
|
+
|
|
69
|
+
from flask import Flask
|
|
70
|
+
# ... rest of your app
|
|
71
|
+
```
|
|
72
|
+
|
|
73
|
+
## What Gets Auto-Instrumented
|
|
74
|
+
|
|
75
|
+
The Azure Monitor Distro automatically captures:
|
|
76
|
+
- ✅ All HTTP requests to your Flask routes
|
|
77
|
+
- ✅ Request duration, status codes, and paths
|
|
78
|
+
- ✅ Exceptions and error details
|
|
79
|
+
- ✅ Outbound HTTP calls (requests, urllib3)
|
|
80
|
+
- ✅ Database queries (psycopg2 for PostgreSQL)
|
|
81
|
+
|
|
82
|
+
## Step 4: Add Custom Telemetry (Optional)
|
|
83
|
+
|
|
84
|
+
```python
|
|
85
|
+
from flask import Flask, jsonify
|
|
86
|
+
from opentelemetry import trace
|
|
87
|
+
|
|
88
|
+
app = Flask(__name__)
|
|
89
|
+
tracer = trace.get_tracer(__name__)
|
|
90
|
+
|
|
91
|
+
@app.route('/api/orders/<order_id>')
|
|
92
|
+
def get_order(order_id):
|
|
93
|
+
# Add custom span for business logic
|
|
94
|
+
with tracer.start_as_current_span("fetch-order") as span:
|
|
95
|
+
span.set_attribute("order.id", order_id)
|
|
96
|
+
|
|
97
|
+
# Simulate database fetch
|
|
98
|
+
order = fetch_order_from_db(order_id)
|
|
99
|
+
|
|
100
|
+
span.set_attribute("order.status", order.get("status"))
|
|
101
|
+
return jsonify(order)
|
|
102
|
+
|
|
103
|
+
@app.route('/api/process', methods=['POST'])
|
|
104
|
+
def process_data():
|
|
105
|
+
span = trace.get_active_span()
|
|
106
|
+
|
|
107
|
+
# Add context to current request span
|
|
108
|
+
span.set_attribute("user.id", request.headers.get("X-User-ID"))
|
|
109
|
+
span.set_attribute("request.size", len(request.data))
|
|
110
|
+
|
|
111
|
+
try:
|
|
112
|
+
result = do_processing(request.json)
|
|
113
|
+
return jsonify(result)
|
|
114
|
+
except Exception as e:
|
|
115
|
+
span.record_exception(e)
|
|
116
|
+
return jsonify({"error": str(e)}), 500
|
|
117
|
+
```
|
|
118
|
+
|
|
119
|
+
## Step 5: Add Logging Integration
|
|
120
|
+
|
|
121
|
+
```python
|
|
122
|
+
import logging
|
|
123
|
+
from flask import Flask
|
|
124
|
+
|
|
125
|
+
# Configure logging
|
|
126
|
+
logging.basicConfig(level=logging.INFO)
|
|
127
|
+
logger = logging.getLogger(__name__)
|
|
128
|
+
|
|
129
|
+
app = Flask(__name__)
|
|
130
|
+
|
|
131
|
+
@app.route('/api/users/<user_id>')
|
|
132
|
+
def get_user(user_id):
|
|
133
|
+
logger.info(f"Fetching user {user_id}") # Correlated with trace
|
|
134
|
+
|
|
135
|
+
user = fetch_user(user_id)
|
|
136
|
+
|
|
137
|
+
if not user:
|
|
138
|
+
logger.warning(f"User {user_id} not found")
|
|
139
|
+
return jsonify({"error": "Not found"}), 404
|
|
140
|
+
|
|
141
|
+
logger.info(f"Found user {user_id}")
|
|
142
|
+
return jsonify(user)
|
|
143
|
+
```
|
|
144
|
+
|
|
145
|
+
## Complete Example
|
|
146
|
+
|
|
147
|
+
```python
|
|
148
|
+
# app.py
|
|
149
|
+
import os
|
|
150
|
+
import logging
|
|
151
|
+
from dotenv import load_dotenv
|
|
152
|
+
|
|
153
|
+
# Load environment variables first
|
|
154
|
+
load_dotenv()
|
|
155
|
+
|
|
156
|
+
# Configure Azure Monitor BEFORE importing Flask
|
|
157
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
158
|
+
configure_azure_monitor()
|
|
159
|
+
|
|
160
|
+
# Now import Flask and other dependencies
|
|
161
|
+
from flask import Flask, jsonify, request
|
|
162
|
+
from opentelemetry import trace
|
|
163
|
+
|
|
164
|
+
# Setup logging
|
|
165
|
+
logging.basicConfig(level=logging.INFO)
|
|
166
|
+
logger = logging.getLogger(__name__)
|
|
167
|
+
|
|
168
|
+
# Create Flask app
|
|
169
|
+
app = Flask(__name__)
|
|
170
|
+
tracer = trace.get_tracer(__name__)
|
|
171
|
+
|
|
172
|
+
@app.route('/')
|
|
173
|
+
def index():
|
|
174
|
+
logger.info("Index page accessed")
|
|
175
|
+
return jsonify({"status": "healthy"})
|
|
176
|
+
|
|
177
|
+
@app.route('/api/items')
|
|
178
|
+
def list_items():
|
|
179
|
+
with tracer.start_as_current_span("list-items") as span:
|
|
180
|
+
items = [{"id": 1, "name": "Item 1"}, {"id": 2, "name": "Item 2"}]
|
|
181
|
+
span.set_attribute("items.count", len(items))
|
|
182
|
+
return jsonify(items)
|
|
183
|
+
|
|
184
|
+
@app.errorhandler(Exception)
|
|
185
|
+
def handle_error(error):
|
|
186
|
+
logger.error(f"Unhandled error: {error}")
|
|
187
|
+
return jsonify({"error": "Internal server error"}), 500
|
|
188
|
+
|
|
189
|
+
if __name__ == '__main__':
|
|
190
|
+
app.run(host='0.0.0.0', port=5000)
|
|
191
|
+
```
|
|
192
|
+
|
|
193
|
+
## Running the Application
|
|
194
|
+
|
|
195
|
+
```bash
|
|
196
|
+
# Set connection string
|
|
197
|
+
export APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=..."
|
|
198
|
+
|
|
199
|
+
# Run Flask
|
|
200
|
+
python app.py
|
|
201
|
+
```
|
|
202
|
+
|
|
203
|
+
Or with Flask CLI:
|
|
204
|
+
```bash
|
|
205
|
+
flask run
|
|
206
|
+
```
|
|
207
|
+
|
|
208
|
+
## Verification
|
|
209
|
+
|
|
210
|
+
1. Make requests to your Flask endpoints
|
|
211
|
+
2. Go to Azure Portal → Application Insights
|
|
212
|
+
3. Check "Transaction search" for your requests
|
|
213
|
+
4. View "Application map" for dependencies
|
|
214
|
+
|
|
215
|
+
## Links
|
|
216
|
+
|
|
217
|
+
- [Flask Documentation](https://flask.palletsprojects.com/)
|
|
218
|
+
- [Azure Monitor Python](https://learn.microsoft.com/azure/azure-monitor/app/opentelemetry-enable?tabs=python)
|
|
@@ -0,0 +1,214 @@
|
|
|
1
|
+
# Basic Azure Monitor Setup for GenAI Applications
|
|
2
|
+
|
|
3
|
+
This guide shows how to add Azure Monitor OpenTelemetry to applications using GenAI libraries like OpenAI, LangChain, or Anthropic.
|
|
4
|
+
|
|
5
|
+
## Prerequisites
|
|
6
|
+
|
|
7
|
+
- Python 3.8 or higher
|
|
8
|
+
- GenAI library (openai, langchain, anthropic, etc.)
|
|
9
|
+
- Azure Application Insights resource
|
|
10
|
+
|
|
11
|
+
## Step 1: Install Packages
|
|
12
|
+
|
|
13
|
+
For OpenAI applications:
|
|
14
|
+
```bash
|
|
15
|
+
pip install azure-monitor-opentelemetry openai opentelemetry-instrumentation-openai
|
|
16
|
+
```
|
|
17
|
+
|
|
18
|
+
For LangChain applications:
|
|
19
|
+
```bash
|
|
20
|
+
pip install azure-monitor-opentelemetry langchain opentelemetry-instrumentation-langchain
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
For Anthropic applications:
|
|
24
|
+
```bash
|
|
25
|
+
pip install azure-monitor-opentelemetry anthropic opentelemetry-instrumentation-anthropic
|
|
26
|
+
```
|
|
27
|
+
|
|
28
|
+
Or add to your `requirements.txt`:
|
|
29
|
+
```
|
|
30
|
+
azure-monitor-opentelemetry>=1.8.3
|
|
31
|
+
openai
|
|
32
|
+
opentelemetry-instrumentation-openai
|
|
33
|
+
```
|
|
34
|
+
|
|
35
|
+
## Step 2: Initialize at Startup
|
|
36
|
+
|
|
37
|
+
Update your main application file (e.g., `app.py`):
|
|
38
|
+
|
|
39
|
+
```python
|
|
40
|
+
# IMPORTANT: Configure Azure Monitor BEFORE importing GenAI libraries
|
|
41
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
42
|
+
configure_azure_monitor()
|
|
43
|
+
|
|
44
|
+
# Now import your GenAI libraries
|
|
45
|
+
from openai import OpenAI
|
|
46
|
+
|
|
47
|
+
client = OpenAI()
|
|
48
|
+
|
|
49
|
+
def generate_completion(prompt: str):
|
|
50
|
+
"""Generate a completion - automatically traced by Azure Monitor."""
|
|
51
|
+
response = client.chat.completions.create(
|
|
52
|
+
model="gpt-4",
|
|
53
|
+
messages=[
|
|
54
|
+
{"role": "system", "content": "You are a helpful assistant."},
|
|
55
|
+
{"role": "user", "content": prompt}
|
|
56
|
+
]
|
|
57
|
+
)
|
|
58
|
+
return response.choices[0].message.content
|
|
59
|
+
|
|
60
|
+
if __name__ == "__main__":
|
|
61
|
+
result = generate_completion("What is the capital of France?")
|
|
62
|
+
print(f"Response: {result}")
|
|
63
|
+
```
|
|
64
|
+
|
|
65
|
+
## Step 3: Configure Connection String
|
|
66
|
+
|
|
67
|
+
Create a `.env` file:
|
|
68
|
+
```env
|
|
69
|
+
APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=00000000-0000-0000-0000-000000000000;IngestionEndpoint=https://...
|
|
70
|
+
OPENAI_API_KEY=sk-...
|
|
71
|
+
```
|
|
72
|
+
|
|
73
|
+
Load environment variables:
|
|
74
|
+
```python
|
|
75
|
+
from dotenv import load_dotenv
|
|
76
|
+
load_dotenv()
|
|
77
|
+
|
|
78
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
79
|
+
configure_azure_monitor()
|
|
80
|
+
|
|
81
|
+
from openai import OpenAI
|
|
82
|
+
# ... rest of your app
|
|
83
|
+
```
|
|
84
|
+
|
|
85
|
+
## What Gets Auto-Instrumented
|
|
86
|
+
|
|
87
|
+
The Azure Monitor Distro with GenAI instrumentations automatically captures:
|
|
88
|
+
- ✅ All LLM API calls (OpenAI, Anthropic, etc.)
|
|
89
|
+
- ✅ Request duration and latency
|
|
90
|
+
- ✅ Token usage (prompt tokens, completion tokens, total)
|
|
91
|
+
- ✅ Model names and parameters (temperature, max_tokens, etc.)
|
|
92
|
+
- ✅ Prompt and completion content (configurable)
|
|
93
|
+
- ✅ Error details and exceptions
|
|
94
|
+
- ✅ Chain execution in LangChain applications
|
|
95
|
+
- ✅ Agent interactions and tool calls
|
|
96
|
+
|
|
97
|
+
## Advanced: Custom Tracing
|
|
98
|
+
|
|
99
|
+
Add custom spans for business logic:
|
|
100
|
+
|
|
101
|
+
```python
|
|
102
|
+
from opentelemetry import trace
|
|
103
|
+
|
|
104
|
+
tracer = trace.get_tracer(__name__)
|
|
105
|
+
|
|
106
|
+
def process_user_query(user_id: str, query: str):
|
|
107
|
+
with tracer.start_as_current_span("process_query") as span:
|
|
108
|
+
span.set_attribute("user.id", user_id)
|
|
109
|
+
span.set_attribute("query.length", len(query))
|
|
110
|
+
|
|
111
|
+
# Your LLM call is automatically traced as a child span
|
|
112
|
+
response = generate_completion(query)
|
|
113
|
+
|
|
114
|
+
span.set_attribute("response.length", len(response))
|
|
115
|
+
return response
|
|
116
|
+
```
|
|
117
|
+
|
|
118
|
+
## Supported GenAI Libraries
|
|
119
|
+
|
|
120
|
+
| Library | Instrumentation Package | What's Traced |
|
|
121
|
+
|---------|------------------------|---------------|
|
|
122
|
+
| OpenAI | `opentelemetry-instrumentation-openai` | Chat completions, embeddings, fine-tuning |
|
|
123
|
+
| Anthropic | `opentelemetry-instrumentation-anthropic` | Messages API, Claude models |
|
|
124
|
+
| LangChain | `opentelemetry-instrumentation-langchain` | Chains, agents, tools, retrievers |
|
|
125
|
+
| OpenAI Agents | `opentelemetry-instrumentation-openai-agents` | Agent runs, function calls |
|
|
126
|
+
|
|
127
|
+
## Example: LangChain Application
|
|
128
|
+
|
|
129
|
+
```python
|
|
130
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
131
|
+
configure_azure_monitor()
|
|
132
|
+
|
|
133
|
+
from langchain.chat_models import ChatOpenAI
|
|
134
|
+
from langchain.chains import LLMChain
|
|
135
|
+
from langchain.prompts import PromptTemplate
|
|
136
|
+
|
|
137
|
+
llm = ChatOpenAI(model="gpt-4")
|
|
138
|
+
template = PromptTemplate(
|
|
139
|
+
input_variables=["topic"],
|
|
140
|
+
template="Tell me a joke about {topic}"
|
|
141
|
+
)
|
|
142
|
+
chain = LLMChain(llm=llm, prompt=template)
|
|
143
|
+
|
|
144
|
+
# This entire chain execution is traced
|
|
145
|
+
result = chain.run(topic="programming")
|
|
146
|
+
print(result)
|
|
147
|
+
```
|
|
148
|
+
|
|
149
|
+
## Example: OpenAI Agents
|
|
150
|
+
|
|
151
|
+
```python
|
|
152
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
153
|
+
configure_azure_monitor()
|
|
154
|
+
|
|
155
|
+
from openai import OpenAI
|
|
156
|
+
from openai_agents import Agent, function_tool
|
|
157
|
+
|
|
158
|
+
@function_tool
|
|
159
|
+
def get_weather(location: str) -> dict:
|
|
160
|
+
"""Get weather for a location."""
|
|
161
|
+
return {"location": location, "temperature": 72}
|
|
162
|
+
|
|
163
|
+
agent = Agent(
|
|
164
|
+
name="Weather Assistant",
|
|
165
|
+
instructions="You are a helpful weather assistant.",
|
|
166
|
+
tools=[get_weather],
|
|
167
|
+
model="gpt-4"
|
|
168
|
+
)
|
|
169
|
+
|
|
170
|
+
# Agent runs and tool calls are automatically traced
|
|
171
|
+
response = agent.run("What's the weather in San Francisco?")
|
|
172
|
+
print(response)
|
|
173
|
+
```
|
|
174
|
+
|
|
175
|
+
## Configuration Options
|
|
176
|
+
|
|
177
|
+
### Disable Content Logging
|
|
178
|
+
|
|
179
|
+
To avoid logging prompt/completion content (for privacy):
|
|
180
|
+
|
|
181
|
+
```python
|
|
182
|
+
import os
|
|
183
|
+
os.environ["OTEL_PYTHON_DISABLED_INSTRUMENTATIONS"] = "openai-v2"
|
|
184
|
+
|
|
185
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
186
|
+
configure_azure_monitor()
|
|
187
|
+
```
|
|
188
|
+
|
|
189
|
+
### Control Sampling
|
|
190
|
+
|
|
191
|
+
To sample only 10% of traces (reduce costs):
|
|
192
|
+
|
|
193
|
+
```python
|
|
194
|
+
import os
|
|
195
|
+
os.environ["OTEL_TRACES_SAMPLER"] = "traceidratio"
|
|
196
|
+
os.environ["OTEL_TRACES_SAMPLER_ARG"] = "0.1"
|
|
197
|
+
|
|
198
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
199
|
+
configure_azure_monitor()
|
|
200
|
+
```
|
|
201
|
+
|
|
202
|
+
## Viewing Telemetry
|
|
203
|
+
|
|
204
|
+
Once configured, view your GenAI telemetry in Azure Portal:
|
|
205
|
+
1. Go to your Application Insights resource
|
|
206
|
+
2. Navigate to **Performance** → **Dependencies** to see LLM calls
|
|
207
|
+
3. Check **Transaction search** for individual requests
|
|
208
|
+
4. Use **Application Map** to visualize your application topology
|
|
209
|
+
5. Create custom dashboards to track token usage and costs
|
|
210
|
+
|
|
211
|
+
## Next Steps
|
|
212
|
+
|
|
213
|
+
- OpenTelemetry Pipeline Concepts(see in opentelemetry-pipeline-python.md)
|
|
214
|
+
- Azure Monitor Python Overview(see in azure-monitor-python.md)
|
|
@@ -0,0 +1,164 @@
|
|
|
1
|
+
# Basic Azure Monitor Setup for Python
|
|
2
|
+
|
|
3
|
+
This guide shows how to add Azure Monitor OpenTelemetry to a Python application.
|
|
4
|
+
|
|
5
|
+
## Prerequisites
|
|
6
|
+
|
|
7
|
+
- Python 3.8 or higher
|
|
8
|
+
- pip
|
|
9
|
+
- Azure Application Insights resource
|
|
10
|
+
|
|
11
|
+
## Step 1: Install Package
|
|
12
|
+
|
|
13
|
+
```bash
|
|
14
|
+
pip install azure-monitor-opentelemetry
|
|
15
|
+
```
|
|
16
|
+
|
|
17
|
+
Or add to your `requirements.txt`:
|
|
18
|
+
```
|
|
19
|
+
azure-monitor-opentelemetry
|
|
20
|
+
```
|
|
21
|
+
|
|
22
|
+
## Step 2: Initialize at Startup
|
|
23
|
+
|
|
24
|
+
Add the following to your application entry point:
|
|
25
|
+
|
|
26
|
+
```python
|
|
27
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
28
|
+
|
|
29
|
+
# Configure Azure Monitor - MUST be called before importing other libraries
|
|
30
|
+
configure_azure_monitor()
|
|
31
|
+
|
|
32
|
+
# Now import your application code
|
|
33
|
+
# ...
|
|
34
|
+
```
|
|
35
|
+
|
|
36
|
+
## Step 3: Configure Connection String
|
|
37
|
+
|
|
38
|
+
### Option A: Environment Variable (Recommended)
|
|
39
|
+
|
|
40
|
+
```bash
|
|
41
|
+
export APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=00000000-0000-0000-0000-000000000000;IngestionEndpoint=https://..."
|
|
42
|
+
```
|
|
43
|
+
|
|
44
|
+
### Option B: .env File
|
|
45
|
+
|
|
46
|
+
Create a `.env` file:
|
|
47
|
+
```env
|
|
48
|
+
APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=00000000-0000-0000-0000-000000000000;IngestionEndpoint=https://...
|
|
49
|
+
```
|
|
50
|
+
|
|
51
|
+
Load it with python-dotenv:
|
|
52
|
+
```bash
|
|
53
|
+
pip install python-dotenv
|
|
54
|
+
```
|
|
55
|
+
|
|
56
|
+
```python
|
|
57
|
+
from dotenv import load_dotenv
|
|
58
|
+
load_dotenv()
|
|
59
|
+
|
|
60
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
61
|
+
configure_azure_monitor()
|
|
62
|
+
```
|
|
63
|
+
|
|
64
|
+
### Option C: Direct Configuration
|
|
65
|
+
|
|
66
|
+
```python
|
|
67
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
68
|
+
|
|
69
|
+
configure_azure_monitor(
|
|
70
|
+
connection_string="InstrumentationKey=..."
|
|
71
|
+
)
|
|
72
|
+
```
|
|
73
|
+
|
|
74
|
+
## Step 4: Optional Configuration
|
|
75
|
+
|
|
76
|
+
### Service Name
|
|
77
|
+
```bash
|
|
78
|
+
export OTEL_SERVICE_NAME="my-python-app"
|
|
79
|
+
```
|
|
80
|
+
|
|
81
|
+
### Resource Attributes
|
|
82
|
+
```bash
|
|
83
|
+
export OTEL_RESOURCE_ATTRIBUTES="deployment.environment=production,service.version=1.0.0"
|
|
84
|
+
```
|
|
85
|
+
|
|
86
|
+
### Sampling
|
|
87
|
+
```bash
|
|
88
|
+
export OTEL_TRACES_SAMPLER=traceidratio
|
|
89
|
+
export OTEL_TRACES_SAMPLER_ARG=0.1 # 10% sampling
|
|
90
|
+
```
|
|
91
|
+
|
|
92
|
+
## Step 5: Add Custom Telemetry (Optional)
|
|
93
|
+
|
|
94
|
+
```python
|
|
95
|
+
from opentelemetry import trace
|
|
96
|
+
|
|
97
|
+
tracer = trace.get_tracer(__name__)
|
|
98
|
+
|
|
99
|
+
def process_order(order_id: str):
|
|
100
|
+
with tracer.start_as_current_span("process-order") as span:
|
|
101
|
+
span.set_attribute("order.id", order_id)
|
|
102
|
+
|
|
103
|
+
try:
|
|
104
|
+
# Your business logic
|
|
105
|
+
result = do_processing(order_id)
|
|
106
|
+
span.set_attribute("order.status", "completed")
|
|
107
|
+
return result
|
|
108
|
+
except Exception as e:
|
|
109
|
+
span.record_exception(e)
|
|
110
|
+
span.set_attribute("order.status", "failed")
|
|
111
|
+
raise
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
## Verification
|
|
115
|
+
|
|
116
|
+
After setup, you should see telemetry in Azure Portal:
|
|
117
|
+
1. Navigate to your Application Insights resource
|
|
118
|
+
2. Go to "Transaction search" or "Live Metrics"
|
|
119
|
+
3. Make some requests to your application
|
|
120
|
+
4. Verify data appears within a few minutes
|
|
121
|
+
|
|
122
|
+
## Complete Example
|
|
123
|
+
|
|
124
|
+
```python
|
|
125
|
+
# app.py
|
|
126
|
+
import os
|
|
127
|
+
from dotenv import load_dotenv
|
|
128
|
+
|
|
129
|
+
# Load environment variables
|
|
130
|
+
load_dotenv()
|
|
131
|
+
|
|
132
|
+
# Configure Azure Monitor FIRST
|
|
133
|
+
from azure.monitor.opentelemetry import configure_azure_monitor
|
|
134
|
+
configure_azure_monitor()
|
|
135
|
+
|
|
136
|
+
# Now import and run your application
|
|
137
|
+
import logging
|
|
138
|
+
|
|
139
|
+
logging.basicConfig(level=logging.INFO)
|
|
140
|
+
logger = logging.getLogger(__name__)
|
|
141
|
+
|
|
142
|
+
def main():
|
|
143
|
+
logger.info("Application started")
|
|
144
|
+
# Your application logic here
|
|
145
|
+
logger.info("Application finished")
|
|
146
|
+
|
|
147
|
+
if __name__ == "__main__":
|
|
148
|
+
main()
|
|
149
|
+
```
|
|
150
|
+
|
|
151
|
+
## Troubleshooting
|
|
152
|
+
|
|
153
|
+
### No Data in Azure Portal
|
|
154
|
+
1. Verify connection string is correct
|
|
155
|
+
2. Check for firewall/proxy blocking outbound HTTPS
|
|
156
|
+
3. Enable debug logging: `export OTEL_LOG_LEVEL=debug`
|
|
157
|
+
|
|
158
|
+
### Import Order Issues
|
|
159
|
+
Ensure `configure_azure_monitor()` is called before importing instrumented libraries.
|
|
160
|
+
|
|
161
|
+
## Links
|
|
162
|
+
|
|
163
|
+
- [Azure Monitor OpenTelemetry](https://learn.microsoft.com/azure/azure-monitor/app/opentelemetry-enable?tabs=python)
|
|
164
|
+
- [Troubleshooting Guide](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/telemetry/opentelemetry-troubleshooting-python)
|