llmops-observability 10.0.4__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,669 @@
1
+ Metadata-Version: 2.4
2
+ Name: llmops-observability
3
+ Version: 10.0.4
4
+ Summary: LLMOps Observability SDK with direct Langfuse integration (no SQS/batching)
5
+ Requires-Python: >=3.9
6
+ Description-Content-Type: text/markdown
7
+ Requires-Dist: langfuse>=2.0.0
8
+ Requires-Dist: httpx
9
+ Requires-Dist: python-dotenv
10
+
11
+ # LLMOps Observability SDK
12
+
13
+ A lightweight Python SDK for LLM observability with **direct Langfuse integration**. No SQS queues, no batching, no worker threads - just instant tracing to Langfuse.
14
+
15
+ ## Key Features
16
+
17
+ - ⚡ **Instant Tracing**: Sends traces directly to Langfuse in real-time
18
+ - 🎯 **Simple API**: Same decorators as veriskGO (`@track_function`, `@track_llm_call`)
19
+ - 🚫 **No Complexity**: No SQS queues, no batching, no background workers
20
+ - 🔄 **Sync & Async**: Supports both synchronous and asynchronous functions
21
+ - 🎨 **Provider Agnostic**: Works with any LLM provider (Bedrock, OpenAI, Anthropic, etc.)
22
+ - 🪆 **Nested Spans**: Automatic parent-child relationship tracking
23
+ - 🔍 **Locals Capture**: Capture function local variables for debugging
24
+ - 🌐 **ASGI Middleware**: Auto-tracing for FastAPI/Starlette apps
25
+ - 📊 **Smart Serialization**: Automatic data size limits (200KB) to prevent issues
26
+
27
+ ## Installation
28
+
29
+ ```bash
30
+ # From source (development)
31
+ cd llmops-observability_sdk
32
+ pip install -e .
33
+
34
+ # Or install from package (when published)
35
+ pip install llmops-observability
36
+ ```
37
+
38
+ ## Quick Start
39
+
40
+ ### 1. Configure Langfuse Credentials
41
+
42
+ **Option A: Environment Variables (Recommended)**
43
+
44
+ Create a `.env` file in your application directory:
45
+
46
+ ```bash
47
+ # Langfuse Configuration
48
+ LANGFUSE_PUBLIC_KEY=pk-lf-...
49
+ LANGFUSE_SECRET_KEY=sk-lf-...
50
+ LANGFUSE_BASE_URL=https://your-langfuse-instance.com
51
+ LANGFUSE_VERIFY_SSL=false # Optional, default is false
52
+
53
+ # Project Configuration
54
+ PROJECT_ID=my_project # Your project identifier (used as trace name in Langfuse)
55
+ ENV=production # Environment: production, development, staging, etc.
56
+ ```
57
+
58
+ **Important Notes:**
59
+ - `PROJECT_ID`: Used as the trace name in Langfuse. Should be unique per project.
60
+ - `ENV`: Automatically mapped to `LANGFUSE_TRACING_ENVIRONMENT` for proper environment tracking in Langfuse.
61
+ - The SDK automatically reads these values from the `.env` file.
62
+
63
+ **Option B: Programmatic Configuration**
64
+
65
+ ```python
66
+ from llmops_observability import configure
67
+ import os
68
+
69
+ # Configure at application startup
70
+ configure(
71
+ public_key=os.getenv("LANGFUSE_PUBLIC_KEY"),
72
+ secret_key=os.getenv("LANGFUSE_SECRET_KEY"),
73
+ base_url=os.getenv("LANGFUSE_BASE_URL"),
74
+ verify_ssl=False
75
+ )
76
+
77
+ # Note: You still need PROJECT_ID and ENV in your .env file
78
+ # Or pass them explicitly in start_trace()
79
+ ```
80
+
81
+ > **Important**: Each application should have its own Langfuse project/keys for proper isolation. Never hardcode credentials!
82
+
83
+ ### 2. Basic Usage
84
+
85
+ **Method 1: Simple Start/End (with auto-loaded project and environment)**
86
+ ```python
87
+ from llmops_observability import TraceManager, track_function, track_llm_call
88
+
89
+ # Start a trace - PROJECT_ID and ENV are auto-loaded from .env
90
+ TraceManager.start_trace(
91
+ name="chat_message", # Operation name
92
+ metadata={"user_id": "123", "session_id": "456"},
93
+ tags=["production", "v1.0"]
94
+ )
95
+ # Trace name in Langfuse will be: PROJECT_ID (from .env)
96
+ # Environment will be: ENV (from .env) - automatically tracked by Langfuse
97
+
98
+ # Track regular functions
99
+ @track_function()
100
+ def process_data(input_data):
101
+ # Your code here
102
+ return {"processed": input_data}
103
+
104
+ # Track LLM calls
105
+ @track_llm_call()
106
+ def call_bedrock(prompt):
107
+ # Call your LLM
108
+ response = bedrock_client.converse(
109
+ modelId="anthropic.claude-3-sonnet",
110
+ messages=[{"role": "user", "content": prompt}]
111
+ )
112
+ return response
113
+
114
+ # Use the functions
115
+ result = process_data("some data")
116
+ llm_response = call_bedrock("Hello, world!")
117
+
118
+ # End the trace (flushes to Langfuse)
119
+ TraceManager.end_trace()
120
+ ```
121
+
122
+ **Method 2: Explicit Project and Environment Override**
123
+ ```python
124
+ # Override PROJECT_ID and ENV from .env
125
+ TraceManager.start_trace(
126
+ name="chat_message", # Operation name
127
+ project_id="custom_project", # Override PROJECT_ID
128
+ environment="staging", # Override ENV
129
+ metadata={"user_id": "123"},
130
+ )
131
+
132
+ # Your code...
133
+
134
+ TraceManager.end_trace()
135
+ ```
136
+
137
+ **Method 3: Using `finalize_and_send()` (veriskGO-compatible)**
138
+ ```python
139
+ # Start trace
140
+ TraceManager.start_trace(name="chat_session")
141
+
142
+ # Your code
143
+ user_input = "What is machine learning?"
144
+ response = await llm.generate(user_input)
145
+
146
+ # Finalize with input/output in one call
147
+ TraceManager.finalize_and_send(
148
+ user_id="user_123",
149
+ session_id="session_456",
150
+ trace_name="chat_message",
151
+ trace_input={"user_msg": user_input},
152
+ trace_output={"bot_response": str(response)}
153
+ )
154
+ ```
155
+
156
+ ### 3. Capture Local Variables (Debugging)
157
+
158
+ ```python
159
+ @track_function(capture_locals=True)
160
+ def complex_calculation(x, y, z):
161
+ intermediate = x + y
162
+ result = intermediate * z
163
+ final = result ** 2
164
+ # All local variables are captured in Langfuse
165
+ return final
166
+
167
+ # Capture specific variables only
168
+ @track_function(capture_locals=["important_var", "result"])
169
+ def selective_capture(data):
170
+ important_var = process(data)
171
+ temp_var = "not captured"
172
+ result = transform(important_var)
173
+ return result
174
+ ```
175
+
176
+ ### 4. Nested Spans (Parent-Child Tracking)
177
+
178
+ ```python
179
+ @track_function(name="parent_task")
180
+ def parent_function():
181
+ data = fetch_data()
182
+ # Child spans are automatically nested
183
+ processed = child_function(data)
184
+ return processed
185
+
186
+ @track_function(name="child_task")
187
+ def child_function(data):
188
+ return data.upper()
189
+
190
+ # Langfuse will show: parent_task → child_task
191
+ ```
192
+
193
+ ### 5. ASGI Middleware (FastAPI Auto-Tracing)
194
+
195
+ ```python
196
+ from fastapi import FastAPI
197
+ from llmops_observability import LLMOpsASGIMiddleware
198
+
199
+ app = FastAPI()
200
+ app.add_middleware(LLMOpsASGIMiddleware, service_name="my_api")
201
+
202
+ @app.get("/")
203
+ async def root():
204
+ # Request is automatically traced
205
+ return {"message": "Hello World"}
206
+
207
+ @app.post("/generate")
208
+ async def generate(prompt: str):
209
+ # All decorated functions within request are nested
210
+ result = await generate_text(prompt)
211
+ return result
212
+ ```
213
+
214
+ ### 6. Async Support
215
+
216
+ ```python
217
+ @track_function()
218
+ async def async_process(data):
219
+ return await some_async_operation(data)
220
+
221
+ @track_llm_call(name="summarize")
222
+ async def async_llm_call(text):
223
+ return await chain.ainvoke({"text": text})
224
+
225
+ # Both sync and async work seamlessly
226
+ ``` Guide
227
+
228
+ ### Per-Application Configuration
229
+
230
+ Each Gen AI application using this SDK should have **its own Langfuse project and credentials**. This ensures proper isolation and organization.
231
+
232
+ #### Step 1: Create Langfuse Project
233
+ 1. Go to your Langfuse instance
234
+ 2. Create a new project for your application (e.g., "chatbot-api", "doc-analyzer")
235
+ 3. Copy the project's public key, secret key, and base URL
236
+
237
+ #### Step 2: Configure in Your Application
238
+
239
+ **Method 1: Environment Variables** (Recommended for production)
240
+
241
+ ```bash
242
+ # .env file in your application root
243
+ LANGFUSE_PUBLIC_KEY=pk-lf-abc123...
244
+ LANGFUSE_SECRET_KEY=sk-lf-xyz789...
245
+ LANGFUSE_BASE_URL=https://langfuse.company.com
246
+ LANGFUSE_VERIFY_SSL=false
247
+ ```
248
+
249
+ ```python
250
+ from llmops_observability import TraceManager
251
+ from dotenv import load_dotenv
252
+
253
+ load_dotenv() # Loads .env from current directory
254
+ # SDK auto-configures from environment variables
255
+ ```
256
+
257
+ **Method 2: Explicit Configuration** (Recommended for testing)
258
+
259
+ ```python
260
+ from llmops_observability import configure
261
+ import os
262
+
263
+ # At application startup (e.g., main.py)
264
+ configure(
265
+ public_key=os.getenv("LANGFUSE_PUBLIC_KEY"),
266
+ secret_key=os.getenv("LANGFUSE_SECRET_KEY"),
267
+ base_url=os.getenv("LANGFUSE_BASE_URL"),
268
+ verify_ssl=False
269
+ )
270
+ ```
271
+
272
+ ### Environment Variables Reference
273
+
274
+ | Variable | Required | Default | Description |
275
+ |----------|----------|---------|-------------|
276
+ | `LANGFUSE_PUBLIC_KEY` | Yes | None | Langfuse public key from your project |
277
+ | `LANGFUSE_SECRET_KEY` | Yes | None | Langfuse secret key from your project |
278
+ | `LANGFUSE_BASE_URL` | Yes | None | Langfuse instance URL |
279
+ | `LANGFUSE_VERIFY_SSL` | No | `false` | Whether to verify SSL certificates |
280
+ | `PROJECT_ID` | No | `unknown_project` | Project identifier (used as trace name in Langfuse) |
281
+ | `ENV` | No | `development` | Environment name (production, staging, development, etc.) - automatically mapped to `LANGFUSE_TRACING_ENVIRONMENT` |
282
+
283
+ **Environment Tracking:**
284
+ - The `ENV` variable is automatically mapped to Langfuse's `LANGFUSE_TRACING_ENVIRONMENT`
285
+ - This applies the environment as a top-level attribute to all traces and observations
286
+ - Allows easy filtering by environment in Langfuse UI
287
+ - Must follow regex: `^(?!langfuse)[a-z0-9-_]+$` with max 40 characters, capture_locals=False, capture_self=True)`
288
+ Track regular function execution with optional local variable capture.
289
+
290
+ ```python
291
+ @track_function()
292
+ def my_function(x, y):
293
+ return x + y
294
+
295
+ @track_function(name="custom_name", tags={"version": "1.0"})
296
+ def another_function():
297
+ pass
298
+
299
+ # Capture all local variables for debugging
300
+ @track_function(capture_locals=True)
301
+ def debug_function(data):
302
+ step1 = process(data)
303
+ step2 = transform(step1)
304
+ return step2 # All locals captured in Langfuse
305
+
306
+ # Capture specific variables only
307
+ @track_function(capture_locals=["result", "important_var"])
308
+ def selective_function(input):
309
+ temp = input * 2 # Not captured
310
+ result = temp + 10 # Captured
311
+ important_var = compute(result) # Captured
312
+ return important_var
313
+ ```
314
+
315
+ **Parameters:**
316
+ - `name`: Custom span name (default: function name)
317
+ - `tags`: Dictionary of tags/metadata
318
+ - `capture_locals`: Capture local variables - `True` (all), `False` (none), or list of variable names
319
+ - `capture_self`: Whether to capture `self` in methods (default: `True`)
320
+
321
+ ## API Reference
322
+
323
+ ### TraceManager
324
+
325
+ #### `start_trace(name, project_id=None, environment=None, metadata=None, user_id=None, session_id=None, tags=None)`
326
+ Start a new trace with project and environment tracking.
327
+
328
+ ```python
329
+ TraceManager.start_trace(
330
+ name="chat_message", # Operation name (required)
331
+ project_id="my_project", # Optional: defaults to PROJECT_ID env var
332
+ environment="production", # Optional: defaults to ENV env var
333
+ metadata={"custom": "data"},
334
+ user_id="user_123",
335
+ session_id="session_456",
336
+ tags=["experiment"]
337
+ )
338
+ ```
339
+
340
+ **Parameters:**
341
+ - `name` (required): Operation/trace name (e.g., "chat_message", "document_analysis")
342
+ - `project_id` (optional): Project identifier. Defaults to `PROJECT_ID` from `.env`. Used as trace name in Langfuse.
343
+ - `environment` (optional): Environment name (e.g., "production", "staging"). Defaults to `ENV` from `.env`. Automatically mapped to `LANGFUSE_TRACING_ENVIRONMENT`.
344
+ - `metadata` (optional): Custom metadata dictionary
345
+ - `user_id` (optional): User identifier
346
+ - `session_id` (optional): Session identifier
347
+ - `tags` (optional): List of tags
348
+
349
+ **Returns:** Trace ID (string)
350
+
351
+ **Example with .env auto-loading:**
352
+ ```bash
353
+ # .env file
354
+ PROJECT_ID=chatbot-api
355
+ ENV=production
356
+ ```
357
+
358
+ ```python
359
+ # Automatically uses PROJECT_ID and ENV from .env
360
+ TraceManager.start_trace(
361
+ name="user_query",
362
+ metadata={"version": "2.0"}
363
+ )
364
+ # Trace name in Langfuse: "chatbot-api"
365
+ # Environment in Langfuse: "production"
366
+ ```
367
+
368
+ #### `end_trace()`
369
+ End the current trace and flush to Langfuse.
370
+
371
+ ```python
372
+ TraceManager.end_trace()
373
+ ```
374
+
375
+ #### `finalize_and_send(user_id, session_id, trace_name, trace_input, trace_output)`
376
+ Finalize and send the trace with input/output metadata .
377
+
378
+ This is a convenience method that combines setting trace metadata and ending the trace in one call.
379
+
380
+ ```python
381
+ TraceManager.start_trace(name="chat_message")
382
+
383
+ # ... your code executes ...
384
+
385
+ # Finalize with input/output details
386
+ TraceManager.finalize_and_send(
387
+ user_id="user_123",
388
+ session_id="session_456",
389
+ trace_name="bedrock_chat_message",
390
+ trace_input={"user_msg": "What is Python?"},
391
+ trace_output={"bot_response": "Python is a programming language..."}
392
+ )
393
+ ```
394
+
395
+ **Parameters:**
396
+ - `user_id`: User identifier
397
+ - `session_id`: Session identifier
398
+ - `trace_name`: Name for the trace (can override the initial name)
399
+ - `trace_input`: Dictionary containing the input data
400
+ - `trace_output`: Dictionary containing the output/response data
401
+
402
+ ### Decorators
403
+
404
+ #### `@track_function(name=None, tags=None)`
405
+ Track regular function execution.
406
+
407
+ ```python
408
+ @track_function()
409
+ def my_function(x, y):
410
+ return x + y
411
+
412
+ @track_function(name="custom_name", tags={"version": "1.0"})
413
+ def another_function():
414
+ pass
415
+ ```
416
+
417
+ #### `@track_llm_call(name=None, tags=None)`
418
+ Track LLM generation calls.
419
+
420
+ ```python
421
+ @trAdvanced Features
422
+
423
+ ### Nested Spans & Parent-Child Relationships
424
+
425
+ The SDK automatically handles nested function calls, creating parent-child relationships in Langfuse:
426
+
427
+ ```python
428
+ @track_function(name="orchestrator")
429
+ def main_workflow(user_query):
430
+ # This is the parent span
431
+ context = retrieve_documents(user_query) # Child span 1
432
+ answer = generate_response(user_query, context) # Child span 2
433
+ return answer
434
+
435
+ @track_function(name="retrieval")
436
+ def retrieve_documents(query):
437
+ # This becomes a child of main_workflow
438
+ return db.search(query)
439
+
440
+ @track_function(name="generation")
441
+ def generate_response(query, context):
442
+ # This also becomes a child of main_workflow
443
+ return llm.generate(query, context)
444
+ ```
445
+
446
+ ### Data Size Management
447
+
448
+ The SDK automatically limits output size to **200KB** to prevent issues with large data:
449
+
450
+ - Outputs larger than 200KB are truncated with metadata
451
+ - Preview of first ~1KB is included
452
+ - Prevents memory/network issues with large responses
453
+
454
+ ### ASGI Middleware for FastAPI
455
+
456
+ Automatically trace all HTTP requests:
457
+
458
+ ```python
459
+ from fastapi import FastAPI
460
+ from llmops_observability import LLMOpsASGIMiddleware, track_function
461
+
462
+ app = FastAPI()
463
+ app.add_middleware(LLMOpsASGIMiddleware, service_name="chatbot_api")
464
+
465
+ @app.post("/chat")
466
+ async def chat_endpoint(message: str):
467
+ # Entire request is automatically traced
468
+ response = process_message(message)
469
+ return {"response": response}
470
+
471
+ @track_function()
472
+ def process_message(msg):
473
+ # This becomes a child span of the HTTP request trace
474
+ return "Response"
475
+ ```
476
+
477
+ The middleware captures:
478
+ - Request method, path, headers
479
+ - Response status code
480
+ - Request duration
481
+ - User agent, client IP
482
+ - Automatic trace naming: `{project}_{hostname}`
483
+
484
+ ## Project Structure
485
+
486
+ ```
487
+ llmops-observability_sdk/
488
+ ├── src/
489
+ │ └── llmops_observability/
490
+ │ ├── __init__.py # Public API
491
+ │ ├── config.py # Langfuse client configuration
492
+ │ ├── trace_manager.py # TraceManager & @track_function
493
+ │ ├── llm.py # @track_llm_call decorator
494
+ │ ├── models.py # SpanContext model
495
+ │ ├── asgi_middleware.py # FastAPI middleware
496
+ │ └── pricing.py # Token pricing calculations
497
+ ├──Best Practices
498
+
499
+ ### 1. Configuration Management
500
+ - ✅ **Each application gets its own `.env` file** with unique Langfuse credentials
501
+ - ✅ Use `.gitignore` to exclude `.env` files from version control
502
+ - ✅ Call `configure()` at application startup before any tracing
503
+ - ❌ Never hardcode credentials in the SDK or application code
504
+
505
+ ### 2. Trace Organization
506
+ ```python
507
+ # Good: Descriptive trace names with context
508
+ TraceManager.start_trace(
509
+ name="document_analysis_pipeline",
510
+ user_id=user_id,
511
+ session_id=session_id,
512
+ metadata={"doc_type": "pdf", "version": "2.0"},
513
+ tags=["production", "critical"]
514
+ )
515
+
516
+ # Bad: Generic names without context
517
+ TraceManager.start_trace(name="process")
518
+ ```
519
+
520
+ ### 3. Local Variables Capture
521
+ ```python
522
+ # Use for debugging only - has performance impact
523
+ @track_function(capture_locals=True) # Development
524
+ def debug_complex_logic(data):
525
+ # All locals captured
526
+ pass
527
+
528
+ # Production: Disable or be selective
529
+ @track_function(capture_locals=False) # Production
530
+ @track_function(capture_locals=["final_result"]) # Selective
531
+ ```
532
+
533
+ ### 4. Always End Traces
534
+ ```python
535
+ try:
536
+ TraceManager.start_trace(name="workflow")
537
+ result = process()
538
+ return result
539
+ finally:
540
+ TraceManager.end_trace() # Always flush
541
+ ```
542
+
543
+ ### 5. Trace Naming Convention
544
+ - **Trace Name (in Langfuse)**: Uses `PROJECT_ID` for easy project identification
545
+ - **Operation Name**: The `name` parameter describes what operation is being traced
546
+ - **Environment**: Tracked automatically from `ENV` variable
547
+
548
+ ```python
549
+ # Example:
550
+ # .env: PROJECT_ID=payment-service, ENV=production
551
+
552
+ TraceManager.start_trace(name="process_payment")
553
+ # In Langfuse UI:
554
+ # - Trace Name: "payment-service"
555
+ # - Environment: "production"
556
+ # - Operation: "process_payment" (in metadata)
557
+ ```
558
+
559
+ ## When to Use This SDK
560
+
561
+ ✅ **Use llmops-observability when:**
562
+ - Developing and testing LLM applications locally
563
+ - You want instant trace visibility in Langfuse (no delays)
564
+ - Simple, straightforward tracing without infrastructure setup
565
+ - Running internal tools, demos, or proof-of-concepts
566
+ - Need quick debugging with local variable capture
567
+ - Small to medium-scale deployments
568
+
569
+
570
+ ## Troubleshooting
571
+
572
+ ### Configuration Errors
573
+
574
+ **Error: "Langfuse not configured"**
575
+ ```python
576
+ # Solution: Ensure env vars are set or call configure()
577
+ from dotenv import load_dotenv
578
+ load_dotenv() # Load .env file
579
+
580
+ # Or configure explicitly
581
+ from llmops_observability import configure
582
+ configure(public_key="...", secret_key="...", base_url="...")
583
+ ```
584
+
585
+ ### Trace Not Appearing in Langfuse
586
+
587
+ 1. Check that `TraceManager.end_trace()` is called
588
+ 2. Verify credentials are correct
589
+ 3. Check Langfuse URL is accessible
590
+ 4. Look for error messages in console output
591
+
592
+ ### SSL Certificate Issues
593
+
594
+ ```python
595
+ # Disable SSL verification if using self-signed certs
596
+ configure(
597
+ public_key="...",
598
+ secret_key="...",
599
+ base_url="...",
600
+ verify_ssl=False # ← Disable SSL verification
601
+ )
602
+ ```
603
+
604
+ ## Version History
605
+
606
+ **v8.0.0** (Current)
607
+ - Direct Langfuse integration (no SQS/batching)
608
+ - Nested span support with automatic parent-child relationships
609
+ - Local variable capture for debugging
610
+ - ASGI middleware for FastAPI
611
+ - Smart data serialization with size limits
612
+ - Sync and async function support
613
+ - Dynamic configuration per application
614
+
615
+ ## License
616
+
617
+ Proprietary - Verisk Analytics
618
+
619
+ ## Contributing
620
+
621
+ Internal SDK - For questions or contributions, contact the LLMOps team.# TraceManager & @track_function
622
+ │ ├── llm.py # @track_llm_call decorator
623
+ │ └── models.py # SpanContext model
624
+ ├── pyproject.toml
625
+ └── README.md
626
+ ```
627
+
628
+ ## Example: Complete Workflow
629
+
630
+ ```python
631
+ from llmops_observability import TraceManager, track_function, track_llm_call
632
+ import boto3
633
+
634
+ # Initialize Bedrock client
635
+ bedrock = boto3.client("bedrock-runtime", region_name="us-east-1")
636
+
637
+ @track_function()
638
+ def retrieve_context(query):
639
+ # Simulate RAG retrieval
640
+ return {"documents": ["Context doc 1", "Context doc 2"]}
641
+
642
+ @track_llm_call()
643
+ def generate_answer(prompt, context):
644
+ response = bedrock.converse(
645
+ modelId="anthropic.claude-3-sonnet-20240229-v1:0",
646
+ messages=[{
647
+ "role": "user",
648
+ "content": f"Context: {context}\n\nQuestion: {prompt}"
649
+ }]
650
+ )
651
+ return response
652
+
653
+ # Start trace
654
+ TraceManager.start_trace(
655
+ name="rag_pipeline",
656
+ user_id="user_123",
657
+ metadata={"pipeline": "v1"}
658
+ )
659
+
660
+ # Execute workflow
661
+ context = retrieve_context("What is Python?")
662
+ answer = generate_answer("What is Python?", context)
663
+
664
+ # End trace
665
+ TraceManager.end_trace()
666
+ ```
667
+
668
+ ## Thanks to
669
+ Verisk LLMOps Team ❤️