@latitude-data/telemetry 3.0.0-alpha.0 → 3.0.0-alpha.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,167 +10,467 @@ npm install @latitude-data/telemetry
10
10
 
11
11
  ## Quick Start
12
12
 
13
+ ### Bootstrap (Recommended)
14
+
15
+ The fastest way to start tracing your LLM calls. One function sets up everything:
16
+
13
17
  ```typescript
14
- import OpenAI from "openai"
15
- import { LatitudeTelemetry, Instrumentation } from "@latitude-data/telemetry"
18
+ import { initLatitude } from "@latitude-data/telemetry";
16
19
 
17
- const telemetry = new LatitudeTelemetry("your-api-key", "your-project-slug", {
18
- instrumentations: {
19
- [Instrumentation.OpenAI]: OpenAI,
20
- },
21
- })
20
+ const latitude = initLatitude({
21
+ apiKey: process.env.LATITUDE_API_KEY!,
22
+ projectSlug: process.env.LATITUDE_PROJECT_SLUG!,
23
+ instrumentations: ["openai"], // Auto-instrument OpenAI, Anthropic, etc.
24
+ });
22
25
 
23
- // All OpenAI calls are now automatically traced
24
- const client = new OpenAI()
25
- const response = await client.chat.completions.create({
26
- model: "gpt-4o-mini",
27
- messages: [{ role: "user", content: "Hello!" }],
28
- })
26
+ // Optionally wait for instrumentations to be ready
27
+ await latitude.ready;
29
28
 
30
- // Ensure traces are sent before process exits
31
- await telemetry.flush()
29
+ // Your LLM calls will now be traced and sent to Latitude
30
+ const response = await openai.chat.completions.create({
31
+ model: "gpt-4",
32
+ messages: [{ role: "user", content: "Hello" }],
33
+ });
34
+
35
+ await latitude.shutdown();
32
36
  ```
33
37
 
34
- ## Constructor
38
+ **What this does:**
39
+
40
+ - Creates a complete OpenTelemetry setup
41
+ - Registers LLM auto-instrumentation (OpenAI, Anthropic, etc.) **asynchronously in the background**
42
+ - Configures the Latitude span processor and exporter
43
+ - Sets up async context propagation (for passing context through async operations)
44
+
45
+ **Key point about async instrumentations:**
46
+
47
+ `initLatitude` returns **immediately** — no top-level await needed. Instrumentation registration happens in the background. This avoids top-level await issues in CommonJS environments while still supporting ESM.
48
+
49
+ - **Fire-and-forget**: Start using your LLM clients right away. Early spans will be captured once instrumentations finish registering.
50
+ - **Optional `await latitude.ready`**: If you want to ensure instrumentations are fully registered before making LLM calls, await the `ready` promise.
51
+
52
+ **When to use this:** Most applications should start here. It's the simplest path to get LLM observability into Latitude.
53
+
54
+ **When you might need the advanced setup:**
55
+
56
+ - You already have OpenTelemetry configured for other backends (Datadog, Sentry, Jaeger)
57
+ - You need custom span processing, sampling, or filtering
58
+ - You want multiple observability backends receiving the same spans
59
+
60
+ ### Existing OpenTelemetry Setup (Advanced)
61
+
62
+ If your app already uses OpenTelemetry, add Latitude alongside your existing setup:
35
63
 
36
64
  ```typescript
37
- new LatitudeTelemetry(apiKey: string, projectSlug: string, options?: TelemetryOptions)
65
+ import { NodeSDK } from "@opentelemetry/sdk-node";
66
+ import {
67
+ LatitudeSpanProcessor,
68
+ registerLatitudeInstrumentations,
69
+ } from "@latitude-data/telemetry";
70
+
71
+ const sdk = new NodeSDK({
72
+ spanProcessors: [
73
+ existingProcessor, // Your existing processor
74
+ new LatitudeSpanProcessor(
75
+ process.env.LATITUDE_API_KEY!,
76
+ process.env.LATITUDE_PROJECT_SLUG!,
77
+ ),
78
+ ],
79
+ });
80
+
81
+ sdk.start();
82
+
83
+ // Enable LLM auto-instrumentation
84
+ await registerLatitudeInstrumentations({
85
+ instrumentations: ["openai"],
86
+ tracerProvider: sdk.getTracerProvider(),
87
+ });
88
+
89
+ // Your LLM calls will now be traced and sent to Latitude
90
+ const response = await openai.chat.completions.create({
91
+ model: "gpt-4",
92
+ messages: [{ role: "user", content: "Hello" }],
93
+ });
38
94
  ```
39
95
 
40
- | Parameter | Type | Description |
41
- |-----------|------|-------------|
42
- | `apiKey` | `string` | Your Latitude API key |
43
- | `projectSlug` | `string` | Your Latitude project slug |
44
- | `options` | `TelemetryOptions` | Optional configuration |
96
+ ### Why OpenTelemetry?
97
+
98
+ Latitude Telemetry is built entirely on OpenTelemetry standards. When you're ready to add other observability tools (Datadog, Sentry, Jaeger, etc.), you can use them alongside Latitude without conflicts:
99
+
100
+ - **Standard span processors** — `LatitudeSpanProcessor` works with any `NodeSDK` or `NodeTracerProvider`
101
+ - **Smart filtering** — Only LLM-relevant spans are exported to Latitude (spans with `gen_ai.*`, `llm.*`, `openinference.*`, or `ai.*` attributes, plus known LLM instrumentation scopes)
102
+ - **Compatible with existing instrumentations** — Works alongside HTTP, DB, and other OTel instrumentations
103
+ - **No vendor lock-in** — Standard OTLP export, no proprietary wire format
104
+
105
+ ## Using `capture()` for Context and Boundaries
106
+
107
+ The SDK automatically traces LLM calls when you use auto-instrumentation. However, you may want to add additional context (user ID, session ID, tags, or metadata) to group related spans together.
108
+
109
+ ### What `capture()` Does
45
110
 
46
- ### Options
111
+ `capture()` wraps your code to attach Latitude context to all LLM spans created inside the callback:
47
112
 
48
- | Option | Type | Default | Description |
49
- |--------|------|---------|-------------|
50
- | `serviceName` | `string` | `process.env.npm_package_name` | Service name reported in traces |
51
- | `disableBatch` | `boolean` | `false` | Send spans immediately instead of batching |
52
- | `exporter` | `SpanExporter` | OTLP to Latitude | Custom span exporter |
53
- | `processors` | `SpanProcessor[]` | Redact processor | Custom span processors |
54
- | `propagators` | `TextMapPropagator[]` | W3C Trace + Baggage | Custom context propagators |
55
- | `instrumentations` | `object` | `{}` | Provider modules to auto-instrument |
113
+ - Adds attributes like `user.id`, `session.id`, `latitude.tags`, and `latitude.metadata` to every span
114
+ - Creates a named boundary for grouping related traces
115
+ - Uses OpenTelemetry's native `context.with()` for reliable async propagation
56
116
 
57
- ## Auto-Instrumentation
117
+ ### When to Use It
58
118
 
59
- Pass the provider module to automatically instrument all calls:
119
+ You don't need `capture()` to get started—auto-instrumentation handles LLM calls automatically. Use `capture()` when you want to:
120
+
121
+ - **Group traces by user or session** — Track all LLM calls from a specific user or session
122
+ - **Add business context** — Tag traces with deployment environment, feature flags, or request IDs
123
+ - **Mark agent boundaries** — Wrap an entire agent run or conversation turn with a name and metadata
124
+ - **Filter and analyze** — Use tags and metadata to filter traces in the Latitude
125
+
126
+ ### Example
60
127
 
61
128
  ```typescript
62
- import OpenAI from "openai"
63
- import Anthropic from "@anthropic-ai/sdk"
64
- import { LatitudeTelemetry, Instrumentation } from "@latitude-data/telemetry"
65
-
66
- const telemetry = new LatitudeTelemetry("your-api-key", "your-project-slug", {
67
- instrumentations: {
68
- [Instrumentation.OpenAI]: OpenAI,
69
- [Instrumentation.Anthropic]: Anthropic,
129
+ import { initLatitude, capture } from "@latitude-data/telemetry";
130
+
131
+ const latitude = initLatitude({
132
+ apiKey: process.env.LATITUDE_API_KEY!,
133
+ projectSlug: process.env.LATITUDE_PROJECT_SLUG!,
134
+ instrumentations: ["openai"],
135
+ });
136
+
137
+ // Optional: wait for instrumentations before starting
138
+ await latitude.ready;
139
+
140
+ // Wrap a request or agent run to add context
141
+ await capture(
142
+ "handle-user-request",
143
+ async () => {
144
+ const result = await agent.process(userMessage);
145
+ return result;
70
146
  },
71
- })
147
+ {
148
+ userId: "user_123",
149
+ sessionId: "session_abc",
150
+ tags: ["production", "v2-agent"],
151
+ metadata: { requestId: "req-xyz", featureFlag: "new-prompt" },
152
+ },
153
+ );
154
+
155
+ await latitude.shutdown();
72
156
  ```
73
157
 
74
- ### Supported Providers
158
+ **Important:** `capture()` does not create spans—it only attaches context to spans created by auto-instrumentation. Use one `capture()` call at the request or agent boundary. You can nest calls for granular context; child spans inherit from parent context with local overrides.
75
159
 
76
- | Instrumentation | Package |
77
- |-----------------|---------|
78
- | `Instrumentation.OpenAI` | `openai` |
79
- | `Instrumentation.Anthropic` | `@anthropic-ai/sdk` |
80
- | `Instrumentation.Bedrock` | `@aws-sdk/client-bedrock-runtime` |
81
- | `Instrumentation.Cohere` | `cohere-ai` |
82
- | `Instrumentation.Langchain` | `langchain` |
83
- | `Instrumentation.LlamaIndex` | `llamaindex` |
84
- | `Instrumentation.TogetherAI` | `together-ai` |
85
- | `Instrumentation.VertexAI` | `@google-cloud/vertexai` |
86
- | `Instrumentation.AIPlatform` | `@google-cloud/aiplatform` |
160
+ ## Key Concepts
87
161
 
88
- ## Capture
162
+ - **`initLatitude()`** — The primary way to use Latitude. Bootstraps a complete OpenTelemetry setup with LLM auto-instrumentation and the Latitude exporter. Best for most applications.
163
+ - **`LatitudeSpanProcessor`** — For advanced use cases where you already have an OpenTelemetry setup. Exports spans to Latitude alongside your existing observability stack.
164
+ - **`registerLatitudeInstrumentations()`** — Registers LLM auto-instrumentations (OpenAI, Anthropic, etc.) when using the advanced setup with your own provider.
165
+ - **`capture()`** — Optional. Wraps your code to attach Latitude context (tags, userId, sessionId, metadata) to all spans created inside the callback. Use this when you want to group traces by user, session, or add business context.
89
166
 
90
- Use `capture()` to set trace-wide context attributes. All spans created within the callback inherit these as baggage:
167
+ **Important:** Auto-instrumentation traces LLM calls without `capture()`. Use `capture()` only when you need to add context or mark boundaries. Wrap the request, job, or agent entrypoint once—you don't need to wrap every internal step.
168
+
169
+ ## Public API
91
170
 
92
171
  ```typescript
93
- await telemetry.capture(
94
- {
95
- tags: ["production", "chat"],
96
- metadata: { environment: "prod", version: "1.2.0" },
97
- sessionId: "session-abc-123",
98
- userId: "user-456",
99
- },
100
- async (ctx) => {
101
- // All spans created here will carry the tags, metadata, sessionId, userId
102
- const response = await client.chat.completions.create({
103
- model: "gpt-4o-mini",
104
- messages: [{ role: "user", content: "Hello!" }],
105
- })
106
- return response
107
- },
108
- )
172
+ import {
173
+ initLatitude,
174
+ LatitudeSpanProcessor,
175
+ capture,
176
+ registerLatitudeInstrumentations,
177
+ } from "@latitude-data/telemetry";
109
178
  ```
110
179
 
111
- ### Capture Options
180
+ ### `initLatitude(options)`
112
181
 
113
- | Option | Type | OTel Attribute | Description |
114
- |--------|------|----------------|-------------|
115
- | `tags` | `string[]` | `latitude.tags` | Tags for filtering traces |
116
- | `metadata` | `Record<string, unknown>` | `latitude.metadata` | Arbitrary key-value metadata |
117
- | `sessionId` | `string` | `session.id` | Group traces by session |
118
- | `userId` | `string` | `user.id` | Associate traces with a user |
182
+ The primary entry point. Bootstraps a complete OpenTelemetry setup with LLM instrumentations and Latitude export.
119
183
 
120
- ## Custom Spans
184
+ ```typescript
185
+ type InitLatitudeOptions = {
186
+ apiKey: string;
187
+ projectSlug: string;
188
+ instrumentations?: InstrumentationType[]; // ["openai", "anthropic", etc.]
189
+ serviceName?: string;
190
+ disableBatch?: boolean;
191
+ disableSmartFilter?: boolean;
192
+ shouldExportSpan?: (span: ReadableSpan) => boolean;
193
+ blockedInstrumentationScopes?: string[];
194
+ disableRedact?: boolean;
195
+ redact?: RedactSpanProcessorOptions;
196
+ exporter?: SpanExporter;
197
+ };
198
+
199
+ function initLatitude(options: InitLatitudeOptions): {
200
+ provider: NodeTracerProvider; // Access to underlying provider for advanced use
201
+ ready: Promise<void>; // Resolves when instrumentations are registered
202
+ flush(): Promise<void>;
203
+ shutdown(): Promise<void>;
204
+ };
205
+ ```
121
206
 
122
- The SDK exposes the underlying OpenTelemetry `tracer` for creating custom spans:
207
+ ### `LatitudeSpanProcessor`
208
+
209
+ Span processor for shared-provider setups. Reads Latitude context from OTel context and stamps attributes onto spans.
123
210
 
124
211
  ```typescript
125
- // Simple span
126
- const span = telemetry.tracer.startSpan("my-operation")
127
- span.setAttribute("custom.key", "value")
128
- span.end()
129
-
130
- // Nested spans with context
131
- telemetry.context.with(
132
- telemetry.tracer.startSpan("parent").context,
133
- () => {
134
- const child = telemetry.tracer.startSpan("child")
135
- child.end()
136
- },
137
- )
212
+ class LatitudeSpanProcessor implements SpanProcessor {
213
+ constructor(
214
+ apiKey: string,
215
+ projectSlug: string,
216
+ options?: LatitudeSpanProcessorOptions,
217
+ );
218
+ }
219
+
220
+ type LatitudeSpanProcessorOptions = {
221
+ disableBatch?: boolean;
222
+ disableSmartFilter?: boolean;
223
+ shouldExportSpan?: (span: ReadableSpan) => boolean;
224
+ blockedInstrumentationScopes?: string[];
225
+ disableRedact?: boolean;
226
+ redact?: RedactSpanProcessorOptions;
227
+ exporter?: SpanExporter;
228
+ };
229
+ ```
230
+
231
+ ### `capture(name, fn, options?)`
232
+
233
+ Wraps a function to attach Latitude context to all spans created inside. Uses OpenTelemetry's native `context.with()` for scoping.
234
+
235
+ ```typescript
236
+ type ContextOptions = {
237
+ name?: string;
238
+ userId?: string;
239
+ sessionId?: string;
240
+ tags?: string[];
241
+ metadata?: Record<string, unknown>;
242
+ };
243
+
244
+ function capture<T>(
245
+ name: string,
246
+ fn: () => T | Promise<T>,
247
+ options?: ContextOptions,
248
+ ): T | Promise<T>;
249
+ ```
250
+
251
+ **Nested `capture()` behavior:**
252
+
253
+ - `userId`: last-write-wins
254
+ - `sessionId`: last-write-wins
255
+ - `metadata`: shallow merge
256
+ - `tags`: append and dedupe while preserving order
257
+
258
+ ### `registerLatitudeInstrumentations(options)`
259
+
260
+ Registers patch-based AI SDK instrumentations against a specific tracer provider.
261
+
262
+ ```typescript
263
+ type InstrumentationType =
264
+ | "openai"
265
+ | "anthropic"
266
+ | "bedrock"
267
+ | "cohere"
268
+ | "langchain"
269
+ | "llamaindex"
270
+ | "togetherai"
271
+ | "vertexai"
272
+ | "aiplatform";
273
+
274
+ function registerLatitudeInstrumentations(options: {
275
+ instrumentations: InstrumentationType[];
276
+ tracerProvider: TracerProvider;
277
+ instrumentationModules?: Partial<Record<InstrumentationType, unknown>>;
278
+ }): Promise<void>;
279
+ ```
280
+
281
+ ## Vendor-Specific Integration Guides
282
+
283
+ For users with existing observability infrastructure.
284
+
285
+ ### With Datadog
286
+
287
+ Use Datadog's OTel `TracerProvider` with `LatitudeSpanProcessor`:
288
+
289
+ ```typescript
290
+ import tracer from "dd-trace";
291
+ import { LatitudeSpanProcessor } from "@latitude-data/telemetry";
292
+
293
+ const ddTracer = tracer.init({ service: "my-app", env: "production" });
294
+ const provider = new ddTracer.TracerProvider();
295
+
296
+ provider.addSpanProcessor(
297
+ new LatitudeSpanProcessor(
298
+ process.env.LATITUDE_API_KEY!,
299
+ process.env.LATITUDE_PROJECT_SLUG!,
300
+ ),
301
+ );
302
+
303
+ provider.register();
304
+
305
+ // LLM calls are now traced and sent to both Datadog and Latitude
138
306
  ```
139
307
 
140
- ## Context Management
308
+ **Adding context:** Use `capture()` if you want to add user IDs, session IDs, or tags to your traces (see [Using `capture()` for Context and Boundaries](#using-capture-for-context-and-boundaries)).
309
+
310
+ ### With Sentry
311
+
312
+ Use Sentry's custom OpenTelemetry setup with `skipOpenTelemetrySetup: true`:
141
313
 
142
314
  ```typescript
143
- // Get the current active context
144
- const ctx = telemetry.context.active()
145
-
146
- // Run code within a specific context
147
- telemetry.context.with(ctx, () => {
148
- // spans created here are children of ctx
149
- })
150
-
151
- // Resume a trace from W3C traceparent
152
- const ctx = telemetry.context.resume({
153
- traceparent: "00-trace-id-span-id-01",
154
- baggage: "key=value",
155
- })
315
+ import * as Sentry from "@sentry/node";
316
+ import {
317
+ SentrySpanProcessor,
318
+ SentrySampler,
319
+ SentryPropagator,
320
+ } from "@sentry/opentelemetry";
321
+ import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
322
+ import {
323
+ LatitudeSpanProcessor,
324
+ registerLatitudeInstrumentations,
325
+ } from "@latitude-data/telemetry";
326
+
327
+ // Initialize Sentry with custom OTel setup flag
328
+ const sentryClient = Sentry.init({
329
+ dsn: process.env.SENTRY_DSN,
330
+ skipOpenTelemetrySetup: true,
331
+ tracesSampleRate: 1.0,
332
+ });
333
+
334
+ // Create a shared provider with both Sentry and Latitude processors
335
+ const provider = new NodeTracerProvider({
336
+ sampler: sentryClient ? new SentrySampler(sentryClient) : undefined,
337
+ spanProcessors: [
338
+ new SentrySpanProcessor(), // Send spans to Sentry
339
+ new LatitudeSpanProcessor(
340
+ process.env.LATITUDE_API_KEY!,
341
+ process.env.LATITUDE_PROJECT_SLUG!,
342
+ ), // Send spans to Latitude
343
+ ],
344
+ });
345
+
346
+ // Register with Sentry's propagator and context manager
347
+ provider.register({
348
+ propagator: new SentryPropagator(),
349
+ contextManager: new Sentry.SentryContextManager(),
350
+ });
351
+
352
+ // Add LLM instrumentations
353
+ await registerLatitudeInstrumentations({
354
+ instrumentations: ["openai"],
355
+ tracerProvider: provider,
356
+ });
357
+
358
+ // Validate the setup
359
+ Sentry.validateOpenTelemetrySetup();
360
+
361
+ // LLM calls are now traced and sent to both Sentry and Latitude
156
362
  ```
157
363
 
158
- ## Lifecycle
364
+ **Adding context:** Use `capture()` if you want to add user IDs, session IDs, or tags to your traces (see [Using `capture()` for Context and Boundaries](#using-capture-for-context-and-boundaries)).
365
+
366
+ **Required Sentry packages:** `@sentry/node`, `@sentry/opentelemetry`
367
+
368
+ **Note:** See [Sentry's custom OTel setup documentation](https://docs.sentry.io/platforms/javascript/guides/node/opentelemetry/custom-setup/) for version-specific details.
369
+
370
+ ## Supported AI Providers
371
+
372
+ | Identifier | Package |
373
+ | -------------- | --------------------------------- |
374
+ | `"openai"` | `openai` |
375
+ | `"anthropic"` | `@anthropic-ai/sdk` |
376
+ | `"bedrock"` | `@aws-sdk/client-bedrock-runtime` |
377
+ | `"cohere"` | `cohere-ai` |
378
+ | `"langchain"` | `langchain` |
379
+ | `"llamaindex"` | `llamaindex` |
380
+ | `"togetherai"` | `together-ai` |
381
+ | `"vertexai"` | `@google-cloud/vertexai` |
382
+ | `"aiplatform"` | `@google-cloud/aiplatform` |
383
+
384
+ ## Context Options
385
+
386
+ `capture()` accepts these context options:
387
+
388
+ | Option | Type | OTel Attribute | Description |
389
+ | ----------- | ------------------------- | ----------------------- | ---------------------------- |
390
+ | `name` | `string` | `latitude.capture.name` | Name for the capture context |
391
+ | `tags` | `string[]` | `latitude.tags` | Tags for filtering traces |
392
+ | `metadata` | `Record<string, unknown>` | `latitude.metadata` | Arbitrary key-value metadata |
393
+ | `sessionId` | `string` | `session.id` | Group traces by session |
394
+ | `userId` | `string` | `user.id` | Associate traces with a user |
395
+
396
+ ## Configuration Options
397
+
398
+ ### Smart Filtering
399
+
400
+ By default, only LLM-relevant spans are exported:
159
401
 
160
402
  ```typescript
161
- // Force flush pending spans
162
- await telemetry.flush()
403
+ new LatitudeSpanProcessor(apiKey, projectSlug, {
404
+ disableSmartFilter: true, // Export all spans
405
+ });
406
+ ```
407
+
408
+ ### Redaction
163
409
 
164
- // Shutdown (flushes then closes)
165
- await telemetry.shutdown()
410
+ PII redaction is enabled by default for security-sensitive attributes only:
411
+
412
+ **Redacted by default:**
413
+
414
+ - HTTP authorization headers
415
+ - HTTP cookies
416
+ - HTTP API key headers (`x-api-key`)
417
+ - Database statements
418
+
419
+ ```typescript
420
+ new LatitudeSpanProcessor(apiKey, projectSlug, {
421
+ disableRedact: true, // Disable all redaction
422
+ redact: {
423
+ attributes: [/^password$/i, /secret/i], // Add custom patterns
424
+ mask: (attr, value) => "[REDACTED]",
425
+ },
426
+ });
427
+ ```
428
+
429
+ ### Custom Filtering
430
+
431
+ ```typescript
432
+ new LatitudeSpanProcessor(apiKey, projectSlug, {
433
+ shouldExportSpan: (span) => span.attributes["custom"] === true,
434
+ blockedInstrumentationScopes: ["opentelemetry.instrumentation.fs"],
435
+ });
166
436
  ```
167
437
 
168
438
  ## Environment Variables
169
439
 
170
- | Variable | Default | Description |
171
- |----------|---------|-------------|
440
+ | Variable | Default | Description |
441
+ | ------------------------ | ------------------------------------------------------------------- | ---------------------- |
172
442
  | `LATITUDE_TELEMETRY_URL` | `http://localhost:3002` (dev) / `https://ingest.latitude.so` (prod) | OTLP exporter endpoint |
173
443
 
444
+ ## Troubleshooting
445
+
446
+ ### Spans not appearing in Latitude
447
+
448
+ 1. **Check API key and project slug** — Must be non-empty strings
449
+ 2. **Verify instrumentations are registered** — Use `await registerLatitudeInstrumentations()`
450
+ 3. **Flush before exit** — Call `await latitude.flush()` or `await provider.forceFlush()`
451
+ 4. **Check smart filter** — Only LLM spans are exported by default. Use `disableSmartFilter: true` to export all spans
452
+ 5. **Ensure `capture()` wraps the code that creates spans** — `capture()` itself doesn't create spans; it only attaches context to spans created by instrumentations
453
+
454
+ ### No spans created inside `capture()`
455
+
456
+ `capture()` only attaches context. You need:
457
+
458
+ 1. An active instrumentation (e.g., `@traceloop/instrumentation-openai`)
459
+ 2. That instrumentation to create spans for the operations inside your callback
460
+
461
+ ### Context not propagating
462
+
463
+ Ensure you have a functioning OpenTelemetry context manager registered:
464
+
465
+ ```typescript
466
+ import { AsyncLocalStorageContextManager } from "@opentelemetry/context-async-hooks";
467
+ import { context } from "@opentelemetry/api";
468
+
469
+ context.setGlobalContextManager(new AsyncLocalStorageContextManager());
470
+ ```
471
+
472
+ `initLatitude()` does this automatically. For shared-provider setups, your app's existing OTel setup should already have this.
473
+
174
474
  ## License
175
475
 
176
476
  MIT