@lelemondev/sdk 0.9.9 → 0.10.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -31,23 +31,189 @@ npm install @lelemondev/sdk
31
31
  ## Quick Start
32
32
 
33
33
  ```typescript
34
- // Import from provider-specific entry point for smaller bundle size
35
34
  import { init, observe } from '@lelemondev/sdk/openai';
36
35
  import OpenAI from 'openai';
37
36
 
38
- // 1. Initialize once
39
37
  init({ apiKey: process.env.LELEMON_API_KEY });
40
38
 
41
- // 2. Wrap your client
42
39
  const openai = observe(new OpenAI());
43
40
 
44
- // 3. Use normally - all calls traced automatically
45
41
  const response = await openai.chat.completions.create({
46
42
  model: 'gpt-4',
47
43
  messages: [{ role: 'user', content: 'Hello!' }],
48
44
  });
49
45
  ```
50
46
 
47
+ That's the bare minimum — it works, but your traces will be flat and anonymous. The next section shows you how to get **structured, beautiful traces** in your dashboard from day one.
48
+
49
+ ## Recommended Usage
50
+
51
+ The difference between messy traces and clear, actionable ones comes down to three things: **naming your traces**, **identifying users and sessions**, and **grouping related calls together**.
52
+
53
+ ### 1. Wrap your workflow with `trace()`
54
+
55
+ Without `trace()`, each LLM call shows up as an isolated entry. With it, you get a hierarchical view that shows exactly what happened and why.
56
+
57
+ ```typescript
58
+ import { init, observe, trace, span, flush } from '@lelemondev/sdk/openai';
59
+ import OpenAI from 'openai';
60
+
61
+ init({ apiKey: process.env.LELEMON_API_KEY });
62
+ const openai = observe(new OpenAI());
63
+
64
+ // ✅ Named trace groups everything into a clear hierarchy
65
+ const result = await trace('answer-question', async () => {
66
+ // This LLM call appears as a child span — you can see it nested
67
+ const response = await openai.chat.completions.create({
68
+ model: 'gpt-4',
69
+ messages: [{ role: 'user', content: 'What is observability?' }],
70
+ });
71
+
72
+ return response.choices[0].message.content;
73
+ });
74
+ ```
75
+
76
+ **In the dashboard, this renders as:**
77
+ ```
78
+ ▼ answer-question (trace) 1.2s $0.003
79
+ ▼ gpt-4 (llm) 1.2s $0.003 320 tokens
80
+ ```
81
+
82
+ Without `trace()`, you'd just see a flat `gpt-4` entry with no context.
83
+
84
+ ### 2. Add `userId` and `sessionId`
85
+
86
+ These two fields unlock filtering, conversation grouping, and per-user analytics in the dashboard.
87
+
88
+ ```typescript
89
+ // ✅ Always pass userId and sessionId — this is what makes the dashboard useful
90
+ const openai = observe(new OpenAI(), {
91
+ userId: 'user-123', // Who is making this request
92
+ sessionId: 'conversation-abc', // Groups multi-turn conversations together
93
+ });
94
+ ```
95
+
96
+ - **`userId`** → Filter traces by user, track costs per user, debug specific user issues
97
+ - **`sessionId`** → See an entire conversation across multiple requests as one session
98
+
99
+ ### 3. Add spans for non-LLM operations
100
+
101
+ LLM calls are traced automatically, but your pipeline probably does more: vector search, tool calls, reranking, guardrails. Use `span()` to capture these so the full picture shows up in the dashboard.
102
+
103
+ ```typescript
104
+ import { trace, span } from '@lelemondev/sdk/openai';
105
+
106
+ await trace('rag-pipeline', async () => {
107
+ // Capture vector search as a retrieval span
108
+ const t0 = Date.now();
109
+ const docs = await vectorDB.search(query, { topK: 5 });
110
+ span({
111
+ type: 'retrieval', // Shows with a blue badge in dashboard
112
+ name: 'vector-search', // Descriptive name — not "search" or "step-1"
113
+ input: { query, topK: 5 },
114
+ output: { count: docs.length },
115
+ durationMs: Date.now() - t0,
116
+ });
117
+
118
+ // LLM call is captured automatically as a child
119
+ const response = await openai.chat.completions.create({
120
+ model: 'gpt-4',
121
+ messages: [
122
+ { role: 'system', content: 'Answer based on the provided context.' },
123
+ { role: 'user', content: `Context: ${docs.join('\n')}\n\nQuestion: ${query}` },
124
+ ],
125
+ });
126
+
127
+ return response.choices[0].message.content;
128
+ });
129
+ ```
130
+
131
+ **In the dashboard:**
132
+ ```
133
+ ▼ rag-pipeline (trace) 2.4s $0.005
134
+ ▼ vector-search (retrieval) 0.3s
135
+ ▼ gpt-4 (llm) 2.1s $0.005 580 tokens
136
+ ```
137
+
138
+ **Available span types:** `retrieval`, `embedding`, `tool`, `guardrail`, `rerank`, `custom`
139
+
140
+ ### 4. Use descriptive names
141
+
142
+ Names are the first thing you see in the dashboard. Good names make traces scannable; bad names make them useless.
143
+
144
+ ```typescript
145
+ // ❌ Bad — these all look the same in the dashboard
146
+ await trace('process', async () => { ... });
147
+ await trace('handle', async () => { ... });
148
+ span({ type: 'tool', name: 'function' });
149
+
150
+ // ✅ Good — you can immediately understand what happened
151
+ await trace('sales-agent', async () => { ... });
152
+ await trace('summarize-document', async () => { ... });
153
+ span({ type: 'tool', name: 'get-weather-forecast' });
154
+ ```
155
+
156
+ ### Full example: AI agent with everything wired up
157
+
158
+ Here's a complete example combining all the best practices — this is what we recommend as the starting point for any production app:
159
+
160
+ ```typescript
161
+ import { init, observe, trace, span, flush } from '@lelemondev/sdk/openai';
162
+ import OpenAI from 'openai';
163
+
164
+ // 1. Initialize once at app startup
165
+ init({ apiKey: process.env.LELEMON_API_KEY });
166
+
167
+ async function handleChat(userId: string, sessionId: string, message: string) {
168
+ // 2. Wrap client with user context
169
+ const openai = observe(new OpenAI(), { userId, sessionId });
170
+
171
+ // 3. Wrap the full workflow in a named trace
172
+ const result = await trace('customer-support-agent', async () => {
173
+ // 4. Capture non-LLM operations as typed spans
174
+ const t0 = Date.now();
175
+ const history = await db.getChatHistory(sessionId);
176
+ span({
177
+ type: 'retrieval',
178
+ name: 'load-chat-history',
179
+ input: { sessionId },
180
+ output: { messageCount: history.length },
181
+ durationMs: Date.now() - t0,
182
+ });
183
+
184
+ // 5. LLM call is automatically captured as a child span
185
+ const response = await openai.chat.completions.create({
186
+ model: 'gpt-4',
187
+ messages: [
188
+ { role: 'system', content: 'You are a helpful customer support agent.' },
189
+ ...history,
190
+ { role: 'user', content: message },
191
+ ],
192
+ });
193
+
194
+ return response.choices[0].message.content;
195
+ });
196
+
197
+ return result;
198
+ }
199
+
200
+ // In your server:
201
+ app.post('/chat', async (req, res) => {
202
+ const { message, conversationId } = req.body;
203
+ const answer = await handleChat(req.user.id, conversationId, message);
204
+ res.json({ answer });
205
+ });
206
+ ```
207
+
208
+ **In the dashboard, each request renders as:**
209
+ ```
210
+ ▼ customer-support-agent (trace) 1.8s $0.004 user: user-123 session: conv-abc
211
+ ▼ load-chat-history (retrieval) 0.1s
212
+ ▼ gpt-4 (llm) 1.7s $0.004 420 tokens
213
+ ```
214
+
215
+ And because you set `sessionId`, all turns of the same conversation are grouped together — you can click into a session and see the entire conversation flow.
216
+
51
217
  ### Provider-Specific Imports
52
218
 
53
219
  Each provider has its own entry point for optimal bundle size:
@@ -62,9 +228,12 @@ import { init, observe, flush } from '@lelemondev/sdk/anthropic';
62
228
  // AWS Bedrock
63
229
  import { init, observe, flush } from '@lelemondev/sdk/bedrock';
64
230
 
65
- // Google Gemini
231
+ // Google Gemini (supports both @google/genai and @google/generative-ai)
66
232
  import { init, observe, flush } from '@lelemondev/sdk/gemini';
67
233
 
234
+ // Google GenAI (dedicated entry point for @google/genai)
235
+ import { init, observe, flush } from '@lelemondev/sdk/google-genai';
236
+
68
237
  // OpenRouter
69
238
  import { init, observe, flush } from '@lelemondev/sdk/openrouter';
70
239
  ```
@@ -77,7 +246,8 @@ import { init, observe, flush } from '@lelemondev/sdk/openrouter';
77
246
  | OpenRouter | ✅ | `chat.completions.create()` (access to 400+ models) |
78
247
  | Anthropic | ✅ | `messages.create()`, `messages.stream()` |
79
248
  | AWS Bedrock | ✅ | `ConverseCommand`, `ConverseStreamCommand`, `InvokeModelCommand` |
80
- | Google Gemini | ✅ | `generateContent()`, `generateContentStream()`, `chat.sendMessage()` |
249
+ | Google Gemini (`@google/generative-ai`) | ✅ | `generateContent()`, `generateContentStream()`, `chat.sendMessage()` |
250
+ | Google GenAI (`@google/genai`) | ✅ | `models.generateContent()`, `models.generateContentStream()`, `chats.create()`, `chat.sendMessage()` |
81
251
 
82
252
  ### OpenRouter
83
253
 
@@ -108,6 +278,116 @@ const response = await openrouter.chat.completions.create({
108
278
 
109
279
  Models are specified in `provider/model` format (e.g., `anthropic/claude-3-opus`, `openai/gpt-4`, `meta-llama/llama-3-70b`). See [OpenRouter Models](https://openrouter.ai/models) for the full list.
110
280
 
281
+ ### Google Gemini
282
+
283
+ The SDK supports both the **new** `@google/genai` (recommended) and the **old** `@google/generative-ai` packages. Both are auto-detected by `observe()`.
284
+
285
+ #### Recommended: `@google/genai` (new SDK)
286
+
287
+ ```bash
288
+ npm install @google/genai
289
+ ```
290
+
291
+ ```typescript
292
+ import { init, observe, flush } from '@lelemondev/sdk/google-genai';
293
+ import { GoogleGenAI } from '@google/genai';
294
+
295
+ init({ apiKey: process.env.LELEMON_API_KEY });
296
+
297
+ const ai = observe(new GoogleGenAI({ apiKey: process.env.GEMINI_API_KEY }));
298
+
299
+ // Generate content
300
+ const response = await ai.models.generateContent({
301
+ model: 'gemini-2.5-flash',
302
+ contents: 'Explain how observability works',
303
+ });
304
+ console.log(response.text);
305
+
306
+ // Streaming
307
+ const stream = await ai.models.generateContentStream({
308
+ model: 'gemini-2.5-flash',
309
+ contents: 'Write a short poem about monitoring',
310
+ });
311
+ for await (const chunk of stream) {
312
+ process.stdout.write(chunk.text ?? '');
313
+ }
314
+
315
+ // Chat (multi-turn)
316
+ const chat = ai.chats.create({
317
+ model: 'gemini-2.5-flash',
318
+ history: [
319
+ { role: 'user', parts: [{ text: 'Hello' }] },
320
+ { role: 'model', parts: [{ text: 'Hi! How can I help?' }] },
321
+ ],
322
+ });
323
+
324
+ const reply = await chat.sendMessage({ message: 'What can you do?' });
325
+ console.log(reply.text);
326
+
327
+ // Streaming chat
328
+ const chatStream = await chat.sendMessageStream({ message: 'Tell me more' });
329
+ for await (const chunk of chatStream) {
330
+ process.stdout.write(chunk.text ?? '');
331
+ }
332
+
333
+ await flush();
334
+ ```
335
+
336
+ #### With config (system instructions, temperature, tools)
337
+
338
+ ```typescript
339
+ const response = await ai.models.generateContent({
340
+ model: 'gemini-2.5-flash',
341
+ contents: 'Analyze this data',
342
+ config: {
343
+ systemInstruction: 'You are a data analyst. Be concise.',
344
+ temperature: 0.3,
345
+ maxOutputTokens: 1000,
346
+ tools: [{ functionDeclarations: [myFunctionDecl] }],
347
+ },
348
+ });
349
+
350
+ // Access function calls if tools were used
351
+ if (response.functionCalls) {
352
+ console.log(response.functionCalls);
353
+ }
354
+ ```
355
+
356
+ #### Legacy: `@google/generative-ai` (old SDK)
357
+
358
+ Still supported but not recommended for new projects:
359
+
360
+ ```typescript
361
+ import { init, observe, flush } from '@lelemondev/sdk/gemini';
362
+ import { GoogleGenerativeAI } from '@google/generative-ai';
363
+
364
+ init({ apiKey: process.env.LELEMON_API_KEY });
365
+
366
+ const genAI = new GoogleGenerativeAI(process.env.GEMINI_API_KEY);
367
+
368
+ // You can observe the client or the model directly
369
+ const model = observe(genAI.getGenerativeModel({ model: 'gemini-2.5-flash' }));
370
+
371
+ const result = await model.generateContent('Hello!');
372
+ console.log(result.response.text());
373
+
374
+ await flush();
375
+ ```
376
+
377
+ #### Auto-detection with generic `observe()`
378
+
379
+ Both SDKs work with the generic entry point too:
380
+
381
+ ```typescript
382
+ import { init, observe } from '@lelemondev/sdk';
383
+ import { GoogleGenAI } from '@google/genai';
384
+
385
+ init({ apiKey: process.env.LELEMON_API_KEY });
386
+
387
+ // Auto-detects @google/genai
388
+ const ai = observe(new GoogleGenAI({ apiKey: process.env.GEMINI_API_KEY }));
389
+ ```
390
+
111
391
  ## Usage Without Framework Integrations
112
392
 
113
393
  The SDK works with **any Node.js application**. Framework integrations are optional - they just automate the `flush()` call.
@@ -602,7 +882,7 @@ When you have multiple LLM clients or make calls from different places, use `cre
602
882
  ```typescript
603
883
  import { init, createObserve } from '@lelemondev/sdk';
604
884
  import OpenAI from 'openai';
605
- import { GoogleGenerativeAI } from '@google/generative-ai';
885
+ import { GoogleGenAI } from '@google/genai';
606
886
  import { BedrockRuntimeClient } from '@aws-sdk/client-bedrock-runtime';
607
887
 
608
888
  init({ apiKey: process.env.LELEMON_API_KEY });
@@ -616,12 +896,12 @@ const observeForUser = createObserve({
616
896
 
617
897
  // All clients inherit the same context
618
898
  const openai = observeForUser(new OpenAI());
619
- const gemini = observeForUser(new GoogleGenerativeAI(apiKey));
899
+ const ai = observeForUser(new GoogleGenAI({ apiKey }));
620
900
  const bedrock = observeForUser(new BedrockRuntimeClient({}));
621
901
 
622
902
  // All these calls will be associated with user-123
623
903
  await openai.chat.completions.create({ ... });
624
- await gemini.getGenerativeModel({ model: 'gemini-pro' }).generateContent('...');
904
+ await ai.models.generateContent({ model: 'gemini-2.5-flash', contents: '...' });
625
905
  ```
626
906
 
627
907
  ### In Middleware (Express)
@@ -653,7 +933,7 @@ app.post('/chat', async (req, res) => {
653
933
  });
654
934
 
655
935
  app.post('/summarize', async (req, res) => {
656
- const gemini = req.observe(new GoogleGenerativeAI(apiKey));
936
+ const ai = req.observe(new GoogleGenAI({ apiKey }));
657
937
  // Same user context, different endpoint
658
938
  });
659
939
  ```
@@ -736,16 +1016,18 @@ wss.on('connection', (ws, req) => {
736
1016
 
737
1017
  // Create observe function for this connection
738
1018
  const observe = createUserObserve(userId, sessionId, 'websocket');
739
- const gemini = observe(new GoogleGenerativeAI(apiKey));
1019
+ const ai = observe(new GoogleGenAI({ apiKey }));
740
1020
 
741
1021
  ws.on('message', async (data) => {
742
1022
  const { message } = JSON.parse(data);
743
1023
 
744
1024
  // This trace will be linked to the same session as the API calls
745
- const model = gemini.getGenerativeModel({ model: 'gemini-pro' });
746
- const result = await model.generateContent(message);
1025
+ const result = await ai.models.generateContent({
1026
+ model: 'gemini-2.5-flash',
1027
+ contents: message,
1028
+ });
747
1029
 
748
- ws.send(JSON.stringify({ response: result.response.text() }));
1030
+ ws.send(JSON.stringify({ response: result.text }));
749
1031
  });
750
1032
  });
751
1033
  ```
@@ -828,7 +1110,7 @@ for await (const chunk of stream) {
828
1110
 
829
1111
  Each LLM call automatically captures:
830
1112
 
831
- - **Provider** - openai, anthropic, bedrock, gemini
1113
+ - **Provider** - openai, anthropic, bedrock, gemini, openrouter
832
1114
  - **Model** - gpt-4, claude-3-opus, gemini-pro, etc.
833
1115
  - **Input** - Messages/prompt (sanitized)
834
1116
  - **Output** - Response content
@@ -1,5 +1,5 @@
1
- import { O as ObserveOptions } from './capture-CiUL-4hv.mjs';
2
- export { C as CaptureSpanOptions, L as LelemonConfig, h as SpanOptions, d as SpanType, T as TraceContext, e as TraceOptions, c as captureSpan, f as flush, g as getTraceContext, i as init, a as isEnabled, s as span, t as trace } from './capture-CiUL-4hv.mjs';
1
+ import { O as ObserveOptions } from './capture-DOj4Z9MB.mjs';
2
+ export { C as CaptureSpanOptions, L as LelemonConfig, h as SpanOptions, d as SpanType, T as TraceContext, e as TraceOptions, c as captureSpan, f as flush, g as getTraceContext, i as init, a as isEnabled, s as span, t as trace } from './capture-DOj4Z9MB.mjs';
3
3
 
4
4
  /**
5
5
  * Anthropic Provider Entry Point
@@ -1,5 +1,5 @@
1
- import { O as ObserveOptions } from './capture-CiUL-4hv.js';
2
- export { C as CaptureSpanOptions, L as LelemonConfig, h as SpanOptions, d as SpanType, T as TraceContext, e as TraceOptions, c as captureSpan, f as flush, g as getTraceContext, i as init, a as isEnabled, s as span, t as trace } from './capture-CiUL-4hv.js';
1
+ import { O as ObserveOptions } from './capture-DOj4Z9MB.js';
2
+ export { C as CaptureSpanOptions, L as LelemonConfig, h as SpanOptions, d as SpanType, T as TraceContext, e as TraceOptions, c as captureSpan, f as flush, g as getTraceContext, i as init, a as isEnabled, s as span, t as trace } from './capture-DOj4Z9MB.js';
3
3
 
4
4
  /**
5
5
  * Anthropic Provider Entry Point
package/dist/anthropic.js CHANGED
@@ -1,4 +1,4 @@
1
1
  'use strict';var async_hooks=require('async_hooks');/* @lelemondev/sdk - LLM Observability */
2
- var ge=Object.defineProperty;var me=(e,t,n)=>t in e?ge(e,t,{enumerable:true,configurable:true,writable:true,value:n}):e[t]=n;var F=(e=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(e,{get:(t,n)=>(typeof require<"u"?require:t)[n]}):e)(function(e){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+e+'" is not supported')});var ye=(e,t)=>()=>(t||e((t={exports:{}}).exports,t),t.exports);var I=(e,t,n)=>me(e,typeof t!="symbol"?t+"":t,n);var ee=ye((it,Te)=>{Te.exports={name:"@lelemondev/sdk",version:"0.9.9",description:"Automatic LLM observability. Wrap your client, everything is traced.",author:"Lelemon <info@lelemon.dev>",license:"MIT",repository:{type:"git",url:"git+https://github.com/lelemondev/lelemondev-sdk.git"},homepage:"https://lelemon.dev",bugs:{url:"https://github.com/lelemondev/lelemondev-sdk/issues"},keywords:["llm","observability","tracing","openai","anthropic","nextjs","lambda","express","hono","claude","gpt","ai","monitoring","serverless"],main:"./dist/index.js",module:"./dist/index.mjs",types:"./dist/index.d.ts",exports:{".":{types:"./dist/index.d.ts",import:"./dist/index.mjs",require:"./dist/index.js"},"./openai":{types:"./dist/openai.d.ts",import:"./dist/openai.mjs",require:"./dist/openai.js"},"./anthropic":{types:"./dist/anthropic.d.ts",import:"./dist/anthropic.mjs",require:"./dist/anthropic.js"},"./bedrock":{types:"./dist/bedrock.d.ts",import:"./dist/bedrock.mjs",require:"./dist/bedrock.js"},"./gemini":{types:"./dist/gemini.d.ts",import:"./dist/gemini.mjs",require:"./dist/gemini.js"},"./openrouter":{types:"./dist/openrouter.d.ts",import:"./dist/openrouter.mjs",require:"./dist/openrouter.js"},"./next":{types:"./dist/next.d.ts",import:"./dist/next.mjs",require:"./dist/next.js"},"./lambda":{types:"./dist/lambda.d.ts",import:"./dist/lambda.mjs",require:"./dist/lambda.js"},"./express":{types:"./dist/express.d.ts",import:"./dist/express.mjs",require:"./dist/express.js"},"./hono":{types:"./dist/hono.d.ts",import:"./dist/hono.mjs",require:"./dist/hono.js"},"./integrations":{types:"./dist/integrations.d.ts",import:"./dist/integrations.mjs",require:"./dist/integrations.js"},"./package.json":"./package.json"},typesVersions:{"*":{openai:["./dist/openai.d.ts"],anthropic:["./dist/anthropic.d.ts"],bedrock:["./dist/bedrock.d.ts"],gemini:["./dist/gemini.d.ts"],openrouter:["./dist/openrouter.d.ts"],next:["./dist/next.d.ts"],lambda:["./dist/lambda.d.ts"],express:["./dist/express.d.ts"],hono:["./dist/hono.d.ts"],integrations:["./dist/integrations.d.ts"],"*":["./dist/index.d.ts"]}},files:["dist/**/*.js","dist/**/*.mjs","dist/**/*.d.ts","dist/**/*.d.mts","README.md"],sideEffects:false,engines:{node:">=18.0.0"},scripts:{build:"tsup",dev:"tsup --watch",docs:"typedoc && node scripts/generate-llms-txt.mjs",prepublishOnly:"npm run build",lint:"eslint src/",test:"vitest","test:run":"vitest run","test:coverage":"vitest run --coverage","test:e2e":"vitest run tests/e2e",typecheck:"tsc --noEmit"},devDependencies:{"@aws-sdk/client-bedrock-runtime":"^3.962.0","@google/generative-ai":"^0.24.1","@types/node":"^20.0.0","@vitest/coverage-v8":"^2.0.0",dotenv:"^17.2.3",openai:"^6.15.0",tsup:"^8.5.1",typedoc:"^0.28.15",typescript:"^5.9.3",vitest:"^2.0.0"}};});var G=false;function V(e){G=e;}function y(){return G?true:be("LELEMON_DEBUG")==="true"}var g="[Lelemon]";function f(e,t){y()&&P("debug",e,t);}function q(e,t){y()&&P("info",e,t);}function C(e,t){P("warn",e,t);}function $(e,t,n,r){y()&&console.log(`${g} Captured trace: provider=${e} model=${t} duration=${n}ms status=${r}`);}function M(e,t){console.error(`${g} Failed to capture trace: provider=${e} error=${t.message}`);}function H(e){y()&&console.log(`${g} Wrapped client: provider=${e}`);}function W(e,t){y()&&console.log(`${g} Sending batch: count=${e} endpoint=${t}`);}function Y(e,t){y()&&console.log(`${g} Batch sent successfully: count=${e} duration=${t}ms`);}function X(e,t){let n=t instanceof Error?t.message:String(t);console.error(`${g} Batch send failed: count=${e} error=${n}`);}function Z(e,t,n){y()&&console.log(`${g} Request: ${e} ${t} (${n} bytes)`);}function Q(e,t){y()&&console.log(`${g} Response: status=${e} duration=${t}ms`);}function P(e,t,n){let r=e==="error"?console.error:e==="warn"?console.warn:console.log;n!==void 0?r(`${g} ${t}`,n):r(`${g} ${t}`);}function be(e){if(typeof process<"u"&&process.env)return process.env[e]}var he=10,ke=1e3,we=1e4,R=class{constructor(t){I(this,"config");I(this,"queue",[]);I(this,"flushPromise",null);I(this,"flushTimer",null);this.config={apiKey:t.apiKey,endpoint:t.endpoint,debug:t.debug,disabled:t.disabled,batchSize:t.batchSize??he,flushIntervalMs:t.flushIntervalMs??ke,requestTimeoutMs:t.requestTimeoutMs??we};}isEnabled(){return !this.config.disabled&&!!this.config.apiKey}enqueue(t){this.config.disabled||(this.queue.push(t),this.queue.length>=this.config.batchSize?this.flush():this.scheduleFlush());}async flush(){if(this.flushPromise)return this.flushPromise;if(this.queue.length===0)return;this.cancelScheduledFlush();let t=this.queue;return this.queue=[],this.flushPromise=this.sendBatch(t).finally(()=>{this.flushPromise=null;}),this.flushPromise}getPendingCount(){return this.queue.length}scheduleFlush(){this.flushTimer===null&&(this.flushTimer=setTimeout(()=>{this.flushTimer=null,this.flush();},this.config.flushIntervalMs));}cancelScheduledFlush(){this.flushTimer!==null&&(clearTimeout(this.flushTimer),this.flushTimer=null);}async sendBatch(t){if(t.length===0)return;let n=Date.now();W(t.length,`${this.config.endpoint}/api/v1/ingest`);try{await this.request("POST","/api/v1/ingest",{events:t}),Y(t.length,Date.now()-n);}catch(r){X(t.length,r);}}async request(t,n,r){let s=`${this.config.endpoint}${n}`,a=new AbortController,u=r?JSON.stringify(r):void 0;Z(t,s,u?.length??0);let c=setTimeout(()=>{a.abort();},this.config.requestTimeoutMs),d=Date.now();try{let o=await fetch(s,{method:t,headers:{"Content-Type":"application/json",Authorization:`Bearer ${this.config.apiKey}`},body:u,signal:a.signal});if(clearTimeout(c),Q(o.status,Date.now()-d),!o.ok){let i=await o.text().catch(()=>"Unknown error");throw new Error(`HTTP ${o.status}: ${i}`)}let l=await o.text();return l?JSON.parse(l):{}}catch(o){throw clearTimeout(c),o instanceof Error&&o.name==="AbortError"?new Error(`Request timeout after ${this.config.requestTimeoutMs}ms`):o}}};var ve="@lelemondev/sdk",xe="nodejs";function Ie(){return typeof process<"u"&&process.versions?.node?{name:"nodejs",version:process.versions.node}:typeof Deno<"u"?{name:"deno",version:Deno.version?.deno??"unknown"}:typeof Bun<"u"?{name:"bun",version:Bun.version??"unknown"}:typeof window<"u"&&typeof navigator<"u"?{name:"browser",version:navigator.userAgent}:null}function Se(){if(typeof process<"u"&&process.platform){let e=process.platform;switch(e){case "darwin":return "darwin";case "win32":return "windows";case "linux":return "linux";default:return e}}if(typeof navigator<"u"){let e=navigator.userAgent.toLowerCase();if(e.includes("mac"))return "darwin";if(e.includes("win"))return "windows";if(e.includes("linux"))return "linux"}return null}function _e(){try{if(typeof F<"u")return ee().version??"unknown"}catch{}return "unknown"}var v=null;function ne(e){if(!v){let n=Ie(),r=Se();v={"telemetry.sdk.name":ve,"telemetry.sdk.version":_e(),"telemetry.sdk.language":xe},n&&(v["process.runtime.name"]=n.name,v["process.runtime.version"]=n.version),r&&(v["os.type"]=r);}let t={...v};return e?.name&&(t["service.name"]=e.name),e?.version&&(t["service.version"]=e.version),e?.environment&&(t["deployment.environment"]=e.environment),t}var N={},k=null,K=null,re="https://api.lelemon.dev";function Ce(e={}){N=e,e.debug&&V(true),K=ne(e.service),q("Initializing SDK",{endpoint:e.endpoint??re,debug:e.debug??false,disabled:e.disabled??false,telemetry:K}),k=se(e),k.isEnabled()?q("SDK initialized - tracing enabled"):f("SDK initialized - tracing disabled (no API key or explicitly disabled)");}function A(){return N}function D(){return K}function Me(){return w().isEnabled()}function w(){return k||(k=se(N)),k}async function Re(){k&&await k.flush();}function se(e){let t=e.apiKey??Ae("LELEMON_API_KEY");return !t&&!e.disabled&&C("No API key provided. Set apiKey in init() or LELEMON_API_KEY env var. Tracing disabled."),new R({apiKey:t??"",endpoint:e.endpoint??re,debug:e.debug??false,disabled:e.disabled??!t,batchSize:e.batchSize,flushIntervalMs:e.flushIntervalMs,requestTimeoutMs:e.requestTimeoutMs})}function Ae(e){if(typeof process<"u"&&process.env)return process.env[e]}var L=Symbol.for("@lelemondev/sdk:globalContext");function ae(){let e=globalThis;return e[L]||(e[L]={context:{}}),e[L]}function ue(e){ae().context=e,f("Global context updated",e);}function _(){return ae().context}function j(e){try{let t=w();if(!t.isEnabled()){f("Transport disabled, skipping trace capture");return}let n=_(),r=m(),s=x(),a=D(),u={provider:e.provider,model:e.model,input:B(e.input),rawResponse:e.rawResponse?S(e.rawResponse,0):void 0,durationMs:e.durationMs,status:e.status,streaming:e.streaming,firstTokenMs:e.firstTokenMs,sessionId:r?.sessionId??n.sessionId,userId:r?.userId??n.userId,traceId:r?.traceId,spanId:s,parentSpanId:r?.currentSpanId,metadata:{...n.metadata,...e.metadata,...r?{_traceName:r.name}:{},...a?{_telemetry:a}:{}},tags:n.tags,spanType:e.spanType,name:e.name};return $(e.provider,e.model,e.durationMs,e.status),t.enqueue(u),s}catch(t){M(e.provider,t instanceof Error?t:new Error(String(t)));return}}function E(e){try{let t=w();if(!t.isEnabled()){f("Transport disabled, skipping error capture");return}let n=_(),r=m(),s=D(),a={provider:e.provider,model:e.model,input:B(e.input),durationMs:e.durationMs,status:"error",errorMessage:e.error.message,streaming:e.streaming,sessionId:r?.sessionId??n.sessionId,userId:r?.userId??n.userId,traceId:r?.traceId,spanId:x(),parentSpanId:r?.currentSpanId,metadata:{...n.metadata,...e.metadata,...r?{_traceName:r.name}:{},...s?{_telemetry:s}:{}},tags:n.tags};$(e.provider,e.model,e.durationMs,"error"),f("Error details",{message:e.error.message,stack:e.error.stack}),t.enqueue(a);}catch(t){M(e.provider,t instanceof Error?t:new Error(String(t)));}}function U(e){try{let t=w();if(!t.isEnabled()){f("Transport disabled, skipping span capture");return}let n=_(),r=m(),s=e.metadata?._traceId,a=e.metadata?._parentSpanId,u=D(),c={...n.metadata,...e.metadata,...u?{_telemetry:u}:{}};delete c._traceId,delete c._parentSpanId;let d={spanType:e.type,name:e.name,provider:"unknown",model:e.name,input:B(e.input),output:S(e.output,0),durationMs:e.durationMs,status:e.status||"success",errorMessage:e.errorMessage,streaming:!1,sessionId:r?.sessionId??n.sessionId,userId:r?.userId??n.userId,traceId:s??r?.traceId,spanId:x(),parentSpanId:a??r?.currentSpanId,toolCallId:e.toolCallId,metadata:c,tags:n.tags};f(`Span captured: ${e.type}/${e.name}`,{durationMs:e.durationMs}),t.enqueue(d);}catch(t){M("unknown",t instanceof Error?t:new Error(String(t)));}}var oe=1e5,De=["api_key","apikey","password","secret","authorization"],je=["access_token","auth_token","bearer_token","refresh_token","id_token","session_token"],Oe=["inputtokens","outputtokens","totaltokens","prompttokens","completiontokens","cachereadtokens","cachewritetokens","cachereadinputtokens","cachewriteinputtokens","reasoningtokens"],ie={emails:/[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/g,phones:/\b\d{9,}\b/g};function ce(){return A().redaction??{}}function qe(e){let t=e.toLowerCase();return Oe.includes(t)?false:!!(De.some(r=>t.includes(r))||je.some(r=>t.includes(r))||(ce().keys??[]).some(r=>t.includes(r.toLowerCase())))}function $e(e){let t=ce(),n=e;for(let r of t.patterns??[])r.lastIndex=0,n=n.replace(r,"[REDACTED]");return t.emails&&(n=n.replace(ie.emails,"[EMAIL]")),t.phones&&(n=n.replace(ie.phones,"[PHONE]")),n}function B(e){return S(e,0)}function S(e,t){if(t>10)return "[max depth exceeded]";if(e==null)return e;if(typeof e=="string"){let n=$e(e);return n.length>oe&&(n=n.slice(0,oe)+"...[truncated]"),n}if(typeof e=="number"||typeof e=="boolean")return e;if(Array.isArray(e))return e.map(n=>S(n,t+1));if(typeof e=="object"){let n={};for(let[r,s]of Object.entries(e))qe(r)?n[r]="[REDACTED]":n[r]=S(s,t+1);return n}return String(e)}var J=Symbol.for("@lelemondev/sdk:traceStorage");function Ke(){let e=globalThis;return e[J]||(e[J]=new async_hooks.AsyncLocalStorage),e[J]}var de=Ke();function x(){return typeof crypto<"u"&&crypto.randomUUID?crypto.randomUUID():`${Date.now().toString(36)}-${Math.random().toString(36).slice(2,11)}`}function m(){return de.getStore()}function O(e,t){let n=m();if(n)for(let r of e)n.pendingToolCalls.set(r,t),f(`Registered tool call ${r} \u2192 LLM span ${t}`);}function Ne(e){let t=m();if(t)return e&&t.pendingToolCalls.has(e)?t.pendingToolCalls.get(e):t.currentSpanId}function Le(e){let t=m();t&&t.pendingToolCalls.delete(e);}async function Ue(e,t){let n=typeof e=="string"?{name:e}:e,r=m(),s=r?.traceId??x(),a=x(),u={traceId:s,rootSpanId:a,currentSpanId:a,parentSpanId:r?.currentSpanId,name:n.name,startTime:Date.now(),input:n.input,metadata:n.metadata,tags:n.tags,outputKey:n.outputKey,outputTransform:n.outputTransform,sessionId:n.sessionId,userId:n.userId,pendingToolCalls:new Map};return de.run(u,async()=>{let c,d;try{return c=await t(),c}catch(o){throw d=o instanceof Error?o:new Error(String(o)),o}finally{ze(u,d?void 0:c,d);}})}var Be=["text","content","message","output","response","result","answer"];function Je(e,t,n){if(n)try{return n(e)}catch{return e}if(t===false)return e;if(typeof t=="string"&&e&&typeof e=="object"){let s=e;return t in s?s[t]:e}if(e==null||typeof e!="object"||Array.isArray(e))return e;let r=e;for(let s of Be)if(s in r&&typeof r[s]=="string")return r[s];return e}function ze(e,t,n){let r=w();if(!r.isEnabled()){f("Transport disabled, skipping root span");return}let s=_(),a=Date.now()-e.startTime,u=n?null:Je(t,e.outputKey,e.outputTransform),c={spanType:"agent",name:e.name,provider:"agent",model:e.name,traceId:e.traceId,spanId:e.rootSpanId,parentSpanId:e.parentSpanId,input:e.input,output:u,inputTokens:0,outputTokens:0,durationMs:a,status:n?"error":"success",errorMessage:n?.message,streaming:false,sessionId:e.sessionId??s.sessionId,userId:e.userId??s.userId,metadata:{...s.metadata,...e.metadata},tags:e.tags??s.tags};f(`Sending root span: ${e.name}`,{durationMs:a,hasError:!!n}),r.enqueue(c);}function Fe(e){let t=m();if(!t){process.env.NODE_ENV!=="production"&&console.warn("[Lelemon] span() called outside of trace() - span will not be captured");return}let n=Ne(e.toolCallId);U({type:e.type,name:e.name,input:e.input,output:e.output,durationMs:e.durationMs??0,status:e.status??"success",errorMessage:e.errorMessage,toolCallId:e.toolCallId,metadata:{...e.metadata,_traceId:t.traceId,_parentSpanId:n}}),e.toolCallId&&Le(e.toolCallId);}var T="anthropic";function le(e){if(!e||typeof e!="object")return false;if(e.constructor?.name==="Anthropic")return true;let n=e;return !!(n.messages&&typeof n.messages=="object")}function pe(e){return async function(...n){let r=Date.now(),s=n[0]||{},a=s.stream===true;try{let u=await e(...n);if(a&&Ge(u))return He(u,s,r);let c=Date.now()-r,d=u,o=j({provider:T,model:s.model||d.model||"unknown",input:{system:s.system,messages:s.messages},rawResponse:u,durationMs:c,status:"success",streaming:!1});if(o&&d.content){let l=z(d.content);l.length>0&&O(l,o);}return u}catch(u){let c=Date.now()-r;throw E({provider:T,model:s.model||"unknown",input:{system:s.system,messages:s.messages},error:u instanceof Error?u:new Error(String(u)),durationMs:c,streaming:a}),u}}}function fe(e){return function(...n){let r=Date.now(),s=n[0]||{};try{let a=e(...n);return a&&typeof a=="object"?Ve(a,s,r):a}catch(a){let u=Date.now()-r;throw E({provider:T,model:s.model||"unknown",input:{system:s.system,messages:s.messages},error:a instanceof Error?a:new Error(String(a)),durationMs:u,streaming:true}),a}}}function Ge(e){return e!=null&&typeof e[Symbol.asyncIterator]=="function"}function Ve(e,t,n){let r=e;if(!r[Symbol.asyncIterator])return e;let s={content:[],usage:{input_tokens:0,output_tokens:0}},a=false,u,c=false,d=null,o=[],l=async function*(){try{for await(let i of r){if(i.type==="message_start"&&i.message&&(s.model=i.message.model,s.id=i.message.id,s.role=i.message.role,i.message.usage&&(s.usage={...i.message.usage})),i.type==="content_block_start"&&i.content_block&&(d=i.index??o.length,o[d]={...i.content_block},i.content_block.type==="text"&&(o[d].text=""),i.content_block.type==="thinking"&&(o[d].thinking="")),i.type==="content_block_delta"&&i.delta&&d!==null){let p=o[d];p&&(i.delta.text&&(c||(c=!0,u=Date.now()-n),p.type==="thinking"?p.thinking=(p.thinking||"")+i.delta.text:p.type==="text"&&(p.text=(p.text||"")+i.delta.text)),i.delta.partial_json&&p.type==="tool_use"&&(p._inputJson=(p._inputJson||"")+i.delta.partial_json));}if(i.type==="content_block_stop"&&d!==null){let p=o[d];if(p&&p.type==="tool_use"){let b=p._inputJson;if(b){try{p.input=JSON.parse(b);}catch{}delete p._inputJson;}}d=null;}i.type==="message_delta"&&(i.usage?.output_tokens&&(s.usage.output_tokens=i.usage.output_tokens),i.delta?.stop_reason&&(s.stop_reason=i.delta.stop_reason)),yield i;}}catch(i){if(!a){a=true;let p=Date.now()-n;E({provider:T,model:s.model||t.model||"unknown",input:{system:t.system,messages:t.messages},error:i instanceof Error?i:new Error(String(i)),durationMs:p,streaming:true});}throw i}finally{if(!a){a=true;let i=Date.now()-n;s.content=o.filter(Boolean);let p=j({provider:T,model:s.model||t.model||"unknown",input:{system:t.system,messages:t.messages},rawResponse:s,durationMs:i,status:"success",streaming:true,firstTokenMs:u});if(p&&s.content){let b=z(s.content);b.length>0&&O(b,p);}}}};return new Proxy(e,{get(i,p,b){return p===Symbol.asyncIterator?()=>l()[Symbol.asyncIterator]():Reflect.get(i,p,b)}})}async function*He(e,t,n){let r={content:[],usage:{input_tokens:0,output_tokens:0}},s=null,a,u=false,c=null,d=[];try{for await(let o of e){if(o.type==="message_start"&&o.message&&(r.model=o.message.model,r.id=o.message.id,r.role=o.message.role,o.message.usage&&(r.usage={...o.message.usage})),o.type==="content_block_start"&&o.content_block&&(c=o.index??d.length,d[c]={...o.content_block},o.content_block.type==="text"&&(d[c].text=""),o.content_block.type==="thinking"&&(d[c].thinking="")),o.type==="content_block_delta"&&o.delta&&c!==null){let l=d[c];l&&(o.delta.text&&(u||(u=!0,a=Date.now()-n),l.type==="thinking"?l.thinking=(l.thinking||"")+o.delta.text:l.type==="text"&&(l.text=(l.text||"")+o.delta.text)),o.delta.partial_json&&l.type==="tool_use"&&(l._inputJson=(l._inputJson||"")+o.delta.partial_json));}if(o.type==="content_block_stop"&&c!==null){let l=d[c];if(l&&l.type==="tool_use"){let i=l._inputJson;if(i){try{l.input=JSON.parse(i);}catch{}delete l._inputJson;}}c=null;}o.type==="message_delta"&&(o.usage?.output_tokens&&(r.usage.output_tokens=o.usage.output_tokens),o.delta?.stop_reason&&(r.stop_reason=o.delta.stop_reason)),yield o;}}catch(o){throw s=o instanceof Error?o:new Error(String(o)),o}finally{let o=Date.now()-n;if(r.content=d.filter(Boolean),s)E({provider:T,model:r.model||t.model||"unknown",input:{system:t.system,messages:t.messages},error:s,durationMs:o,streaming:true});else {let l=j({provider:T,model:r.model||t.model||"unknown",input:{system:t.system,messages:t.messages},rawResponse:r,durationMs:o,status:"success",streaming:true,firstTokenMs:a});if(l&&r.content){let i=z(r.content);i.length>0&&O(i,l);}}}}function z(e){let t=[];for(let n of e)n.type==="tool_use"&&n.id&&t.push(n.id);return t}function jt(e,t){return t&&ue(t),A().disabled?(f("Tracing disabled, returning unwrapped client"),e):le(e)?(H("anthropic"),Ye(e)):(C("Client is not an Anthropic client. Use @lelemondev/sdk/anthropic only with Anthropic SDK."),e)}function Ye(e){let t=e;return new Proxy(t,{get(n,r,s){let a=Reflect.get(n,r,s);return r==="messages"&&a&&typeof a=="object"?Xe(a):a}})}function Xe(e){return new Proxy(e,{get(t,n,r){let s=Reflect.get(t,n,r);return n==="create"&&typeof s=="function"?pe(s.bind(t)):n==="stream"&&typeof s=="function"?fe(s.bind(t)):s}})}
3
- exports.captureSpan=U;exports.flush=Re;exports.getTraceContext=m;exports.init=Ce;exports.isEnabled=Me;exports.observe=jt;exports.span=Fe;exports.trace=Ue;//# sourceMappingURL=anthropic.js.map
2
+ var z=(e=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(e,{get:(t,n)=>(typeof require<"u"?require:t)[n]}):e)(function(e){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+e+'" is not supported')});var ge=(e,t)=>()=>(t||e((t={exports:{}}).exports,t),t.exports);var Q=ge((nt,he)=>{he.exports={name:"@lelemondev/sdk",version:"0.10.1",description:"Automatic LLM observability. Wrap your client, everything is traced.",author:"Lelemon <info@lelemon.dev>",license:"MIT",repository:{type:"git",url:"git+https://github.com/lelemondev/lelemondev-sdk.git"},homepage:"https://lelemon.dev",bugs:{url:"https://github.com/lelemondev/lelemondev-sdk/issues"},keywords:["llm","observability","tracing","openai","anthropic","nextjs","lambda","express","hono","claude","gpt","ai","monitoring","serverless"],main:"./dist/index.js",module:"./dist/index.mjs",types:"./dist/index.d.ts",exports:{".":{types:"./dist/index.d.ts",import:"./dist/index.mjs",require:"./dist/index.js"},"./openai":{types:"./dist/openai.d.ts",import:"./dist/openai.mjs",require:"./dist/openai.js"},"./anthropic":{types:"./dist/anthropic.d.ts",import:"./dist/anthropic.mjs",require:"./dist/anthropic.js"},"./bedrock":{types:"./dist/bedrock.d.ts",import:"./dist/bedrock.mjs",require:"./dist/bedrock.js"},"./gemini":{types:"./dist/gemini.d.ts",import:"./dist/gemini.mjs",require:"./dist/gemini.js"},"./google-genai":{types:"./dist/google-genai.d.ts",import:"./dist/google-genai.mjs",require:"./dist/google-genai.js"},"./openrouter":{types:"./dist/openrouter.d.ts",import:"./dist/openrouter.mjs",require:"./dist/openrouter.js"},"./next":{types:"./dist/next.d.ts",import:"./dist/next.mjs",require:"./dist/next.js"},"./lambda":{types:"./dist/lambda.d.ts",import:"./dist/lambda.mjs",require:"./dist/lambda.js"},"./express":{types:"./dist/express.d.ts",import:"./dist/express.mjs",require:"./dist/express.js"},"./hono":{types:"./dist/hono.d.ts",import:"./dist/hono.mjs",require:"./dist/hono.js"},"./integrations":{types:"./dist/integrations.d.ts",import:"./dist/integrations.mjs",require:"./dist/integrations.js"},"./package.json":"./package.json"},typesVersions:{"*":{openai:["./dist/openai.d.ts"],anthropic:["./dist/anthropic.d.ts"],bedrock:["./dist/bedrock.d.ts"],gemini:["./dist/gemini.d.ts"],"google-genai":["./dist/google-genai.d.ts"],openrouter:["./dist/openrouter.d.ts"],next:["./dist/next.d.ts"],lambda:["./dist/lambda.d.ts"],express:["./dist/express.d.ts"],hono:["./dist/hono.d.ts"],integrations:["./dist/integrations.d.ts"],"*":["./dist/index.d.ts"]}},files:["dist/**/*.js","dist/**/*.mjs","dist/**/*.d.ts","dist/**/*.d.mts","README.md"],sideEffects:false,engines:{node:">=18.0.0"},scripts:{build:"tsup",dev:"tsup --watch",docs:"typedoc && node scripts/generate-llms-txt.mjs",prepublishOnly:"npm run build",lint:"eslint src/",test:"vitest","test:run":"vitest run","test:coverage":"vitest run --coverage","test:e2e":"vitest run tests/e2e",typecheck:"tsc --noEmit"},devDependencies:{"@aws-sdk/client-bedrock-runtime":"^3.962.0","@google/genai":"^1.0.0","@google/generative-ai":"^0.24.1","@anthropic-ai/sdk":"^0.82.0","@types/node":"^22.0.0","@vitest/coverage-v8":"^4.0.0",dotenv:"^17.2.3",openai:"^6.15.0",tsup:"^8.5.1",vite:"^6.0.0",typedoc:"^0.28.18",typescript:"^6.0.0",vitest:"^4.0.0"}};});var F=false;function G(e){F=e;}function y(){return F?true:fe("LELEMON_DEBUG")==="true"}var f="[Lelemon]";function g(e,t){y()&&$("debug",e,t);}function j(e,t){y()&&$("info",e,t);}function E(e,t){$("warn",e,t);}function q(e,t,n,r){y()&&console.log(`${f} Captured trace: provider=${e} model=${t} duration=${n}ms status=${r}`);}function C(e,t){console.error(`${f} Failed to capture trace: provider=${e} error=${t.message}`);}function V(e){y()&&console.log(`${f} Wrapped client: provider=${e}`);}function H(e,t){y()&&console.log(`${f} Sending batch: count=${e} endpoint=${t}`);}function W(e,t){y()&&console.log(`${f} Batch sent successfully: count=${e} duration=${t}ms`);}function Y(e,t){let n=t instanceof Error?t.message:String(t);console.error(`${f} Batch send failed: count=${e} error=${n}`);}function X(e,t,n){y()&&console.log(`${f} Request: ${e} ${t} (${n} bytes)`);}function Z(e,t){y()&&console.log(`${f} Response: status=${e} duration=${t}ms`);}function $(e,t,n){let r=e==="error"?console.error:e==="warn"?console.warn:console.log;n!==void 0?r(`${f} ${t}`,n):r(`${f} ${t}`);}function fe(e){if(typeof process<"u"&&process.env)return process.env[e]}var me=10,ye=1e3,be=1e4,M=class{config;queue=[];flushPromise=null;flushTimer=null;constructor(t){this.config={apiKey:t.apiKey,endpoint:t.endpoint,debug:t.debug,disabled:t.disabled,batchSize:t.batchSize??me,flushIntervalMs:t.flushIntervalMs??ye,requestTimeoutMs:t.requestTimeoutMs??be};}isEnabled(){return !this.config.disabled&&!!this.config.apiKey}enqueue(t){this.config.disabled||(this.queue.push(t),this.queue.length>=this.config.batchSize?this.flush():this.scheduleFlush());}async flush(){if(this.flushPromise)return this.flushPromise;if(this.queue.length===0)return;this.cancelScheduledFlush();let t=this.queue;return this.queue=[],this.flushPromise=this.sendBatch(t).finally(()=>{this.flushPromise=null;}),this.flushPromise}getPendingCount(){return this.queue.length}scheduleFlush(){this.flushTimer===null&&(this.flushTimer=setTimeout(()=>{this.flushTimer=null,this.flush();},this.config.flushIntervalMs));}cancelScheduledFlush(){this.flushTimer!==null&&(clearTimeout(this.flushTimer),this.flushTimer=null);}async sendBatch(t){if(t.length===0)return;let n=Date.now();H(t.length,`${this.config.endpoint}/api/v1/ingest`);try{await this.request("POST","/api/v1/ingest",{events:t}),W(t.length,Date.now()-n);}catch(r){Y(t.length,r);}}async request(t,n,r){let s=`${this.config.endpoint}${n}`,a=new AbortController,u=r?JSON.stringify(r):void 0;X(t,s,u?.length??0);let c=setTimeout(()=>{a.abort();},this.config.requestTimeoutMs),d=Date.now();try{let o=await fetch(s,{method:t,headers:{"Content-Type":"application/json",Authorization:`Bearer ${this.config.apiKey}`},body:u,signal:a.signal});if(clearTimeout(c),Z(o.status,Date.now()-d),!o.ok){let i=await o.text().catch(()=>"Unknown error");throw new Error(`HTTP ${o.status}: ${i}`)}let l=await o.text();return l?JSON.parse(l):{}}catch(o){throw clearTimeout(c),o instanceof Error&&o.name==="AbortError"?new Error(`Request timeout after ${this.config.requestTimeoutMs}ms`):o}}};var ke="@lelemondev/sdk",we="nodejs";function Te(){return typeof process<"u"&&process.versions?.node?{name:"nodejs",version:process.versions.node}:typeof Deno<"u"?{name:"deno",version:Deno.version?.deno??"unknown"}:typeof Bun<"u"?{name:"bun",version:Bun.version??"unknown"}:typeof window<"u"&&typeof navigator<"u"?{name:"browser",version:navigator.userAgent}:null}function ve(){if(typeof process<"u"&&process.platform){let e=process.platform;switch(e){case "darwin":return "darwin";case "win32":return "windows";case "linux":return "linux";default:return e}}if(typeof navigator<"u"){let e=navigator.userAgent.toLowerCase();if(e.includes("mac"))return "darwin";if(e.includes("win"))return "windows";if(e.includes("linux"))return "linux"}return null}function Se(){try{if(typeof z<"u")return Q().version??"unknown"}catch{}return "unknown"}var v=null;function te(e){if(!v){let n=Te(),r=ve();v={"telemetry.sdk.name":ke,"telemetry.sdk.version":Se(),"telemetry.sdk.language":we},n&&(v["process.runtime.name"]=n.name,v["process.runtime.version"]=n.version),r&&(v["os.type"]=r);}let t={...v};return e?.name&&(t["service.name"]=e.name),e?.version&&(t["service.version"]=e.version),e?.environment&&(t["deployment.environment"]=e.environment),t}var K={},k=null,P=null,ne="https://api.lelemon.dev";function Ie(e={}){K=e,e.debug&&G(true),P=te(e.service),j("Initializing SDK",{endpoint:e.endpoint??ne,debug:e.debug??false,disabled:e.disabled??false,telemetry:P}),k=re(e),k.isEnabled()?j("SDK initialized - tracing enabled"):g("SDK initialized - tracing disabled (no API key or explicitly disabled)");}function R(){return K}function A(){return P}function _e(){return w().isEnabled()}function w(){return k||(k=re(K)),k}async function Ee(){k&&await k.flush();}function re(e){let t=e.apiKey??Ce("LELEMON_API_KEY");return !t&&!e.disabled&&E("No API key provided. Set apiKey in init() or LELEMON_API_KEY env var. Tracing disabled."),new M({apiKey:t??"",endpoint:e.endpoint??ne,debug:e.debug??false,disabled:e.disabled??!t,batchSize:e.batchSize,flushIntervalMs:e.flushIntervalMs,requestTimeoutMs:e.requestTimeoutMs})}function Ce(e){if(typeof process<"u"&&process.env)return process.env[e]}var N=Symbol.for("@lelemondev/sdk:globalContext");function ie(){let e=globalThis;return e[N]||(e[N]={context:{}}),e[N]}function ae(e){ie().context=e,g("Global context updated",e);}function I(){return ie().context}function D(e){try{let t=w();if(!t.isEnabled()){g("Transport disabled, skipping trace capture");return}let n=I(),r=m(),s=S(),a=A(),u={provider:e.provider,model:e.model,input:U(e.input),rawResponse:e.rawResponse?x(e.rawResponse,0):void 0,durationMs:e.durationMs,status:e.status,streaming:e.streaming,firstTokenMs:e.firstTokenMs,sessionId:r?.sessionId??n.sessionId,userId:r?.userId??n.userId,traceId:r?.traceId,spanId:s,parentSpanId:r?.currentSpanId,metadata:{...n.metadata,...e.metadata,...r?{_traceName:r.name}:{},...a?{_telemetry:a}:{}},tags:n.tags,spanType:e.spanType,name:e.name};return q(e.provider,e.model,e.durationMs,e.status),t.enqueue(u),s}catch(t){C(e.provider,t instanceof Error?t:new Error(String(t)));return}}function _(e){try{let t=w();if(!t.isEnabled()){g("Transport disabled, skipping error capture");return}let n=I(),r=m(),s=A(),a={provider:e.provider,model:e.model,input:U(e.input),durationMs:e.durationMs,status:"error",errorMessage:e.error.message,streaming:e.streaming,sessionId:r?.sessionId??n.sessionId,userId:r?.userId??n.userId,traceId:r?.traceId,spanId:S(),parentSpanId:r?.currentSpanId,metadata:{...n.metadata,...e.metadata,...r?{_traceName:r.name}:{},...s?{_telemetry:s}:{}},tags:n.tags};q(e.provider,e.model,e.durationMs,"error"),g("Error details",{message:e.error.message,stack:e.error.stack}),t.enqueue(a);}catch(t){C(e.provider,t instanceof Error?t:new Error(String(t)));}}function L(e){try{let t=w();if(!t.isEnabled()){g("Transport disabled, skipping span capture");return}let n=I(),r=m(),s=e.metadata?._traceId,a=e.metadata?._parentSpanId,u=A(),c={...n.metadata,...e.metadata,...u?{_telemetry:u}:{}};delete c._traceId,delete c._parentSpanId;let d={spanType:e.type,name:e.name,provider:"unknown",model:e.name,input:U(e.input),output:x(e.output,0),durationMs:e.durationMs,status:e.status||"success",errorMessage:e.errorMessage,streaming:!1,sessionId:r?.sessionId??n.sessionId,userId:r?.userId??n.userId,traceId:s??r?.traceId,spanId:S(),parentSpanId:a??r?.currentSpanId,toolCallId:e.toolCallId,metadata:c,tags:n.tags};g(`Span captured: ${e.type}/${e.name}`,{durationMs:e.durationMs}),t.enqueue(d);}catch(t){C("unknown",t instanceof Error?t:new Error(String(t)));}}var se=1e5,Me=["api_key","apikey","password","secret","authorization"],Re=["access_token","auth_token","bearer_token","refresh_token","id_token","session_token"],Ae=["inputtokens","outputtokens","totaltokens","prompttokens","completiontokens","cachereadtokens","cachewritetokens","cachereadinputtokens","cachewriteinputtokens","reasoningtokens"],oe={emails:/[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/g,phones:/\b\d{9,}\b/g};function ue(){return R().redaction??{}}function De(e){let t=e.toLowerCase();return Ae.includes(t)?false:!!(Me.some(r=>t.includes(r))||Re.some(r=>t.includes(r))||(ue().keys??[]).some(r=>t.includes(r.toLowerCase())))}function Oe(e){let t=ue(),n=e;for(let r of t.patterns??[])r.lastIndex=0,n=n.replace(r,"[REDACTED]");return t.emails&&(n=n.replace(oe.emails,"[EMAIL]")),t.phones&&(n=n.replace(oe.phones,"[PHONE]")),n}function U(e){return x(e,0)}function x(e,t){if(t>10)return "[max depth exceeded]";if(e==null)return e;if(typeof e=="string"){let n=Oe(e);return n.length>se&&(n=n.slice(0,se)+"...[truncated]"),n}if(typeof e=="number"||typeof e=="boolean")return e;if(Array.isArray(e))return e.map(n=>x(n,t+1));if(typeof e=="object"){let n={};for(let[r,s]of Object.entries(e))De(r)?n[r]="[REDACTED]":n[r]=x(s,t+1);return n}return String(e)}var B=Symbol.for("@lelemondev/sdk:traceStorage");function qe(){let e=globalThis;return e[B]||(e[B]=new async_hooks.AsyncLocalStorage),e[B]}var ce=qe();function S(){return typeof crypto<"u"&&crypto.randomUUID?crypto.randomUUID():`${Date.now().toString(36)}-${Math.random().toString(36).slice(2,11)}`}function m(){return ce.getStore()}function O(e,t){let n=m();if(n)for(let r of e)n.pendingToolCalls.set(r,t),g(`Registered tool call ${r} \u2192 LLM span ${t}`);}function $e(e){let t=m();if(t)return e&&t.pendingToolCalls.has(e)?t.pendingToolCalls.get(e):t.currentSpanId}function Pe(e){let t=m();t&&t.pendingToolCalls.delete(e);}async function Ke(e,t){let n=typeof e=="string"?{name:e}:e,r=m(),s=r?.traceId??S(),a=S(),u={traceId:s,rootSpanId:a,currentSpanId:a,parentSpanId:r?.currentSpanId,name:n.name,startTime:Date.now(),input:n.input,metadata:n.metadata,tags:n.tags,outputKey:n.outputKey,outputTransform:n.outputTransform,sessionId:n.sessionId,userId:n.userId,style:n.style,pendingToolCalls:new Map};return ce.run(u,async()=>{let c,d;try{return c=await t(),c}catch(o){throw d=o instanceof Error?o:new Error(String(o)),o}finally{Ue(u,d?void 0:c,d);}})}var Ne=["text","content","message","output","response","result","answer"];function Le(e,t,n){if(n)try{return n(e)}catch{return e}if(t===false)return e;if(typeof t=="string"&&e&&typeof e=="object"){let s=e;return t in s?s[t]:e}if(e==null||typeof e!="object"||Array.isArray(e))return e;let r=e;for(let s of Ne)if(s in r&&typeof r[s]=="string")return r[s];return e}function Ue(e,t,n){let r=w();if(!r.isEnabled()){g("Transport disabled, skipping root span");return}let s=I(),a=Date.now()-e.startTime,u=n?null:Le(t,e.outputKey,e.outputTransform),c={spanType:"agent",name:e.name,provider:"agent",model:e.name,traceId:e.traceId,spanId:e.rootSpanId,parentSpanId:e.parentSpanId,input:e.input,output:u,inputTokens:0,outputTokens:0,durationMs:a,status:n?"error":"success",errorMessage:n?.message,streaming:false,sessionId:e.sessionId??s.sessionId,userId:e.userId??s.userId,metadata:{...s.metadata,...e.metadata,...e.style?{_style:e.style}:{}},tags:e.tags??s.tags};g(`Sending root span: ${e.name}`,{durationMs:a,hasError:!!n}),r.enqueue(c);}function Be(e){let t=m();if(!t){process.env.NODE_ENV!=="production"&&console.warn("[Lelemon] span() called outside of trace() - span will not be captured");return}let n=$e(e.toolCallId);L({type:e.type,name:e.name,input:e.input,output:e.output,durationMs:e.durationMs??0,status:e.status??"success",errorMessage:e.errorMessage,toolCallId:e.toolCallId,metadata:{...e.metadata,_traceId:t.traceId,_parentSpanId:n,...e.style?{_style:e.style}:{}}}),e.toolCallId&&Pe(e.toolCallId);}var T="anthropic";function de(e){if(!e||typeof e!="object")return false;if(e.constructor?.name==="Anthropic")return true;let n=e;return !!(n.messages&&typeof n.messages=="object")}function le(e){return async function(...n){let r=Date.now(),s=n[0]||{},a=s.stream===true;try{let u=await e(...n);if(a&&Je(u))return Fe(u,s,r);let c=Date.now()-r,d=u,o=D({provider:T,model:s.model||d.model||"unknown",input:{system:s.system,messages:s.messages},rawResponse:u,durationMs:c,status:"success",streaming:!1});if(o&&d.content){let l=J(d.content);l.length>0&&O(l,o);}return u}catch(u){let c=Date.now()-r;throw _({provider:T,model:s.model||"unknown",input:{system:s.system,messages:s.messages},error:u instanceof Error?u:new Error(String(u)),durationMs:c,streaming:a}),u}}}function pe(e){return function(...n){let r=Date.now(),s=n[0]||{};try{let a=e(...n);return a&&typeof a=="object"?ze(a,s,r):a}catch(a){let u=Date.now()-r;throw _({provider:T,model:s.model||"unknown",input:{system:s.system,messages:s.messages},error:a instanceof Error?a:new Error(String(a)),durationMs:u,streaming:true}),a}}}function Je(e){return e!=null&&typeof e[Symbol.asyncIterator]=="function"}function ze(e,t,n){let r=e;if(!r[Symbol.asyncIterator])return e;let s={content:[],usage:{input_tokens:0,output_tokens:0}},a=false,u,c=false,d=null,o=[],l=async function*(){try{for await(let i of r){if(i.type==="message_start"&&i.message&&(s.model=i.message.model,s.id=i.message.id,s.role=i.message.role,i.message.usage&&(s.usage={...i.message.usage})),i.type==="content_block_start"&&i.content_block&&(d=i.index??o.length,o[d]={...i.content_block},i.content_block.type==="text"&&(o[d].text=""),i.content_block.type==="thinking"&&(o[d].thinking="")),i.type==="content_block_delta"&&i.delta&&d!==null){let p=o[d];p&&(i.delta.text&&(c||(c=!0,u=Date.now()-n),p.type==="thinking"?p.thinking=(p.thinking||"")+i.delta.text:p.type==="text"&&(p.text=(p.text||"")+i.delta.text)),i.delta.partial_json&&p.type==="tool_use"&&(p._inputJson=(p._inputJson||"")+i.delta.partial_json));}if(i.type==="content_block_stop"&&d!==null){let p=o[d];if(p&&p.type==="tool_use"){let b=p._inputJson;if(b){try{p.input=JSON.parse(b);}catch{}delete p._inputJson;}}d=null;}i.type==="message_delta"&&(i.usage?.output_tokens&&(s.usage.output_tokens=i.usage.output_tokens),i.delta?.stop_reason&&(s.stop_reason=i.delta.stop_reason)),yield i;}}catch(i){if(!a){a=true;let p=Date.now()-n;_({provider:T,model:s.model||t.model||"unknown",input:{system:t.system,messages:t.messages},error:i instanceof Error?i:new Error(String(i)),durationMs:p,streaming:true});}throw i}finally{if(!a){a=true;let i=Date.now()-n;s.content=o.filter(Boolean);let p=D({provider:T,model:s.model||t.model||"unknown",input:{system:t.system,messages:t.messages},rawResponse:s,durationMs:i,status:"success",streaming:true,firstTokenMs:u});if(p&&s.content){let b=J(s.content);b.length>0&&O(b,p);}}}};return new Proxy(e,{get(i,p,b){return p===Symbol.asyncIterator?()=>l()[Symbol.asyncIterator]():Reflect.get(i,p,b)}})}async function*Fe(e,t,n){let r={content:[],usage:{input_tokens:0,output_tokens:0}},s=null,a,u=false,c=null,d=[];try{for await(let o of e){if(o.type==="message_start"&&o.message&&(r.model=o.message.model,r.id=o.message.id,r.role=o.message.role,o.message.usage&&(r.usage={...o.message.usage})),o.type==="content_block_start"&&o.content_block&&(c=o.index??d.length,d[c]={...o.content_block},o.content_block.type==="text"&&(d[c].text=""),o.content_block.type==="thinking"&&(d[c].thinking="")),o.type==="content_block_delta"&&o.delta&&c!==null){let l=d[c];l&&(o.delta.text&&(u||(u=!0,a=Date.now()-n),l.type==="thinking"?l.thinking=(l.thinking||"")+o.delta.text:l.type==="text"&&(l.text=(l.text||"")+o.delta.text)),o.delta.partial_json&&l.type==="tool_use"&&(l._inputJson=(l._inputJson||"")+o.delta.partial_json));}if(o.type==="content_block_stop"&&c!==null){let l=d[c];if(l&&l.type==="tool_use"){let i=l._inputJson;if(i){try{l.input=JSON.parse(i);}catch{}delete l._inputJson;}}c=null;}o.type==="message_delta"&&(o.usage?.output_tokens&&(r.usage.output_tokens=o.usage.output_tokens),o.delta?.stop_reason&&(r.stop_reason=o.delta.stop_reason)),yield o;}}catch(o){throw s=o instanceof Error?o:new Error(String(o)),o}finally{let o=Date.now()-n;if(r.content=d.filter(Boolean),s)_({provider:T,model:r.model||t.model||"unknown",input:{system:t.system,messages:t.messages},error:s,durationMs:o,streaming:true});else {let l=D({provider:T,model:r.model||t.model||"unknown",input:{system:t.system,messages:t.messages},rawResponse:r,durationMs:o,status:"success",streaming:true,firstTokenMs:a});if(l&&r.content){let i=J(r.content);i.length>0&&O(i,l);}}}}function J(e){let t=[];for(let n of e)n.type==="tool_use"&&n.id&&t.push(n.id);return t}function Mt(e,t){return t&&ae(t),R().disabled?(g("Tracing disabled, returning unwrapped client"),e):de(e)?(V("anthropic"),Ve(e)):(E("Client is not an Anthropic client. Use @lelemondev/sdk/anthropic only with Anthropic SDK."),e)}function Ve(e){let t=e;return new Proxy(t,{get(n,r,s){let a=Reflect.get(n,r,s);return r==="messages"&&a&&typeof a=="object"?He(a):a}})}function He(e){return new Proxy(e,{get(t,n,r){let s=Reflect.get(t,n,r);return n==="create"&&typeof s=="function"?le(s.bind(t)):n==="stream"&&typeof s=="function"?pe(s.bind(t)):s}})}
3
+ exports.captureSpan=L;exports.flush=Ee;exports.getTraceContext=m;exports.init=Ie;exports.isEnabled=_e;exports.observe=Mt;exports.span=Be;exports.trace=Ke;//# sourceMappingURL=anthropic.js.map
4
4
  //# sourceMappingURL=anthropic.js.map