teckel-ai 0.3.2 → 0.3.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,48 +1,425 @@
1
1
  # teckel-ai
2
2
 
3
+ TypeScript/JavaScript SDK for [Teckel AI](https://teckel.ai) - Monitor and improve your AI chatbot's accuracy and completeness.
4
+
3
5
  ## What is Teckel AI?
4
6
 
5
- Teckel AI helps you understand and improve your AI chatbots by analyzing conversations your users are having. We track what questions users ask, how well your AI answers them, and tell you which areas needs improvement.
7
+ Teckel AI helps you understand and improve your AI chatbots by analyzing conversations. Track what questions users ask, how well your AI answers them, and identify which areas need improvement.
6
8
 
7
- **Your Problem**: Your RAG or MCP referencing AI chat system gives incomplete or incorrect answers because your knowledge base has gaps. Users get frustrated, support tickets increase, and you don't know which areas to focus on.
9
+ **Your Problem:** Your RAG or AI chat system gives incomplete or incorrect answers because your knowledge base has gaps.
8
10
 
9
- **Our Service**: We analyze every AI response for accuracy and completeness, cluster similar queries to identify trending topics (or let you define an area of interest and see related questions), and tell you exactly which AI-retrieved documentation to add or update, prioritized by impact. We write your document templates for you to fill in with info your users need, and give you updates on your AI system's health via Slack.
11
+ **Our Service:** We analyze every AI response for accuracy and completeness, cluster similar queries to identify trending topics, and tell you exactly which documentation to add or update.
10
12
 
11
13
  ## What This SDK Does
12
14
 
13
- This lightweight SDK (one dependency, 20-minute integration) sends your AI conversations to Teckel for analysis. You get:
15
+ This lightweight SDK (zero dependencies, 20-minute integration) sends your AI conversations to Teckel for analysis:
14
16
 
15
- - **Completeness Scoring** - How well each response answers the user's question (0-100%)
17
+ - **Completeness Scoring** - How well each response answers the question (0-100%)
16
18
  - **Accuracy Analysis** - Whether AI claims are supported by your source documents
17
19
  - **Topic Intelligence** - Automatic clustering of queries to reveal documentation gaps
18
- - **Actionable Feedback** - Specific recommendations on what knowledge your AI needs to better answer user questions
20
+ - **Actionable Feedback** - Specific recommendations on what knowledge to add
19
21
 
20
- Works with any RAG system or AI conversational framework: LangChain, LlamaIndex, Vercel AI SDK, custom implementations using vector database retrieval, or direct OpenAI/Anthropic calls.
22
+ Works with any RAG system or AI framework: LangChain, LlamaIndex, Vercel AI SDK, or custom implementations.
21
23
 
22
- ## Getting Started
24
+ ## Installation
23
25
 
24
- 1. **Sign up** at [app.teckel.ai](https://app.teckel.ai) with a 14 day free trial
25
- 2. **Generate an API key** from your dashboard (Admin Panel > API Keys)
26
- 3. **Install the SDK**: `npm install teckel-ai`
27
- 4. **Follow the integration guide** at [docs.teckel.ai/docs/getting-started](https://docs.teckel.ai/docs/getting-started)
26
+ ```bash
27
+ npm install teckel-ai
28
+ ```
28
29
 
29
- ## Documentation
30
+ **Requirements:**
31
+ - Node.js 18+ (or Bun, Deno, serverless runtimes)
32
+ - TypeScript 4.5+ (optional but recommended)
30
33
 
31
- - [Getting Started Guide](https://docs.teckel.ai/docs/getting-started) - Step-by-step integration
32
- - [API & SDK Reference](https://docs.teckel.ai/docs/api-sdk-reference) - Complete field documentation and framework examples
33
- - [Troubleshooting](https://docs.teckel.ai/docs/troubleshooting) - Common issues and solutions
34
+ ## Quick Start
34
35
 
35
- ## Requirements
36
+ ```typescript
37
+ import { TeckelTracer } from 'teckel-ai';
36
38
 
37
- - Node.js 18+ (or Bun, Deno, serverless runtimes)
38
- - A Teckel AI account with API key
39
- - An LLM-based application (RAG system, chatbot, AI agent)
39
+ // Initialize once at startup
40
+ const tracer = new TeckelTracer({
41
+ apiKey: process.env.TECKEL_API_KEY
42
+ });
43
+
44
+ // In your API handler
45
+ async function handleChat(userQuestion: string, sessionId: string) {
46
+ // Start conversation
47
+ const conversation = tracer.start({
48
+ sessionRef: sessionId,
49
+ userId: 'user@example.com'
50
+ });
51
+
52
+ // Your existing RAG logic
53
+ const chunks = await vectorDB.search(userQuestion);
54
+ const answer = await llm.generate(userQuestion, chunks);
55
+
56
+ // Map chunks to Teckel format
57
+ const documents = chunks.map((chunk, index) => ({
58
+ documentRef: chunk.id,
59
+ documentName: chunk.title,
60
+ documentText: chunk.content,
61
+ documentLastUpdated: chunk.lastModified,
62
+ sourceUri: chunk.url,
63
+ similarity: chunk.score,
64
+ rank: index
65
+ }));
66
+
67
+ // Send trace (non-blocking)
68
+ conversation.trace({
69
+ query: userQuestion,
70
+ response: answer,
71
+ documents: documents,
72
+ model: 'gpt-5',
73
+ responseTimeMs: 1200
74
+ });
75
+
76
+ // For serverless: flush before returning
77
+ await conversation.flush(5000);
78
+
79
+ return answer;
80
+ }
81
+ ```
82
+
83
+ ## API Reference
84
+
85
+ ### TeckelTracer
86
+
87
+ Main SDK class.
88
+
89
+ #### Constructor
90
+
91
+ ```typescript
92
+ new TeckelTracer(config: TeckelConfig)
93
+ ```
94
+
95
+ | Field | Type | Required | Default | Description |
96
+ |-------|------|----------|---------|-------------|
97
+ | `apiKey` | string | Yes | - | Your Teckel API key (starts with `tk_live_`) |
98
+ | `endpoint` | string | No | `"https://app.teckel.ai/api"` | API base URL |
99
+ | `debug` | boolean | No | `false` | Enable debug logging |
100
+ | `timeoutMs` | number | No | `5000` | Network timeout in milliseconds |
101
+
102
+ #### tracer.start()
103
+
104
+ Start or continue a conversation.
105
+
106
+ ```typescript
107
+ tracer.start(options: ConversationOptions): Conversation
108
+ ```
109
+
110
+ | Field | Type | Required | Description |
111
+ |-------|------|----------|-------------|
112
+ | `sessionRef` | string | Yes | Your unique conversation identifier |
113
+ | `userId` | string | No | End-user identifier |
114
+ | `metadata` | object | No | Custom context |
115
+
116
+ ### Conversation
117
+
118
+ #### conversation.trace()
119
+
120
+ Record a query-response interaction. Fire-and-forget by default.
121
+
122
+ ```typescript
123
+ conversation.trace(data: TraceData): TraceResult
124
+ ```
125
+
126
+ **Returns:** `{ traceRef: string, turnNumber: number }`
127
+
128
+ | Field | Type | Required | Description |
129
+ |-------|------|----------|-------------|
130
+ | `query` | string | Yes | User's question |
131
+ | `response` | string | Yes | AI-generated answer |
132
+ | `documents` | Document[] | Recommended | Retrieved chunks (for RAG) |
133
+ | `traceRef` | string | No | Your correlation ID |
134
+ | `model` | string | No | LLM model (e.g., "gpt-5") |
135
+ | `responseTimeMs` | number | No | Latency in milliseconds |
136
+ | `tokens` | TokenUsage | No | Token usage |
137
+ | `metadata` | object | No | Custom context |
138
+
139
+ **Example:**
140
+
141
+ ```typescript
142
+ const result = conversation.trace({
143
+ query: "How do I reset my password?",
144
+ response: "Go to Settings > Security...",
145
+ model: "gpt-5",
146
+ documents: [
147
+ {
148
+ documentRef: "kb-123",
149
+ documentName: "Password Reset Guide",
150
+ documentText: "To reset your password...",
151
+ documentLastUpdated: "2025-01-15T10:00:00Z",
152
+ sourceUri: "https://kb.example.com/security",
153
+ similarity: 0.92,
154
+ rank: 0
155
+ }
156
+ ]
157
+ });
158
+
159
+ console.log(result.traceRef); // "session-123:1"
160
+ ```
161
+
162
+ #### conversation.feedback()
163
+
164
+ Add user feedback signal.
165
+
166
+ ```typescript
167
+ await conversation.feedback(data: FeedbackData): Promise<void>
168
+ ```
169
+
170
+ | Field | Type | Required | Description |
171
+ |-------|------|----------|-------------|
172
+ | `type` | FeedbackType | Yes | `"thumbs_up"`, `"thumbs_down"`, `"flag"`, or `"rating"` |
173
+ | `value` | string | No | For ratings: `"1"` to `"5"` |
174
+ | `comment` | string | No | User's explanation |
175
+ | `traceRef` | string | No | Link to specific trace |
176
+
177
+ **Example:**
178
+
179
+ ```typescript
180
+ await conversation.feedback({
181
+ type: "thumbs_down",
182
+ comment: "Information was outdated"
183
+ });
184
+ ```
185
+
186
+ #### conversation.flush()
187
+
188
+ Wait for queued traces to send. **Required for serverless** to prevent data loss.
189
+
190
+ ```typescript
191
+ await conversation.flush(timeoutMs?: number): Promise<void>
192
+ ```
193
+
194
+ **Throws:** Error on timeout
195
+
196
+ **Example:**
197
+
198
+ ```typescript
199
+ // Serverless: flush before returning
200
+ try {
201
+ await conversation.flush(5000);
202
+ } catch (err) {
203
+ logger.warn('Flush timeout', { err });
204
+ }
205
+ ```
206
+
207
+ #### conversation.end()
208
+
209
+ End conversation and flush pending traces.
210
+
211
+ ```typescript
212
+ await conversation.end(): Promise<void>
213
+ ```
214
+
215
+ #### Read-only Properties
216
+
217
+ ```typescript
218
+ conversation.id // session reference
219
+ conversation.turns // number of traces
220
+ conversation.started // start time
221
+ ```
222
+
223
+ ## Type Definitions
224
+
225
+ ### Document
226
+
227
+ ```typescript
228
+ interface Document {
229
+ // Required
230
+ documentRef: string; // Your document ID
231
+ documentName: string; // Human-readable name
232
+ documentText: string; // Chunk content
233
+
234
+ // Recommended
235
+ documentLastUpdated?: string; // ISO 8601 timestamp
236
+ sourceUri?: string; // URL or path
237
+
238
+ // Optional
239
+ sourceType?: string; // e.g., 'confluence', 'slack'
240
+ similarity?: number; // 0-1 score
241
+ rank?: number; // Position (0 = first)
242
+ ownerEmail?: string; // Owner email
243
+ documentType?: string; // e.g., 'pdf', 'markdown'
244
+ }
245
+ ```
246
+
247
+ ### TokenUsage
248
+
249
+ ```typescript
250
+ interface TokenUsage {
251
+ prompt: number; // Input tokens
252
+ completion: number; // Output tokens
253
+ total: number; // Total tokens
254
+ }
255
+ ```
256
+
257
+ ### FeedbackType
258
+
259
+ ```typescript
260
+ type FeedbackType = 'thumbs_up' | 'thumbs_down' | 'flag' | 'rating'
261
+ ```
262
+
263
+ ## Usage Patterns
264
+
265
+ ### Serverless (Vercel, Lambda, Cloudflare Workers)
266
+
267
+ **Must** call `flush()` before returning to prevent data loss.
268
+
269
+ ```typescript
270
+ const tracer = new TeckelTracer({
271
+ apiKey: process.env.TECKEL_API_KEY
272
+ });
273
+
274
+ export async function handler(request) {
275
+ const conversation = tracer.start({
276
+ sessionRef: request.sessionId
277
+ });
278
+
279
+ const answer = await generateAnswer(request.question);
280
+
281
+ conversation.trace({
282
+ query: request.question,
283
+ response: answer,
284
+ documents: retrievedDocs
285
+ });
286
+
287
+ // CRITICAL: Flush before returning (3-5s recommended)
288
+ await conversation.flush(5000);
289
+
290
+ return { statusCode: 200, body: answer };
291
+ }
292
+ ```
293
+
294
+ ### Long-Running Servers (Express, Fastify, etc.)
295
+
296
+ No `flush()` needed - traces send in background.
297
+
298
+ ```typescript
299
+ const tracer = new TeckelTracer({
300
+ apiKey: process.env.TECKEL_API_KEY
301
+ });
302
+
303
+ app.post('/chat', async (req, res) => {
304
+ const conversation = tracer.start({
305
+ sessionRef: req.session.id
306
+ });
307
+
308
+ const answer = await generateAnswer(req.body.question);
309
+
310
+ conversation.trace({
311
+ query: req.body.question,
312
+ response: answer,
313
+ documents: retrievedDocs
314
+ });
315
+
316
+ // No flush needed
317
+ res.json({ answer });
318
+ });
319
+ ```
320
+
321
+ ### Non-RAG Systems
322
+
323
+ Omit `documents` if not using retrieval.
324
+
325
+ ```typescript
326
+ conversation.trace({
327
+ query: userQuestion,
328
+ response: answer,
329
+ model: 'gpt-5'
330
+ });
331
+ ```
332
+
333
+ ## SDK Behavior
334
+
335
+ ### Error Handling
336
+
337
+ - `trace()`, `feedback()`, `end()` **never throw** - failures logged in debug mode
338
+ - `flush()` **throws on timeout** - catch to monitor potential data loss
339
+
340
+ ### Retry Logic
341
+
342
+ Automatically retries once on transient errors:
343
+ - HTTP 429 (rate limit)
344
+ - HTTP 5xx (server errors)
345
+ - Network failures
346
+
347
+ **Retry pattern:**
348
+ 1. Initial attempt
349
+ 2. Wait 250-350ms (jittered)
350
+ 3. Single retry
351
+ 4. Log failure (debug mode)
352
+
353
+ **Total time:** `2 * timeoutMs + ~300ms`
354
+
355
+ ### Timeouts and Flush
356
+
357
+ - `timeoutMs`: Per-request network timeout for SDK HTTP calls. If a request exceeds this, it is aborted. With one retry, total worst-case send time is approximately `2 * timeoutMs + ~300ms`.
358
+ - `flush(timeoutMs?)`: Bounded wait for the internal send queue to drain. In serverless, call this before returning to avoid data loss. If no argument is passed, it uses the SDK `timeoutMs` value.
359
+ - Recommendation for serverless: `await conversation.flush(3000–5000)` to balance reliability and latency.
360
+ - `end()`: Convenience that flushes pending sends and marks the conversation as ended. It throws on flush timeout just like `flush()`.
361
+
362
+ ### Rate Limits
363
+
364
+ - **Default:** 1,000 requests/hour per organization
365
+ - **Reset:** Top of each hour
366
+ - **Headers:** `X-RateLimit-Limit`, `X-RateLimit-Remaining`, `X-RateLimit-Reset`
367
+
368
+ Contact support@teckel.ai for increases.
369
+
370
+ ## Runtime Compatibility
371
+
372
+ Uses standard Web APIs (`fetch`, `AbortSignal`):
373
+
374
+ - ✅ Node.js 18+
375
+ - ✅ Bun 1.0+
376
+ - ✅ Deno 1.35+ (`npm:teckel-ai`)
377
+ - ✅ Cloudflare Workers
378
+ - ✅ AWS Lambda
379
+ - ✅ Vercel Edge Runtime
380
+ - ✅ Google Cloud Run
381
+
382
+ **Security:** Never expose API keys in browser code. Always call from server/serverless backend.
383
+
384
+ ## Best Practices
385
+
386
+ 1. Initialize `TeckelTracer` once at startup, reuse across requests
387
+ 2. Always call `flush()` in serverless before returning
388
+ 3. Include `documentLastUpdated` when available (enables freshness scoring)
389
+ 4. Use consistent `sessionRef` and `traceRef` for tracking
390
+ 5. Include `model`, `responseTimeMs`, `tokens` for insights
391
+ 6. Set `debug: false` in production
392
+ 7. Call `conversation.end()` when chat session completes
393
+
394
+ ## Troubleshooting
395
+
396
+ **Traces not appearing?**
397
+ 1. Verify API key starts with `tk_live_`
398
+ 2. Check network connectivity to `https://app.teckel.ai/api`
399
+ 3. Enable `debug: true` to see errors
400
+ 4. Look for validation errors in console
401
+
402
+ **Serverless traces dropping?**
403
+ 1. Ensure `await conversation.flush(5000)` before returning
404
+ 2. Monitor flush timeout errors in logs
405
+ 3. Increase timeout if needed (up to 5s for slow networks)
406
+
407
+ **High latency?**
408
+ 1. Verify `trace()` is non-blocking (don't await it)
409
+ 2. Check `timeoutMs` configuration (default 5000ms)
410
+ 3. Review network connectivity
411
+
412
+ ## Documentation
413
+
414
+ - **Getting Started:** [docs.teckel.ai/docs/getting-started](https://docs.teckel.ai/docs/getting-started)
415
+ - **Complete SDK Reference:** [docs.teckel.ai/docs/typescript-sdk-reference](https://docs.teckel.ai/docs/typescript-sdk-reference)
416
+ - **HTTP API Reference:** [docs.teckel.ai/docs/http-api-reference](https://docs.teckel.ai/docs/http-api-reference)
417
+ - **Dashboard:** [app.teckel.ai](https://app.teckel.ai)
40
418
 
41
419
  ## Support
42
420
 
43
- - **Documentation**: [docs.teckel.ai](https://docs.teckel.ai)
44
- - **Dashboard**: [app.teckel.ai](https://app.teckel.ai)
45
- - **Email**: support@teckel.ai
421
+ - **Email:** support@teckel.ai
422
+ - **Issues:** Report bugs or request features via GitHub issues
46
423
 
47
424
  ## License
48
425
 
@@ -50,4 +427,4 @@ MIT
50
427
 
51
428
  ---
52
429
 
53
- Version 0.3.2
430
+ Version 0.3.3
@@ -1,5 +1,5 @@
1
1
  /**
2
- * Conversation class for teckel-ai SDK v0.3.1
2
+ * Conversation class for teckel-ai SDK v0.3.2
3
3
  * Manages a single conversation with fire-and-forget semantics
4
4
  */
5
5
  import type { TraceData, FeedbackData, ConversationOptions, TraceResult } from './types';
@@ -24,14 +24,6 @@ export declare class Conversation {
24
24
  * For serverless, call flush() before function termination
25
25
  */
26
26
  trace(data: TraceData): TraceResult | void;
27
- /**
28
- * Synchronous trace: waits for HTTP send to complete.
29
- * Throws on network/validation errors.
30
- * Respects send queue to maintain trace ordering.
31
- */
32
- traceSync(data: TraceData, opt?: {
33
- timeoutMs?: number;
34
- }): Promise<TraceResult>;
35
27
  /**
36
28
  * Add user feedback signal
37
29
  * Never throws - gracefully handles errors
@@ -60,5 +52,4 @@ export declare class Conversation {
60
52
  * Returns when the queue is empty or the timeout elapses (whichever comes first).
61
53
  */
62
54
  flush(timeoutMs?: number): Promise<void>;
63
- private mapOTelAliases;
64
55
  }
@@ -1,6 +1,6 @@
1
1
  "use strict";
2
2
  /**
3
- * Conversation class for teckel-ai SDK v0.3.1
3
+ * Conversation class for teckel-ai SDK v0.3.2
4
4
  * Manages a single conversation with fire-and-forget semantics
5
5
  */
6
6
  Object.defineProperty(exports, "__esModule", { value: true });
@@ -41,24 +41,22 @@ class Conversation {
41
41
  try {
42
42
  // Validate with Zod
43
43
  const validated = schemas_1.TraceDataSchema.parse(data);
44
- // Apply OTel alias mapping (beta): infer canonical fields from metadata if missing
45
- const mapped = this.mapOTelAliases(validated);
46
44
  // Increment local turn number immediately (client-side counter)
47
45
  const localTurn = ++this.turnCount;
48
46
  // Ensure we have a client-provided traceRef; if missing, generate a deterministic one
49
- const finalTraceRef = mapped.traceRef && mapped.traceRef.length > 0
50
- ? mapped.traceRef
47
+ const finalTraceRef = validated.traceRef && validated.traceRef.length > 0
48
+ ? validated.traceRef
51
49
  : `${this.sessionRef}:${localTurn}`;
52
50
  if (this.debug) {
53
51
  console.log('[Teckel] Queueing trace (fire-and-forget):', {
54
52
  sessionRef: this.sessionRef,
55
53
  turnNumber: localTurn,
56
- queryLength: mapped.query.length,
57
- responseLength: mapped.response.length,
58
- documentCount: ((_a = mapped.documents) === null || _a === void 0 ? void 0 : _a.length) || 0
54
+ queryLength: validated.query.length,
55
+ responseLength: validated.response.length,
56
+ documentCount: ((_a = validated.documents) === null || _a === void 0 ? void 0 : _a.length) || 0
59
57
  });
60
58
  }
61
- const toSend = Object.assign(Object.assign({}, mapped), { traceRef: finalTraceRef });
59
+ const toSend = Object.assign(Object.assign({}, validated), { traceRef: finalTraceRef });
62
60
  // Fire-and-forget: enqueue and return immediately
63
61
  this.enqueueSend(async () => {
64
62
  try {
@@ -67,7 +65,8 @@ class Conversation {
67
65
  }
68
66
  catch (err) {
69
67
  if (this.debug) {
70
- console.warn('[Teckel] Trace send failed (non-blocking):', (err === null || err === void 0 ? void 0 : err.message) || err);
68
+ const msg = err instanceof Error ? err.message : String(err);
69
+ console.warn('[Teckel] Trace send failed (non-blocking):', msg);
71
70
  }
72
71
  }
73
72
  });
@@ -80,36 +79,6 @@ class Conversation {
80
79
  }
81
80
  }
82
81
  }
83
- /**
84
- * Synchronous trace: waits for HTTP send to complete.
85
- * Throws on network/validation errors.
86
- * Respects send queue to maintain trace ordering.
87
- */
88
- async traceSync(data, opt) {
89
- // Validate with Zod
90
- const validated = schemas_1.TraceDataSchema.parse(data);
91
- // Apply OTel alias mapping
92
- const mapped = this.mapOTelAliases(validated);
93
- // Increment local turn and compute final traceRef
94
- const localTurn = ++this.turnCount;
95
- const finalTraceRef = mapped.traceRef && mapped.traceRef.length > 0
96
- ? mapped.traceRef
97
- : `${this.sessionRef}:${localTurn}`;
98
- const toSend = Object.assign(Object.assign({}, mapped), { traceRef: finalTraceRef });
99
- try {
100
- await this.startPromise;
101
- // Wait for any queued operations to complete first (maintains ordering)
102
- await this.sendQueue.catch(() => { });
103
- await this._sendTrace(toSend, { timeoutMs: opt === null || opt === void 0 ? void 0 : opt.timeoutMs });
104
- return { traceRef: finalTraceRef, turnNumber: localTurn };
105
- }
106
- catch (err) {
107
- if (this.debug) {
108
- console.warn('[Teckel] Sync trace failed:', (err === null || err === void 0 ? void 0 : err.message) || err);
109
- }
110
- throw err;
111
- }
112
- }
113
82
  /**
114
83
  * Add user feedback signal
115
84
  * Never throws - gracefully handles errors
@@ -131,7 +100,8 @@ class Conversation {
131
100
  }
132
101
  catch (err) {
133
102
  if (this.debug) {
134
- console.warn('[Teckel] Feedback failed:', (err === null || err === void 0 ? void 0 : err.message) || err);
103
+ const msg = err instanceof Error ? err.message : String(err);
104
+ console.warn('[Teckel] Feedback failed:', msg);
135
105
  }
136
106
  }
137
107
  });
@@ -164,7 +134,8 @@ class Conversation {
164
134
  }
165
135
  catch (err) {
166
136
  if (this.debug) {
167
- console.warn('[Teckel] End failed:', (err === null || err === void 0 ? void 0 : err.message) || err);
137
+ const msg = err instanceof Error ? err.message : String(err);
138
+ console.warn('[Teckel] End failed:', msg);
168
139
  }
169
140
  }
170
141
  });
@@ -227,7 +198,7 @@ class Conversation {
227
198
  'Content-Type': 'application/json'
228
199
  },
229
200
  keepalive: true,
230
- signal: AbortSignal.timeout ? AbortSignal.timeout(this.timeoutMs) : undefined,
201
+ signal: getTimeoutSignal(this.timeoutMs),
231
202
  body: JSON.stringify({
232
203
  sessionRef: this.sessionRef,
233
204
  userId: this.userId,
@@ -247,7 +218,7 @@ class Conversation {
247
218
  'Content-Type': 'application/json'
248
219
  },
249
220
  keepalive: true,
250
- signal: AbortSignal.timeout ? AbortSignal.timeout((_a = opt === null || opt === void 0 ? void 0 : opt.timeoutMs) !== null && _a !== void 0 ? _a : this.timeoutMs) : undefined,
221
+ signal: getTimeoutSignal((_a = opt === null || opt === void 0 ? void 0 : opt.timeoutMs) !== null && _a !== void 0 ? _a : this.timeoutMs),
251
222
  body: JSON.stringify(data)
252
223
  }, { retries: 1, retryDelayMs: 300 });
253
224
  if (!response.ok) {
@@ -264,7 +235,7 @@ class Conversation {
264
235
  'Content-Type': 'application/json'
265
236
  },
266
237
  keepalive: true,
267
- signal: AbortSignal.timeout ? AbortSignal.timeout(this.timeoutMs) : undefined,
238
+ signal: getTimeoutSignal(this.timeoutMs),
268
239
  body: JSON.stringify(data)
269
240
  }, { retries: 1, retryDelayMs: 300 });
270
241
  if (!response.ok) {
@@ -279,7 +250,7 @@ class Conversation {
279
250
  'Content-Type': 'application/json'
280
251
  },
281
252
  keepalive: true,
282
- signal: AbortSignal.timeout ? AbortSignal.timeout(this.timeoutMs) : undefined,
253
+ signal: getTimeoutSignal(this.timeoutMs),
283
254
  body: JSON.stringify({
284
255
  durationMs: duration,
285
256
  turnCount: this.turnCount
@@ -316,7 +287,8 @@ class Conversation {
316
287
  }
317
288
  catch (err) {
318
289
  if (this.debug) {
319
- console.warn('[Teckel] Flush incomplete:', (err === null || err === void 0 ? void 0 : err.message) || err);
290
+ const msg = err instanceof Error ? err.message : String(err);
291
+ console.warn('[Teckel] Flush incomplete:', msg);
320
292
  }
321
293
  // Surface timeout so callers can react in serverless
322
294
  throw err;
@@ -326,38 +298,13 @@ class Conversation {
326
298
  clearTimeout(timer);
327
299
  }
328
300
  }
329
- // Utility: map OpenTelemetry-like metadata to canonical fields (beta)
330
- mapOTelAliases(data) {
331
- try {
332
- const meta = data.metadata;
333
- if (!meta)
334
- return data;
335
- const mapped = Object.assign({}, data);
336
- // Model
337
- if (!mapped.model && typeof meta['gen_ai.request.model'] === 'string') {
338
- mapped.model = meta['gen_ai.request.model'];
339
- }
340
- // Tokens
341
- const inTok = meta['gen_ai.usage.input_tokens'];
342
- const outTok = meta['gen_ai.usage.output_tokens'];
343
- const totTok = meta['gen_ai.usage.total_tokens'];
344
- if (!mapped.tokens && (typeof inTok === 'number' || typeof outTok === 'number')) {
345
- mapped.tokens = {
346
- prompt: typeof inTok === 'number' ? inTok : 0,
347
- completion: typeof outTok === 'number' ? outTok : 0,
348
- total: typeof totTok === 'number' ? totTok : ((typeof inTok === 'number' ? inTok : 0) + (typeof outTok === 'number' ? outTok : 0))
349
- };
350
- }
351
- // Latency
352
- const latency = meta['gen_ai.response.latency'];
353
- if (!mapped.responseTimeMs && typeof latency === 'number') {
354
- mapped.responseTimeMs = latency;
355
- }
356
- return mapped;
357
- }
358
- catch (_a) {
359
- return data;
360
- }
361
- }
362
301
  }
363
302
  exports.Conversation = Conversation;
303
+ // Helper: get AbortSignal.timeout if available
304
+ function getTimeoutSignal(ms) {
305
+ const timeout = AbortSignal.timeout;
306
+ if (typeof timeout === 'function' && typeof ms === 'number') {
307
+ return timeout(ms);
308
+ }
309
+ return undefined;
310
+ }
package/dist/index.d.ts CHANGED
@@ -1,5 +1,5 @@
1
1
  /**
2
- * teckel-ai v0.3.1
2
+ * teckel-ai v0.3.3
3
3
  * Simple SDK for AI conversation tracking and RAG observability
4
4
  *
5
5
  * @packageDocumentation
package/dist/index.js CHANGED
@@ -1,6 +1,6 @@
1
1
  "use strict";
2
2
  /**
3
- * teckel-ai v0.3.1
3
+ * teckel-ai v0.3.3
4
4
  * Simple SDK for AI conversation tracking and RAG observability
5
5
  *
6
6
  * @packageDocumentation
package/dist/schemas.d.ts CHANGED
@@ -1,5 +1,5 @@
1
1
  /**
2
- * Zod validation schemas for teckel-ai SDK v0.3.1
2
+ * Zod validation schemas for teckel-ai SDK v0.3.2
3
3
  */
4
4
  import { z } from 'zod';
5
5
  /**
@@ -110,7 +110,7 @@ export declare const TraceDataSchema: z.ZodObject<{
110
110
  completion: number;
111
111
  total: number;
112
112
  }>>;
113
- metadata: z.ZodOptional<z.ZodRecord<z.ZodString, z.ZodAny>>;
113
+ metadata: z.ZodOptional<z.ZodRecord<z.ZodString, z.ZodUnknown>>;
114
114
  traceRef: z.ZodOptional<z.ZodString>;
115
115
  }, "strip", z.ZodTypeAny, {
116
116
  query: string;
@@ -134,7 +134,7 @@ export declare const TraceDataSchema: z.ZodObject<{
134
134
  completion: number;
135
135
  total: number;
136
136
  } | undefined;
137
- metadata?: Record<string, any> | undefined;
137
+ metadata?: Record<string, unknown> | undefined;
138
138
  traceRef?: string | undefined;
139
139
  }, {
140
140
  query: string;
@@ -158,7 +158,7 @@ export declare const TraceDataSchema: z.ZodObject<{
158
158
  completion: number;
159
159
  total: number;
160
160
  } | undefined;
161
- metadata?: Record<string, any> | undefined;
161
+ metadata?: Record<string, unknown> | undefined;
162
162
  traceRef?: string | undefined;
163
163
  }>;
164
164
  /**
@@ -167,14 +167,14 @@ export declare const TraceDataSchema: z.ZodObject<{
167
167
  export declare const ConversationOptionsSchema: z.ZodObject<{
168
168
  sessionRef: z.ZodString;
169
169
  userId: z.ZodOptional<z.ZodString>;
170
- metadata: z.ZodOptional<z.ZodRecord<z.ZodString, z.ZodAny>>;
170
+ metadata: z.ZodOptional<z.ZodRecord<z.ZodString, z.ZodUnknown>>;
171
171
  }, "strip", z.ZodTypeAny, {
172
172
  sessionRef: string;
173
- metadata?: Record<string, any> | undefined;
173
+ metadata?: Record<string, unknown> | undefined;
174
174
  userId?: string | undefined;
175
175
  }, {
176
176
  sessionRef: string;
177
- metadata?: Record<string, any> | undefined;
177
+ metadata?: Record<string, unknown> | undefined;
178
178
  userId?: string | undefined;
179
179
  }>;
180
180
  /**
package/dist/schemas.js CHANGED
@@ -1,6 +1,6 @@
1
1
  "use strict";
2
2
  /**
3
- * Zod validation schemas for teckel-ai SDK v0.3.1
3
+ * Zod validation schemas for teckel-ai SDK v0.3.2
4
4
  */
5
5
  Object.defineProperty(exports, "__esModule", { value: true });
6
6
  exports.TeckelConfigSchema = exports.FeedbackDataSchema = exports.ConversationOptionsSchema = exports.TraceDataSchema = exports.TokenUsageSchema = exports.DocumentSchema = void 0;
@@ -38,7 +38,7 @@ exports.TraceDataSchema = zod_1.z.object({
38
38
  responseTimeMs: zod_1.z.number().positive().optional(),
39
39
  documents: zod_1.z.array(exports.DocumentSchema).max(50, 'Too many documents (max 50)').optional(),
40
40
  tokens: exports.TokenUsageSchema.optional(),
41
- metadata: zod_1.z.record(zod_1.z.string(), zod_1.z.any()).optional(),
41
+ metadata: zod_1.z.record(zod_1.z.string(), zod_1.z.unknown()).optional(),
42
42
  traceRef: zod_1.z.string().min(1).optional()
43
43
  });
44
44
  /**
@@ -47,7 +47,7 @@ exports.TraceDataSchema = zod_1.z.object({
47
47
  exports.ConversationOptionsSchema = zod_1.z.object({
48
48
  sessionRef: zod_1.z.string().min(1, 'sessionRef is required'),
49
49
  userId: zod_1.z.string().optional(),
50
- metadata: zod_1.z.record(zod_1.z.string(), zod_1.z.any()).optional()
50
+ metadata: zod_1.z.record(zod_1.z.string(), zod_1.z.unknown()).optional()
51
51
  });
52
52
  /**
53
53
  * Feedback data schema
package/dist/tracer.d.ts CHANGED
@@ -1,5 +1,5 @@
1
1
  /**
2
- * TeckelTracer - Main SDK class for teckel-ai v0.3.1
2
+ * TeckelTracer - Main SDK class for teckel-ai v0.3.2
3
3
  * Simple, lightweight SDK for AI conversation tracking
4
4
  */
5
5
  import { Conversation } from './conversation';
@@ -12,7 +12,7 @@ export declare class TeckelTracer {
12
12
  constructor(config: TeckelConfig);
13
13
  /**
14
14
  * Start a new conversation
15
- * sessionRef IS the public conversation identifier (sessionId is legacy alias)
15
+ * sessionRef IS the public conversation identifier
16
16
  */
17
17
  start(options: ConversationOptions): Conversation;
18
18
  }
package/dist/tracer.js CHANGED
@@ -1,6 +1,6 @@
1
1
  "use strict";
2
2
  /**
3
- * TeckelTracer - Main SDK class for teckel-ai v0.3.1
3
+ * TeckelTracer - Main SDK class for teckel-ai v0.3.2
4
4
  * Simple, lightweight SDK for AI conversation tracking
5
5
  */
6
6
  Object.defineProperty(exports, "__esModule", { value: true });
@@ -24,14 +24,14 @@ class TeckelTracer {
24
24
  if (this.debug) {
25
25
  console.log('[Teckel] SDK initialized:', {
26
26
  endpoint: this.endpoint,
27
- version: '0.3.1',
27
+ version: '0.3.2',
28
28
  timeoutMs: this.timeoutMs
29
29
  });
30
30
  }
31
31
  }
32
32
  /**
33
33
  * Start a new conversation
34
- * sessionRef IS the public conversation identifier (sessionId is legacy alias)
34
+ * sessionRef IS the public conversation identifier
35
35
  */
36
36
  start(options) {
37
37
  // Validate options with Zod
package/dist/types.d.ts CHANGED
@@ -1,5 +1,5 @@
1
1
  /**
2
- * Type definitions for teckel-ai SDK v0.3.1
2
+ * Type definitions for teckel-ai SDK v0.3.2
3
3
  * Simple, clean types matching existing database schema
4
4
  */
5
5
  /**
@@ -18,7 +18,7 @@ export interface TeckelConfig {
18
18
  export interface ConversationOptions {
19
19
  sessionRef: string;
20
20
  userId?: string;
21
- metadata?: Record<string, any>;
21
+ metadata?: Record<string, unknown>;
22
22
  }
23
23
  /**
24
24
  * Document structure for RAG systems
@@ -55,7 +55,7 @@ export interface TraceData {
55
55
  responseTimeMs?: number;
56
56
  documents?: Document[];
57
57
  tokens?: TokenUsage;
58
- metadata?: Record<string, any>;
58
+ metadata?: Record<string, unknown>;
59
59
  traceRef?: string;
60
60
  }
61
61
  /**
package/dist/types.js CHANGED
@@ -1,6 +1,6 @@
1
1
  "use strict";
2
2
  /**
3
- * Type definitions for teckel-ai SDK v0.3.1
3
+ * Type definitions for teckel-ai SDK v0.3.2
4
4
  * Simple, clean types matching existing database schema
5
5
  */
6
6
  Object.defineProperty(exports, "__esModule", { value: true });
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "teckel-ai",
3
- "version": "0.3.2",
3
+ "version": "0.3.3",
4
4
  "description": "Simple SDK for AI conversation tracking and RAG observability",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",