@revenium/openai 1.0.11 → 1.0.12

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (66) hide show
  1. package/.env.example +20 -0
  2. package/CHANGELOG.md +21 -47
  3. package/README.md +141 -690
  4. package/dist/cjs/core/config/loader.js +1 -1
  5. package/dist/cjs/core/config/loader.js.map +1 -1
  6. package/dist/cjs/core/tracking/api-client.js +1 -1
  7. package/dist/cjs/core/tracking/api-client.js.map +1 -1
  8. package/dist/cjs/index.js +2 -2
  9. package/dist/cjs/index.js.map +1 -1
  10. package/dist/cjs/utils/url-builder.js +32 -7
  11. package/dist/cjs/utils/url-builder.js.map +1 -1
  12. package/dist/esm/core/config/loader.js +1 -1
  13. package/dist/esm/core/config/loader.js.map +1 -1
  14. package/dist/esm/core/tracking/api-client.js +1 -1
  15. package/dist/esm/core/tracking/api-client.js.map +1 -1
  16. package/dist/esm/index.js +2 -2
  17. package/dist/esm/index.js.map +1 -1
  18. package/dist/esm/utils/url-builder.js +32 -7
  19. package/dist/esm/utils/url-builder.js.map +1 -1
  20. package/dist/types/index.d.ts +2 -2
  21. package/dist/types/types/index.d.ts +2 -2
  22. package/dist/types/types/index.d.ts.map +1 -1
  23. package/dist/types/utils/url-builder.d.ts +11 -3
  24. package/dist/types/utils/url-builder.d.ts.map +1 -1
  25. package/examples/README.md +250 -254
  26. package/examples/azure-basic.ts +25 -13
  27. package/examples/azure-responses-basic.ts +36 -7
  28. package/examples/azure-responses-streaming.ts +36 -7
  29. package/examples/azure-streaming.ts +40 -19
  30. package/examples/getting_started.ts +54 -0
  31. package/examples/openai-basic.ts +39 -17
  32. package/examples/openai-function-calling.ts +259 -0
  33. package/examples/openai-responses-basic.ts +36 -7
  34. package/examples/openai-responses-streaming.ts +36 -7
  35. package/examples/openai-streaming.ts +24 -13
  36. package/examples/openai-vision.ts +289 -0
  37. package/package.json +3 -9
  38. package/src/core/config/azure-config.ts +72 -0
  39. package/src/core/config/index.ts +23 -0
  40. package/src/core/config/loader.ts +66 -0
  41. package/src/core/config/manager.ts +94 -0
  42. package/src/core/config/validator.ts +89 -0
  43. package/src/core/providers/detector.ts +159 -0
  44. package/src/core/providers/index.ts +16 -0
  45. package/src/core/tracking/api-client.ts +78 -0
  46. package/src/core/tracking/index.ts +21 -0
  47. package/src/core/tracking/payload-builder.ts +132 -0
  48. package/src/core/tracking/usage-tracker.ts +189 -0
  49. package/src/core/wrapper/index.ts +9 -0
  50. package/src/core/wrapper/instance-patcher.ts +288 -0
  51. package/src/core/wrapper/request-handler.ts +423 -0
  52. package/src/core/wrapper/stream-wrapper.ts +100 -0
  53. package/src/index.ts +336 -0
  54. package/src/types/function-parameters.ts +251 -0
  55. package/src/types/index.ts +313 -0
  56. package/src/types/openai-augmentation.ts +233 -0
  57. package/src/types/responses-api.ts +308 -0
  58. package/src/utils/azure-model-resolver.ts +220 -0
  59. package/src/utils/constants.ts +21 -0
  60. package/src/utils/error-handler.ts +251 -0
  61. package/src/utils/metadata-builder.ts +219 -0
  62. package/src/utils/provider-detection.ts +257 -0
  63. package/src/utils/request-handler-factory.ts +285 -0
  64. package/src/utils/stop-reason-mapper.ts +74 -0
  65. package/src/utils/type-guards.ts +202 -0
  66. package/src/utils/url-builder.ts +68 -0
@@ -1,42 +1,25 @@
1
- # Revenium OpenAI Middleware - TypeScript Examples
1
+ # Revenium OpenAI Middleware - Examples
2
2
 
3
- **TypeScript-first** examples demonstrating how to use the Revenium OpenAI middleware with full type safety and IntelliSense support.
3
+ **TypeScript-first** examples demonstrating automatic Revenium usage tracking with the OpenAI SDK.
4
4
 
5
- ## TypeScript-First Approach
6
-
7
- This middleware is designed with **TypeScript developers in mind**:
8
-
9
- - **Full Type Safety**: All interfaces are strongly typed with comprehensive JSDoc
10
- - **IntelliSense Support**: Rich auto-completion and inline documentation
11
- - **Module Augmentation**: Native `usageMetadata` support in OpenAI SDK calls
12
- - **Dual Package Exports**: Works with both CommonJS and ES Modules
13
- - **Zero Configuration**: Auto-initialization with environment variables
14
-
15
- ## Installation & Setup
5
+ ## Getting Started - Step by Step
16
6
 
17
- ### 1. Install Dependencies
7
+ ### 1. Create Your Project
18
8
 
19
9
  ```bash
20
- npm install @revenium/openai openai@^5.8.0
21
- npm install -D typescript tsx @types/node # For TypeScript development
22
- ```
10
+ # Create project directory
11
+ mkdir my-openai-project
12
+ cd my-openai-project
23
13
 
24
- ### 2. TypeScript Configuration
14
+ # Initialize Node.js project
15
+ npm init -y
16
+ ```
25
17
 
26
- Ensure your `tsconfig.json` includes:
18
+ ### 2. Install Dependencies
27
19
 
28
- ```json
29
- {
30
- "compilerOptions": {
31
- "target": "ES2020",
32
- "module": "ESNext",
33
- "moduleResolution": "node",
34
- "esModuleInterop": true,
35
- "allowSyntheticDefaultImports": true,
36
- "strict": true,
37
- "skipLibCheck": true
38
- }
39
- }
20
+ ```bash
21
+ npm install @revenium/openai openai dotenv
22
+ npm install -D typescript tsx @types/node # For TypeScript
40
23
  ```
41
24
 
42
25
  ### 3. Environment Setup
@@ -44,15 +27,13 @@ Ensure your `tsconfig.json` includes:
44
27
  Create a `.env` file in your project root:
45
28
 
46
29
  ```bash
47
- # Required - OpenAI API key
48
- OPENAI_API_KEY=sk-your_openai_api_key_here
49
-
50
- # Required - Revenium API key
30
+ # Required
51
31
  REVENIUM_METERING_API_KEY=hak_your_revenium_api_key
32
+ OPENAI_API_KEY=sk_your_openai_api_key
52
33
 
53
- # Optional (uses defaults if not set)
54
- REVENIUM_METERING_BASE_URL=https://api.revenium.io/meter
55
- REVENIUM_DEBUG=false # Set to true for detailed logging
34
+ # Optional
35
+ REVENIUM_METERING_BASE_URL=https://api.revenium.io
36
+ REVENIUM_DEBUG=false
56
37
 
57
38
  # Optional - For Azure OpenAI examples
58
39
  AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/
@@ -61,301 +42,316 @@ AZURE_OPENAI_DEPLOYMENT=your-deployment-name
61
42
  AZURE_OPENAI_API_VERSION=2024-12-01-preview
62
43
  ```
63
44
 
64
- ### 4. Run TypeScript Examples
45
+ ### 4. Run Examples
46
+
47
+ **If you cloned from GitHub:**
65
48
 
66
49
  ```bash
67
- # OpenAI examples
50
+ # Run examples directly
51
+ npx tsx examples/getting_started.ts
68
52
  npx tsx examples/openai-basic.ts
69
53
  npx tsx examples/openai-streaming.ts
70
- npx tsx examples/openai-responses-basic.ts
71
- npx tsx examples/openai-responses-streaming.ts
72
-
73
- # Azure OpenAI examples
74
- npx tsx examples/azure-basic.ts
75
- npx tsx examples/azure-streaming.ts
76
- npx tsx examples/azure-responses-basic.ts
77
- npx tsx examples/azure-responses-streaming.ts
78
54
  ```
79
55
 
80
- ## Getting Started - Step by Step
56
+ **If you installed via npm:**
81
57
 
82
- This guide walks you through creating a complete project from scratch. For GitHub users who cloned this repository, you can run the included examples directly. For npm users, these examples are also available in your `node_modules/@revenium/openai/examples/` directory.
58
+ Examples are included in your `node_modules/@revenium/openai/examples/` directory:
83
59
 
84
- ### Step 1: Create Your First Test
60
+ ```bash
61
+ npx tsx node_modules/@revenium/openai/examples/getting_started.ts
62
+ npx tsx node_modules/@revenium/openai/examples/openai-basic.ts
63
+ npx tsx node_modules/@revenium/openai/examples/openai-streaming.ts
64
+ ```
85
65
 
86
- #### TypeScript Test
66
+ ## Available Examples
87
67
 
88
- Create `test-openai.ts`:
68
+ ### `getting_started.ts` - Simple Entry Point
89
69
 
90
- ```typescript
91
- // test-openai.ts
92
- import "dotenv/config";
93
- import { initializeReveniumFromEnv, patchOpenAIInstance } from "@revenium/openai";
94
- import OpenAI from "openai";
70
+ The simplest example to get you started with Revenium tracking:
95
71
 
96
- async function testOpenAI() {
97
- try {
98
- // Initialize Revenium middleware
99
- initializeReveniumFromEnv();
100
-
101
- // Create and patch OpenAI client (returns patched instance)
102
- const openai = patchOpenAIInstance(new OpenAI());
103
-
104
- // Make API call with optional metadata
105
- const response = await openai.chat.completions.create({
106
- model: "gpt-4o",
107
- messages: [{ role: "user", content: "What is artificial intelligence?" }],
108
- max_tokens: 100,
109
- usageMetadata: {
110
- subscriber: {
111
- id: "test-user",
112
- email: "test@example.com",
113
- },
114
- organizationId: "test-org",
115
- taskType: "test-query",
116
- },
117
- });
118
-
119
- console.log("Response:", response.choices[0].message.content);
120
- // Success! Response and metering data sent to Revenium
121
- } catch (error) {
122
- console.error("Error:", error);
123
- }
124
- }
72
+ - **Minimal setup** - Just import, configure, and start tracking
73
+ - **Complete metadata example** - Shows all 11 optional metadata fields in comments
74
+ - **Ready to customize** - Uncomment the metadata section to add tracking context
125
75
 
126
- testOpenAI();
127
- ```
76
+ **Key Features:**
128
77
 
129
- #### JavaScript Test
78
+ - Auto-initialization from environment variables
79
+ - Native `usageMetadata` support via module augmentation
80
+ - All metadata fields documented with examples
81
+ - Single API call demonstration
130
82
 
131
- Create `test-openai.js`:
83
+ **Perfect for:** First-time users, quick validation, understanding metadata structure
132
84
 
133
- ```javascript
134
- // test-openai.js
135
- require("dotenv").config();
136
- const { initializeReveniumFromEnv, patchOpenAIInstance } = require("@revenium/openai");
137
- const OpenAI = require("openai");
85
+ **See the file for complete code examples.**
138
86
 
139
- async function testOpenAI() {
140
- try {
141
- // Initialize Revenium middleware
142
- initializeReveniumFromEnv();
87
+ ### `openai-basic.ts` - Chat Completions and Embeddings
143
88
 
144
- // Create and patch OpenAI client (returns patched instance)
145
- const openai = patchOpenAIInstance(new OpenAI());
89
+ Demonstrates standard OpenAI API usage with automatic tracking:
146
90
 
147
- // Make API call (metadata optional)
148
- const response = await openai.chat.completions.create({
149
- model: "gpt-4o-mini",
150
- messages: [{ role: "user", content: "What is artificial intelligence?" }],
151
- max_tokens: 100,
152
- });
91
+ - **Chat completions** - Basic chat API with metadata tracking
92
+ - **Embeddings** - Text embedding generation with usage tracking
93
+ - **Multiple API calls** - Batch operations with consistent metadata
153
94
 
154
- console.log("Response:", response.choices[0].message.content);
155
- // Success! Response and metering data sent to Revenium
156
- } catch (error) {
157
- console.error("Error:", error);
158
- }
159
- }
95
+ **Key Features:**
160
96
 
161
- testOpenAI();
162
- ```
97
+ - TypeScript module augmentation for native `usageMetadata` support
98
+ - Full type safety with IntelliSense
99
+ - Comprehensive metadata examples
100
+ - Error handling patterns
163
101
 
164
- ### Step 2: Update package.json
102
+ **Perfect for:** Understanding basic OpenAI API patterns with tracking
165
103
 
166
- Add a test script to easily run your example:
104
+ **See the file for complete code examples.**
167
105
 
168
- ```json
169
- {
170
- "scripts": {
171
- "test": "tsx test-openai.ts"
172
- },
173
- "dependencies": {
174
- "@revenium/openai": "^1.0.10",
175
- "openai": "^5.8.0",
176
- "dotenv": "^16.0.0"
177
- },
178
- "devDependencies": {
179
- "typescript": "^5.0.0",
180
- "tsx": "^4.0.0",
181
- "@types/node": "^20.0.0"
182
- }
183
- }
184
- ```
106
+ ### `openai-streaming.ts` - Real-time Streaming
185
107
 
186
- ### Step 3: Run Your Tests
108
+ Demonstrates streaming responses with automatic token tracking:
187
109
 
188
- ```bash
189
- # TypeScript
190
- npm run test
110
+ - **Streaming chat completions** - Real-time token streaming with metadata
111
+ - **Batch embeddings** - Multiple embedding requests efficiently
112
+ - **Stream processing** - Type-safe event handling
191
113
 
192
- # Or directly
193
- npx tsx test-openai.ts
114
+ **Key Features:**
194
115
 
195
- # JavaScript
196
- node test-openai.js
197
- ```
116
+ - Automatic tracking when stream completes
117
+ - Real-time token counting
118
+ - Time-to-first-token metrics
119
+ - Stream error handling
198
120
 
199
- ### Step 4: Create Advanced Examples
121
+ **Perfect for:** Real-time applications, chatbots, interactive AI assistants
200
122
 
201
- Explore our comprehensive examples for different use cases:
123
+ **See the file for complete code examples.**
202
124
 
203
- **Standard Chat Completions API:**
204
- - `openai-basic.ts` - Chat completions with metadata
205
- - `openai-streaming.ts` - Streaming responses
206
- - `azure-basic.ts` - Azure OpenAI chat completions
207
- - `azure-streaming.ts` - Azure OpenAI streaming
125
+ ### `openai-function-calling.ts` - Function Calling with Tools
208
126
 
209
- **Responses API (OpenAI SDK 5.8+):**
210
- - `openai-responses-basic.ts` - New Responses API
211
- - `openai-responses-streaming.ts` - Responses API streaming
212
- - `azure-responses-basic.ts` - Azure Responses API
213
- - `azure-responses-streaming.ts` - Azure Responses API streaming
127
+ Demonstrates OpenAI function calling (tools) with tracking:
214
128
 
215
- ### Step 5: Project Structure
129
+ - **Tool definitions** - Define functions for the AI to call
130
+ - **Function execution** - Handle tool calls and return results
131
+ - **Multi-turn conversations** - Manage conversation flow with function results
216
132
 
217
- Here's a recommended project structure:
133
+ **Key Features:**
218
134
 
219
- ```
220
- my-openai-project/
221
- ├── .env # Environment variables (never commit!)
222
- ├── .gitignore # Include .env in here
223
- ├── package.json # Dependencies and scripts
224
- ├── tsconfig.json # TypeScript configuration
225
- ├── src/
226
- │ └── index.ts # Your application code
227
- ├── examples/ # Copy examples here for reference
228
- │ ├── openai-basic.ts
229
- │ └── ...
230
- └── node_modules/
231
- └── @revenium/openai/
232
- └── examples/ # npm users: examples available here too
233
- ```
135
+ - Complete function calling workflow
136
+ - Type-safe tool definitions
137
+ - Automatic tracking of tool usage
138
+ - Multi-turn conversation handling
234
139
 
235
- ## TypeScript Integration Patterns
140
+ **Perfect for:** AI agents, function-calling applications, structured outputs
236
141
 
237
- ### Pattern A: Auto-Initialization (Recommended)
142
+ **See the file for complete code examples.**
238
143
 
239
- ```typescript
240
- import { patchOpenAIInstance } from "@revenium/openai";
241
- import OpenAI from "openai";
144
+ ### `openai-vision.ts` - Vision API with Images
242
145
 
243
- // Auto-initializes from environment variables
244
- const openai = patchOpenAIInstance(new OpenAI());
245
- // Already tracked! No explicit init needed if env vars are set
246
- ```
146
+ Demonstrates OpenAI Vision API with image analysis:
247
147
 
248
- ### Pattern B: Explicit Initialization
148
+ - **Image URL analysis** - Analyze images from URLs
149
+ - **Base64 images** - Process local images as base64
150
+ - **Multi-image analysis** - Compare multiple images in one request
249
151
 
250
- ```typescript
251
- import { initializeReveniumFromEnv, patchOpenAIInstance } from "@revenium/openai";
252
- import OpenAI from "openai";
152
+ **Key Features:**
253
153
 
254
- // Explicit initialization with error handling
255
- const result = initializeReveniumFromEnv();
256
- if (!result.success) {
257
- throw new Error(result.message);
258
- }
154
+ - Image analysis with GPT-4 Vision
155
+ - Multiple image input methods
156
+ - Detailed image descriptions
157
+ - Usage tracking for vision API calls
259
158
 
260
- const openai = patchOpenAIInstance(new OpenAI());
261
- ```
159
+ **Perfect for:** Image analysis applications, visual search, accessibility tools
262
160
 
263
- ### Pattern C: Manual Configuration
161
+ **See the file for complete code examples.**
264
162
 
265
- ```typescript
266
- import { patchOpenAIInstance } from "@revenium/openai";
267
- import OpenAI from "openai";
163
+ ### `openai-responses-basic.ts` - New Responses API
268
164
 
269
- const openai = patchOpenAIInstance(new OpenAI(), {
270
- meteringApiKey: "hak_your_key",
271
- meteringBaseUrl: "https://api.revenium.io/meter",
272
- });
273
- ```
165
+ Demonstrates OpenAI's new Responses API (SDK 5.8+):
166
+
167
+ - **Simplified interface** - Uses `input` instead of `messages` parameter
168
+ - **Stateful API** - Enhanced capabilities for agent-like applications
169
+ - **Unified experience** - Combines chat completions and assistants features
170
+
171
+ **Key Features:**
172
+
173
+ - New Responses API patterns
174
+ - Automatic tracking with new API
175
+ - Metadata support
176
+ - Backward compatibility notes
177
+
178
+ **Perfect for:** Applications using OpenAI's latest API features
179
+
180
+ **See the file for complete code examples.**
181
+
182
+ ### `openai-responses-streaming.ts` - Responses API Streaming
274
183
 
275
- ## Available TypeScript Examples
184
+ Demonstrates streaming with the new Responses API:
276
185
 
277
- ### OpenAI Examples
186
+ - **Streaming responses** - Real-time responses with new API
187
+ - **Event handling** - Process response events as they arrive
188
+ - **Usage tracking** - Automatic tracking for streaming responses
278
189
 
279
- | File | Description | Features |
280
- |------|-------------|----------|
281
- | `openai-basic.ts` | Basic chat completions | Chat, embeddings, metadata tracking |
282
- | `openai-streaming.ts` | Streaming responses | Real-time token streaming |
283
- | `openai-responses-basic.ts` | Responses API (new) | New response format (SDK 5.8+) |
284
- | `openai-responses-streaming.ts` | Responses API streaming | Streaming with new API |
190
+ **Key Features:**
191
+
192
+ - Responses API streaming patterns
193
+ - Type-safe event processing
194
+ - Automatic usage metrics
195
+ - Stream completion tracking
196
+
197
+ **Perfect for:** Real-time applications using the new Responses API
198
+
199
+ **See the file for complete code examples.**
285
200
 
286
201
  ### Azure OpenAI Examples
287
202
 
288
- | File | Description | Features |
289
- |------|-------------|----------|
290
- | `azure-basic.ts` | Azure chat completions | Azure integration, auto-detection |
291
- | `azure-streaming.ts` | Azure streaming | Real-time Azure responses |
292
- | `azure-responses-basic.ts` | Azure Responses API | New API with Azure |
293
- | `azure-responses-streaming.ts` | Azure Responses streaming | Streaming with Azure Responses API |
203
+ #### `azure-basic.ts` - Azure Chat Completions
204
+
205
+ Demonstrates Azure OpenAI integration with automatic detection:
206
+
207
+ - **Azure configuration** - Environment-based Azure setup
208
+ - **Chat completions** - Azure-hosted models with tracking
209
+ - **Automatic detection** - Middleware detects Azure vs OpenAI automatically
210
+
211
+ **Key Features:**
212
+
213
+ - Azure OpenAI deployment configuration
214
+ - Model name resolution for Azure
215
+ - Accurate Azure pricing
216
+ - Metadata tracking with Azure
217
+
218
+ **Perfect for:** Enterprise applications using Azure OpenAI
219
+
220
+ **See the file for complete code examples.**
221
+
222
+ #### `azure-streaming.ts` - Azure Streaming
223
+
224
+ Demonstrates streaming with Azure OpenAI:
225
+
226
+ - **Azure streaming** - Real-time responses from Azure-hosted models
227
+ - **Deployment resolution** - Automatic Azure deployment name handling
228
+ - **Usage tracking** - Azure-specific metrics and pricing
229
+
230
+ **Perfect for:** Real-time Azure OpenAI applications
231
+
232
+ **See the file for complete code examples.**
233
+
234
+ #### `azure-responses-basic.ts` - Azure Responses API
294
235
 
295
- ## TypeScript Requirements
236
+ Demonstrates new Responses API with Azure OpenAI:
296
237
 
297
- **Minimum versions:**
298
- - TypeScript: 4.5+
299
- - Node.js: 16.0+
300
- - OpenAI SDK: 5.0+ (5.8+ for Responses API)
238
+ - **Azure + Responses API** - Combine Azure hosting with new API
239
+ - **Unified interface** - Same Responses API patterns on Azure
240
+ - **Automatic tracking** - Azure-aware usage tracking
241
+
242
+ **Perfect for:** Azure applications using latest OpenAI features
243
+
244
+ **See the file for complete code examples.**
245
+
246
+ #### `azure-responses-streaming.ts` - Azure Responses Streaming
247
+
248
+ Demonstrates Responses API streaming with Azure OpenAI:
249
+
250
+ - **Azure streaming** - Real-time Responses API on Azure
251
+ - **Event handling** - Process Azure response events
252
+ - **Complete tracking** - Azure metrics with new API
253
+
254
+ **Perfect for:** Real-time Azure applications with Responses API
255
+
256
+ **See the file for complete code examples.**
257
+
258
+ ## TypeScript Configuration
259
+
260
+ Ensure your `tsconfig.json` includes:
301
261
 
302
- **TypeScript compiler options:**
303
262
  ```json
304
263
  {
305
- "esModuleInterop": true,
306
- "allowSyntheticDefaultImports": true,
307
- "strict": true,
308
- "skipLibCheck": true
264
+ "compilerOptions": {
265
+ "target": "ES2020",
266
+ "module": "ESNext",
267
+ "moduleResolution": "node",
268
+ "esModuleInterop": true,
269
+ "allowSyntheticDefaultImports": true,
270
+ "strict": true,
271
+ "skipLibCheck": true
272
+ }
309
273
  }
310
274
  ```
311
275
 
312
- ## TypeScript Troubleshooting
276
+ ## Requirements
313
277
 
314
- ### Issue: `usageMetadata` property not recognized
278
+ - **Node.js 16+** with TypeScript support
279
+ - **TypeScript 4.5+** for module augmentation features
280
+ - **Valid Revenium API key** (starts with `hak_`)
281
+ - **Valid OpenAI API key** (starts with `sk-`) or Azure OpenAI credentials
282
+ - **OpenAI SDK 5.0+** (5.8+ for Responses API)
315
283
 
316
- **Solution**: Ensure you import the middleware **before** OpenAI:
284
+ ## Troubleshooting
285
+
286
+ ### Module Augmentation Not Working
287
+
288
+ **Problem:** TypeScript doesn't recognize `usageMetadata` in OpenAI SDK calls
289
+
290
+ **Solution:**
317
291
 
318
292
  ```typescript
319
- // Correct order
320
- import "@revenium/openai";
321
- import OpenAI from "openai";
293
+ // ❌ Wrong - missing module augmentation import
294
+ import { initializeReveniumFromEnv } from "@revenium/openai";
322
295
 
323
- // Wrong order
296
+ // ✅ Correct - import for module augmentation
297
+ import { initializeReveniumFromEnv, patchOpenAIInstance } from "@revenium/openai";
324
298
  import OpenAI from "openai";
325
- import "@revenium/openai";
326
299
  ```
327
300
 
328
- ### Issue: Type errors with streaming responses
301
+ ### Environment Variables Not Loading
329
302
 
330
- **Solution**: The middleware extends OpenAI types automatically. If you see errors:
303
+ **Problem:** `REVENIUM_METERING_API_KEY` or `OPENAI_API_KEY` not found
331
304
 
332
- ```typescript
333
- // Ensure you're using the patched client (correct pattern)
334
- import { patchOpenAIInstance } from "@revenium/openai";
335
- import OpenAI from "openai";
305
+ **Solutions:**
306
+
307
+ - Ensure `.env` file is in project root
308
+ - Check variable names match exactly
309
+ - Verify you're importing `dotenv/config` before the middleware
310
+ - Check API keys have correct prefixes (`hak_` for Revenium, `sk-` for OpenAI)
311
+
312
+ ### TypeScript Compilation Errors
313
+
314
+ **Problem:** Module resolution or import errors
315
+
316
+ **Solution:** Verify your `tsconfig.json` settings:
336
317
 
337
- const openai = patchOpenAIInstance(new OpenAI()); // Returns patched instance
318
+ ```json
319
+ {
320
+ "compilerOptions": {
321
+ "moduleResolution": "node",
322
+ "esModuleInterop": true,
323
+ "allowSyntheticDefaultImports": true,
324
+ "strict": true
325
+ }
326
+ }
338
327
  ```
339
328
 
340
- ### Issue: Examples not found after npm install
329
+ ### Azure Configuration Issues
330
+
331
+ **Problem:** Azure OpenAI not working or incorrect pricing
332
+
333
+ **Solutions:**
334
+
335
+ - Verify all Azure environment variables are set (`AZURE_OPENAI_ENDPOINT`, `AZURE_OPENAI_API_KEY`, `AZURE_OPENAI_DEPLOYMENT`)
336
+ - Check deployment name matches your Azure resource
337
+ - Ensure API version is compatible (`2024-12-01-preview` or later recommended)
338
+ - Verify endpoint URL format: `https://your-resource-name.openai.azure.com/`
341
339
 
342
- **Solution**: Examples are included in the package:
340
+ ### Debug Mode
341
+
342
+ Enable detailed logging to troubleshoot issues:
343
343
 
344
344
  ```bash
345
- # Check examples are present
346
- ls node_modules/@revenium/openai/examples/
345
+ # In .env file
346
+ REVENIUM_DEBUG=true
347
347
 
348
- # Copy to your project
349
- cp -r node_modules/@revenium/openai/examples/ ./examples/
348
+ # Then run examples
349
+ npx tsx examples/getting_started.ts
350
350
  ```
351
351
 
352
- ## Getting Help
352
+ ## Additional Resources
353
353
 
354
354
  - **Main Documentation**: See root [README.md](https://github.com/revenium/revenium-middleware-openai-node/blob/HEAD/README.md)
355
- - **Troubleshooting**: See [TROUBLESHOOTING.md](https://github.com/revenium/revenium-middleware-openai-node/blob/HEAD/TROUBLESHOOTING.md)
355
+ - **API Reference**: [Revenium Metadata Fields](https://revenium.readme.io/reference/meter_ai_completion)
356
+ - **OpenAI Documentation**: [OpenAI API Reference](https://platform.openai.com/docs)
356
357
  - **Issues**: [Report bugs](https://github.com/revenium/revenium-middleware-openai-node/issues)
357
- - **Support**: support@revenium.io
358
-
359
- ---
360
-
361
- **Built by Revenium** | [Documentation](https://docs.revenium.io) | [npm Package](https://www.npmjs.com/package/@revenium/openai)