logixia 1.0.3 → 1.1.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (37) hide show
  1. package/README.md +523 -2852
  2. package/dist/{build-C67p8wVr.js → build-DIEB3doa.js} +7 -2
  3. package/dist/build-DIEB3doa.js.map +1 -0
  4. package/dist/{build-MpmEusc_.mjs → build-MmD3T4bV.mjs} +8 -3
  5. package/dist/build-MmD3T4bV.mjs.map +1 -0
  6. package/dist/{chunk-C41Io3cc.mjs → chunk-uEZWKkIX.mjs} +1 -1
  7. package/dist/{esm-BYmTa3gi.mjs → esm-BRY8ugtK.mjs} +438 -276
  8. package/dist/esm-BRY8ugtK.mjs.map +1 -0
  9. package/dist/{esm-BTpcNBX_.js → esm-CzjF801-.js} +437 -275
  10. package/dist/esm-CzjF801-.js.map +1 -0
  11. package/dist/index.d.mts +1014 -97
  12. package/dist/index.d.mts.map +1 -1
  13. package/dist/index.d.ts +1014 -97
  14. package/dist/index.d.ts.map +1 -1
  15. package/dist/index.js +3559 -1435
  16. package/dist/index.js.map +1 -1
  17. package/dist/index.mjs +3518 -1408
  18. package/dist/index.mjs.map +1 -1
  19. package/dist/{lib-DvMm_tAr.mjs → lib-BNWFXK2z.mjs} +2211 -786
  20. package/dist/lib-BNWFXK2z.mjs.map +1 -0
  21. package/dist/{lib-xHiD5O-N.js → lib-Bb_wxP5g.js} +2210 -785
  22. package/dist/lib-Bb_wxP5g.js.map +1 -0
  23. package/dist/{promise-CnIyndHL.mjs → promise-BAWXE7C8.mjs} +820 -1449
  24. package/dist/promise-BAWXE7C8.mjs.map +1 -0
  25. package/dist/{promise-C7YeyZbJ.js → promise-Tbon3Kaq.js} +819 -1448
  26. package/dist/promise-Tbon3Kaq.js.map +1 -0
  27. package/dist/{sqlite3--ZdiJYT3.mjs → sqlite3-BUpkBlte.mjs} +2 -2
  28. package/dist/{sqlite3--ZdiJYT3.mjs.map → sqlite3-BUpkBlte.mjs.map} +1 -1
  29. package/package.json +118 -54
  30. package/dist/build-C67p8wVr.js.map +0 -1
  31. package/dist/build-MpmEusc_.mjs.map +0 -1
  32. package/dist/esm-BTpcNBX_.js.map +0 -1
  33. package/dist/esm-BYmTa3gi.mjs.map +0 -1
  34. package/dist/lib-DvMm_tAr.mjs.map +0 -1
  35. package/dist/lib-xHiD5O-N.js.map +0 -1
  36. package/dist/promise-C7YeyZbJ.js.map +0 -1
  37. package/dist/promise-CnIyndHL.mjs.map +0 -1
package/README.md CHANGED
@@ -1,3061 +1,732 @@
1
- # Logixia
2
-
3
- **Enterprise-grade TypeScript logging library with comprehensive transport system, database integration, and advanced log management capabilities**
4
-
5
- [![npm version](https://badge.fury.io/js/logixia.svg)](https://badge.fury.io/js/logixia)
6
- [![TypeScript](https://img.shields.io/badge/TypeScript-5.0+-blue.svg)](https://www.typescriptlang.org/)
7
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
8
- [![Node.js](https://img.shields.io/badge/Node.js-16+-green.svg)](https://nodejs.org/)
9
-
10
- ## Features
11
-
12
- ### Core Logging Capabilities
13
- - **TypeScript-First Architecture**: Complete type safety with intelligent IntelliSense support
14
- - **Custom Log Levels**: Define application-specific log levels with configurable priorities and colors
15
- - **Structured Logging**: Rich metadata support with nested object logging
16
- - **Async/Await Support**: Full asynchronous operation support with proper error handling
17
- - **Child Logger System**: Hierarchical logger creation with inherited context
18
-
19
- ### Transport System
20
- - **Multiple Output Destinations**: Simultaneous logging to console, files, and databases
21
- - **Console Transport**: Configurable console output with colorization and formatting options
22
- - **File Transport**: Advanced file logging with rotation, compression, and cleanup
23
- - **Database Transport**: Native support for MongoDB, PostgreSQL, MySQL, and SQLite
24
- - **Custom Transport Support**: Extensible architecture for implementing custom log destinations
25
-
26
- ### Advanced Log Management
27
- - **Log Rotation**: Time-based and size-based rotation with configurable intervals
28
- - **Batch Processing**: Efficient batch writing with configurable batch sizes and flush intervals
29
- - **Compression Support**: Automatic compression of rotated log files
30
- - **Retention Policies**: Configurable cleanup of old log files based on count or age
31
- - **Health Monitoring**: Built-in transport health checking and status reporting
32
-
33
- ### Performance and Monitoring
34
- - **Performance Timing**: Built-in timing utilities for operation measurement
35
- - **Trace ID Support**: Request tracing with async context propagation
36
- - **Metrics Collection**: Transport-level metrics and performance monitoring
37
- - **Resource Management**: Proper connection pooling and resource cleanup
38
-
39
- ### Framework Integration
40
- - **NestJS Integration**: First-class NestJS module with dependency injection support
41
- - **Express Middleware**: Ready-to-use Express middleware for request tracking
42
- - **Field Configuration**: Granular control over log field inclusion and formatting
1
+ # logixia
43
2
 
44
- ## Installation
45
-
46
- ```bash
47
- # Using npm
48
- npm install logixia
3
+ <p align="center">
4
+ <strong>The async-first logging library that ships complete.</strong><br/>
5
+ TypeScript-first · Non-blocking by design · NestJS · Database · Tracing · OTel
6
+ </p>
49
7
 
50
- # Using yarn
51
- yarn add logixia
52
-
53
- # Using pnpm
54
- pnpm add logixia
55
- ```
8
+ <p align="center">
9
+ <a href="https://www.npmjs.com/package/logixia"><img src="https://img.shields.io/npm/v/logixia" alt="npm version"/></a>
10
+ <a href="https://www.npmjs.com/package/logixia"><img src="https://img.shields.io/npm/dm/logixia" alt="npm downloads"/></a>
11
+ <a href="https://bundlephobia.com/package/logixia"><img src="https://img.shields.io/bundlephobia/minzip/logixia" alt="bundle size"/></a>
12
+ <a href="https://github.com/Logixia/logixia/actions/workflows/ci.yml"><img src="https://github.com/Logixia/logixia/actions/workflows/ci.yml/badge.svg" alt="CI"/></a>
13
+ <a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="MIT"/></a>
14
+ <a href="https://www.typescriptlang.org/"><img src="https://img.shields.io/badge/TypeScript-5.0%2B-blue" alt="TypeScript"/></a>
15
+ </p>
56
16
 
57
- ### Optional Database Dependencies
17
+ ---
58
18
 
59
- For database transport functionality, install the appropriate database drivers:
19
+ ## The logging setup you copy-paste into every new project
60
20
 
61
21
  ```bash
62
- # MongoDB support
63
- npm install mongodb
22
+ # The pino route:
23
+ npm install pino pino-pretty pino-roll pino-redact pino-nestjs pino-http
64
24
 
65
- # PostgreSQL support
66
- npm install pg @types/pg
25
+ # The winston route:
26
+ npm install winston winston-daily-rotate-file
27
+ # ...then wire 4 separate config objects
28
+ # ...then discover there's no built-in DB transport
29
+ # ...then discover request tracing is manual
30
+ # ...then discover both block your event loop under I/O pressure
67
31
 
68
- # MySQL support
69
- npm install mysql2
70
-
71
- # SQLite support
72
- npm install sqlite3 sqlite
32
+ # Or:
33
+ npm install logixia
73
34
  ```
74
35
 
75
- ## Quick Start
76
-
77
- ### Basic Usage
36
+ logixia ships **console + file rotation + database + request tracing + NestJS module + field redaction + log search + OpenTelemetry** in one package — non-blocking on every transport, zero extra installs.
78
37
 
79
38
  ```typescript
80
39
  import { createLogger } from 'logixia';
81
40
 
82
- // Create a basic logger instance
83
41
  const logger = createLogger({
84
- appName: 'MyApplication',
85
- environment: 'development'
86
- });
42
+ appName: 'api',
43
+ environment: 'production',
44
+ file: { filename: 'app.log', dirname: './logs', maxSize: '50MB' },
45
+ database: { type: 'postgresql', host: 'localhost', database: 'appdb', table: 'logs' },
46
+ });
47
+
48
+ await logger.info('Server started', { port: 3000 });
49
+ // Writes to console + file + postgres simultaneously. Non-blocking. Done.
50
+ ```
51
+
52
+ ---
53
+
54
+ ## Table of Contents
55
+
56
+ - [Why logixia?](#why-logixia)
57
+ - [Feature comparison](#feature-comparison)
58
+ - [Performance](#performance)
59
+ - [Installation](#installation)
60
+ - [Quick start](#quick-start)
61
+ - [Core concepts](#core-concepts)
62
+ - [Log levels](#log-levels)
63
+ - [Structured logging](#structured-logging)
64
+ - [Child loggers](#child-loggers)
65
+ - [Transports](#transports)
66
+ - [Console](#console)
67
+ - [File with rotation](#file-with-rotation)
68
+ - [Database](#database)
69
+ - [Analytics](#analytics)
70
+ - [Multiple transports simultaneously](#multiple-transports-simultaneously)
71
+ - [Custom transport](#custom-transport)
72
+ - [Request tracing](#request-tracing)
73
+ - [NestJS integration](#nestjs-integration)
74
+ - [Log redaction](#log-redaction)
75
+ - [Log search](#log-search)
76
+ - [OpenTelemetry](#opentelemetry)
77
+ - [Graceful shutdown](#graceful-shutdown)
78
+ - [Configuration reference](#configuration-reference)
79
+ - [Contributing](#contributing)
80
+ - [License](#license)
81
+
82
+ ---
83
+
84
+ ## Why logixia?
85
+
86
+ `console.log` doesn't scale. `pino` is fast but leaves database persistence, NestJS integration, log search, and field redaction entirely to plugins. `winston` is flexible but synchronous and requires substantial boilerplate to get production-ready.
87
+
88
+ logixia takes a different approach: **everything ships built-in, and nothing blocks your event loop.**
89
+
90
+ - ⚡ **Async by design** — every log call is non-blocking, even to file and database transports
91
+ - 🗄️ **Built-in database transports** — PostgreSQL, MySQL, MongoDB, SQLite with zero extra drivers
92
+ - 🏗️ **NestJS module** — plug in with `LogixiaLoggerModule.forRoot()`, inject with `@InjectLogger()`
93
+ - 📁 **File rotation** — `maxSize`, `maxFiles`, gzip archive — no `winston-daily-rotate-file` needed
94
+ - 🔍 **Log search** — query your in-memory log store without shipping to an external service
95
+ - 🔒 **Field redaction** — mask passwords, tokens, and PII before they touch any transport
96
+ - 🕸️ **Request tracing** — `AsyncLocalStorage`-based trace propagation, no manual thread-locals
97
+ - 📡 **OpenTelemetry** — W3C `traceparent` and `tracestate` support, zero extra dependencies
98
+ - 🧩 **Multi-transport** — write to console, file, and database concurrently with one log call
99
+ - 🛡️ **TypeScript-first** — typed log entries, typed metadata, full IntelliSense throughout
100
+ - 🌱 **Adaptive log level** — auto-configures based on `NODE_ENV` and CI environment
101
+ - 🔌 **Custom transports** — ship to Slack, Datadog, S3, or anywhere else with a simple interface
102
+
103
+ ---
104
+
105
+ ## Feature comparison
106
+
107
+ | Feature | **logixia** | pino | winston | bunyan |
108
+ | ----------------------------------- | :---------: | :------------: | :--------------------------: | :----: |
109
+ | TypeScript-first | ✅ | ⚠️ | ⚠️ | ⚠️ |
110
+ | Async / non-blocking writes | ✅ | ❌ | ❌ | ❌ |
111
+ | NestJS module (built-in) | ✅ | ❌ | ❌ | ❌ |
112
+ | Database transports (built-in) | ✅ | ❌ | ❌ | ❌ |
113
+ | File rotation (built-in) | ✅ | ⚠️ pino-roll | ⚠️ winston-daily-rotate-file | ❌ |
114
+ | Multi-transport concurrent | ✅ | ❌ | ✅ | ❌ |
115
+ | Log search | ✅ | ❌ | ❌ | ❌ |
116
+ | Field redaction (built-in) | ✅ | ⚠️ pino-redact | ❌ | ❌ |
117
+ | Request tracing (AsyncLocalStorage) | ✅ | ❌ | ❌ | ❌ |
118
+ | OpenTelemetry / W3C headers | ✅ | ❌ | ❌ | ❌ |
119
+ | Graceful shutdown / flush | ✅ | ❌ | ❌ | ❌ |
120
+ | Custom log levels | ✅ | ✅ | ✅ | ✅ |
121
+ | Adaptive log level (NODE_ENV) | ✅ | ❌ | ❌ | ❌ |
122
+ | Actively maintained | ✅ | ✅ | ✅ | ❌ |
123
+
124
+ > ⚠️ = requires a separate package or manual implementation
125
+
126
+ ---
127
+
128
+ ## Performance
129
+
130
+ logixia is **faster than winston in every benchmark** and outperforms pino on the workloads that matter most in production — structured metadata and error serialization:
131
+
132
+ | Library | Simple log (ops/sec) | Structured log (ops/sec) | Error log (ops/sec) | p99 latency |
133
+ | ----------- | -------------------: | -----------------------: | ------------------: | -----------: |
134
+ | pino | 1,258,000 | 630,000 | 390,000 | 2.5–12µs |
135
+ | **logixia** | **840,000** | **696,000** | **654,000** | **4.8–10µs** |
136
+ | winston | 738,000 | 371,000 | 433,000 | 9–16µs |
137
+
138
+ logixia is **10% faster than pino on structured logging** and **68% faster on error serialization**. It beats winston across the board.
139
+
140
+ **Why pino leads on simple strings:** pino uses synchronous direct writes to `process.stdout` — a trade-off that blocks the event loop under heavy I/O and that disappears as soon as you add real metadata. logixia is non-blocking on every call while still winning where it counts.
141
+
142
+ To reproduce: `node benchmarks/run.mjs`
143
+
144
+ ---
87
145
 
88
- // Basic logging operations
89
- await logger.info('Application started', { version: '1.0.1', port: 3000 });
90
- await logger.warn('High memory usage detected', { memoryUsage: '85%' });
91
- await logger.error('Database connection failed', new Error('Connection timeout'));
92
- ```
146
+ ## Installation
93
147
 
94
- ### Multi-Transport Configuration
148
+ ```bash
149
+ # npm
150
+ npm install logixia
95
151
 
96
- ```typescript
97
- import { createLogger } from 'logixia';
152
+ # pnpm
153
+ pnpm add logixia
98
154
 
99
- const logger = createLogger({
100
- appName: 'ProductionApp',
101
- environment: 'production',
102
- transports: {
103
- console: {
104
- level: 'info',
105
- colorize: true,
106
- format: 'text'
107
- },
108
- file: {
109
- filename: './logs/application.log',
110
- level: 'debug',
111
- format: 'json',
112
- rotation: {
113
- interval: '1d',
114
- maxFiles: 30,
115
- compress: true
116
- }
117
- },
118
- database: {
119
- type: 'mongodb',
120
- connectionString: 'mongodb://localhost:27017/logs',
121
- database: 'application_logs',
122
- collection: 'error_logs',
123
- batchSize: 100,
124
- flushInterval: 5000
125
- }
126
- }
127
- });
155
+ # yarn
156
+ yarn add logixia
128
157
 
129
- // Logs will be written to console, file, and database simultaneously
130
- await logger.error('Critical system error', {
131
- errorCode: 'SYS_001',
132
- component: 'database',
133
- severity: 'critical'
134
- });
158
+ # bun
159
+ bun add logixia
135
160
  ```
136
161
 
137
- ### Performance Monitoring
162
+ **For database transports**, install the relevant driver alongside logixia:
138
163
 
139
- ```typescript
140
- // Simple timing
141
- logger.time('database-query');
142
- const users = await database.findUsers();
143
- const duration = await logger.timeEnd('database-query');
144
-
145
- // Async timing with automatic logging
146
- const result = await logger.timeAsync('api-call', async () => {
147
- const response = await fetch('/api/users');
148
- return response.json();
149
- });
164
+ ```bash
165
+ npm install pg # PostgreSQL
166
+ npm install mysql2 # MySQL
167
+ npm install mongodb # MongoDB
168
+ npm install sqlite3 # SQLite
150
169
  ```
151
170
 
152
- ## Custom Log Levels
171
+ **Requirements:** TypeScript 5.0+, Node.js 18+
153
172
 
154
- Define application-specific log levels with custom priorities and visual styling:
173
+ ---
174
+
175
+ ## Quick start
155
176
 
156
177
  ```typescript
157
178
  import { createLogger } from 'logixia';
158
179
 
159
180
  const logger = createLogger({
160
- appName: 'EcommerceApplication',
161
- levelOptions: {
162
- level: 'info',
163
- levels: {
164
- // Standard logging levels
165
- error: 0,
166
- warn: 1,
167
- info: 2,
168
- debug: 3,
169
- // Business-specific levels
170
- order: 2, // Order processing events
171
- payment: 1, // Payment processing (high priority)
172
- inventory: 2, // Inventory management operations
173
- customer: 3, // Customer interaction tracking
174
- audit: 0, // Audit trail (highest priority)
175
- },
176
- colors: {
177
- error: 'red',
178
- warn: 'yellow',
179
- info: 'blue',
180
- debug: 'green',
181
- order: 'brightBlue',
182
- payment: 'brightYellow',
183
- inventory: 'cyan',
184
- customer: 'brightGreen',
185
- audit: 'brightRed',
186
- }
187
- }
181
+ appName: 'api',
182
+ environment: 'production',
188
183
  });
189
184
 
190
- // Utilize custom levels with complete TypeScript support
191
- await logger.order('Order processing initiated', {
192
- orderId: 'ORD-12345',
193
- customerId: 'CUST-67890',
194
- items: 3,
195
- totalAmount: 299.97
196
- });
185
+ await logger.info('Server started', { port: 3000 });
186
+ await logger.warn('High memory usage', { used: '87%' });
187
+ await logger.error('Request failed', new Error('Connection timeout'));
188
+ ```
197
189
 
198
- await logger.payment('Payment transaction completed', {
199
- transactionId: 'TXN-98765',
200
- amount: 299.97,
201
- method: 'credit_card',
202
- processor: 'stripe'
203
- });
190
+ That's it. Logs go to the console by default, structured JSON in production, colorized text in development. Add a `file` or `database` key to write there too — all transports run concurrently.
204
191
 
205
- await logger.inventory('Stock level updated', {
206
- productId: 'PROD-ABC123',
207
- previousQuantity: 25,
208
- newQuantity: 50,
209
- operation: 'restock'
210
- });
192
+ ---
211
193
 
212
- await logger.audit('User permission modified', {
213
- userId: 'USER-456',
214
- action: 'permission_grant',
215
- permission: 'admin_access',
216
- modifiedBy: 'USER-123'
217
- });
218
- ```
194
+ ## Core concepts
219
195
 
220
- ## NestJS Integration
196
+ ### Log levels
221
197
 
222
- ### Module Configuration
198
+ logixia ships with six built-in levels: `trace`, `debug`, `info`, `warn`, `error`, and `fatal`. The minimum level is automatically inferred from `NODE_ENV` and CI environment — no manual setup in most projects.
223
199
 
224
- Integrate Logixia seamlessly into your NestJS application with full dependency injection support:
200
+ You can also define custom levels for your domain:
225
201
 
226
202
  ```typescript
227
- import { Module } from '@nestjs/common';
228
- import { LogixiaLoggerModule } from 'logixia';
203
+ const logger = createLogger({
204
+ appName: 'payments',
205
+ environment: 'production',
206
+ levelOptions: {
207
+ level: 'info',
208
+ customLevels: {
209
+ audit: { priority: 35, color: 'blue' },
210
+ security: { priority: 45, color: 'red' },
211
+ },
212
+ },
213
+ });
229
214
 
230
- @Module({
231
- imports: [
232
- LogixiaLoggerModule.forRoot({
233
- appName: 'NestJS-Application',
234
- environment: 'production',
235
- traceId: true,
236
- levelOptions: {
237
- level: 'info',
238
- levels: {
239
- error: 0,
240
- warn: 1,
241
- info: 2,
242
- debug: 3,
243
- verbose: 4
244
- }
245
- },
246
- transports: {
247
- console: {
248
- level: 'info',
249
- colorize: true,
250
- format: 'text'
251
- },
252
- file: {
253
- filename: './logs/nestjs-app.log',
254
- level: 'debug',
255
- format: 'json',
256
- rotation: {
257
- interval: '1d',
258
- maxFiles: 30
259
- }
260
- },
261
- database: {
262
- type: 'mongodb',
263
- connectionString: process.env.MONGODB_URI,
264
- database: 'application_logs',
265
- collection: 'nestjs_logs'
266
- }
267
- }
268
- })
269
- ],
270
- })
271
- export class AppModule {}
215
+ await logger.log('audit', 'Payment processed', { orderId: 'ord_123', amount: 99.99 });
216
+ await logger.log('security', 'Suspicious login attempt', { ip: '1.2.3.4', userId: 'usr_456' });
272
217
  ```
273
218
 
274
- ### Service Usage
275
-
276
- ```typescript
277
- import { Injectable } from '@nestjs/common';
278
- import { LogixiaLoggerService } from 'logixia';
279
-
280
- @Injectable()
281
- export class UserService {
282
- constructor(private readonly logger: LogixiaLoggerService) {
283
- this.logger.setContext('UserService');
284
- }
219
+ ### Structured logging
285
220
 
286
- async findUser(id: string) {
287
- await this.logger.info('Fetching user', { userId: id });
288
-
289
- try {
290
- const user = await this.userRepository.findById(id);
291
- await this.logger.info('User found', { userId: id });
292
- return user;
293
- } catch (error) {
294
- await this.logger.error('User not found', { userId: id, error });
295
- throw error;
296
- }
297
- }
221
+ Every log call accepts metadata as its second argument — serialized as structured fields alongside the message, never concatenated into a string:
298
222
 
299
- async createUser(userData: any) {
300
- const childLogger = this.logger.child('createUser', { operation: 'create' });
301
-
302
- return await childLogger.timeAsync('user-creation', async () => {
303
- await childLogger.info('Creating user', { userData });
304
- const user = await this.userRepository.create(userData);
305
- await childLogger.info('User created', { userId: user.id });
306
- return user;
307
- });
308
- }
309
- }
223
+ ```typescript
224
+ await logger.info('User authenticated', {
225
+ userId: 'usr_123',
226
+ method: 'oauth',
227
+ provider: 'google',
228
+ durationMs: 42,
229
+ ip: '203.0.113.4',
230
+ });
310
231
  ```
311
232
 
312
- ## Interceptors
313
-
314
- ### Kafka and WebSocket Interceptors
233
+ Output in development (colorized text):
315
234
 
316
- Logixia provides specialized interceptors for Kafka and WebSocket applications to automatically extract and propagate trace IDs across distributed systems.
317
-
318
- #### Kafka Trace Interceptor
235
+ ```
236
+ [INFO] User authenticated userId=usr_123 method=oauth provider=google durationMs=42
237
+ ```
319
238
 
320
- Automatically extract trace IDs from Kafka messages and add them to the logging context:
239
+ Output in production (JSON):
321
240
 
322
- ```typescript
323
- import { KafkaTraceInterceptor } from 'logixia';
324
- import { Controller, Post, Body, UseInterceptors } from '@nestjs/common';
325
-
326
- @Controller('kafka')
327
- @UseInterceptors(KafkaTraceInterceptor)
328
- export class KafkaController {
329
- @Post('process-message')
330
- async processMessage(@Body() message: any) {
331
- // Trace ID is automatically extracted from message headers or body
332
- // and added to the logging context
333
- await this.logger.info('Processing Kafka message', {
334
- messageId: message.id,
335
- topic: message.topic
336
- });
337
-
338
- return { status: 'processed' };
339
- }
241
+ ```json
242
+ {
243
+ "level": "info",
244
+ "message": "User authenticated",
245
+ "userId": "usr_123",
246
+ "method": "oauth",
247
+ "provider": "google",
248
+ "durationMs": 42,
249
+ "timestamp": "2025-03-14T10:22:01.412Z",
250
+ "traceId": "abc123def456"
340
251
  }
341
252
  ```
342
253
 
343
- #### WebSocket Trace Interceptor
254
+ ### Child loggers
344
255
 
345
- Extract trace IDs from WebSocket messages and maintain trace context across WebSocket connections:
256
+ Create child loggers that inherit parent context and add their own. Every log from the child carries both sets of fields automatically:
346
257
 
347
258
  ```typescript
348
- import { WebSocketTraceInterceptor } from 'logixia';
349
- import { WebSocketGateway, SubscribeMessage, UseInterceptors } from '@nestjs/websockets';
350
-
351
- @WebSocketGateway()
352
- @UseInterceptors(WebSocketTraceInterceptor)
353
- export class ChatGateway {
354
- @SubscribeMessage('message')
355
- async handleMessage(client: any, payload: any) {
356
- // Trace ID is automatically extracted from payload headers or query parameters
357
- await this.logger.info('WebSocket message received', {
358
- clientId: client.id,
359
- messageType: payload.type
360
- });
361
-
362
- return { event: 'response', data: 'Message processed' };
363
- }
364
- }
259
+ const reqLogger = logger.child({
260
+ requestId: req.id,
261
+ userId: req.user.id,
262
+ route: req.path,
263
+ });
264
+
265
+ await reqLogger.info('Processing order'); // carries requestId + userId + route
266
+ await reqLogger.info('Payment confirmed'); // same context, no repetition
365
267
  ```
366
268
 
367
- #### Configuration
269
+ ---
368
270
 
369
- Configure interceptors through the LogixiaLoggerModule:
271
+ ## Transports
370
272
 
371
- ```typescript
372
- import { Module } from '@nestjs/common';
373
- import { LogixiaLoggerModule } from 'logixia';
273
+ ### Console
374
274
 
375
- @Module({
376
- imports: [
377
- LogixiaLoggerModule.forRoot({
378
- appName: 'MyApplication',
379
- environment: 'production',
380
- traceId: {
381
- enabled: true,
382
- extractor: {
383
- header: ['x-trace-id', 'x-request-id'],
384
- query: ['traceId'],
385
- body: ['traceId', 'requestId']
386
- }
387
- },
388
- transports: {
389
- console: { level: 'info' },
390
- file: { filename: './logs/app.log', level: 'debug' }
391
- }
392
- })
393
- ]
394
- })
395
- export class AppModule {}
275
+ ```typescript
276
+ const logger = createLogger({
277
+ appName: 'api',
278
+ environment: 'development',
279
+ console: {
280
+ colorize: true,
281
+ timestamp: true,
282
+ format: 'text', // 'text' (human-readable) or 'json' (structured)
283
+ },
284
+ });
396
285
  ```
397
286
 
398
- #### Key Features
399
-
400
- - **Automatic Trace Extraction**: Extracts trace IDs from headers, query parameters, or message body
401
- - **Enable/Disable Support**: Can be enabled or disabled through configuration
402
- - **No Auto-Generation**: Only uses existing trace IDs, doesn't generate new ones
403
- - **Flexible Configuration**: Supports multiple extraction sources and patterns
404
- - **Performance Optimized**: Minimal overhead when disabled
287
+ ### File with rotation
405
288
 
406
- #### Usage with Custom Configuration
289
+ No extra packages. Rotation by size, automatic compression, and configurable retention — all built-in:
407
290
 
408
291
  ```typescript
409
- // Async configuration with custom trace settings
410
- LogixiaLoggerModule.forRootAsync({
411
- useFactory: async (configService: ConfigService) => ({
412
- appName: configService.get('APP_NAME'),
413
- environment: configService.get('NODE_ENV'),
414
- traceId: {
415
- enabled: configService.get('TRACE_ENABLED', true),
416
- extractor: {
417
- header: ['x-correlation-id', 'x-trace-id'],
418
- query: ['correlationId'],
419
- body: ['traceId']
420
- }
421
- },
422
- transports: {
423
- console: { level: 'info' },
424
- database: {
425
- type: 'mongodb',
426
- connectionString: configService.get('MONGODB_URI'),
427
- database: 'logs',
428
- collection: 'application_logs'
429
- }
430
- }
431
- }),
432
- inject: [ConfigService]
433
- })
292
+ const logger = createLogger({
293
+ appName: 'api',
294
+ environment: 'production',
295
+ file: {
296
+ filename: 'app.log',
297
+ dirname: './logs',
298
+ maxSize: '50MB', // Rotate when file hits 50 MB
299
+ maxFiles: 14, // Keep 14 rotated files (~ 2 weeks)
300
+ zippedArchive: true, // Compress old logs with gzip
301
+ format: 'json',
302
+ },
303
+ });
434
304
  ```
435
305
 
436
- ## Express Integration
306
+ ### Database
437
307
 
438
- Integrate comprehensive logging into Express applications with automatic request tracking and performance monitoring:
308
+ Write structured logs directly to your database batched, non-blocking, with configurable flush intervals:
439
309
 
440
310
  ```typescript
441
- import express from 'express';
442
- import { createLogger, traceMiddleware, getCurrentTraceId } from 'logixia';
443
-
444
- const app = express();
311
+ // PostgreSQL
445
312
  const logger = createLogger({
446
- appName: 'ExpressApplication',
313
+ appName: 'api',
447
314
  environment: 'production',
448
- transports: {
449
- console: { level: 'info', colorize: true },
450
- file: {
451
- filename: './logs/express-app.log',
452
- level: 'debug',
453
- format: 'json'
454
- },
455
- database: {
456
- type: 'mongodb',
457
- connectionString: process.env.MONGODB_URI,
458
- database: 'application_logs',
459
- collection: 'express_logs'
460
- }
461
- }
315
+ database: {
316
+ type: 'postgresql',
317
+ host: 'localhost',
318
+ port: 5432,
319
+ database: 'appdb',
320
+ table: 'logs',
321
+ username: 'dbuser',
322
+ password: process.env.DB_PASSWORD,
323
+ batchSize: 100, // Write in batches of 100
324
+ flushInterval: 5000, // Flush every 5 seconds
325
+ },
462
326
  });
463
327
 
464
- // Configure trace middleware for request tracking
465
- app.use(traceMiddleware({
466
- enabled: true,
467
- extractor: {
468
- header: ['x-trace-id', 'x-request-id', 'x-correlation-id'],
469
- query: ['traceId', 'requestId']
328
+ // MongoDB
329
+ const logger = createLogger({
330
+ appName: 'api',
331
+ environment: 'production',
332
+ database: {
333
+ type: 'mongodb',
334
+ connectionString: process.env.MONGO_URI,
335
+ database: 'appdb',
336
+ collection: 'logs',
470
337
  },
471
- generator: () => `req_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`
472
- }));
473
-
474
- // Request logging middleware with comprehensive context
475
- app.use(async (req, res, next) => {
476
- const startTime = Date.now();
477
- const traceId = getCurrentTraceId();
478
-
479
- await logger.info('HTTP request initiated', {
480
- method: req.method,
481
- path: req.path,
482
- query: req.query,
483
- traceId,
484
- userAgent: req.get('User-Agent'),
485
- clientIp: req.ip,
486
- contentType: req.get('Content-Type')
487
- });
488
-
489
- // Log response when request completes
490
- res.on('finish', async () => {
491
- const duration = Date.now() - startTime;
492
- await logger.info('HTTP request completed', {
493
- method: req.method,
494
- path: req.path,
495
- statusCode: res.statusCode,
496
- duration: `${duration}ms`,
497
- traceId,
498
- contentLength: res.get('Content-Length')
499
- });
500
- });
501
-
502
- next();
503
338
  });
504
339
 
505
- // Error handling middleware
506
- app.use(async (error, req, res, next) => {
507
- await logger.error('HTTP request error', {
508
- error: error.message,
509
- stack: error.stack,
510
- method: req.method,
511
- path: req.path,
512
- traceId: getCurrentTraceId(),
513
- statusCode: error.statusCode || 500
514
- });
515
-
516
- res.status(error.statusCode || 500).json({
517
- error: 'Internal Server Error',
518
- traceId: getCurrentTraceId()
519
- });
340
+ // MySQL
341
+ const logger = createLogger({
342
+ appName: 'api',
343
+ environment: 'production',
344
+ database: {
345
+ type: 'mysql',
346
+ host: 'localhost',
347
+ database: 'appdb',
348
+ table: 'logs',
349
+ username: 'root',
350
+ password: process.env.MYSQL_PASSWORD,
351
+ },
520
352
  });
521
353
 
522
- // Example route with contextual logging
523
- app.get('/users/:id', async (req, res) => {
524
- const { id } = req.params;
525
- const userLogger = logger.child('UserController', {
526
- userId: id,
527
- operation: 'fetchUser'
528
- });
529
-
530
- try {
531
- const userData = await userLogger.timeAsync('database-fetch-user', async () => {
532
- // Simulate database operation
533
- await new Promise(resolve => setTimeout(resolve, 100));
534
- return {
535
- id,
536
- name: 'John Doe',
537
- email: 'john.doe@example.com',
538
- lastLogin: new Date().toISOString()
539
- };
540
- });
541
-
542
- await userLogger.info('User data retrieved successfully', {
543
- userId: id,
544
- fieldsReturned: Object.keys(userData).length
545
- });
546
-
547
- res.json(userData);
548
- } catch (error) {
549
- await userLogger.error('Failed to retrieve user data', {
550
- userId: id,
551
- error: error.message
552
- });
553
-
554
- res.status(500).json({
555
- error: 'Failed to retrieve user',
556
- traceId: getCurrentTraceId()
557
- });
558
- }
354
+ // SQLite (great for local development and small apps)
355
+ const logger = createLogger({
356
+ appName: 'api',
357
+ environment: 'development',
358
+ database: {
359
+ type: 'sqlite',
360
+ filename: './logs/app.sqlite',
361
+ table: 'logs',
362
+ },
559
363
  });
364
+ ```
560
365
 
561
- // Health check endpoint with logging
562
- app.get('/health', async (req, res) => {
563
- const healthLogger = logger.child('HealthCheck');
564
-
565
- try {
566
- // Check database connectivity
567
- const dbHealth = await healthLogger.timeAsync('database-health-check', async () => {
568
- // Simulate health check
569
- return { status: 'healthy', latency: 45 };
570
- });
571
-
572
- await healthLogger.info('Health check completed', {
573
- database: dbHealth,
574
- uptime: process.uptime(),
575
- memory: process.memoryUsage()
576
- });
577
-
578
- res.json({
579
- status: 'healthy',
580
- timestamp: new Date().toISOString(),
581
- services: {
582
- database: dbHealth
583
- }
584
- });
585
- } catch (error) {
586
- await healthLogger.error('Health check failed', {
587
- error: error.message
588
- });
589
-
590
- res.status(503).json({
591
- status: 'unhealthy',
592
- error: 'Service unavailable'
593
- });
594
- }
595
- });
366
+ ### Analytics
367
+
368
+ Send log events to your analytics platform:
596
369
 
597
- const PORT = process.env.PORT || 3000;
598
- app.listen(PORT, async () => {
599
- await logger.info('Express server started', {
600
- port: PORT,
601
- environment: process.env.NODE_ENV || 'development',
602
- nodeVersion: process.version
603
- });
370
+ ```typescript
371
+ const logger = createLogger({
372
+ appName: 'api',
373
+ environment: 'production',
374
+ analytics: {
375
+ endpoint: 'https://analytics.example.com/events',
376
+ apiKey: process.env.ANALYTICS_KEY,
377
+ batchSize: 50,
378
+ flushInterval: 10_000,
379
+ },
604
380
  });
605
381
  ```
606
382
 
607
- ## Performance Monitoring
383
+ ### Multiple transports simultaneously
608
384
 
609
- Built-in performance monitoring capabilities for tracking operation durations and system metrics:
385
+ All configured transports receive every log call concurrently no sequential bottleneck:
610
386
 
611
387
  ```typescript
612
- import { createLogger } from 'logixia';
613
-
614
388
  const logger = createLogger({
615
- appName: 'PerformanceMonitoringApp',
389
+ appName: 'api',
616
390
  environment: 'production',
617
- transports: {
618
- console: { level: 'info', format: 'text' },
619
- file: {
620
- filename: './logs/performance.log',
621
- level: 'debug',
622
- format: 'json' // Structured format for performance analysis
623
- },
624
- database: {
625
- type: 'mongodb',
626
- connectionString: process.env.MONGODB_URI,
627
- database: 'performance_logs',
628
- collection: 'timing_metrics'
629
- }
630
- }
391
+ console: { colorize: false, format: 'json' },
392
+ file: { filename: 'app.log', dirname: './logs', maxSize: '100MB' },
393
+ database: {
394
+ type: 'postgresql',
395
+ host: 'localhost',
396
+ database: 'appdb',
397
+ table: 'logs',
398
+ },
631
399
  });
632
400
 
633
- class DatabaseService {
634
- private db: any; // Your database instance
635
-
636
- async findUser(id: string) {
637
- // Manual timing with detailed context
638
- const timerLabel = `database-find-user-${id}`;
639
- logger.time(timerLabel);
640
-
641
- try {
642
- const user = await this.db.findById(id);
643
- const duration = await logger.timeEnd(timerLabel);
644
-
645
- await logger.info('Database query completed successfully', {
646
- operation: 'findUser',
647
- userId: id,
648
- duration: `${duration}ms`,
649
- recordsFound: user ? 1 : 0,
650
- queryType: 'single_record_lookup'
651
- });
652
-
653
- return user;
654
- } catch (error) {
655
- await logger.timeEnd(timerLabel); // Still record timing on error
656
- await logger.error('Database query failed', {
657
- operation: 'findUser',
658
- userId: id,
659
- error: error.message
660
- });
661
- throw error;
662
- }
663
- }
664
-
665
- async createUser(userData: any) {
666
- // Automatic timing with comprehensive logging
667
- return await logger.timeAsync('database-create-user', async () => {
668
- await logger.debug('User creation initiated', {
669
- operation: 'createUser',
670
- dataFields: Object.keys(userData),
671
- estimatedSize: JSON.stringify(userData).length
672
- });
673
-
674
- const user = await this.db.create(userData);
675
-
676
- await logger.info('User created successfully', {
677
- operation: 'createUser',
678
- userId: user.id,
679
- createdFields: Object.keys(user).length
680
- });
681
-
682
- return user;
683
- }, {
684
- operationType: 'database_write',
685
- tableName: 'users',
686
- recordType: 'user_profile'
687
- });
688
- }
689
-
690
- async findUsersByQuery(query: any, limit: number = 100) {
691
- // Complex operation timing with nested operations
692
- const queryLogger = logger.child('DatabaseQuery', {
693
- operation: 'findUsersByQuery',
694
- queryComplexity: Object.keys(query).length,
695
- resultLimit: limit
696
- });
697
-
698
- return await queryLogger.timeAsync('complex-user-query', async () => {
699
- // Simulate query parsing time
700
- await queryLogger.timeAsync('query-parsing', async () => {
701
- await new Promise(resolve => setTimeout(resolve, 10));
702
- });
703
-
704
- // Simulate database execution time
705
- const results = await queryLogger.timeAsync('database-execution', async () => {
706
- const users = await this.db.find(query).limit(limit);
707
- return users;
708
- });
709
-
710
- // Simulate result processing time
711
- const processedResults = await queryLogger.timeAsync('result-processing', async () => {
712
- return results.map(user => ({
713
- ...user,
714
- lastAccessed: new Date().toISOString()
715
- }));
716
- });
717
-
718
- await queryLogger.info('Complex query completed', {
719
- resultsCount: processedResults.length,
720
- queryParameters: Object.keys(query),
721
- processingSteps: 3
722
- });
723
-
724
- return processedResults;
725
- });
726
- }
727
- }
401
+ // One call → console + file + postgres. All concurrent. All non-blocking.
402
+ await logger.info('Order placed', { orderId: 'ord_789' });
403
+ ```
728
404
 
729
- class CacheService {
730
- private cache: Map<string, any> = new Map();
731
-
732
- async get(key: string) {
733
- return await logger.timeAsync('cache-get', async () => {
734
- const value = this.cache.get(key);
735
-
736
- await logger.debug('Cache access', {
737
- operation: 'get',
738
- key,
739
- hit: value !== undefined,
740
- cacheSize: this.cache.size
741
- });
742
-
743
- return value;
744
- }, {
745
- cacheOperation: 'read',
746
- keyPattern: key.split(':')[0] // Log key pattern for analysis
747
- });
748
- }
749
-
750
- async set(key: string, value: any, ttl?: number) {
751
- return await logger.timeAsync('cache-set', async () => {
752
- this.cache.set(key, value);
753
-
754
- if (ttl) {
755
- setTimeout(() => this.cache.delete(key), ttl * 1000);
756
- }
757
-
758
- await logger.debug('Cache write completed', {
759
- operation: 'set',
760
- key,
761
- valueSize: JSON.stringify(value).length,
762
- ttl: ttl || 'permanent',
763
- cacheSize: this.cache.size
764
- });
765
- }, {
766
- cacheOperation: 'write',
767
- keyPattern: key.split(':')[0]
768
- });
769
- }
770
- }
405
+ ### Custom transport
771
406
 
772
- // Performance monitoring for HTTP requests
773
- class APIService {
774
- async fetchExternalData(endpoint: string) {
775
- const apiLogger = logger.child('ExternalAPI', {
776
- endpoint,
777
- service: 'third_party_api'
778
- });
779
-
780
- return await apiLogger.timeAsync('external-api-call', async () => {
781
- const response = await fetch(endpoint);
782
-
783
- await apiLogger.info('External API response received', {
784
- statusCode: response.status,
785
- contentLength: response.headers.get('content-length'),
786
- contentType: response.headers.get('content-type')
787
- });
788
-
789
- if (!response.ok) {
790
- throw new Error(`API request failed: ${response.status}`);
791
- }
792
-
793
- return response.json();
794
- }, {
795
- requestType: 'external_api',
796
- protocol: 'https',
797
- method: 'GET'
407
+ Implement `ITransport` to send logs anywhere — Slack, Datadog, S3, an internal queue:
408
+
409
+ ```typescript
410
+ import type { ITransport, TransportLogEntry } from 'logixia';
411
+
412
+ class SlackTransport implements ITransport {
413
+ name = 'slack';
414
+
415
+ async write(entry: TransportLogEntry): Promise<void> {
416
+ if (entry.level !== 'error' && entry.level !== 'fatal') return;
417
+ await fetch(process.env.SLACK_WEBHOOK_URL!, {
418
+ method: 'POST',
419
+ headers: { 'Content-Type': 'application/json' },
420
+ body: JSON.stringify({
421
+ text: `🚨 *[${entry.level.toUpperCase()}]* ${entry.message}`,
422
+ attachments: [{ text: JSON.stringify(entry.metadata, null, 2) }],
423
+ }),
798
424
  });
799
425
  }
800
426
  }
427
+
428
+ const logger = createLogger({
429
+ appName: 'api',
430
+ environment: 'production',
431
+ transports: [new SlackTransport()],
432
+ });
801
433
  ```
802
434
 
803
- ## Field Configuration
435
+ ---
436
+
437
+ ## Request tracing
804
438
 
805
- Configure global fields that are automatically included in all log entries for consistent metadata:
439
+ logixia uses `AsyncLocalStorage` to propagate trace IDs through your entire async call graph automatically no passing of context objects, no manual threading.
806
440
 
807
441
  ```typescript
808
- import { createLogger } from 'logixia';
442
+ import { runWithTraceId, getCurrentTraceId } from 'logixia';
809
443
 
810
- const logger = createLogger({
811
- appName: 'UserManagementService',
812
- environment: 'production',
813
- fields: {
814
- version: '2.1.4',
815
- service: 'user-management-api',
816
- region: 'us-east-1',
817
- datacenter: 'aws-virginia',
818
- buildNumber: process.env.BUILD_NUMBER || 'unknown',
819
- deploymentId: process.env.DEPLOYMENT_ID || 'local',
820
- nodeVersion: process.version,
821
- platform: process.platform
822
- },
823
- transports: {
824
- console: { level: 'info', format: 'text' },
825
- file: {
826
- filename: './logs/application.log',
827
- level: 'debug',
828
- format: 'json'
829
- },
830
- database: {
831
- type: 'mongodb',
832
- connectionString: process.env.MONGODB_URI,
833
- database: 'application_logs',
834
- collection: 'service_logs'
835
- }
836
- }
444
+ // Express / Fastify middleware
445
+ app.use((req, res, next) => {
446
+ const traceId = (req.headers['x-trace-id'] as string) ?? crypto.randomUUID();
447
+ runWithTraceId(traceId, next);
837
448
  });
838
449
 
839
- // Example usage with automatic field inclusion
840
- class UserService {
841
- async authenticateUser(credentials: any) {
842
- // All logs automatically include the configured global fields
843
- await logger.info('User authentication attempt', {
844
- userId: credentials.username,
845
- authMethod: 'password',
846
- clientIP: credentials.ip,
847
- userAgent: credentials.userAgent
848
- });
849
-
850
- try {
851
- const user = await this.validateCredentials(credentials);
852
-
853
- await logger.info('User authentication successful', {
854
- userId: user.id,
855
- userRole: user.role,
856
- lastLogin: user.lastLogin,
857
- sessionId: user.sessionId
858
- });
859
-
860
- return user;
861
- } catch (error) {
862
- await logger.error('User authentication failed', {
863
- userId: credentials.username,
864
- errorCode: error.code,
865
- errorMessage: error.message,
866
- attemptCount: credentials.attemptCount || 1
867
- });
868
-
869
- throw error;
870
- }
450
+ // Service layer no parameters, no context objects
451
+ class OrderService {
452
+ async createOrder(data: OrderData) {
453
+ await logger.info('Creating order', { items: data.items.length });
454
+ // trace ID is automatically included in this log entry
455
+ await this.processPayment(data);
871
456
  }
872
-
873
- async createUser(userData: any) {
874
- await logger.info('User creation initiated', {
875
- requestedUsername: userData.username,
876
- userRole: userData.role,
877
- registrationSource: userData.source || 'direct'
878
- });
879
-
880
- const user = await this.database.createUser(userData);
881
-
882
- await logger.info('User created successfully', {
883
- userId: user.id,
884
- username: user.username,
885
- userRole: user.role,
886
- accountStatus: user.status,
887
- createdAt: user.createdAt
888
- });
889
-
890
- return user;
457
+
458
+ async processPayment(data: OrderData) {
459
+ await logger.info('Processing payment', { amount: data.total });
460
+ // ↑ same trace ID, propagated automatically
891
461
  }
892
462
  }
893
-
894
- // Dynamic field configuration for different environments
895
- const createEnvironmentLogger = (env: string) => {
896
- const baseFields = {
897
- version: '2.1.4',
898
- service: 'user-management-api',
899
- nodeVersion: process.version
900
- };
901
-
902
- const environmentFields = {
903
- development: {
904
- ...baseFields,
905
- region: 'local',
906
- datacenter: 'development',
907
- debugMode: true
908
- },
909
- staging: {
910
- ...baseFields,
911
- region: 'us-west-2',
912
- datacenter: 'aws-oregon',
913
- testingEnabled: true
914
- },
915
- production: {
916
- ...baseFields,
917
- region: 'us-east-1',
918
- datacenter: 'aws-virginia',
919
- performanceMonitoring: true,
920
- securityAudit: true
921
- }
922
- };
923
-
924
- return createLogger({
925
- appName: 'UserManagementService',
926
- environment: env,
927
- fields: environmentFields[env] || environmentFields.development,
928
- transports: {
929
- console: {
930
- level: env === 'production' ? 'warn' : 'debug',
931
- format: env === 'production' ? 'json' : 'text'
932
- },
933
- file: {
934
- filename: `./logs/${env}-application.log`,
935
- level: 'debug',
936
- format: 'json',
937
- rotation: {
938
- interval: '1d',
939
- maxFiles: env === 'production' ? 30 : 7
940
- }
941
- }
942
- }
943
- });
944
- };
945
-
946
- const logger = createEnvironmentLogger(process.env.NODE_ENV || 'development');
947
-
948
- // All logs will automatically include the configured fields:
949
- // {
950
- // level: 'info',
951
- // message: 'User authentication successful',
952
- // userId: 'user123',
953
- // userRole: 'admin',
954
- // lastLogin: '2024-01-15T10:30:00Z',
955
- // sessionId: 'sess_abc123',
956
- // version: '2.1.4',
957
- // service: 'user-management-api',
958
- // region: 'us-east-1',
959
- // datacenter: 'aws-virginia',
960
- // buildNumber: '1234',
961
- // deploymentId: 'deploy_xyz789',
962
- // nodeVersion: 'v18.17.0',
963
- // platform: 'linux',
964
- // performanceMonitoring: true,
965
- // securityAudit: true,
966
- // timestamp: '2024-01-15T10:30:15.123Z',
967
- // appName: 'UserManagementService',
968
- // environment: 'production'
969
- // }
970
463
  ```
971
464
 
972
- ## Field Management
973
-
974
- Dynamically control which fields are included in log entries at runtime with persistent state management:
465
+ Every log entry automatically includes the current trace ID — even across `await` boundaries, `Promise.all`, and background jobs that were started in the request context.
975
466
 
976
- ```typescript
977
- import { createLogger } from 'logixia';
467
+ ---
978
468
 
979
- const logger = createLogger({
980
- appName: 'FieldManagementApp',
981
- environment: 'development',
982
- fields: {
983
- version: '1.0.1',
984
- service: 'api-gateway',
985
- region: 'us-east-1'
986
- },
987
- transports: {
988
- console: { level: 'info', format: 'text' },
989
- file: { filename: './logs/app.log', level: 'debug', format: 'json' }
990
- }
991
- });
469
+ ## NestJS integration
992
470
 
993
- // Enable specific fields for inclusion in logs
994
- await logger.enableField('userId');
995
- await logger.enableField('requestId');
996
- await logger.enableField('sessionId');
997
-
998
- // Check if a field is currently enabled
999
- const isUserIdEnabled = logger.isFieldEnabled('userId'); // true
1000
- const isEmailEnabled = logger.isFieldEnabled('email'); // false
1001
-
1002
- // Get current field state
1003
- const fieldState = logger.getFieldState();
1004
- console.log(fieldState);
1005
- // Output: { userId: true, requestId: true, sessionId: true }
1006
-
1007
- // Disable specific fields
1008
- await logger.disableField('sessionId');
1009
-
1010
- // Log with automatic field filtering
1011
- await logger.info('User action performed', {
1012
- userId: 'user123', // Included (enabled)
1013
- requestId: 'req456', // Included (enabled)
1014
- sessionId: 'sess789', // Excluded (disabled)
1015
- email: 'user@example.com', // Excluded (not enabled)
1016
- action: 'profile_update' // Included (always included)
1017
- });
471
+ Drop-in module with zero boilerplate. Supports both synchronous and async configuration:
1018
472
 
1019
- // Reset field state to default (all fields enabled)
1020
- await logger.resetFieldState();
473
+ ```typescript
474
+ // app.module.ts
475
+ import { Module } from '@nestjs/common';
476
+ import { LogixiaLoggerModule } from 'logixia';
1021
477
 
1022
- // Batch field management
1023
- await logger.enableField(['userId', 'requestId', 'traceId']);
1024
- await logger.disableField(['sessionId', 'deviceId']);
478
+ @Module({
479
+ imports: [
480
+ LogixiaLoggerModule.forRoot({
481
+ appName: 'nestjs-api',
482
+ environment: process.env.NODE_ENV ?? 'development',
483
+ console: { colorize: true },
484
+ file: { filename: 'app.log', dirname: './logs', maxSize: '50MB' },
485
+ }),
486
+ ],
487
+ })
488
+ export class AppModule {}
1025
489
  ```
1026
490
 
1027
- ### Field Management Use Cases
1028
-
1029
491
  ```typescript
1030
- // Privacy compliance - disable PII fields in production
1031
- if (process.env.NODE_ENV === 'production') {
1032
- await logger.disableField(['email', 'phoneNumber', 'address']);
1033
- }
492
+ // my.service.ts
493
+ import { Injectable } from '@nestjs/common';
494
+ import { InjectLogger, LogixiaLoggerService } from 'logixia';
1034
495
 
1035
- // Debug mode - enable all diagnostic fields
1036
- if (process.env.DEBUG_MODE === 'true') {
1037
- await logger.enableField(['stackTrace', 'memoryUsage', 'cpuUsage']);
1038
- }
496
+ @Injectable()
497
+ export class OrderService {
498
+ constructor(@InjectLogger() private readonly logger: LogixiaLoggerService) {}
1039
499
 
1040
- // Feature-specific logging
1041
- class PaymentService {
1042
- constructor() {
1043
- // Enable payment-specific fields
1044
- logger.enableField(['transactionId', 'paymentMethod', 'amount']);
1045
- }
1046
-
1047
- async processPayment(paymentData: any) {
1048
- await logger.info('Payment processing started', {
1049
- transactionId: paymentData.id,
1050
- paymentMethod: paymentData.method,
1051
- amount: paymentData.amount,
1052
- customerEmail: paymentData.email, // Only included if enabled
1053
- internalRef: paymentData.ref // Always included
1054
- });
500
+ async createOrder(data: CreateOrderDto) {
501
+ await this.logger.info('Creating order', { userId: data.userId });
502
+ // ...
1055
503
  }
1056
504
  }
1057
505
  ```
1058
506
 
1059
- ## Transport Level Selection
1060
-
1061
- Configure different log levels for each transport with interactive prompting and programmatic control:
507
+ **Async configuration** (for database credentials from a config service):
1062
508
 
1063
509
  ```typescript
1064
- import { createLogger } from 'logixia';
1065
-
1066
- const logger = createLogger({
1067
- appName: 'TransportLevelApp',
1068
- environment: 'development',
1069
- transports: {
1070
- console: { level: 'info', format: 'text' },
1071
- file: [
1072
- { filename: './logs/app.log', level: 'debug', format: 'json' },
1073
- { filename: './logs/error.log', level: 'error', format: 'json' }
1074
- ],
510
+ LogixiaLoggerModule.forRootAsync({
511
+ useFactory: async (configService: ConfigService) => ({
512
+ appName: 'nestjs-api',
513
+ environment: configService.get('NODE_ENV'),
1075
514
  database: {
1076
- type: 'mongodb',
1077
- connectionString: 'mongodb://localhost:27017/logs',
1078
- database: 'app_logs',
1079
- collection: 'entries'
1080
- }
1081
- }
515
+ type: 'postgresql',
516
+ host: configService.get('DB_HOST'),
517
+ password: configService.get('DB_PASSWORD'),
518
+ },
519
+ }),
520
+ inject: [ConfigService],
1082
521
  });
522
+ ```
1083
523
 
1084
- // Programmatic transport level configuration
1085
- await logger.setTransportLevels({
1086
- 'console': 'warn', // Only warnings and errors to console
1087
- 'file-0': 'debug', // All logs to main file
1088
- 'file-1': 'error', // Only errors to error file
1089
- 'database': 'info' // Info and above to database
1090
- });
524
+ ---
1091
525
 
1092
- // Get current transport levels
1093
- const currentLevels = logger.getTransportLevels();
1094
- console.log(currentLevels);
1095
- // Output: { 'console': 'warn', 'file-0': 'debug', 'file-1': 'error', 'database': 'info' }
526
+ ## Log redaction
1096
527
 
1097
- // Get available transports for configuration
1098
- const availableTransports = logger.getAvailableTransports();
1099
- console.log(availableTransports);
1100
- // Output: ['console', 'file-0', 'file-1', 'database']
528
+ Redact sensitive fields before they reach any transport — passwords, tokens, PII, credit card numbers. Fields are masked in-place before serialization:
1101
529
 
1102
- // Enable interactive transport level prompting
1103
- await logger.enableTransportLevelPrompting();
530
+ ```typescript
531
+ const logger = createLogger({
532
+ appName: 'api',
533
+ environment: 'production',
534
+ redaction: {
535
+ paths: [
536
+ 'password',
537
+ 'token',
538
+ 'accessToken',
539
+ 'refreshToken',
540
+ 'creditCard',
541
+ 'ssn',
542
+ '*.secret', // Wildcard: any field named 'secret' at any depth
543
+ 'user.email', // Nested path
544
+ ],
545
+ censor: '[REDACTED]', // Default: '[REDACTED]'
546
+ },
547
+ });
1104
548
 
1105
- // Test logging with different levels
1106
- await logger.debug('Debug message'); // Only to file-0
1107
- await logger.info('Info message'); // To file-0 and database
1108
- await logger.warn('Warning message'); // To console, file-0, and database
1109
- await logger.error('Error message'); // To all transports
549
+ await logger.info('User login', {
550
+ username: 'alice',
551
+ password: 'hunter2', // '[REDACTED]'
552
+ token: 'eyJhbGc...', // '[REDACTED]'
553
+ creditCard: '4111...', // '[REDACTED]'
554
+ ip: '203.0.113.4', // ← untouched
555
+ });
1110
556
  ```
1111
557
 
1112
- ## Analytics Transports
558
+ Redaction is applied once, before the entry is dispatched to any transport — no risk of a transport accidentally logging sensitive data.
1113
559
 
1114
- Logitron supports integration with popular analytics and monitoring platforms to track application events, user behavior, and system metrics.
560
+ ---
1115
561
 
1116
- ### Supported Analytics Platforms
562
+ ## Log search
1117
563
 
1118
- - **Mixpanel** - Event tracking and user analytics
1119
- - **DataDog** - Application monitoring and log forwarding
1120
- - **Google Analytics** - Web analytics and event tracking
1121
- - **Segment** - Unified analytics platform
564
+ Query your in-memory log history without shipping to Elasticsearch, Datadog, or any external service. Great for development environments and lightweight production setups:
1122
565
 
1123
- ### Mixpanel Integration
566
+ ```typescript
567
+ import { SearchManager } from 'logixia';
1124
568
 
1125
- Track user events and behavior with Mixpanel:
569
+ const search = new SearchManager({ maxEntries: 10_000 });
1126
570
 
1127
- ```typescript
1128
- import { LogixiaLogger } from 'logitron';
1129
-
1130
- const logger = new LogixiaLogger({
1131
- appName: 'MyApp',
1132
- transports: {
1133
- analytics: {
1134
- mixpanel: {
1135
- token: 'your-mixpanel-token',
1136
- apiKey: 'your-mixpanel-api-key',
1137
- batchSize: 50,
1138
- flushInterval: 5000,
1139
- level: 'info'
1140
- }
1141
- }
1142
- }
1143
- });
571
+ // Index a batch of entries (e.g. from a file or database)
572
+ await search.index(logEntries);
1144
573
 
1145
- // Track user events
1146
- logger.info('User signed up', {
1147
- userId: 'user-123',
1148
- email: 'user@example.com',
1149
- plan: 'premium',
1150
- source: 'landing_page'
574
+ // Search by text, level, and time range
575
+ const results = await search.search({
576
+ query: 'payment failed',
577
+ level: 'error',
578
+ from: new Date('2025-01-01'),
579
+ to: new Date(),
580
+ limit: 50,
1151
581
  });
1152
582
 
1153
- logger.info('Feature used', {
1154
- feature: 'export_data',
1155
- userId: 'user-123',
1156
- exportFormat: 'csv'
1157
- });
583
+ // results → sorted by relevance, includes matched entries with full metadata
1158
584
  ```
1159
585
 
1160
- ### DataDog Integration
586
+ ---
1161
587
 
1162
- Send logs and metrics to DataDog for monitoring:
588
+ ## OpenTelemetry
1163
589
 
1164
- ```typescript
1165
- const logger = new LogixiaLogger({
1166
- appName: 'MyApp',
1167
- transports: {
1168
- analytics: {
1169
- datadog: {
1170
- apiKey: 'your-datadog-api-key',
1171
- site: 'datadoghq.com',
1172
- service: 'my-service',
1173
- version: '1.0.1',
1174
- batchSize: 100,
1175
- flushInterval: 10000,
1176
- level: 'warn'
1177
- }
1178
- }
1179
- }
1180
- });
590
+ W3C `traceparent` and `tracestate` headers are extracted from incoming requests and attached to every log entry automatically — enabling correlation between distributed traces and log events in tools like Jaeger, Zipkin, Honeycomb, and Datadog:
1181
591
 
1182
- // Send application metrics
1183
- logger.error('API Error', {
1184
- endpoint: '/api/users',
1185
- statusCode: 500,
1186
- responseTime: 1200,
1187
- errorType: 'database_timeout'
592
+ ```typescript
593
+ // With tracing enabled (zero extra packages required)
594
+ const logger = createLogger({
595
+ appName: 'checkout-service',
596
+ environment: 'production',
597
+ otel: {
598
+ enabled: true,
599
+ serviceName: 'checkout-service',
600
+ propagate: ['traceparent', 'tracestate', 'baggage'],
601
+ },
1188
602
  });
1189
603
 
1190
- logger.warn('High memory usage', {
1191
- memoryUsage: 85,
1192
- threshold: 80,
1193
- service: 'user-service'
604
+ // In an Express handler receiving a traced request:
605
+ app.post('/checkout', async (req, res) => {
606
+ await logger.info('Checkout initiated', { cartId: req.body.cartId });
607
+ // ^ log entry carries the W3C traceparent from the incoming request
1194
608
  });
1195
609
  ```
1196
610
 
1197
- ### Google Analytics Integration
611
+ ---
1198
612
 
1199
- Track web analytics and custom events:
613
+ ## Graceful shutdown
1200
614
 
1201
- ```typescript
1202
- const logger = new LogixiaLogger({
1203
- appName: 'MyApp',
1204
- transports: {
1205
- analytics: {
1206
- googleAnalytics: {
1207
- measurementId: 'G-XXXXXXXXXX',
1208
- apiSecret: 'your-ga-api-secret',
1209
- apiKey: 'your-ga-api-key',
1210
- clientId: 'client-123',
1211
- batchSize: 25,
1212
- flushInterval: 3000,
1213
- level: 'info'
1214
- }
1215
- }
1216
- }
1217
- });
615
+ Ensures all buffered log entries are flushed to every transport before the process exits. Critical for database and analytics transports that batch writes:
1218
616
 
1219
- // Track page views and events
1220
- logger.info('Page view', {
1221
- page: '/dashboard',
1222
- userId: 'user-123',
1223
- sessionDuration: 1250,
1224
- referrer: 'https://google.com'
1225
- });
617
+ ```typescript
618
+ import { flushOnExit } from 'logixia';
1226
619
 
1227
- logger.info('Conversion event', {
1228
- eventType: 'purchase',
1229
- value: 99.99,
1230
- currency: 'USD',
1231
- transactionId: 'txn-456'
1232
- });
620
+ // Register once at startup — handles SIGTERM, SIGINT, and uncaught exceptions
621
+ flushOnExit(logger);
1233
622
  ```
1234
623
 
1235
- ### Segment Integration
1236
-
1237
- Unify analytics across multiple platforms:
624
+ Alternatively, flush manually:
1238
625
 
1239
626
  ```typescript
1240
- const logger = new LogixiaLogger({
1241
- appName: 'MyApp',
1242
- transports: {
1243
- analytics: {
1244
- segment: {
1245
- writeKey: 'your-segment-write-key',
1246
- apiKey: 'your-segment-api-key',
1247
- dataPlaneUrl: 'https://api.segment.io',
1248
- batchSize: 75,
1249
- flushInterval: 7000,
1250
- level: 'info'
1251
- }
1252
- }
1253
- }
627
+ // In a Kubernetes SIGTERM handler:
628
+ process.on('SIGTERM', async () => {
629
+ await logger.flush(); // Wait for all in-flight writes to complete
630
+ process.exit(0);
1254
631
  });
632
+ ```
1255
633
 
1256
- // Track user events
1257
- logger.info('Product purchased', {
1258
- productId: 'prod-456',
1259
- productName: 'Premium Plan',
1260
- price: 29.99,
1261
- currency: 'USD',
1262
- userId: 'user-123',
1263
- category: 'subscription'
1264
- });
634
+ ---
1265
635
 
1266
- logger.info('User identified', {
1267
- userId: 'user-123',
1268
- email: 'user@example.com',
1269
- name: 'John Doe',
1270
- plan: 'premium'
1271
- });
1272
- ```
1273
-
1274
- ### Multiple Analytics Providers
1275
-
1276
- Configure multiple analytics providers simultaneously:
1277
-
1278
- ```typescript
1279
- const logger = new LogixiaLogger({
1280
- appName: 'MyApp',
1281
- transports: {
1282
- console: { level: 'debug' },
1283
- analytics: {
1284
- mixpanel: {
1285
- token: 'mixpanel-token',
1286
- apiKey: 'mixpanel-api-key',
1287
- level: 'info'
1288
- },
1289
- datadog: {
1290
- apiKey: 'datadog-api-key',
1291
- site: 'datadoghq.com',
1292
- service: 'my-service',
1293
- level: 'warn'
1294
- },
1295
- segment: {
1296
- writeKey: 'segment-write-key',
1297
- apiKey: 'segment-api-key',
1298
- level: 'info'
1299
- }
1300
- }
1301
- }
1302
- });
1303
-
1304
- // Events will be sent to all configured analytics providers
1305
- logger.info('User action', {
1306
- action: 'button_click',
1307
- buttonId: 'signup-cta',
1308
- userId: 'user-123',
1309
- timestamp: new Date().toISOString()
1310
- });
1311
- ```
1312
-
1313
- ### Analytics Configuration Options
1314
-
1315
- #### Common Options
1316
-
1317
- - `apiKey`: API key for authentication
1318
- - `batchSize`: Number of events to batch before sending (default: 50)
1319
- - `flushInterval`: Time in milliseconds between automatic flushes (default: 5000)
1320
- - `level`: Minimum log level to send to analytics platform
1321
-
1322
- #### Platform-Specific Options
1323
-
1324
- **Mixpanel:**
1325
- - `token`: Project token from Mixpanel dashboard
1326
-
1327
- **DataDog:**
1328
- - `site`: DataDog site (e.g., 'datadoghq.com', 'datadoghq.eu')
1329
- - `service`: Service name for log correlation
1330
- - `version`: Application version
1331
-
1332
- **Google Analytics:**
1333
- - `measurementId`: GA4 Measurement ID
1334
- - `apiSecret`: Measurement Protocol API secret
1335
- - `clientId`: Client identifier for user tracking
1336
-
1337
- **Segment:**
1338
- - `writeKey`: Write key from Segment dashboard
1339
- - `dataPlaneUrl`: Custom data plane URL (optional)
1340
-
1341
- ### Best Practices
1342
-
1343
- 1. **Environment Variables**: Store API keys in environment variables
1344
- 2. **Batch Configuration**: Adjust batch sizes based on your traffic volume
1345
- 3. **Log Levels**: Use appropriate log levels for different analytics platforms
1346
- 4. **Error Handling**: Monitor transport metrics for failed deliveries
1347
- 5. **Privacy**: Ensure compliance with data privacy regulations
1348
-
1349
- ```typescript
1350
- // Production configuration example
1351
- const logger = new LogixiaLogger({
1352
- appName: process.env.APP_NAME,
1353
- environment: process.env.NODE_ENV,
1354
- transports: {
1355
- console: { level: 'error' },
1356
- analytics: {
1357
- mixpanel: {
1358
- token: process.env.MIXPANEL_TOKEN,
1359
- apiKey: process.env.MIXPANEL_API_KEY,
1360
- batchSize: 100,
1361
- flushInterval: 10000,
1362
- level: 'info'
1363
- },
1364
- datadog: {
1365
- apiKey: process.env.DATADOG_API_KEY,
1366
- site: process.env.DATADOG_SITE || 'datadoghq.com',
1367
- service: process.env.SERVICE_NAME,
1368
- version: process.env.APP_VERSION,
1369
- level: 'warn'
1370
- }
1371
- }
1372
- }
1373
- });
1374
- ```
1375
-
1376
- ### Interactive Transport Configuration
1377
-
1378
- ```typescript
1379
- // Enable interactive prompting for transport level selection
1380
- await logger.enableTransportLevelPrompting();
1381
-
1382
- // When logging, user will be prompted to select levels for each transport
1383
- // Example interactive session:
1384
- // ? Select log level for transport 'console': (Use arrow keys)
1385
- // ❯ error
1386
- // warn
1387
- // info
1388
- // debug
1389
- // trace
1390
-
1391
- // ? Select log level for transport 'file-0': (Use arrow keys)
1392
- // error
1393
- // warn
1394
- // ❯ info
1395
- // debug
1396
- // trace
1397
-
1398
- // Disable interactive prompting
1399
- await logger.disableTransportLevelPrompting();
1400
-
1401
- // Clear all transport level preferences (reset to defaults)
1402
- await logger.clearTransportLevelPreferences();
1403
- ```
1404
-
1405
- ### Advanced Transport Level Management
1406
-
1407
- ```typescript
1408
- class ApplicationService {
1409
- constructor() {
1410
- this.configureTransportLevels();
1411
- }
1412
-
1413
- private async configureTransportLevels() {
1414
- const environment = process.env.NODE_ENV;
1415
-
1416
- if (environment === 'development') {
1417
- // Development: verbose logging to console and file
1418
- await logger.setTransportLevels({
1419
- 'console': 'debug',
1420
- 'file-0': 'trace',
1421
- 'database': 'info'
1422
- });
1423
- } else if (environment === 'production') {
1424
- // Production: minimal console, comprehensive file and database
1425
- await logger.setTransportLevels({
1426
- 'console': 'error',
1427
- 'file-0': 'warn',
1428
- 'file-1': 'error',
1429
- 'database': 'info'
1430
- });
1431
- }
1432
- }
1433
-
1434
- async handleRequest(request: any) {
1435
- // These logs will be filtered based on transport-specific levels
1436
- await logger.debug('Request received', { requestId: request.id });
1437
- await logger.info('Processing request', { userId: request.userId });
1438
- await logger.warn('High load detected', { activeConnections: 150 });
1439
- await logger.error('Request failed', { error: 'Database timeout' });
1440
- }
1441
- }
1442
-
1443
- // Runtime transport level adjustment
1444
- class MonitoringService {
1445
- async adjustLoggingBasedOnLoad(systemLoad: number) {
1446
- if (systemLoad > 0.8) {
1447
- // High load: reduce logging verbosity
1448
- await logger.setTransportLevels({
1449
- 'console': 'error',
1450
- 'file-0': 'warn',
1451
- 'database': 'error'
1452
- });
1453
- await logger.warn('Reduced logging verbosity due to high system load');
1454
- } else if (systemLoad < 0.3) {
1455
- // Low load: increase logging for debugging
1456
- await logger.setTransportLevels({
1457
- 'console': 'info',
1458
- 'file-0': 'debug',
1459
- 'database': 'info'
1460
- });
1461
- await logger.info('Increased logging verbosity due to low system load');
1462
- }
1463
- }
1464
- }
1465
- ```
1466
-
1467
- ## Child Loggers
1468
-
1469
- Create contextual child loggers for better organization and hierarchical logging:
1470
-
1471
- ```typescript
1472
- import { createLogger } from 'logixia';
1473
-
1474
- const mainLogger = createLogger({
1475
- appName: 'EcommerceApplication',
1476
- environment: 'production',
1477
- transports: {
1478
- console: { level: 'info', format: 'text' },
1479
- file: {
1480
- filename: './logs/application.log',
1481
- level: 'debug',
1482
- format: 'json'
1483
- }
1484
- }
1485
- });
1486
-
1487
- // Create service-level child logger with persistent context
1488
- const userLogger = mainLogger.child('UserService', {
1489
- module: 'user-management',
1490
- version: '2.1.0',
1491
- component: 'authentication'
1492
- });
1493
-
1494
- // Child logger inherits parent configuration and adds its own context
1495
- await userLogger.info('User service initialized', {
1496
- maxConcurrentUsers: 1000,
1497
- cacheEnabled: true
1498
- });
1499
- // Output includes: context: 'UserService', module: 'user-management', version: '2.1.0', component: 'authentication'
1500
-
1501
- // Create operation-specific nested child loggers
1502
- class UserService {
1503
- private logger = userLogger;
1504
-
1505
- async authenticateUser(userId: string, sessionId: string) {
1506
- // Create operation-specific logger with additional context
1507
- const operationLogger = this.logger.child('AuthenticateUser', {
1508
- operation: 'authentication',
1509
- userId,
1510
- sessionId,
1511
- startTime: new Date().toISOString()
1512
- });
1513
-
1514
- await operationLogger.info('Authentication process started', {
1515
- authMethod: 'password',
1516
- ipAddress: '192.168.1.100'
1517
- });
1518
-
1519
- try {
1520
- // Simulate authentication steps with detailed logging
1521
- await operationLogger.debug('Validating user credentials');
1522
- const user = await this.validateCredentials(userId);
1523
-
1524
- await operationLogger.debug('Checking user permissions', {
1525
- userRole: user.role,
1526
- permissions: user.permissions.length
1527
- });
1528
-
1529
- await operationLogger.info('Authentication successful', {
1530
- userId: user.id,
1531
- userRole: user.role,
1532
- lastLogin: user.lastLogin
1533
- });
1534
-
1535
- return user;
1536
- } catch (error) {
1537
- await operationLogger.error('Authentication failed', {
1538
- errorCode: error.code,
1539
- errorMessage: error.message,
1540
- attemptNumber: error.attemptNumber || 1
1541
- });
1542
- throw error;
1543
- }
1544
- }
1545
-
1546
- async createUser(userData: any) {
1547
- // Create another operation-specific logger
1548
- const createLogger = this.logger.child('CreateUser', {
1549
- operation: 'user_creation',
1550
- requestId: `req_${Date.now()}`,
1551
- targetRole: userData.role
1552
- });
1553
-
1554
- await createLogger.info('User creation initiated', {
1555
- username: userData.username,
1556
- email: userData.email,
1557
- registrationSource: userData.source
1558
- });
1559
-
1560
- // Create validation-specific sub-logger
1561
- const validationLogger = createLogger.child('Validation', {
1562
- step: 'input_validation'
1563
- });
1564
-
1565
- await validationLogger.debug('Validating user input', {
1566
- fieldsToValidate: Object.keys(userData)
1567
- });
1568
-
1569
- // Validation logic here...
1570
- await validationLogger.info('Input validation completed');
1571
-
1572
- // Create database-specific sub-logger
1573
- const dbLogger = createLogger.child('Database', {
1574
- step: 'database_operation'
1575
- });
1576
-
1577
- await dbLogger.debug('Inserting user record');
1578
- const user = await this.database.createUser(userData);
1579
- await dbLogger.info('User record created', {
1580
- userId: user.id,
1581
- createdAt: user.createdAt
1582
- });
1583
-
1584
- await createLogger.info('User creation completed successfully', {
1585
- userId: user.id,
1586
- totalProcessingTime: Date.now() - parseInt(createLogger.getContext().split('_')[1])
1587
- });
1588
-
1589
- return user;
1590
- }
1591
- }
1592
-
1593
- // Example of service-to-service communication logging
1594
- class OrderService {
1595
- private logger = mainLogger.child('OrderService', {
1596
- module: 'order-management',
1597
- version: '1.5.0'
1598
- });
1599
-
1600
- async processOrder(orderId: string) {
1601
- const orderLogger = this.logger.child('ProcessOrder', {
1602
- orderId,
1603
- operation: 'order_processing'
1604
- });
1605
-
1606
- await orderLogger.info('Order processing started');
1607
-
1608
- // When calling user service, create a cross-service logger
1609
- const userServiceLogger = orderLogger.child('UserServiceCall', {
1610
- targetService: 'user-service',
1611
- operation: 'user_lookup'
1612
- });
1613
-
1614
- await userServiceLogger.debug('Fetching user information for order');
1615
- // User service call here...
1616
- await userServiceLogger.info('User information retrieved');
1617
-
1618
- await orderLogger.info('Order processing completed');
1619
- }
1620
- }
1621
-
1622
- // All child loggers maintain the full context hierarchy:
1623
- // {
1624
- // level: 'info',
1625
- // message: 'Authentication successful',
1626
- // context: 'UserService > AuthenticateUser',
1627
- // module: 'user-management',
1628
- // version: '2.1.0',
1629
- // component: 'authentication',
1630
- // operation: 'authentication',
1631
- // userId: 'user123',
1632
- // sessionId: 'sess_abc',
1633
- // startTime: '2024-01-15T10:30:00Z',
1634
- // userRole: 'admin',
1635
- // lastLogin: '2024-01-14T15:20:00Z',
1636
- // appName: 'EcommerceApplication',
1637
- // environment: 'production',
1638
- // timestamp: '2024-01-15T10:30:15.123Z'
1639
- // }
1640
- ```
1641
-
1642
- ## Trace ID Support
1643
-
1644
- Built-in request tracing across your application for comprehensive request flow tracking:
636
+ ## Configuration reference
1645
637
 
1646
638
  ```typescript
1647
- import { createLogger, runWithTraceId, getCurrentTraceId } from 'logixia';
1648
-
1649
- const logger = createLogger({
1650
- appName: 'DistributedApplication',
1651
- environment: 'production',
1652
- traceId: true,
1653
- transports: {
1654
- console: { level: 'info', format: 'text' },
1655
- file: {
1656
- filename: './logs/traced-application.log',
1657
- level: 'debug',
1658
- format: 'json'
1659
- },
1660
- database: {
1661
- type: 'mongodb',
1662
- connectionString: process.env.MONGODB_URI,
1663
- database: 'application_logs',
1664
- collection: 'traced_requests'
1665
- }
1666
- }
1667
- });
1668
-
1669
- // Manual trace ID management for specific operations
1670
- class PaymentService {
1671
- async processPayment(paymentData: any) {
1672
- // Use custom trace ID for payment processing
1673
- const traceId = `payment_${paymentData.orderId}_${Date.now()}`;
1674
-
1675
- return await runWithTraceId(traceId, async () => {
1676
- await logger.info('Payment processing initiated', {
1677
- orderId: paymentData.orderId,
1678
- amount: paymentData.amount,
1679
- currency: paymentData.currency,
1680
- paymentMethod: paymentData.method
1681
- });
1682
-
1683
- // Get current trace ID for external service calls
1684
- const currentTraceId = getCurrentTraceId();
1685
- console.log('Processing with trace ID:', currentTraceId); // 'payment_ORD123_1642234567890'
1686
-
1687
- try {
1688
- // Simulate payment gateway call
1689
- await this.callPaymentGateway(paymentData, currentTraceId);
1690
-
1691
- // Simulate fraud detection
1692
- await this.performFraudCheck(paymentData, currentTraceId);
1693
-
1694
- await logger.info('Payment processed successfully', {
1695
- transactionId: 'txn_abc123',
1696
- processingTime: '1.2s',
1697
- gatewayResponse: 'approved'
1698
- });
1699
-
1700
- return { success: true, transactionId: 'txn_abc123' };
1701
- } catch (error) {
1702
- await logger.error('Payment processing failed', {
1703
- errorCode: error.code,
1704
- errorMessage: error.message,
1705
- gatewayError: error.gatewayError
1706
- });
1707
- throw error;
1708
- }
1709
- });
1710
- }
1711
-
1712
- private async callPaymentGateway(paymentData: any, traceId: string) {
1713
- await logger.debug('Calling payment gateway', {
1714
- gateway: 'stripe',
1715
- endpoint: '/v1/charges',
1716
- traceId // Explicitly log trace ID for external calls
1717
- });
1718
-
1719
- // Simulate API call with trace ID in headers
1720
- // fetch('/api/payment', { headers: { 'X-Trace-ID': traceId } })
1721
- }
1722
-
1723
- private async performFraudCheck(paymentData: any, traceId: string) {
1724
- await logger.debug('Performing fraud detection', {
1725
- service: 'fraud-detection',
1726
- riskScore: 'calculating',
1727
- traceId
1728
- });
1729
-
1730
- // Fraud detection logic here
1731
- await logger.info('Fraud check completed', {
1732
- riskScore: 0.15,
1733
- decision: 'approved',
1734
- factors: ['amount_normal', 'location_verified', 'device_known']
1735
- });
1736
- }
1737
- }
1738
-
1739
- // Automatic trace ID generation for web requests
1740
- class OrderService {
1741
- async createOrder(orderData: any) {
1742
- // Auto-generate trace ID for new operations
1743
- return await runWithTraceId(async () => {
1744
- const traceId = getCurrentTraceId();
1745
-
1746
- await logger.info('Order creation started', {
1747
- customerId: orderData.customerId,
1748
- itemCount: orderData.items.length,
1749
- totalAmount: orderData.total,
1750
- autoGeneratedTraceId: traceId
1751
- });
1752
-
1753
- // Create child services with same trace context
1754
- const inventoryResult = await this.checkInventory(orderData.items);
1755
- const paymentResult = await this.processPayment(orderData.payment);
1756
-
1757
- await logger.info('Order created successfully', {
1758
- orderId: 'ORD-12345',
1759
- inventoryReserved: inventoryResult.reserved,
1760
- paymentProcessed: paymentResult.success
1761
- });
1762
-
1763
- return { orderId: 'ORD-12345', status: 'confirmed' };
1764
- });
1765
- }
1766
-
1767
- private async checkInventory(items: any[]) {
1768
- // This will automatically use the same trace ID
1769
- await logger.debug('Checking inventory availability', {
1770
- itemsToCheck: items.length,
1771
- operation: 'inventory_check'
1772
- });
1773
-
1774
- // Inventory check logic
1775
- return { reserved: true, availableQuantity: 100 };
1776
- }
1777
-
1778
- private async processPayment(paymentData: any) {
1779
- // This will automatically use the same trace ID
1780
- await logger.debug('Processing order payment', {
1781
- amount: paymentData.amount,
1782
- method: paymentData.method,
1783
- operation: 'payment_processing'
1784
- });
1785
-
1786
- // Payment processing logic
1787
- return { success: true, transactionId: 'txn_xyz789' };
1788
- }
1789
- }
1790
-
1791
- // Cross-service trace ID propagation
1792
- class NotificationService {
1793
- async sendOrderConfirmation(orderId: string, customerEmail: string) {
1794
- // Inherit trace ID from calling context or create new one
1795
- const existingTraceId = getCurrentTraceId();
1796
-
1797
- if (existingTraceId) {
1798
- // Continue with existing trace
1799
- await logger.info('Sending order confirmation', {
1800
- orderId,
1801
- customerEmail,
1802
- notificationType: 'order_confirmation',
1803
- inheritedTrace: true
1804
- });
1805
- } else {
1806
- // Create new trace for standalone notification
1807
- await runWithTraceId(`notification_${orderId}_${Date.now()}`, async () => {
1808
- await logger.info('Sending standalone notification', {
1809
- orderId,
1810
- customerEmail,
1811
- notificationType: 'order_confirmation',
1812
- newTrace: true
1813
- });
1814
- });
1815
- }
1816
- }
1817
- }
1818
-
1819
- // Example usage in an Express route
1820
- app.post('/orders', async (req, res) => {
1821
- // Extract trace ID from request headers or generate new one
1822
- const incomingTraceId = req.headers['x-trace-id'] ||
1823
- req.headers['x-request-id'] ||
1824
- `api_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
1825
-
1826
- await runWithTraceId(incomingTraceId, async () => {
1827
- await logger.info('API request received', {
1828
- endpoint: '/orders',
1829
- method: 'POST',
1830
- clientIP: req.ip,
1831
- userAgent: req.get('User-Agent')
1832
- });
1833
-
1834
- try {
1835
- const orderService = new OrderService();
1836
- const result = await orderService.createOrder(req.body);
1837
-
1838
- await logger.info('API request completed successfully', {
1839
- endpoint: '/orders',
1840
- orderId: result.orderId,
1841
- responseStatus: 201
1842
- });
1843
-
1844
- res.status(201).json({
1845
- ...result,
1846
- traceId: getCurrentTraceId() // Return trace ID to client
1847
- });
1848
- } catch (error) {
1849
- await logger.error('API request failed', {
1850
- endpoint: '/orders',
1851
- errorMessage: error.message,
1852
- responseStatus: 500
1853
- });
1854
-
1855
- res.status(500).json({
1856
- error: 'Order creation failed',
1857
- traceId: getCurrentTraceId()
1858
- });
1859
- }
1860
- });
1861
- });
1862
-
1863
- // All logs within the trace context will include the trace ID:
1864
- // {
1865
- // level: 'info',
1866
- // message: 'Payment processed successfully',
1867
- // traceId: 'payment_ORD123_1642234567890',
1868
- // transactionId: 'txn_abc123',
1869
- // processingTime: '1.2s',
1870
- // gatewayResponse: 'approved',
1871
- // appName: 'DistributedApplication',
1872
- // environment: 'production',
1873
- // timestamp: '2024-01-15T10:30:15.123Z'
1874
- // }
1875
- ```
1876
-
1877
- ## Custom Formatters
1878
-
1879
- Create custom log formatters for specialized output requirements:
639
+ interface LoggerConfig {
640
+ // Required
641
+ appName: string;
642
+ environment: string;
1880
643
 
1881
- ```typescript
1882
- import { ILogFormatter, LogEntry } from 'logixia';
1883
-
1884
- // Production-ready JSON formatter with structured output
1885
- class StructuredJSONFormatter implements ILogFormatter {
1886
- format(entry: LogEntry): string {
1887
- const structured = {
1888
- '@timestamp': entry.timestamp,
1889
- '@version': '1',
1890
- level: entry.level.toUpperCase(),
1891
- logger_name: entry.context || 'root',
1892
- message: entry.message,
1893
- application: {
1894
- name: entry.appName,
1895
- environment: entry.environment,
1896
- version: entry.version
1897
- },
1898
- trace: {
1899
- id: entry.traceId
1900
- },
1901
- metadata: entry.payload || {},
1902
- host: {
1903
- name: require('os').hostname(),
1904
- platform: process.platform,
1905
- arch: process.arch
1906
- },
1907
- process: {
1908
- pid: process.pid,
1909
- memory_usage: process.memoryUsage(),
1910
- uptime: process.uptime()
1911
- }
1912
- };
1913
-
1914
- return JSON.stringify(structured);
1915
- }
1916
- }
644
+ // Optional — general
645
+ silent?: boolean; // Suppress all output (useful in tests)
1917
646
 
1918
- // Human-readable console formatter with colors and alignment
1919
- class EnhancedConsoleFormatter implements ILogFormatter {
1920
- private colors = {
1921
- error: '\x1b[31m', // Red
1922
- warn: '\x1b[33m', // Yellow
1923
- info: '\x1b[36m', // Cyan
1924
- debug: '\x1b[32m', // Green
1925
- trace: '\x1b[35m', // Magenta
1926
- reset: '\x1b[0m' // Reset
647
+ levelOptions?: {
648
+ level?: 'trace' | 'debug' | 'info' | 'warn' | 'error' | 'fatal';
649
+ customLevels?: Record<string, { priority: number; color: string }>;
650
+ namespaces?: Record<string, string>; // Per-namespace level overrides
1927
651
  };
1928
-
1929
- format(entry: LogEntry): string {
1930
- const timestamp = new Date(entry.timestamp).toISOString();
1931
- const level = entry.level.toUpperCase().padEnd(5);
1932
- const context = entry.context ? `[${entry.context}]` : '';
1933
- const traceId = entry.traceId ? `{${entry.traceId.slice(-8)}}` : '';
1934
- const color = this.colors[entry.level] || this.colors.reset;
1935
-
1936
- let formatted = `${timestamp} ${color}${level}${this.colors.reset} ${context}${traceId} ${entry.message}`;
1937
-
1938
- if (entry.payload && Object.keys(entry.payload).length > 0) {
1939
- const payloadStr = JSON.stringify(entry.payload, null, 2)
1940
- .split('\n')
1941
- .map(line => ` ${line}`)
1942
- .join('\n');
1943
- formatted += `\n${payloadStr}`;
1944
- }
1945
-
1946
- return formatted;
1947
- }
1948
- }
1949
652
 
1950
- // Metrics-focused formatter for performance monitoring
1951
- class MetricsFormatter implements ILogFormatter {
1952
- format(entry: LogEntry): string {
1953
- if (entry.payload?.duration || entry.payload?.timeTaken) {
1954
- const metrics = {
1955
- timestamp: entry.timestamp,
1956
- metric_type: 'performance',
1957
- operation: entry.context || 'unknown',
1958
- duration_ms: entry.payload.duration || entry.payload.timeTaken,
1959
- trace_id: entry.traceId,
1960
- service: entry.appName,
1961
- environment: entry.environment,
1962
- additional_data: { ...entry.payload }
1963
- };
1964
-
1965
- delete metrics.additional_data.duration;
1966
- delete metrics.additional_data.timeTaken;
1967
-
1968
- return JSON.stringify(metrics);
1969
- }
1970
-
1971
- // For non-performance logs, use standard format
1972
- return JSON.stringify({
1973
- timestamp: entry.timestamp,
1974
- level: entry.level,
1975
- message: entry.message,
1976
- context: entry.context,
1977
- trace_id: entry.traceId,
1978
- data: entry.payload
1979
- });
1980
- }
1981
- }
1982
-
1983
- // Security audit formatter for compliance logging
1984
- class SecurityAuditFormatter implements ILogFormatter {
1985
- format(entry: LogEntry): string {
1986
- const auditEntry = {
1987
- audit_timestamp: entry.timestamp,
1988
- event_type: entry.level,
1989
- event_description: entry.message,
1990
- actor: {
1991
- user_id: entry.payload?.userId,
1992
- session_id: entry.payload?.sessionId,
1993
- ip_address: entry.payload?.clientIP,
1994
- user_agent: entry.payload?.userAgent
1995
- },
1996
- resource: {
1997
- type: entry.payload?.resourceType,
1998
- id: entry.payload?.resourceId,
1999
- action: entry.payload?.action
2000
- },
2001
- outcome: {
2002
- success: entry.level !== 'error',
2003
- error_code: entry.payload?.errorCode,
2004
- error_message: entry.payload?.errorMessage
2005
- },
2006
- context: {
2007
- application: entry.appName,
2008
- environment: entry.environment,
2009
- trace_id: entry.traceId,
2010
- component: entry.context
2011
- },
2012
- compliance: {
2013
- retention_period: '7_years',
2014
- classification: entry.payload?.dataClassification || 'internal',
2015
- regulation: ['SOX', 'GDPR', 'HIPAA']
2016
- }
2017
- };
2018
-
2019
- return JSON.stringify(auditEntry);
2020
- }
2021
- }
2022
-
2023
- // Configure logger with multiple formatters for different transports
2024
- const logger = createLogger({
2025
- appName: 'EnterpriseApplication',
2026
- environment: 'production',
2027
- transports: {
2028
- console: {
2029
- level: 'info',
2030
- formatter: new EnhancedConsoleFormatter()
2031
- },
2032
- file: {
2033
- filename: './logs/application.log',
2034
- level: 'debug',
2035
- formatter: new StructuredJSONFormatter()
2036
- },
2037
- database: {
2038
- type: 'mongodb',
2039
- connectionString: process.env.MONGODB_URI,
2040
- database: 'application_logs',
2041
- collection: 'structured_logs',
2042
- formatter: new StructuredJSONFormatter()
2043
- }
2044
- },
2045
- // Separate transport for metrics
2046
- metricsTransport: {
2047
- file: {
2048
- filename: './logs/metrics.log',
2049
- level: 'info',
2050
- formatter: new MetricsFormatter()
2051
- }
2052
- },
2053
- // Separate transport for security audit logs
2054
- auditTransport: {
2055
- file: {
2056
- filename: './logs/security-audit.log',
2057
- level: 'info',
2058
- formatter: new SecurityAuditFormatter()
2059
- },
2060
- database: {
2061
- type: 'mongodb',
2062
- connectionString: process.env.AUDIT_DB_URI,
2063
- database: 'security_audit',
2064
- collection: 'audit_events',
2065
- formatter: new SecurityAuditFormatter()
2066
- }
2067
- }
2068
- });
2069
-
2070
- // Usage examples with different formatters
2071
- class UserAuthenticationService {
2072
- async authenticateUser(credentials: any, clientInfo: any) {
2073
- // This will be formatted differently by each transport
2074
- await logger.info('User authentication attempt', {
2075
- userId: credentials.username,
2076
- clientIP: clientInfo.ip,
2077
- userAgent: clientInfo.userAgent,
2078
- authMethod: 'password',
2079
- resourceType: 'user_account',
2080
- resourceId: credentials.username,
2081
- action: 'authenticate',
2082
- dataClassification: 'confidential'
2083
- });
2084
-
2085
- // Performance timing with metrics formatter
2086
- const result = await logger.timeAsync('user-authentication', async () => {
2087
- // Authentication logic here
2088
- return { success: true, userId: 'user123', role: 'admin' };
2089
- });
2090
-
2091
- await logger.info('User authentication successful', {
2092
- userId: result.userId,
2093
- userRole: result.role,
2094
- sessionId: 'sess_abc123',
2095
- resourceType: 'user_session',
2096
- resourceId: 'sess_abc123',
2097
- action: 'create_session',
2098
- dataClassification: 'confidential'
2099
- });
2100
-
2101
- return result;
2102
- }
2103
- }
2104
-
2105
- // Console output (EnhancedConsoleFormatter):
2106
- // 2024-01-15T10:30:15.123Z INFO [UserAuthenticationService]{abc123ef} User authentication attempt
2107
- // {
2108
- // "userId": "john.doe",
2109
- // "clientIP": "192.168.1.100",
2110
- // "userAgent": "Mozilla/5.0...",
2111
- // "authMethod": "password"
2112
- // }
2113
-
2114
- // Application log file (StructuredJSONFormatter):
2115
- // {"@timestamp":"2024-01-15T10:30:15.123Z","@version":"1","level":"INFO","logger_name":"UserAuthenticationService","message":"User authentication attempt","application":{"name":"EnterpriseApplication","environment":"production"},"trace":{"id":"abc123ef"},"metadata":{"userId":"john.doe","clientIP":"192.168.1.100"}}
2116
-
2117
- // Security audit log (SecurityAuditFormatter):
2118
- // {"audit_timestamp":"2024-01-15T10:30:15.123Z","event_type":"info","event_description":"User authentication attempt","actor":{"user_id":"john.doe","ip_address":"192.168.1.100"},"resource":{"type":"user_account","id":"john.doe","action":"authenticate"},"compliance":{"retention_period":"7_years","classification":"confidential"}}
2119
- ```
2120
-
2121
- ## Configuration Options
2122
-
2123
- Comprehensive configuration interface for all logger features and transport systems:
2124
-
2125
- ```typescript
2126
- interface LoggerConfig {
2127
- // Core application settings
2128
- appName: string;
2129
- environment?: 'development' | 'staging' | 'production' | string;
2130
- level?: LogLevel; // Global minimum log level
2131
-
2132
- // Global field configuration
2133
- fields?: Record<string, any>; // Fields included in all log entries
2134
-
2135
- // Trace ID configuration
2136
- traceId?: boolean | {
2137
- enabled: boolean;
2138
- generator?: () => string; // Custom trace ID generation
653
+ redaction?: {
654
+ paths: string[]; // Field paths or wildcards to redact
655
+ censor?: string; // Replacement string (default: '[REDACTED]')
2139
656
  };
2140
-
2141
- // Output formatting
2142
- format?: {
2143
- json?: boolean; // JSON vs text format
2144
- timestamp?: boolean | string; // Include timestamp, custom format
2145
- colorize?: boolean; // Console color output
2146
- prettyPrint?: boolean; // Pretty-printed JSON
2147
- includeStack?: boolean; // Include stack traces for errors
2148
- };
2149
-
2150
- // Transport configuration
2151
- transports?: {
2152
- console?: ConsoleTransportConfig;
2153
- file?: FileTransportConfig | FileTransportConfig[]; // Multiple file outputs
2154
- database?: DatabaseTransportConfig | DatabaseTransportConfig[]; // Multiple databases
2155
- http?: HttpTransportConfig; // HTTP endpoint logging
2156
- syslog?: SyslogTransportConfig; // System log integration
2157
- custom?: CustomTransportConfig[]; // Custom transport implementations
2158
- };
2159
-
2160
- // Performance and monitoring
2161
- performance?: {
2162
- enableTiming?: boolean; // Enable performance timing
2163
- enableMetrics?: boolean; // Enable metrics collection
2164
- metricsInterval?: number; // Metrics collection interval (ms)
2165
- slowOperationThreshold?: number; // Threshold for slow operation warnings (ms)
2166
- };
2167
-
2168
- // Batch processing configuration
2169
- batching?: {
657
+
658
+ gracefulShutdown?: {
2170
659
  enabled?: boolean;
2171
- batchSize?: number; // Number of logs per batch
2172
- flushInterval?: number; // Time interval for batch flushing (ms)
2173
- maxRetries?: number; // Retry attempts for failed batches
2174
- };
2175
-
2176
- // Error handling
2177
- errorHandling?: {
2178
- suppressErrors?: boolean; // Suppress logger internal errors
2179
- fallbackTransport?: 'console' | 'file'; // Fallback when primary transport fails
2180
- errorCallback?: (error: Error) => void; // Custom error handler
2181
- };
2182
-
2183
- // Security and compliance
2184
- security?: {
2185
- sanitizeFields?: string[]; // Fields to sanitize in logs
2186
- encryptFields?: string[]; // Fields to encrypt
2187
- auditMode?: boolean; // Enable audit logging
2188
- retentionPolicy?: {
2189
- days?: number;
2190
- maxSize?: string; // '100MB', '1GB', etc.
2191
- };
660
+ timeout?: number; // Max ms to wait for transports to flush
2192
661
  };
2193
-
2194
- // Legacy options for backward compatibility
2195
- silent?: boolean; // Disable all output
2196
- levelOptions?: {
2197
- level?: string; // Current log level
2198
- levels?: Record<string, number>; // Custom levels with priorities
2199
- colors?: Record<string, LogColor>; // Custom colors for levels
2200
- };
2201
- }
2202
662
 
2203
- // Console transport configuration
2204
- interface ConsoleTransportConfig {
2205
- level?: LogLevel;
2206
- format?: 'text' | 'json';
2207
- colorize?: boolean;
2208
- timestamp?: boolean;
2209
- formatter?: ILogFormatter;
2210
- silent?: boolean; // Disable console output
2211
- }
663
+ otel?: {
664
+ enabled?: boolean;
665
+ serviceName?: string;
666
+ propagate?: ('traceparent' | 'tracestate' | 'baggage')[];
667
+ };
2212
668
 
2213
- // File transport configuration
2214
- interface FileTransportConfig {
2215
- filename: string;
2216
- level?: LogLevel;
2217
- format?: 'text' | 'json';
2218
- formatter?: ILogFormatter;
2219
-
2220
- // File rotation settings
2221
- rotation?: {
2222
- interval?: '1h' | '6h' | '12h' | '1d' | '1w' | '1m'; // Time-based rotation
2223
- maxSize?: string; // Size-based rotation: '10MB', '100MB', '1GB'
2224
- maxFiles?: number; // Maximum number of rotated files to keep
2225
- compress?: boolean; // Compress rotated files
2226
- datePattern?: string; // Custom date pattern for file naming
669
+ // Transports (all optional, can be combined freely)
670
+ console?: {
671
+ colorize?: boolean;
672
+ timestamp?: boolean;
673
+ format?: 'text' | 'json';
2227
674
  };
2228
-
2229
- // File handling options
2230
- options?: {
2231
- flags?: string; // File system flags ('a', 'w', etc.)
2232
- mode?: number; // File permissions
2233
- encoding?: string; // File encoding
2234
- highWaterMark?: number; // Stream buffer size
675
+
676
+ file?: {
677
+ filename: string;
678
+ dirname: string;
679
+ maxSize?: string; // e.g. '50MB', '1GB'
680
+ maxFiles?: number;
681
+ zippedArchive?: boolean;
682
+ format?: 'text' | 'json';
2235
683
  };
2236
- }
2237
684
 
2238
- // Database transport configuration
2239
- interface DatabaseTransportConfig {
2240
- type: 'mongodb' | 'postgresql' | 'mysql' | 'sqlite' | 'redis';
2241
- connectionString?: string;
2242
-
2243
- // Connection options
2244
- connection?: {
685
+ database?: {
686
+ type: 'postgresql' | 'mysql' | 'mongodb' | 'sqlite';
687
+ // PostgreSQL / MySQL
2245
688
  host?: string;
2246
689
  port?: number;
2247
690
  database?: string;
691
+ table?: string;
2248
692
  username?: string;
2249
693
  password?: string;
2250
- ssl?: boolean;
2251
- poolSize?: number;
2252
- timeout?: number;
2253
- };
2254
-
2255
- // Database-specific settings
2256
- mongodb?: {
2257
- collection: string;
2258
- capped?: boolean; // Capped collection
2259
- cappedSize?: number; // Capped collection size
2260
- indexes?: string[]; // Fields to index
2261
- };
2262
-
2263
- postgresql?: {
2264
- table: string;
2265
- schema?: string;
2266
- createTable?: boolean; // Auto-create table
2267
- columns?: Record<string, string>; // Custom column definitions
2268
- };
2269
-
2270
- mysql?: {
2271
- table: string;
2272
- database?: string;
2273
- createTable?: boolean;
2274
- engine?: 'InnoDB' | 'MyISAM';
2275
- };
2276
-
2277
- sqlite?: {
2278
- filename: string;
2279
- table: string;
2280
- createTable?: boolean;
2281
- };
2282
-
2283
- redis?: {
2284
- key: string; // Redis key for log storage
2285
- listType?: 'list' | 'stream'; // Storage type
2286
- maxLength?: number; // Maximum list/stream length
2287
- ttl?: number; // Time to live (seconds)
2288
- };
2289
-
2290
- level?: LogLevel;
2291
- formatter?: ILogFormatter;
2292
-
2293
- // Batch processing for database writes
2294
- batching?: {
2295
- enabled?: boolean;
694
+ // MongoDB
695
+ connectionString?: string;
696
+ collection?: string;
697
+ // SQLite
698
+ filename?: string;
699
+ // Batching
2296
700
  batchSize?: number;
2297
- flushInterval?: number;
701
+ flushInterval?: number; // ms
2298
702
  };
2299
- }
2300
703
 
2301
- // HTTP transport configuration
2302
- interface HttpTransportConfig {
2303
- url: string;
2304
- method?: 'POST' | 'PUT' | 'PATCH';
2305
- headers?: Record<string, string>;
2306
- level?: LogLevel;
2307
- formatter?: ILogFormatter;
2308
-
2309
- // HTTP-specific options
2310
- options?: {
2311
- timeout?: number;
2312
- retries?: number;
2313
- retryDelay?: number;
2314
- auth?: {
2315
- username: string;
2316
- password: string;
2317
- } | {
2318
- bearer: string;
2319
- };
2320
- };
2321
-
2322
- // Batch processing for HTTP requests
2323
- batching?: {
2324
- enabled?: boolean;
704
+ analytics?: {
705
+ endpoint: string;
706
+ apiKey?: string;
2325
707
  batchSize?: number;
2326
- flushInterval?: number;
708
+ flushInterval?: number; // ms
2327
709
  };
2328
- }
2329
-
2330
- // Example comprehensive configuration
2331
- const productionConfig: LoggerConfig = {
2332
- appName: 'EnterpriseApplication',
2333
- environment: 'production',
2334
- level: 'info',
2335
-
2336
- fields: {
2337
- version: '2.1.4',
2338
- service: 'user-management-api',
2339
- region: process.env.AWS_REGION || 'us-east-1',
2340
- datacenter: 'aws-virginia',
2341
- buildNumber: process.env.BUILD_NUMBER,
2342
- deploymentId: process.env.DEPLOYMENT_ID
2343
- },
2344
-
2345
- traceId: {
2346
- enabled: true,
2347
- header: 'x-trace-id',
2348
- generator: () => `trace_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`
2349
- },
2350
-
2351
- format: {
2352
- json: true,
2353
- timestamp: true,
2354
- prettyPrint: false,
2355
- includeStack: true
2356
- },
2357
-
2358
- transports: {
2359
- console: {
2360
- level: 'warn',
2361
- format: 'text',
2362
- colorize: false
2363
- },
2364
-
2365
- file: [
2366
- {
2367
- filename: './logs/application.log',
2368
- level: 'info',
2369
- format: 'json',
2370
- rotation: {
2371
- interval: '1d',
2372
- maxFiles: 30,
2373
- compress: true
2374
- }
2375
- },
2376
- {
2377
- filename: './logs/error.log',
2378
- level: 'error',
2379
- format: 'json',
2380
- rotation: {
2381
- maxSize: '100MB',
2382
- maxFiles: 10
2383
- }
2384
- }
2385
- ],
2386
-
2387
- database: [
2388
- {
2389
- type: 'mongodb',
2390
- connectionString: process.env.MONGODB_URI,
2391
- mongodb: {
2392
- collection: 'application_logs',
2393
- capped: true,
2394
- cappedSize: 1000000000, // 1GB
2395
- indexes: ['timestamp', 'level', 'traceId']
2396
- },
2397
- level: 'debug',
2398
- batching: {
2399
- enabled: true,
2400
- batchSize: 100,
2401
- flushInterval: 5000
2402
- }
2403
- }
2404
- ],
2405
-
2406
- http: {
2407
- url: 'https://logs.example.com/api/logs',
2408
- method: 'POST',
2409
- headers: {
2410
- 'Authorization': `Bearer ${process.env.LOG_API_TOKEN}`,
2411
- 'Content-Type': 'application/json'
2412
- },
2413
- level: 'error',
2414
- options: {
2415
- timeout: 10000,
2416
- retries: 3,
2417
- retryDelay: 1000
2418
- },
2419
- batching: {
2420
- enabled: true,
2421
- batchSize: 50,
2422
- flushInterval: 10000
2423
- }
2424
- }
2425
- },
2426
-
2427
- performance: {
2428
- enableTiming: true,
2429
- enableMetrics: true,
2430
- metricsInterval: 60000, // 1 minute
2431
- slowOperationThreshold: 1000 // 1 second
2432
- },
2433
-
2434
- batching: {
2435
- enabled: true,
2436
- batchSize: 100,
2437
- flushInterval: 5000,
2438
- maxRetries: 3
2439
- },
2440
-
2441
- errorHandling: {
2442
- suppressErrors: false,
2443
- fallbackTransport: 'console',
2444
- errorCallback: (error) => {
2445
- console.error('Logger error:', error);
2446
- // Send to monitoring service
2447
- }
2448
- },
2449
-
2450
- security: {
2451
- sanitizeFields: ['password', 'creditCard', 'ssn'],
2452
- encryptFields: ['personalData', 'sensitiveInfo'],
2453
- auditMode: true,
2454
- retentionPolicy: {
2455
- days: 2555, // 7 years for compliance
2456
- maxSize: '10GB'
2457
- }
2458
- }
2459
- };
2460
710
 
2461
- const logger = createLogger(productionConfig);
2462
- ```
2463
-
2464
- ## API Reference
2465
-
2466
- ### Logger Creation
2467
-
2468
- #### createLogger(config: LoggerConfig): ILogger
2469
-
2470
- Creates a new logger instance with the specified configuration.
2471
-
2472
- ```typescript
2473
- import { createLogger, LogLevel } from 'logixia';
2474
-
2475
- const logger = createLogger({
2476
- appName: 'MyApplication',
2477
- environment: 'production',
2478
- levelOptions: {
2479
- level: LogLevel.INFO,
2480
- colors: {
2481
- error: 'red',
2482
- warn: 'yellow',
2483
- info: 'green',
2484
- debug: 'blue'
2485
- }
2486
- },
2487
- transports: {
2488
- console: { level: 'info', colorize: true },
2489
- file: { filename: './logs/app.log', level: 'debug' },
2490
- database: { type: 'mongodb', connectionString: 'mongodb://localhost:27017' }
2491
- }
2492
- });
2493
- ```
2494
-
2495
- ### Core Logging Methods
2496
-
2497
- #### Standard Log Levels
2498
-
2499
- ```typescript
2500
- // Error level - critical errors that require immediate attention
2501
- await logger.error(message: string | Error, context?: Record<string, any>): Promise<void>
2502
-
2503
- // Warning level - potentially harmful situations
2504
- await logger.warn(message: string, context?: Record<string, any>): Promise<void>
2505
-
2506
- // Info level - general application flow information
2507
- await logger.info(message: string, context?: Record<string, any>): Promise<void>
2508
-
2509
- // Debug level - detailed diagnostic information
2510
- await logger.debug(message: string, context?: Record<string, any>): Promise<void>
2511
-
2512
- // Trace level - most detailed diagnostic information
2513
- await logger.trace(message: string, context?: Record<string, any>): Promise<void>
2514
-
2515
- // Verbose level - extremely detailed diagnostic information
2516
- await logger.verbose(message: string, context?: Record<string, any>): Promise<void>
2517
- ```
2518
-
2519
- #### Custom Level Logging
2520
-
2521
- ```typescript
2522
- // Log with custom level
2523
- await logger.logLevel(level: string, message: string, context?: Record<string, any>): Promise<void>
2524
-
2525
- // Example with custom business levels
2526
- await logger.logLevel('order', 'Order processing started', { orderId: '12345' });
2527
- await logger.logLevel('payment', 'Payment processed', { amount: 99.99, method: 'card' });
2528
- ```
2529
-
2530
- ### Performance Monitoring
2531
-
2532
- #### Timing Operations
2533
-
2534
- ```typescript
2535
- // Start a timer
2536
- logger.time(label: string): void
2537
-
2538
- // End timer and return duration in milliseconds
2539
- await logger.timeEnd(label: string): Promise<number | undefined>
2540
-
2541
- // Automatic timing wrapper for async operations
2542
- await logger.timeAsync<T>(label: string, operation: () => Promise<T>, context?: Record<string, any>): Promise<T>
2543
- ```
2544
-
2545
- **Usage Examples:**
2546
-
2547
- ```typescript
2548
- // Manual timing
2549
- logger.time('database-query');
2550
- const users = await database.findUsers();
2551
- const duration = await logger.timeEnd('database-query');
2552
-
2553
- // Automatic timing with context
2554
- const result = await logger.timeAsync('api-call', async () => {
2555
- const response = await fetch('/api/users');
2556
- return response.json();
2557
- }, { endpoint: '/api/users', method: 'GET' });
2558
- ```
2559
-
2560
- ### Context Management
2561
-
2562
- #### Logger Context
2563
-
2564
- ```typescript
2565
- // Set context for all subsequent log entries
2566
- logger.setContext(context: string): void
2567
-
2568
- // Get current context
2569
- logger.getContext(): string | undefined
2570
-
2571
- // Set minimum log level
2572
- logger.setLevel(level: string): void
2573
-
2574
- // Get current log level
2575
- logger.getLevel(): string
2576
- ```
2577
-
2578
- ### Field Management
2579
-
2580
- #### Dynamic Field Control
2581
-
2582
- ```typescript
2583
- // Enable specific fields for inclusion in logs
2584
- await logger.enableField(fieldName: string | string[]): Promise<void>
2585
-
2586
- // Disable specific fields from logs
2587
- await logger.disableField(fieldName: string | string[]): Promise<void>
2588
-
2589
- // Check if a field is currently enabled
2590
- logger.isFieldEnabled(fieldName: string): boolean
2591
-
2592
- // Get current field state
2593
- logger.getFieldState(): Record<string, boolean>
2594
-
2595
- // Reset field state to default (all fields enabled)
2596
- await logger.resetFieldState(): Promise<void>
2597
- ```
2598
-
2599
- **Usage Examples:**
2600
-
2601
- ```typescript
2602
- // Enable multiple fields
2603
- await logger.enableField(['userId', 'requestId', 'sessionId']);
2604
-
2605
- // Disable sensitive fields
2606
- await logger.disableField(['email', 'phoneNumber']);
2607
-
2608
- // Check field status
2609
- if (logger.isFieldEnabled('userId')) {
2610
- // Field is enabled
2611
- }
2612
-
2613
- // Get all field states
2614
- const fieldStates = logger.getFieldState();
2615
- console.log(fieldStates); // { userId: true, email: false, ... }
2616
- ```
2617
-
2618
- ### Transport Level Selection
2619
-
2620
- #### Transport-Specific Level Configuration
2621
-
2622
- ```typescript
2623
- // Set log levels for specific transports
2624
- await logger.setTransportLevels(levels: Record<string, string>): Promise<void>
2625
-
2626
- // Get current transport level configuration
2627
- logger.getTransportLevels(): Record<string, string>
2628
-
2629
- // Get available transport identifiers
2630
- logger.getAvailableTransports(): string[]
2631
-
2632
- // Enable interactive transport level prompting
2633
- await logger.enableTransportLevelPrompting(): Promise<void>
2634
-
2635
- // Disable interactive transport level prompting
2636
- await logger.disableTransportLevelPrompting(): Promise<void>
2637
-
2638
- // Clear all transport level preferences
2639
- await logger.clearTransportLevelPreferences(): Promise<void>
2640
- ```
2641
-
2642
- **Usage Examples:**
2643
-
2644
- ```typescript
2645
- // Configure different levels for each transport
2646
- await logger.setTransportLevels({
2647
- 'console': 'warn',
2648
- 'file-0': 'debug',
2649
- 'database': 'info'
2650
- });
2651
-
2652
- // Get current configuration
2653
- const levels = logger.getTransportLevels();
2654
- console.log(levels); // { console: 'warn', 'file-0': 'debug', database: 'info' }
2655
-
2656
- // List available transports
2657
- const transports = logger.getAvailableTransports();
2658
- console.log(transports); // ['console', 'file-0', 'database']
2659
-
2660
- // Enable interactive configuration
2661
- await logger.enableTransportLevelPrompting();
2662
- ```
2663
-
2664
- #### Child Loggers
2665
-
2666
- ```typescript
2667
- // Create child logger with additional context
2668
- logger.child(context: string, persistentData?: Record<string, any>): ILogger
2669
- ```
2670
-
2671
- **Usage Example:**
2672
-
2673
- ```typescript
2674
- const userLogger = logger.child('UserService', { module: 'authentication' });
2675
- const operationLogger = userLogger.child('LoginOperation', { sessionId: 'sess_123' });
2676
-
2677
- // All logs from operationLogger will include both contexts
2678
- await operationLogger.info('User login attempt', { userId: 'user_456' });
2679
- ```
2680
-
2681
- ### Batch Processing
2682
-
2683
- #### Manual Batch Management
2684
-
2685
- ```typescript
2686
- // Add entry to batch queue
2687
- logger.addToBatch(entry: {
2688
- level: string;
2689
- message: string;
2690
- context?: Record<string, any>;
2691
- }): void
2692
-
2693
- // Manually flush all batched entries
2694
- await logger.flushBatch(): Promise<void>
2695
-
2696
- // Get current batch size
2697
- logger.getBatchSize(): number
2698
- ```
2699
-
2700
- ### Transport Management
2701
-
2702
- #### Health Monitoring
2703
-
2704
- ```typescript
2705
- // Check health status of all configured transports
2706
- await logger.checkTransportHealth(): Promise<Record<string, TransportHealthStatus>>
2707
- ```
2708
-
2709
- **Response Format:**
2710
-
2711
- ```typescript
2712
- interface TransportHealthStatus {
2713
- status: 'healthy' | 'degraded' | 'unhealthy';
2714
- latency: number; // Response time in milliseconds
2715
- error?: string; // Error message if unhealthy
2716
- lastCheck: Date; // Timestamp of last health check
2717
- }
2718
- ```
2719
-
2720
- ### Resource Management
2721
-
2722
- #### Cleanup Operations
2723
-
2724
- ```typescript
2725
- // Gracefully close logger and all transports
2726
- await logger.close(): Promise<void>
2727
-
2728
- // Force flush all pending operations
2729
- await logger.flush(): Promise<void>
2730
- ```
2731
-
2732
- ### NestJS Integration
2733
-
2734
- #### LogixiaLoggerService Methods
2735
-
2736
- ```typescript
2737
- // Standard NestJS LoggerService interface
2738
- log(message: any, context?: string): void
2739
- error(message: any, trace?: string, context?: string): void
2740
- warn(message: any, context?: string): void
2741
- debug(message: any, context?: string): void
2742
- verbose(message: any, context?: string): void
2743
-
2744
- // Extended Logixia methods (async versions)
2745
- await info(message: string, context?: Record<string, any>): Promise<void>
2746
- await trace(message: string, context?: Record<string, any>): Promise<void>
2747
-
2748
- // Trace ID management
2749
- getCurrentTraceId(): string | undefined
2750
-
2751
- // Child logger creation
2752
- child(context: string, persistentData?: Record<string, any>): LogixiaLoggerService
2753
-
2754
- // Context management
2755
- setContext(context: string): void
2756
- getContext(): string | undefined
2757
- ```
2758
-
2759
- ### Trace ID Management
2760
-
2761
- #### Global Trace Functions
2762
-
2763
- ```typescript
2764
- // Run operation with specific trace ID
2765
- runWithTraceId(traceId: string, operation: () => Promise<void>): Promise<void>
2766
- runWithTraceId(operation: () => Promise<void>): Promise<void> // Auto-generate ID
2767
-
2768
- // Get current trace ID from async context
2769
- getCurrentTraceId(): string | undefined
2770
-
2771
- // Express middleware for automatic trace ID extraction
2772
- traceMiddleware(options: TraceMiddlewareOptions): express.RequestHandler
2773
- ```
2774
-
2775
- **TraceMiddlewareOptions:**
2776
-
2777
- ```typescript
2778
- interface TraceMiddlewareOptions {
2779
- enabled: boolean;
2780
- extractor: {
2781
- header?: string[]; // Header names to check for trace ID
2782
- query?: string[]; // Query parameter names to check
2783
- body?: string[]; // Body field names to check
2784
- };
2785
- generator?: () => string; // Custom trace ID generator
711
+ transports?: ITransport[]; // Additional custom transports
2786
712
  }
2787
713
  ```
2788
714
 
2789
- ## Examples
2790
-
2791
- The `/examples` directory contains comprehensive usage demonstrations:
2792
-
2793
- ### Available Examples
2794
-
2795
- - **Basic Usage** (`examples/basic-usage.ts`) - Fundamental logging operations and setup
2796
- - **Advanced Logging** (`examples/advanced-logging.ts`) - Multi-transport configuration with database integration
2797
- - **Custom Levels** (`examples/custom-levels.ts`) - Business-specific log levels and custom priorities
2798
- - **NestJS Integration** (`examples/nestjs-example.ts`) - Complete NestJS module integration
2799
- - **Express Integration** (`examples/express-example.ts`) - Express middleware and request tracking
2800
- - **Performance Monitoring** (`examples/performance-monitoring.ts`) - Timing utilities and performance metrics
2801
- - **Field Configuration** (`examples/field-configuration.ts`) - Custom field formatting and inclusion
2802
- - **Field and Transport Management** (`examples/field-and-transport-management.ts`) - Dynamic field control and transport-specific log levels
2803
- - **Database Transport** (`examples/database-transport.ts`) - Database-specific transport configurations
2804
- - **Log Rotation** (`examples/log-rotation.ts`) - File rotation and retention policies
2805
-
2806
- ### Running Examples
2807
-
2808
- ```bash
2809
- # Execute basic usage demonstration
2810
- npm run dev:basic-usage
2811
-
2812
- # Run advanced multi-transport example
2813
- npm run dev:advanced-logging
2814
-
2815
- # Test custom business log levels
2816
- npm run dev:custom-levels
2817
-
2818
- # Demonstrate NestJS integration
2819
- npm run dev:nestjs
2820
-
2821
- # Show Express middleware usage
2822
- npm run dev:express
2823
-
2824
- # Performance monitoring examples
2825
- npm run dev:performance
2826
-
2827
- # Field configuration demonstration
2828
- npm run dev:fields
2829
-
2830
- # Field and transport management demonstration
2831
- npx ts-node examples/field-and-transport-management.ts
2832
-
2833
- # Interactive field and transport management
2834
- npx ts-node examples/field-and-transport-management.ts --interactive
2835
-
2836
- # Database transport examples
2837
- npm run dev:database
2838
-
2839
- # Log rotation demonstration
2840
- npm run dev:rotation
2841
- ```
2842
-
2843
- ## Development
2844
-
2845
- ### Setup and Build
2846
-
2847
- ```bash
2848
- # Install project dependencies
2849
- npm install
2850
-
2851
- # Build TypeScript source
2852
- npm run build
2853
-
2854
- # Build with watch mode for development
2855
- npm run build:watch
2856
-
2857
- # Clean build artifacts
2858
- npm run clean
2859
- ```
2860
-
2861
- ### Testing and Quality Assurance
2862
-
2863
- ```bash
2864
- # Execute test suite
2865
- npm test
2866
-
2867
- # Run tests with coverage report
2868
- npm run test:coverage
2869
-
2870
- # Execute tests in watch mode
2871
- npm run test:watch
2872
-
2873
- # Run ESLint code analysis
2874
- npm run lint
2875
-
2876
- # Fix automatically correctable linting issues
2877
- npm run lint:fix
2878
-
2879
- # Format code with Prettier
2880
- npm run format
2881
-
2882
- # Validate code formatting
2883
- npm run format:check
2884
- ```
2885
-
2886
- ### Documentation
2887
-
2888
- ```bash
2889
- # Generate API documentation
2890
- npm run docs:generate
2891
-
2892
- # Serve documentation locally
2893
- npm run docs:serve
2894
-
2895
- # Validate documentation completeness
2896
- npm run docs:validate
2897
- ```
2898
-
2899
- ## System Requirements
2900
-
2901
- ### Runtime Requirements
2902
-
2903
- - **Node.js**: Version 16.0.0 or higher
2904
- - **Operating System**: Cross-platform (Windows, macOS, Linux)
2905
- - **Memory**: Minimum 512MB available RAM
2906
-
2907
- ### Development Requirements
2908
-
2909
- - **TypeScript**: Version 5.0.0 or higher
2910
- - **npm**: Version 8.0.0 or higher (or equivalent package manager)
2911
- - **Git**: Version 2.20.0 or higher
2912
-
2913
- ### Optional Database Dependencies
2914
-
2915
- - **MongoDB**: Version 4.4 or higher (for MongoDB transport)
2916
- - **PostgreSQL**: Version 12 or higher (for PostgreSQL transport)
2917
- - **MySQL**: Version 8.0 or higher (for MySQL transport)
2918
- - **SQLite**: Version 3.35 or higher (for SQLite transport)
715
+ ---
2919
716
 
2920
717
  ## Contributing
2921
718
 
2922
- 🚀 **We're building the world's most advanced TypeScript logging library!** 🚀
2923
-
2924
- Logixia is an **open source project** and we welcome contributions from developers worldwide. Whether you're fixing bugs, adding features, improving documentation, or sharing ideas, your contribution helps make Logixia better for everyone.
2925
-
2926
- ### Why Contribute?
2927
-
2928
- - 🌟 **Impact**: Help shape the future of logging in TypeScript/Node.js ecosystem
2929
- - 🎯 **Learning**: Work with cutting-edge TypeScript patterns and enterprise architecture
2930
- - 🤝 **Community**: Join a growing community of passionate developers
2931
- - 📈 **Recognition**: Get recognized for your contributions in our contributors hall of fame
2932
-
2933
- ### Quick Start for Contributors
2934
-
2935
719
  ```bash
2936
- # 1. Fork and clone the repository
2937
720
  git clone https://github.com/Logixia/logixia.git
2938
721
  cd logixia
2939
-
2940
- # 2. Install dependencies
2941
722
  npm install
2942
-
2943
- # 3. Run tests to ensure everything works
2944
723
  npm test
2945
-
2946
- # 4. Start developing!
2947
- npm run build:watch
2948
724
  ```
2949
725
 
2950
- ### Ways to Contribute
2951
-
2952
- #### 🐛 **Bug Reports & Fixes**
2953
- - Found a bug? [Open an issue](https://github.com/Logixia/logixia/issues/new)
2954
- - Want to fix it? Submit a pull request!
2955
-
2956
- #### ✨ **New Features**
2957
- - **Transport Integrations**: Add support for new logging services (Elasticsearch, Splunk, etc.)
2958
- - **Performance Optimizations**: Help us make Logixia even faster
2959
- - **Developer Tools**: Build tools that make Logixia easier to use
2960
-
2961
- #### 📚 **Documentation & Examples**
2962
- - Improve existing documentation
2963
- - Create tutorials and guides
2964
- - Add real-world examples
2965
- - Translate documentation
2966
-
2967
- #### 🧪 **Testing & Quality**
2968
- - Add test cases
2969
- - Improve test coverage
2970
- - Performance benchmarking
2971
- - Security auditing
2972
-
2973
- ### Contribution Guidelines
2974
-
2975
- Please read our detailed [CONTRIBUTING.md](CONTRIBUTING.md) for:
2976
-
2977
- - 📋 **Development setup** and workflow
2978
- - 🎨 **Code style** and standards
2979
- - 🧪 **Testing** requirements
2980
- - 📝 **Documentation** guidelines
2981
- - 🔄 **Pull request** process
2982
-
2983
- ### Recognition
2984
-
2985
- All contributors are recognized in:
2986
- - 🏆 **Contributors section** below
2987
- - 📦 **Package.json** contributors field
2988
- - 🎉 **Release notes** for significant contributions
2989
- - 💫 **Special mentions** in our community channels
726
+ Pull requests are welcome. For significant changes, please open an issue first to discuss what you'd like to change.
2990
727
 
2991
- ### Community
2992
-
2993
- - 💬 **Discussions**: [GitHub Discussions](https://github.com/Logixia/logixia/discussions)
2994
- - 🐛 **Issues**: [GitHub Issues](https://github.com/Logixia/logixia/issues)
2995
- - 📧 **Email**: logixia@example.com
2996
- - 🐦 **Twitter**: [@LogixiaJS](https://twitter.com/LogixiaJS)
2997
-
2998
- ### Contributors
2999
-
3000
- Thanks to all our amazing contributors! 🙏
3001
-
3002
- <!-- Contributors will be automatically added here -->
3003
-
3004
- ### Hacktoberfest
3005
-
3006
- 🎃 **Hacktoberfest participants welcome!** We participate in Hacktoberfest and have issues labeled `hacktoberfest` for easy contribution.
728
+ ---
3007
729
 
3008
730
  ## License
3009
731
 
3010
- 📄 **Open Source & Free Forever**
3011
-
3012
- Logixia is proudly **open source** and licensed under the [MIT License](LICENSE). This means:
3013
-
3014
- ✅ **Free to use** - Commercial and personal projects
3015
- ✅ **Free to modify** - Customize to your needs
3016
- ✅ **Free to distribute** - Share with your team
3017
- ✅ **No attribution required** - Though we appreciate it!
3018
-
3019
- ### What this means for you:
3020
-
3021
- - 🏢 **Enterprise-friendly**: Use in commercial applications without licensing fees
3022
- - 🔧 **Modification rights**: Fork, modify, and customize as needed
3023
- - 📦 **Distribution rights**: Include in your own packages and applications
3024
- - 🤝 **Community-driven**: Benefit from community contributions and improvements
3025
-
3026
- ### MIT License Summary
3027
-
3028
- ```
3029
- MIT License
3030
-
3031
- Copyright (c) 2025 Logixia Contributors
3032
-
3033
- Permission is hereby granted, free of charge, to any person obtaining a copy
3034
- of this software and associated documentation files (the "Software"), to deal
3035
- in the Software without restriction...
3036
- ```
3037
-
3038
- See the complete [LICENSE](LICENSE) file for full terms and conditions.
3039
-
3040
- ### Third-Party Licenses
3041
-
3042
- Logixia respects all third-party licenses. See [THIRD-PARTY-NOTICES](THIRD-PARTY-NOTICES.md) for details about dependencies and their licenses.
3043
-
3044
- ## Acknowledgments
3045
-
3046
- ### Technical Foundation
3047
-
3048
- - **TypeScript**: Leveraging advanced type system for enhanced developer experience
3049
- - **Node.js**: Built on the robust Node.js runtime environment
3050
- - **Modern JavaScript**: Utilizing latest ECMAScript features and best practices
3051
-
3052
- ### Design Philosophy
3053
-
3054
- - **Enterprise-Ready**: Designed for production environments and scalable applications
3055
- - **Developer Experience**: Prioritizing intuitive APIs and comprehensive documentation
3056
- - **Performance**: Optimized for high-throughput logging scenarios
3057
- - **Extensibility**: Architected for easy customization and extension
3058
-
3059
- ### Community
3060
-
3061
- Built for and by the TypeScript and Node.js development community, with a focus on modern logging requirements and enterprise-grade reliability.
732
+ [MIT](https://opensource.org/licenses/MIT) © [Sanjeev Sharma](https://github.com/webcoderspeed)