@drarzter/kafka-client 0.3.0 → 0.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,11 +4,11 @@
4
4
  [![CI](https://github.com/drarzter/kafka-client/actions/workflows/publish.yml/badge.svg)](https://github.com/drarzter/kafka-client/actions/workflows/publish.yml)
5
5
  [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
6
6
 
7
- Type-safe Kafka client for Node.js. Framework-agnostic core with a first-class NestJS adapter. Built on top of [kafkajs](https://kafka.js.org/).
7
+ Type-safe Kafka client for Node.js. Framework-agnostic core with a first-class NestJS adapter. Built on top of [`@confluentinc/kafka-javascript`](https://github.com/confluentinc/confluent-kafka-javascript) (librdkafka).
8
8
 
9
9
  ## What is this?
10
10
 
11
- An opinionated, type-safe abstraction over kafkajs. Works standalone (Express, Fastify, raw Node) or as a NestJS DynamicModule. Not a full-featured framework — just a clean, typed layer for producing and consuming Kafka messages.
11
+ An opinionated, type-safe abstraction over `@confluentinc/kafka-javascript` (librdkafka). Works standalone (Express, Fastify, raw Node) or as a NestJS DynamicModule. Not a full-featured framework — just a clean, typed layer for producing and consuming Kafka messages.
12
12
 
13
13
  ## Why?
14
14
 
@@ -22,7 +22,11 @@ An opinionated, type-safe abstraction over kafkajs. Works standalone (Express, F
22
22
  - **Partition key support** — route related messages to the same partition
23
23
  - **Custom headers** — attach metadata headers to messages
24
24
  - **Transactions** — exactly-once semantics with `producer.transaction()`
25
- - **Consumer interceptors** — before/after/onError hooks for message processing
25
+ - **EventEnvelope** — every consumed message is wrapped in `EventEnvelope<T>` with `eventId`, `correlationId`, `timestamp`, `schemaVersion`, `traceparent`, and Kafka metadata
26
+ - **Correlation ID propagation** — auto-generated on send, auto-propagated through `AsyncLocalStorage` so nested sends inherit the same correlation ID
27
+ - **OpenTelemetry support** — `@drarzter/kafka-client/otel` entrypoint with `otelInstrumentation()` for W3C Trace Context propagation
28
+ - **Consumer interceptors** — before/after/onError hooks with `EventEnvelope` access
29
+ - **Client-wide instrumentation** — `KafkaInstrumentation` hooks for cross-cutting concerns (tracing, metrics)
26
30
  - **Auto-create topics** — `autoCreateTopics: true` for dev mode — no need to pre-create topics
27
31
  - **Error classes** — `KafkaProcessingError` and `KafkaRetryExhaustedError` with topic, message, and attempt metadata
28
32
  - **Health check** — built-in health indicator for monitoring
@@ -37,6 +41,21 @@ See the [Roadmap](./ROADMAP.md) for upcoming features and version history.
37
41
  npm install @drarzter/kafka-client
38
42
  ```
39
43
 
44
+ `@confluentinc/kafka-javascript` uses a native librdkafka addon. On most systems it builds automatically. For faster installs (skips compilation), install the system library first:
45
+
46
+ ```bash
47
+ # Arch / CachyOS
48
+ sudo pacman -S librdkafka
49
+
50
+ # Debian / Ubuntu
51
+ sudo apt-get install librdkafka-dev
52
+
53
+ # macOS
54
+ brew install librdkafka
55
+ ```
56
+
57
+ Then install with `BUILD_LIBRDKAFKA=0 npm install`.
58
+
40
59
  For NestJS projects, install peer dependencies: `@nestjs/common`, `@nestjs/core`, `reflect-metadata`, `rxjs`.
41
60
 
42
61
  For standalone usage (Express, Fastify, raw Node), no extra dependencies needed — import from `@drarzter/kafka-client/core`.
@@ -54,9 +73,9 @@ await kafka.connectProducer();
54
73
  // Send
55
74
  await kafka.sendMessage(OrderCreated, { orderId: '123', amount: 100 });
56
75
 
57
- // Consume
58
- await kafka.startConsumer([OrderCreated], async (message, topic) => {
59
- console.log(`${topic}:`, message.orderId);
76
+ // Consume — handler receives an EventEnvelope
77
+ await kafka.startConsumer([OrderCreated], async (envelope) => {
78
+ console.log(`${envelope.topic}:`, envelope.payload.orderId);
60
79
  });
61
80
 
62
81
  // Custom logger (winston, pino, etc.)
@@ -99,7 +118,7 @@ export class AppModule {}
99
118
  ```typescript
100
119
  // app.service.ts
101
120
  import { Injectable } from '@nestjs/common';
102
- import { InjectKafkaClient, KafkaClient, SubscribeTo } from '@drarzter/kafka-client';
121
+ import { InjectKafkaClient, KafkaClient, SubscribeTo, EventEnvelope } from '@drarzter/kafka-client';
103
122
  import { MyTopics } from './types';
104
123
 
105
124
  @Injectable()
@@ -113,8 +132,8 @@ export class AppService {
113
132
  }
114
133
 
115
134
  @SubscribeTo('hello')
116
- async onHello(message: MyTopics['hello']) {
117
- console.log('Received:', message.text);
135
+ async onHello(envelope: EventEnvelope<MyTopics['hello']>) {
136
+ console.log('Received:', envelope.payload.text);
118
137
  }
119
138
  }
120
139
  ```
@@ -192,7 +211,7 @@ await kafka.transaction(async (tx) => {
192
211
 
193
212
  // Consuming (decorator)
194
213
  @SubscribeTo(OrderCreated)
195
- async handleOrder(message: OrdersTopicMap['order.created']) { ... }
214
+ async handleOrder(envelope: EventEnvelope<OrdersTopicMap['order.created']>) { ... }
196
215
 
197
216
  // Consuming (imperative)
198
217
  await kafka.startConsumer([OrderCreated], handler);
@@ -217,7 +236,7 @@ import { OrdersTopicMap } from './orders.types';
217
236
  export class OrdersModule {}
218
237
  ```
219
238
 
220
- `autoCreateTopics` calls `admin.createTopics()` (idempotent — no-op if topic already exists) before the first send/consume for each topic. Useful in development, not recommended for production.
239
+ `autoCreateTopics` calls `admin.createTopics()` (idempotent — no-op if topic already exists) before the first send **and** before each `startConsumer` / `startBatchConsumer` call. librdkafka errors on unknown topics at subscribe time, so consumer-side creation is required. Useful in development, not recommended for production.
221
240
 
222
241
  Or with `ConfigService`:
223
242
 
@@ -301,13 +320,13 @@ import { SubscribeTo } from '@drarzter/kafka-client';
301
320
  @Injectable()
302
321
  export class OrdersHandler {
303
322
  @SubscribeTo('order.created')
304
- async handleOrderCreated(message: OrdersTopicMap['order.created'], topic: string) {
305
- console.log('New order:', message.orderId);
323
+ async handleOrderCreated(envelope: EventEnvelope<OrdersTopicMap['order.created']>) {
324
+ console.log('New order:', envelope.payload.orderId);
306
325
  }
307
326
 
308
327
  @SubscribeTo('order.completed', { retry: { maxRetries: 3 }, dlq: true })
309
- async handleOrderCompleted(message: OrdersTopicMap['order.completed'], topic: string) {
310
- console.log('Order completed:', message.orderId);
328
+ async handleOrderCompleted(envelope: EventEnvelope<OrdersTopicMap['order.completed']>) {
329
+ console.log('Order completed:', envelope.payload.orderId);
311
330
  }
312
331
  }
313
332
  ```
@@ -327,8 +346,8 @@ export class OrdersService implements OnModuleInit {
327
346
  async onModuleInit() {
328
347
  await this.kafka.startConsumer(
329
348
  ['order.created', 'order.completed'],
330
- async (message, topic) => {
331
- console.log(`${topic}:`, message);
349
+ async (envelope) => {
350
+ console.log(`${envelope.topic}:`, envelope.payload);
332
351
  },
333
352
  {
334
353
  retry: { maxRetries: 3, backoffMs: 1000 },
@@ -343,7 +362,7 @@ export class OrdersService implements OnModuleInit {
343
362
 
344
363
  ### Per-consumer groupId
345
364
 
346
- Override the default consumer group for specific consumers. Each unique `groupId` creates a separate kafkajs Consumer internally:
365
+ Override the default consumer group for specific consumers. Each unique `groupId` creates a separate librdkafka Consumer internally:
347
366
 
348
367
  ```typescript
349
368
  // Default group from constructor
@@ -354,7 +373,7 @@ await kafka.startConsumer(['orders'], auditHandler, { groupId: 'orders-audit' })
354
373
 
355
374
  // Works with @SubscribeTo too
356
375
  @SubscribeTo('orders', { groupId: 'orders-audit' })
357
- async auditOrders(message) { ... }
376
+ async auditOrders(envelope) { ... }
358
377
  ```
359
378
 
360
379
  **Important:** You cannot mix `eachMessage` and `eachBatch` consumers on the same `groupId`. The library throws a clear error if you try:
@@ -404,7 +423,7 @@ Same with `@SubscribeTo()` — use `clientName` to target a specific named clien
404
423
 
405
424
  ```typescript
406
425
  @SubscribeTo('payment.received', { clientName: 'payments' }) // ← matches name: 'payments'
407
- async handlePayment(message: PaymentsTopicMap['payment.received']) {
426
+ async handlePayment(envelope: EventEnvelope<PaymentsTopicMap['payment.received']>) {
408
427
  // ...
409
428
  }
410
429
  ```
@@ -460,16 +479,16 @@ await this.kafka.sendBatch('order.created', [
460
479
 
461
480
  ## Batch consuming
462
481
 
463
- Process messages in batches for higher throughput. The handler receives an array of parsed messages and a `BatchMeta` object with offset management controls:
482
+ Process messages in batches for higher throughput. The handler receives an array of `EventEnvelope`s and a `BatchMeta` object with offset management controls:
464
483
 
465
484
  ```typescript
466
485
  await this.kafka.startBatchConsumer(
467
486
  ['order.created'],
468
- async (messages, topic, meta) => {
469
- // messages: OrdersTopicMap['order.created'][]
470
- for (const msg of messages) {
471
- await processOrder(msg);
472
- meta.resolveOffset(/* ... */);
487
+ async (envelopes, meta) => {
488
+ // envelopes: EventEnvelope<OrdersTopicMap['order.created']>[]
489
+ for (const env of envelopes) {
490
+ await processOrder(env.payload);
491
+ meta.resolveOffset(env.offset);
473
492
  }
474
493
  await meta.commitOffsetsIfNecessary();
475
494
  },
@@ -481,8 +500,8 @@ With `@SubscribeTo()`:
481
500
 
482
501
  ```typescript
483
502
  @SubscribeTo('order.created', { batch: true })
484
- async handleOrders(messages: OrdersTopicMap['order.created'][], topic: string) {
485
- // messages is an array
503
+ async handleOrders(envelopes: EventEnvelope<OrdersTopicMap['order.created']>[], meta: BatchMeta) {
504
+ for (const env of envelopes) { ... }
486
505
  }
487
506
  ```
488
507
 
@@ -513,20 +532,20 @@ await this.kafka.transaction(async (tx) => {
513
532
 
514
533
  ## Consumer interceptors
515
534
 
516
- Add before/after/onError hooks to message processing:
535
+ Add before/after/onError hooks to message processing. Interceptors receive the full `EventEnvelope`:
517
536
 
518
537
  ```typescript
519
538
  import { ConsumerInterceptor } from '@drarzter/kafka-client';
520
539
 
521
540
  const loggingInterceptor: ConsumerInterceptor<OrdersTopicMap> = {
522
- before: (message, topic) => {
523
- console.log(`Processing ${topic}`, message);
541
+ before: (envelope) => {
542
+ console.log(`Processing ${envelope.topic}`, envelope.payload);
524
543
  },
525
- after: (message, topic) => {
526
- console.log(`Done ${topic}`);
544
+ after: (envelope) => {
545
+ console.log(`Done ${envelope.topic}`);
527
546
  },
528
- onError: (message, topic, error) => {
529
- console.error(`Failed ${topic}:`, error.message);
547
+ onError: (envelope, error) => {
548
+ console.error(`Failed ${envelope.topic}:`, error.message);
530
549
  },
531
550
  };
532
551
 
@@ -537,18 +556,48 @@ await this.kafka.startConsumer(['order.created'], handler, {
537
556
 
538
557
  Multiple interceptors run in order. All hooks are optional.
539
558
 
559
+ ## Instrumentation
560
+
561
+ For client-wide cross-cutting concerns (tracing, metrics), use `KafkaInstrumentation` hooks instead of per-consumer interceptors:
562
+
563
+ ```typescript
564
+ import { otelInstrumentation } from '@drarzter/kafka-client/otel';
565
+
566
+ const kafka = new KafkaClient('my-app', 'my-group', brokers, {
567
+ instrumentation: [otelInstrumentation()],
568
+ });
569
+ ```
570
+
571
+ `otelInstrumentation()` injects `traceparent` on send, extracts it on consume, and creates `CONSUMER` spans automatically. Requires `@opentelemetry/api` as a peer dependency.
572
+
573
+ Custom instrumentation:
574
+
575
+ ```typescript
576
+ import { KafkaInstrumentation } from '@drarzter/kafka-client';
577
+
578
+ const metrics: KafkaInstrumentation = {
579
+ beforeSend(topic, headers) { /* inject headers, start timer */ },
580
+ afterSend(topic) { /* record send latency */ },
581
+ beforeConsume(envelope) { /* start span */ return () => { /* end span */ }; },
582
+ onConsumeError(envelope, error) { /* record error metric */ },
583
+ };
584
+ ```
585
+
540
586
  ## Options reference
541
587
 
542
588
  ### Send options
543
589
 
544
590
  Options for `sendMessage()` — the third argument:
545
591
 
546
- | Option | Default | Description |
547
- |-----------|---------|--------------------------------------------------|
548
- | `key` | — | Partition key for message routing |
549
- | `headers` | — | Custom metadata headers (`Record<string, string>`) |
592
+ | Option | Default | Description |
593
+ |--------|---------|-------------|
594
+ | `key` | — | Partition key for message routing |
595
+ | `headers` | — | Custom metadata headers (merged with auto-generated envelope headers) |
596
+ | `correlationId` | auto | Override the auto-propagated correlation ID (default: inherited from ALS context or new UUID) |
597
+ | `schemaVersion` | `1` | Schema version for the payload |
598
+ | `eventId` | auto | Override the auto-generated event ID (UUID v4) |
550
599
 
551
- `sendBatch()` accepts `key` and `headers` per message inside the array items.
600
+ `sendBatch()` accepts the same options per message inside the array items.
552
601
 
553
602
  ### Consumer options
554
603
 
@@ -579,6 +628,7 @@ Passed to `KafkaModule.register()` or returned from `registerAsync()` factory:
579
628
  | `autoCreateTopics` | `false` | Auto-create topics on first send (dev only) |
580
629
  | `numPartitions` | `1` | Number of partitions for auto-created topics |
581
630
  | `strictSchemas` | `true` | Validate string topic keys against schemas registered via TopicDescriptor |
631
+ | `instrumentation` | `[]` | Client-wide instrumentation hooks (e.g. OTel). Applied to both send and consume paths |
582
632
 
583
633
  **Module-scoped** (default) — import `KafkaModule` in each module that needs it:
584
634
 
@@ -641,7 +691,7 @@ err.cause; // the original error
641
691
  ```typescript
642
692
  // In an onError interceptor:
643
693
  const interceptor: ConsumerInterceptor<MyTopics> = {
644
- onError: (message, topic, error) => {
694
+ onError: (envelope, error) => {
645
695
  if (error instanceof KafkaRetryExhaustedError) {
646
696
  console.log(`Failed after ${error.attempts} attempts on ${error.topic}`);
647
697
  console.log('Last error:', error.cause);
@@ -658,7 +708,7 @@ When `retry.maxRetries` is set and all attempts fail, `KafkaRetryExhaustedError`
658
708
  import { KafkaValidationError } from '@drarzter/kafka-client';
659
709
 
660
710
  const interceptor: ConsumerInterceptor<MyTopics> = {
661
- onError: (message, topic, error) => {
711
+ onError: (envelope, error) => {
662
712
  if (error instanceof KafkaValidationError) {
663
713
  console.log(`Bad message on ${error.topic}:`, error.cause?.message);
664
714
  }
@@ -707,9 +757,9 @@ await kafka.sendMessage(OrderCreated, { orderId: '1', userId: '2', amount: -5 })
707
757
 
708
758
  ```typescript
709
759
  @SubscribeTo(OrderCreated, { dlq: true })
710
- async handleOrder(message) {
711
- // `message` is guaranteed to match the schema
712
- console.log(message.orderId); // string — validated at runtime
760
+ async handleOrder(envelope) {
761
+ // `envelope.payload` is guaranteed to match the schema
762
+ console.log(envelope.payload.orderId); // string — validated at runtime
713
763
  }
714
764
  ```
715
765
 
@@ -773,7 +823,73 @@ export class HealthService {
773
823
 
774
824
  ## Testing
775
825
 
776
- Unit tests (mocked kafkajs):
826
+ ### Testing utilities
827
+
828
+ Import from `@drarzter/kafka-client/testing` — zero runtime deps, only `jest` and `@testcontainers/kafka` as peer dependencies.
829
+
830
+ > Unit tests mock `@confluentinc/kafka-javascript` — no Kafka broker needed. Integration tests use Testcontainers (requires Docker).
831
+
832
+ #### `createMockKafkaClient<T>()`
833
+
834
+ Fully typed mock with `jest.fn()` on every `IKafkaClient` method. All methods resolve to sensible defaults:
835
+
836
+ ```typescript
837
+ import { createMockKafkaClient } from '@drarzter/kafka-client/testing';
838
+
839
+ const kafka = createMockKafkaClient<MyTopics>();
840
+
841
+ const service = new OrdersService(kafka);
842
+ await service.createOrder();
843
+
844
+ expect(kafka.sendMessage).toHaveBeenCalledWith(
845
+ 'order.created',
846
+ expect.objectContaining({ orderId: '123' }),
847
+ );
848
+
849
+ // Override return values
850
+ kafka.checkStatus.mockResolvedValueOnce({ topics: ['order.created'] });
851
+
852
+ // Mock rejections
853
+ kafka.sendMessage.mockRejectedValueOnce(new Error('broker down'));
854
+ ```
855
+
856
+ #### `KafkaTestContainer`
857
+
858
+ Thin wrapper around `@testcontainers/kafka` that handles common setup pain points — transaction coordinator warmup, topic pre-creation:
859
+
860
+ ```typescript
861
+ import { KafkaTestContainer } from '@drarzter/kafka-client/testing';
862
+ import { KafkaClient } from '@drarzter/kafka-client/core';
863
+
864
+ let container: KafkaTestContainer;
865
+ let brokers: string[];
866
+
867
+ beforeAll(async () => {
868
+ container = new KafkaTestContainer({
869
+ topics: ['orders', { topic: 'payments', numPartitions: 3 }],
870
+ });
871
+ brokers = await container.start();
872
+ }, 120_000);
873
+
874
+ afterAll(() => container.stop());
875
+
876
+ it('sends and receives', async () => {
877
+ const kafka = new KafkaClient('test', 'test-group', brokers);
878
+ // ...
879
+ });
880
+ ```
881
+
882
+ Options:
883
+
884
+ | Option | Default | Description |
885
+ |--------|---------|-------------|
886
+ | `image` | `"confluentinc/cp-kafka:7.7.0"` | Docker image |
887
+ | `transactionWarmup` | `true` | Warm up transaction coordinator on start |
888
+ | `topics` | `[]` | Topics to pre-create (string or `{ topic, numPartitions }`) |
889
+
890
+ ### Running tests
891
+
892
+ Unit tests (mocked `@confluentinc/kafka-javascript` — no broker needed):
777
893
 
778
894
  ```bash
779
895
  npm test
@@ -787,15 +903,18 @@ npm run test:integration
787
903
 
788
904
  The integration suite spins up a single-node KRaft Kafka container and tests sending, consuming, batching, transactions, retry + DLQ, interceptors, health checks, and `fromBeginning` — no mocks.
789
905
 
790
- Both suites run in CI on every push to `main`.
906
+ Both suites run in CI on every push to `main` and on pull requests.
791
907
 
792
908
  ## Project structure
793
909
 
794
- ```
910
+ ```text
795
911
  src/
796
- ├── client/ # Core — KafkaClient, types, topic(), error classes (0 framework deps)
912
+ ├── client/ # Core — KafkaClient, types, envelope, consumer pipeline, topic(), errors (0 framework deps)
797
913
  ├── nest/ # NestJS adapter — Module, Explorer, decorators, health
914
+ ├── testing/ # Testing utilities — mock client, testcontainer wrapper
798
915
  ├── core.ts # Standalone entrypoint (@drarzter/kafka-client/core)
916
+ ├── otel.ts # OpenTelemetry entrypoint (@drarzter/kafka-client/otel)
917
+ ├── testing.ts # Testing entrypoint (@drarzter/kafka-client/testing)
799
918
  └── index.ts # Full entrypoint — core + NestJS adapter
800
919
  ```
801
920
 
@@ -0,0 +1,17 @@
1
+ var __defProp = Object.defineProperty;
2
+ var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
3
+ var __decorateClass = (decorators, target, key, kind) => {
4
+ var result = kind > 1 ? void 0 : kind ? __getOwnPropDesc(target, key) : target;
5
+ for (var i = decorators.length - 1, decorator; i >= 0; i--)
6
+ if (decorator = decorators[i])
7
+ result = (kind ? decorator(target, key, result) : decorator(result)) || result;
8
+ if (kind && result) __defProp(target, key, result);
9
+ return result;
10
+ };
11
+ var __decorateParam = (index, decorator) => (target, key) => decorator(target, key, index);
12
+
13
+ export {
14
+ __decorateClass,
15
+ __decorateParam
16
+ };
17
+ //# sourceMappingURL=chunk-EQQGB2QZ.mjs.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":[],"sourcesContent":[],"mappings":"","names":[]}