@venizia/ignis-docs 0.0.7-2 → 0.0.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,24 +1,28 @@
1
1
  # Kafka
2
2
 
3
- Apache Kafka event streaming with producer, consumer, and admin helpers. Built on [`@platformatic/kafka`](https://github.com/platformatic/kafka) v1.30.0 — a pure TypeScript Kafka client with zero native dependencies.
3
+ Apache Kafka event streaming with producer, consumer, admin, and schema registry helpers. Built on [`@platformatic/kafka`](https://github.com/platformatic/kafka) v1.30.0 — a pure TypeScript Kafka client with zero native dependencies.
4
4
 
5
5
  ## Overview
6
6
 
7
- The Kafka module provides three **thin wrapper** classes around `@platformatic/kafka`:
7
+ The Kafka module provides four helper classes built on a shared `BaseKafkaHelper` base:
8
8
 
9
9
  | Class | Wraps | Use Case |
10
10
  |-------|-------|----------|
11
- | `KafkaProducerHelper` | `Producer` | Publish messages to Kafka topics |
12
- | `KafkaConsumerHelper` | `Consumer` | Consume messages with consumer groups |
11
+ | `KafkaProducerHelper` | `Producer` | Publish messages, transactions |
12
+ | `KafkaConsumerHelper` | `Consumer` | Consume messages with consumer groups, lag monitoring |
13
13
  | `KafkaAdminHelper` | `Admin` | Manage topics, partitions, groups, ACLs, configs |
14
+ | `KafkaSchemaRegistryHelper` | `ConfluentSchemaRegistry` | Schema validation and auto ser/deser |
15
+
16
+ All helpers (except schema registry) extend `BaseKafkaHelper` which provides:
14
17
 
15
- Each helper provides:
16
18
  - **Scoped logging** via `BaseHelper` (Winston with daily rotation)
19
+ - **Health tracking** — `isHealthy()`, `isReady()`, `getHealthStatus()`
20
+ - **Broker event callbacks** — `onBrokerConnect`, `onBrokerDisconnect`
21
+ - **Graceful shutdown** — timeout-based with force fallback
17
22
  - **Sensible defaults** via `KafkaDefaults` constants
18
23
  - **Factory pattern** via `newInstance()` static method
19
- - **Lifecycle management** via `close()` method
20
24
 
21
- They do **not** re-implement or passthrough `@platformatic/kafka` methods. Use `getProducer()`, `getConsumer()`, or `getAdmin()` to access the full underlying API directly.
25
+ Use `getProducer()`, `getConsumer()`, or `getAdmin()` to access the full underlying `@platformatic/kafka` API directly.
22
26
 
23
27
  ### Import Path
24
28
 
@@ -28,19 +32,39 @@ import {
28
32
  KafkaProducerHelper,
29
33
  KafkaConsumerHelper,
30
34
  KafkaAdminHelper,
35
+ KafkaSchemaRegistryHelper,
36
+ BaseKafkaHelper,
31
37
  KafkaDefaults,
32
38
  KafkaAcks,
33
39
  KafkaGroupProtocol,
40
+ KafkaHealthStatuses,
41
+ KafkaClientEvents,
34
42
  } from '@venizia/ignis-helpers/kafka';
35
43
 
36
44
  // Types
37
45
  import type {
38
46
  IKafkaConnectionOptions,
39
- IKafkaProducerOpts,
40
- IKafkaConsumerOpts,
41
- IKafkaAdminOpts,
47
+ IKafkaProducerOptions,
48
+ IKafkaConsumerOptions,
49
+ IKafkaAdminOptions,
50
+ IKafkaConsumeStartOptions,
51
+ IKafkaSchemaRegistryOptions,
52
+ IKafkaTransactionContext,
53
+ IKafkaBaseOptions,
42
54
  TKafkaAcks,
43
55
  TKafkaGroupProtocol,
56
+ TKafkaHealthStatus,
57
+ TKafkaBrokerEventCallback,
58
+ TKafkaMessageCallback,
59
+ TKafkaMessageDoneCallback,
60
+ TKafkaMessageErrorCallback,
61
+ TKafkaGroupJoinCallback,
62
+ TKafkaGroupLeaveCallback,
63
+ TKafkaGroupRebalanceCallback,
64
+ TKafkaHeartbeatErrorCallback,
65
+ TKafkaLagCallback,
66
+ TKafkaLagErrorCallback,
67
+ TKafkaTransactionCallback,
44
68
  } from '@venizia/ignis-helpers/kafka';
45
69
 
46
70
  // @platformatic/kafka (direct usage)
@@ -66,7 +90,43 @@ import type {
66
90
  bun add @platformatic/kafka
67
91
  ```
68
92
 
69
- ---
93
+ ## Architecture
94
+
95
+ ### Class Hierarchy
96
+
97
+ ```
98
+ BaseHelper (scoped logging, identifier)
99
+ └── BaseKafkaHelper<TClient> (health tracking, broker events, graceful shutdown)
100
+ ├── KafkaProducerHelper<K,V,HK,HV>
101
+ ├── KafkaConsumerHelper<K,V,HK,HV>
102
+ └── KafkaAdminHelper
103
+
104
+ BaseHelper
105
+ └── KafkaSchemaRegistryHelper<K,V,HK,HV> (no broker connection)
106
+ ```
107
+
108
+ ### BaseKafkaHelper
109
+
110
+ All Kafka helpers (except schema registry) extend `BaseKafkaHelper<TClient>`, which provides:
111
+
112
+ ```typescript
113
+ abstract class BaseKafkaHelper<TClient extends Base<BaseOptions>> extends BaseHelper {
114
+ // Health
115
+ isHealthy(): boolean; // healthStatus === 'connected'
116
+ isReady(): boolean; // healthStatus === 'connected' (consumer overrides: + isActive())
117
+ getHealthStatus(): TKafkaHealthStatus; // 'connected' | 'disconnected' | 'unknown'
118
+
119
+ // Shutdown (used by subclasses)
120
+ protected closeClient(): Promise<void>;
121
+ protected gracefulCloseClient(): Promise<void>; // races closeClient vs shutdownTimeout
122
+ }
123
+ ```
124
+
125
+ Health status transitions automatically via broker events:
126
+ - `client:broker:connect` → `'connected'`
127
+ - `client:broker:disconnect` → `'disconnected'`
128
+ - `client:broker:failed` → `'disconnected'`
129
+ - `close()` → `'disconnected'`
70
130
 
71
131
  ## Connection Options
72
132
 
@@ -95,6 +155,17 @@ interface IKafkaConnectionOptions extends ConnectionOptions {
95
155
  | `connectTimeout` | `number` | — | TCP connection timeout in milliseconds |
96
156
  | `requestTimeout` | `number` | — | Kafka request timeout in milliseconds |
97
157
 
158
+ ### Shared Helper Options
159
+
160
+ These options are available on all three helpers (`IKafkaProducerOptions`, `IKafkaConsumerOptions`, `IKafkaAdminOptions`):
161
+
162
+ | Option | Type | Default | Description |
163
+ |--------|------|---------|-------------|
164
+ | `identifier` | `string` | `'kafka-{type}'` | Scoped logging identifier |
165
+ | `shutdownTimeout` | `number` | `30000` | Graceful shutdown timeout in ms |
166
+ | `onBrokerConnect` | `TKafkaBrokerEventCallback` | — | Called when broker connects |
167
+ | `onBrokerDisconnect` | `TKafkaBrokerEventCallback` | — | Called when broker disconnects |
168
+
98
169
  ### SASL Authentication
99
170
 
100
171
  `@platformatic/kafka` supports five SASL mechanisms:
@@ -132,6 +203,7 @@ const helper = KafkaConsumerHelper.newInstance({
132
203
  },
133
204
  connectTimeout: 30_000,
134
205
  requestTimeout: 30_000,
206
+ onBrokerConnect: ({ broker }) => console.log(`Connected to ${broker.host}:${broker.port}`),
135
207
  });
136
208
  ```
137
209
 
@@ -167,8 +239,6 @@ const helper = KafkaProducerHelper.newInstance({
167
239
  });
168
240
  ```
169
241
 
170
- ---
171
-
172
242
  ## Serialization & Deserialization
173
243
 
174
244
  `@platformatic/kafka`'s default wire format is `Buffer`. The helpers default generic types to `string` (matching common usage), but you must provide serializers/deserializers explicitly.
@@ -191,51 +261,30 @@ const helper = KafkaProducerHelper.newInstance({
191
261
  | `serializersFrom(s)` | `<T>(s: Serializer<T>) => Serializers<T, T, T, T>` | Create full serializers from a single serializer |
192
262
  | `deserializersFrom(d)` | `<T>(d: Deserializer<T>) => Deserializers<T, T, T, T>` | Create full deserializers from a single deserializer |
193
263
 
194
- ### Serializers/Deserializers Interface
195
-
196
- ```typescript
197
- interface Serializers<Key, Value, HeaderKey, HeaderValue> {
198
- key: SerializerWithHeaders<Key, HeaderKey, HeaderValue>;
199
- value: SerializerWithHeaders<Value, HeaderKey, HeaderValue>;
200
- headerKey: Serializer<HeaderKey>;
201
- headerValue: Serializer<HeaderValue>;
202
- }
203
-
204
- interface Deserializers<Key, Value, HeaderKey, HeaderValue> {
205
- key: DeserializerWithHeaders<Key, HeaderKey, HeaderValue>;
206
- value: DeserializerWithHeaders<Value, HeaderKey, HeaderValue>;
207
- headerKey: Deserializer<HeaderKey>;
208
- headerValue: Deserializer<HeaderValue>;
209
- }
210
- ```
211
-
212
264
  ### String Serialization
213
265
 
214
- The simplest approach — all keys, values, and headers are strings:
215
-
216
266
  ```typescript
217
267
  import { stringSerializers, stringDeserializers } from '@platformatic/kafka';
218
268
 
219
- // Producer
220
269
  const producer = KafkaProducerHelper.newInstance({
221
270
  bootstrapBrokers: ['localhost:9092'],
222
271
  clientId: 'my-producer',
223
272
  serializers: stringSerializers,
224
273
  });
225
274
 
226
- // Consumer
227
275
  const consumer = KafkaConsumerHelper.newInstance({
228
276
  bootstrapBrokers: ['localhost:9092'],
229
277
  clientId: 'my-consumer',
230
278
  groupId: 'my-group',
231
279
  deserializers: stringDeserializers,
280
+ onMessage: async ({ message }) => {
281
+ console.log(message.key, message.value); // both strings
282
+ },
232
283
  });
233
284
  ```
234
285
 
235
286
  ### JSON Serialization
236
287
 
237
- For structured data — serialize objects as JSON:
238
-
239
288
  ```typescript
240
289
  import {
241
290
  jsonSerializer, jsonDeserializer,
@@ -243,66 +292,62 @@ import {
243
292
  serializersFrom, deserializersFrom,
244
293
  } from '@platformatic/kafka';
245
294
 
246
- // JSON values with string keys
247
295
  const producer = KafkaProducerHelper.newInstance({
248
296
  bootstrapBrokers: ['localhost:9092'],
249
297
  clientId: 'my-producer',
250
298
  serializers: { ...serializersFrom(jsonSerializer), key: stringSerializer },
251
299
  });
252
300
 
253
- const p = producer.getProducer();
254
- await p.send({
301
+ await producer.getProducer().send({
255
302
  messages: [{
256
303
  topic: 'orders',
257
- key: 'order-123', // string key
258
- value: { id: '123', status: 'created', amount: 99 }, // object value → auto-serialized to JSON
304
+ key: 'order-123',
305
+ value: { id: '123', status: 'created', amount: 99 },
259
306
  }],
260
307
  });
261
308
 
262
- // Consumer with matching deserializers
263
309
  const consumer = KafkaConsumerHelper.newInstance({
264
310
  bootstrapBrokers: ['localhost:9092'],
265
311
  clientId: 'my-consumer',
266
312
  groupId: 'my-group',
267
313
  deserializers: { ...deserializersFrom(jsonDeserializer), key: stringDeserializer },
314
+ onMessage: async ({ message }) => {
315
+ console.log(message.value.id, message.value.status); // typed object
316
+ },
268
317
  });
269
318
  ```
270
319
 
271
- ### Custom Serialization
320
+ ### Schema Registry Serialization
272
321
 
273
- For advanced use cases (Avro, Protobuf, MessagePack):
322
+ For schema-validated serialization (Avro, Protobuf, JSON Schema), use the schema registry helper:
274
323
 
275
324
  ```typescript
276
- import type { Serializer, Deserializer } from '@platformatic/kafka';
277
- import * as msgpack from '@msgpack/msgpack';
278
-
279
- const msgpackSerializer: Serializer<unknown> = (data) => {
280
- if (data === undefined) return undefined;
281
- return Buffer.from(msgpack.encode(data));
282
- };
325
+ const registry = KafkaSchemaRegistryHelper.newInstance({
326
+ url: 'http://localhost:8081',
327
+ });
283
328
 
284
- const msgpackDeserializer: Deserializer<unknown> = (data) => {
285
- if (!data) return undefined;
286
- return msgpack.decode(data);
287
- };
329
+ const producer = KafkaProducerHelper.newInstance({
330
+ bootstrapBrokers: ['localhost:9092'],
331
+ clientId: 'my-producer',
332
+ registry: registry.getRegistry(),
333
+ });
288
334
 
289
- const producer = KafkaProducerHelper.newInstance<string, unknown, string, string>({
335
+ const consumer = KafkaConsumerHelper.newInstance({
290
336
  bootstrapBrokers: ['localhost:9092'],
291
- clientId: 'msgpack-producer',
292
- serializers: {
293
- key: stringSerializer,
294
- value: msgpackSerializer,
295
- headerKey: stringSerializer,
296
- headerValue: stringSerializer,
337
+ clientId: 'my-consumer',
338
+ groupId: 'my-group',
339
+ registry: registry.getRegistry(),
340
+ onMessage: async ({ message }) => {
341
+ // message.value is auto-deserialized using registered schema
297
342
  },
298
343
  });
299
344
  ```
300
345
 
301
- ---
346
+ See **[Schema Registry](./schema-registry)** for full documentation.
302
347
 
303
348
  ## Generic Type Parameters
304
349
 
305
- All three helpers (and their option interfaces) support generic type parameters controlling the serialization types:
350
+ All helpers (and their option interfaces) support generic type parameters controlling the serialization types:
306
351
 
307
352
  ```typescript
308
353
  class KafkaProducerHelper<
@@ -332,39 +377,71 @@ const helper = KafkaProducerHelper.newInstance<string, MyEvent, string, string>(
332
377
  serializers: { ...serializersFrom(jsonSerializer), key: stringSerializer },
333
378
  ...
334
379
  });
335
-
336
- // Custom: Buffer keys, Buffer values (raw wire format)
337
- const helper = KafkaProducerHelper.newInstance<Buffer, Buffer, Buffer, Buffer>({
338
- // No serializers needed — @platformatic/kafka defaults to Buffer
339
- ...
340
- });
341
380
  ```
342
381
 
343
- ---
344
-
345
382
  ## Constants
346
383
 
347
384
  ### KafkaDefaults
348
385
 
349
- Centralized default values used by all three helpers.
386
+ Centralized default values used by all helpers.
350
387
 
351
388
  ```typescript
352
389
  import { KafkaDefaults } from '@venizia/ignis-helpers/kafka';
353
390
  ```
354
391
 
355
- | Constant | Value | Scope | Used By | Description |
356
- |----------|-------|-------|---------|-------------|
357
- | `RETRIES` | `3` | Shared | All helpers | Connection retry count |
358
- | `RETRY_DELAY` | `1000` | Shared | All helpers | Retry delay in ms |
359
- | `STRICT` | `true` | Producer | `KafkaProducerHelper` | Fail on unknown topics |
360
- | `AUTOCREATE_TOPICS` | `false` | Producer | `KafkaProducerHelper` | Auto-create topics on produce |
361
- | `AUTOCOMMIT` | `false` | Consumer | `KafkaConsumerHelper` | Auto-commit offsets |
362
- | `SESSION_TIMEOUT` | `30000` | Consumer | `KafkaConsumerHelper` | Session timeout in ms |
363
- | `HEARTBEAT_INTERVAL` | `3000` | Consumer | `KafkaConsumerHelper` | Heartbeat interval in ms |
364
- | `HIGH_WATER_MARK` | `1024` | Consumer | `KafkaConsumerHelper` | Stream buffer size (messages) |
365
- | `MIN_BYTES` | `1` | Consumer | `KafkaConsumerHelper` | Min bytes per fetch |
366
- | `METADATA_MAX_AGE` | `300000` | Consumer | `KafkaConsumerHelper` | Metadata cache TTL in ms |
367
- | `GROUP_PROTOCOL` | `'classic'` | Consumer | `KafkaConsumerHelper` | Default group protocol |
392
+ | Constant | Value | Scope | Description |
393
+ |----------|-------|-------|-------------|
394
+ | `RETRIES` | `3` | Shared | Connection retry count |
395
+ | `RETRY_DELAY` | `1000` | Shared | Retry delay in ms |
396
+ | `SHUTDOWN_TIMEOUT` | `30000` | Shared | Graceful shutdown timeout in ms |
397
+ | `STRICT` | `true` | Producer | Fail on unknown topics |
398
+ | `AUTOCREATE_TOPICS` | `false` | Producer | Auto-create topics on produce |
399
+ | `AUTOCOMMIT` | `false` | Consumer | Auto-commit offsets |
400
+ | `SESSION_TIMEOUT` | `30000` | Consumer | Session timeout in ms |
401
+ | `HEARTBEAT_INTERVAL` | `3000` | Consumer | Heartbeat interval in ms |
402
+ | `HIGH_WATER_MARK` | `1024` | Consumer | Stream buffer size (messages) |
403
+ | `MIN_BYTES` | `1` | Consumer | Min bytes per fetch |
404
+ | `METADATA_MAX_AGE` | `300000` | Consumer | Metadata cache TTL in ms |
405
+ | `GROUP_PROTOCOL` | `'classic'` | Consumer | Default group protocol |
406
+ | `CONSUME_MODE` | `'committed'` | Consumer | Default consume mode |
407
+ | `CONSUME_FALLBACK_MODE` | `'latest'` | Consumer | Default consume fallback mode |
408
+ | `LAG_MONITOR_INTERVAL` | `30000` | Consumer | Lag monitoring poll interval in ms |
409
+
410
+ ### KafkaHealthStatuses
411
+
412
+ Health status values used by all Kafka helpers.
413
+
414
+ ```typescript
415
+ import { KafkaHealthStatuses } from '@venizia/ignis-helpers/kafka';
416
+ ```
417
+
418
+ | Constant | Value | Description |
419
+ |----------|-------|-------------|
420
+ | `CONNECTED` | `'connected'` | Broker connection established |
421
+ | `DISCONNECTED` | `'disconnected'` | Broker connection lost or closed |
422
+ | `UNKNOWN` | `'unknown'` | Initial state before first broker event |
423
+
424
+ ### KafkaClientEvents
425
+
426
+ Event name constants for `@platformatic/kafka` event emitters.
427
+
428
+ ```typescript
429
+ import { KafkaClientEvents } from '@venizia/ignis-helpers/kafka';
430
+ ```
431
+
432
+ | Constant | Value | Scope |
433
+ |----------|-------|-------|
434
+ | `BROKER_CONNECT` | `'client:broker:connect'` | All clients |
435
+ | `BROKER_DISCONNECT` | `'client:broker:disconnect'` | All clients |
436
+ | `BROKER_FAILED` | `'client:broker:failed'` | All clients |
437
+ | `CONSUMER_GROUP_JOIN` | `'consumer:group:join'` | Consumer |
438
+ | `CONSUMER_GROUP_LEAVE` | `'consumer:group:leave'` | Consumer |
439
+ | `CONSUMER_GROUP_REBALANCE` | `'consumer:group:rebalance'` | Consumer |
440
+ | `CONSUMER_HEARTBEAT_ERROR` | `'consumer:heartbeat:error'` | Consumer |
441
+ | `CONSUMER_LAG` | `'consumer:lag'` | Consumer |
442
+ | `CONSUMER_LAG_ERROR` | `'consumer:lag:error'` | Consumer |
443
+ | `STREAM_DATA` | `'data'` | Stream |
444
+ | `STREAM_ERROR` | `'error'` | Stream |
368
445
 
369
446
  ### KafkaAcks
370
447
 
@@ -380,20 +457,6 @@ import { KafkaAcks } from '@venizia/ignis-helpers/kafka';
380
457
  | `LEADER` | `1` | Leader broker acknowledges | Fast, leader-durable |
381
458
  | `ALL` | `-1` | All in-sync replicas acknowledge | Slowest, fully durable |
382
459
 
383
- **Static methods:**
384
-
385
- | Method | Signature | Description |
386
- |--------|-----------|-------------|
387
- | `isValid(ack)` | `(ack: number): boolean` | Check if value is a valid ack level |
388
- | `SCHEME_SET` | `Set<number>` | Set of valid values: `{0, 1, -1}` |
389
-
390
- ```typescript
391
- KafkaAcks.ALL; // -1
392
- KafkaAcks.isValid(-1); // true
393
- KafkaAcks.isValid(2); // false
394
- KafkaAcks.SCHEME_SET; // Set { 0, 1, -1 }
395
- ```
396
-
397
460
  ### KafkaGroupProtocol
398
461
 
399
462
  Consumer group protocol versions.
@@ -407,32 +470,16 @@ import { KafkaGroupProtocol } from '@venizia/ignis-helpers/kafka';
407
470
  | `CLASSIC` | `'classic'` | Classic consumer group protocol (default, all Kafka versions) |
408
471
  | `CONSUMER` | `'consumer'` | New consumer group protocol — KIP-848 (Kafka 3.7+) |
409
472
 
410
- **Static methods:**
411
-
412
- | Method | Signature | Description |
413
- |--------|-----------|-------------|
414
- | `isValid(mode)` | `(mode: string): boolean` | Check if value is a valid protocol |
415
- | `SCHEME_SET` | `Set<string>` | Set of valid values: `{'classic', 'consumer'}` |
416
-
417
- ```typescript
418
- KafkaGroupProtocol.CLASSIC; // 'classic'
419
- KafkaGroupProtocol.isValid('classic'); // true
420
- KafkaGroupProtocol.isValid('foo'); // false
421
- ```
422
-
423
473
  ### Derived Types
424
474
 
425
475
  ```typescript
426
- import type { TKafkaAcks, TKafkaGroupProtocol } from '@venizia/ignis-helpers/kafka';
476
+ import type { TKafkaAcks, TKafkaGroupProtocol, TKafkaHealthStatus } from '@venizia/ignis-helpers/kafka';
427
477
 
428
478
  // TKafkaAcks = 0 | 1 | -1
429
479
  // TKafkaGroupProtocol = 'classic' | 'consumer'
480
+ // TKafkaHealthStatus = 'connected' | 'disconnected' | 'unknown'
430
481
  ```
431
482
 
432
- These union types are derived using `TConstValue<T>` from the constant classes.
433
-
434
- ---
435
-
436
483
  ## Compression
437
484
 
438
485
  `@platformatic/kafka` supports five compression algorithms:
@@ -450,24 +497,131 @@ const helper = KafkaProducerHelper.newInstance({
450
497
  bootstrapBrokers: ['localhost:9092'],
451
498
  clientId: 'my-producer',
452
499
  serializers: stringSerializers,
453
- compression: 'zstd', // Applied to all messages by default
500
+ compression: 'zstd',
454
501
  });
455
502
 
456
503
  // Override per-send
457
- const producer = helper.getProducer();
458
- await producer.send({
504
+ await helper.getProducer().send({
459
505
  messages: [{ topic: 'logs', key: 'l1', value: largePayload }],
460
506
  compression: 'lz4',
461
507
  });
462
508
  ```
463
509
 
464
- ---
510
+ ## Quick Usage Comparison
511
+
512
+ ### Construction
513
+
514
+ ```typescript
515
+ // Admin
516
+ const admin = KafkaAdminHelper.newInstance({
517
+ bootstrapBrokers: ['127.0.0.1:29092'],
518
+ clientId: 'my-admin',
519
+ onBrokerConnect: ({ broker }) => console.log(`Connected to ${broker.host}`),
520
+ onBrokerDisconnect: ({ broker }) => console.log(`Disconnected from ${broker.host}`),
521
+ });
522
+
523
+ // Producer
524
+ const producer = KafkaProducerHelper.newInstance({
525
+ bootstrapBrokers: ['127.0.0.1:29092'],
526
+ clientId: 'my-producer',
527
+ acks: -1,
528
+ idempotent: true,
529
+ transactionalId: 'my-tx',
530
+ onBrokerConnect: ({ broker }) => console.log(`Connected to ${broker.host}`),
531
+ onBrokerDisconnect: ({ broker }) => console.log(`Disconnected from ${broker.host}`),
532
+ });
533
+
534
+ // Consumer
535
+ const consumer = KafkaConsumerHelper.newInstance({
536
+ bootstrapBrokers: ['127.0.0.1:29092'],
537
+ clientId: 'my-consumer',
538
+ groupId: 'my-group',
539
+ onBrokerConnect: ({ broker }) => console.log(`Connected to ${broker.host}`),
540
+ onBrokerDisconnect: ({ broker }) => console.log(`Disconnected from ${broker.host}`),
541
+ onMessage: async ({ message }) => {
542
+ console.log('Received:', message.value);
543
+ await message.commit();
544
+ },
545
+ onMessageDone: ({ message }) => console.log('Done:', message.key),
546
+ onMessageError: ({ error, message }) => console.error('Error:', error),
547
+ onGroupJoin: ({ groupId, memberId }) => console.log(`Joined ${groupId}`),
548
+ onGroupLeave: ({ groupId }) => console.log(`Left ${groupId}`),
549
+ onGroupRebalance: ({ groupId }) => console.log(`Rebalance ${groupId}`),
550
+ onHeartbeatError: ({ error }) => console.error('Heartbeat:', error),
551
+ onLag: ({ lag }) => console.log('Lag:', lag),
552
+ onLagError: ({ error }) => console.error('Lag error:', error),
553
+ });
554
+ ```
555
+
556
+ ### Core Operations
557
+
558
+ | Admin | Producer | Consumer |
559
+ |-------|----------|----------|
560
+ | `admin.getAdmin()` | `producer.getProducer()` | `consumer.getConsumer()` |
561
+ | — | `producer.getProducer().send(...)` | `await consumer.start({ topics: ['t1'] })` |
562
+ | — | `await producer.runInTransaction(async ({ send, addConsumer, addOffset }) => { ... })` | `consumer.startLagMonitoring({ topics: ['t1'], interval: 10_000 })` |
563
+ | — | — | `consumer.stopLagMonitoring()` |
564
+ | — | — | `consumer.getStream()` |
565
+
566
+ ### Health Checks
567
+
568
+ ```typescript
569
+ // All three — identical API
570
+ helper.isHealthy(); // true when broker connected
571
+ helper.isReady(); // Admin/Producer: same as isHealthy()
572
+ // Consumer: isHealthy() + consumer.isActive()
573
+ helper.getHealthStatus(); // 'connected' | 'disconnected' | 'unknown'
574
+ ```
575
+
576
+ ### Shutdown
577
+
578
+ ```typescript
579
+ // All three — identical API
580
+ await helper.close(); // graceful (timeout → force fallback)
581
+ await helper.close({ isForce: true }); // immediate force close
582
+ ```
583
+
584
+ ### With Schema Registry
585
+
586
+ ```typescript
587
+ const registry = KafkaSchemaRegistryHelper.newInstance({ url: 'http://localhost:8081' });
588
+
589
+ const producer = KafkaProducerHelper.newInstance({
590
+ ...,
591
+ registry: registry.getRegistry(),
592
+ // or use registry.getSerializers() for manual serializer config
593
+ });
594
+
595
+ const consumer = KafkaConsumerHelper.newInstance({
596
+ ...,
597
+ registry: registry.getRegistry(),
598
+ // or use registry.getDeserializers() for manual deserializer config
599
+ });
600
+ ```
601
+
602
+ ### Transaction (Producer Only)
603
+
604
+ ```typescript
605
+ const result = await producer.runInTransaction(async ({ send, addConsumer, addOffset }) => {
606
+ // Send messages within transaction
607
+ const result = await send({
608
+ messages: [{ topic: 'orders', key: 'o1', value: '{"status":"created"}' }],
609
+ });
610
+
611
+ // Optionally add consumer for exactly-once semantics
612
+ await addConsumer(consumer.getConsumer());
613
+ await addOffset(message);
614
+
615
+ return result;
616
+ });
617
+ ```
465
618
 
466
619
  ## Pages
467
620
 
468
- - **[Producer](./producer)** — Producer helper setup, usage, and full `@platformatic/kafka` Producer API reference
469
- - **[Consumer](./consumer)** — Consumer helper setup, usage, and full `@platformatic/kafka` Consumer API reference
470
- - **[Admin](./admin)** — Admin helper setup, usage, and full `@platformatic/kafka` Admin API reference
621
+ - **[Producer](./producer)** — Producer helper, transactions, and full `@platformatic/kafka` Producer API reference
622
+ - **[Consumer](./consumer)** — Consumer helper, message callbacks, lag monitoring, and full Consumer API reference
623
+ - **[Admin](./admin)** — Admin helper and full Admin API reference
624
+ - **[Schema Registry](./schema-registry)** — Schema registry helper for Avro/Protobuf/JSON Schema validation
471
625
  - **[Examples & Troubleshooting](./examples)** — Complete examples, IoC integration, and troubleshooting guide
472
626
 
473
627
  ## See Also