@venizia/ignis-docs 0.0.7-0 → 0.0.7-2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,305 +1,482 @@
1
- # Kafka <Badge type="warning" text="Experimental" />
1
+ # Kafka
2
2
 
3
- Apache Kafka event streaming with producer, consumer, and admin helpers. Built on [`@platformatic/kafka`](https://github.com/platformatic/kafka).
3
+ Apache Kafka event streaming with producer, consumer, and admin helpers. Built on [`@platformatic/kafka`](https://github.com/platformatic/kafka) v1.30.0 — a pure TypeScript Kafka client with zero native dependencies.
4
4
 
5
- > [!WARNING]
6
- > This helper is **experimental**. The API may change in future releases.
5
+ ## Overview
7
6
 
8
- ## Quick Reference
7
+ The Kafka module provides three **thin wrapper** classes around `@platformatic/kafka`:
9
8
 
10
- | Class | Extends | Peer Dependency | Use Case |
11
- |-------|---------|-----------------|----------|
12
- | **KafkaProducerHelper** | `BaseHelper` | `@platformatic/kafka` | Publish messages to Kafka topics |
13
- | **KafkaConsumerHelper** | `BaseHelper` | `@platformatic/kafka` | Consume messages from Kafka topics with consumer groups |
14
- | **KafkaAdminHelper** | `BaseHelper` | `@platformatic/kafka` | Manage topics, partitions, consumer groups, and configs |
9
+ | Class | Wraps | Use Case |
10
+ |-------|-------|----------|
11
+ | `KafkaProducerHelper` | `Producer` | Publish messages to Kafka topics |
12
+ | `KafkaConsumerHelper` | `Consumer` | Consume messages with consumer groups |
13
+ | `KafkaAdminHelper` | `Admin` | Manage topics, partitions, groups, ACLs, configs |
14
+
15
+ Each helper provides:
16
+ - **Scoped logging** via `BaseHelper` (Winston with daily rotation)
17
+ - **Sensible defaults** via `KafkaDefaults` constants
18
+ - **Factory pattern** via `newInstance()` static method
19
+ - **Lifecycle management** via `close()` method
20
+
21
+ They do **not** re-implement or passthrough `@platformatic/kafka` methods. Use `getProducer()`, `getConsumer()`, or `getAdmin()` to access the full underlying API directly.
15
22
 
16
23
  ### Import Path
17
24
 
18
25
  ```typescript
26
+ // Helpers & constants
19
27
  import {
20
28
  KafkaProducerHelper,
21
29
  KafkaConsumerHelper,
22
30
  KafkaAdminHelper,
23
31
  KafkaDefaults,
24
32
  KafkaAcks,
25
- KafkaConfigResourceTypes,
33
+ KafkaGroupProtocol,
26
34
  } from '@venizia/ignis-helpers/kafka';
27
35
 
36
+ // Types
28
37
  import type {
29
38
  IKafkaConnectionOptions,
30
- IKafkaProducerOptions,
31
- IKafkaConsumerOptions,
32
- IKafkaAdminOptions,
33
- IKafkaProduceMessage,
34
- IKafkaSendOptions,
35
- IKafkaConsumedMessage,
36
- IKafkaCommitOptions,
39
+ IKafkaProducerOpts,
40
+ IKafkaConsumerOpts,
41
+ IKafkaAdminOpts,
42
+ TKafkaAcks,
43
+ TKafkaGroupProtocol,
37
44
  } from '@venizia/ignis-helpers/kafka';
45
+
46
+ // @platformatic/kafka (direct usage)
47
+ import {
48
+ Producer, Consumer, Admin, MessagesStream,
49
+ stringSerializers, stringDeserializers,
50
+ stringSerializer, stringDeserializer,
51
+ jsonSerializer, jsonDeserializer,
52
+ serializersFrom, deserializersFrom,
53
+ } from '@platformatic/kafka';
54
+
55
+ import type {
56
+ Message, MessageToProduce,
57
+ SendOptions, ConsumeOptions,
58
+ Serializers, Deserializers,
59
+ SASLOptions, ConnectionOptions,
60
+ } from '@platformatic/kafka';
38
61
  ```
39
62
 
40
- ## Installation
63
+ ### Installation
41
64
 
42
65
  ```bash
43
66
  bun add @platformatic/kafka
44
67
  ```
45
68
 
46
- ## Producer
69
+ ---
47
70
 
48
- The `KafkaProducerHelper` wraps `@platformatic/kafka`'s `Producer` for publishing messages to Kafka topics.
71
+ ## Connection Options
72
+
73
+ All three helpers share a common base interface `IKafkaConnectionOptions` which extends `@platformatic/kafka`'s `ConnectionOptions`.
49
74
 
50
75
  ```typescript
51
- import { KafkaProducerHelper, KafkaAcks } from '@venizia/ignis-helpers/kafka';
76
+ interface IKafkaConnectionOptions extends ConnectionOptions {
77
+ bootstrapBrokers: string[];
78
+ clientId: string;
79
+ retries?: number; // Default: 3
80
+ retryDelay?: number; // Default: 1000ms
81
+ }
82
+ ```
52
83
 
53
- const producer = new KafkaProducerHelper({
54
- identifier: 'order-producer',
55
- bootstrapBrokers: ['localhost:9092'],
56
- acks: KafkaAcks.ALL,
57
- autocreateTopics: true,
58
- onConnected: () => console.log('Producer connected'),
59
- onError: ({ error }) => console.error('Producer error:', error),
84
+ ### Full Options Table
85
+
86
+ | Option | Type | Default | Description |
87
+ |--------|------|---------|-------------|
88
+ | `bootstrapBrokers` | `string[]` | — | Kafka broker addresses (`host:port`). **Required** |
89
+ | `clientId` | `string` | — | Unique client identifier. **Required** |
90
+ | `retries` | `number` | `3` | Number of connection retries before failing |
91
+ | `retryDelay` | `number` | `1000` | Delay between retries in milliseconds |
92
+ | `sasl` | `SASLOptions` | — | SASL authentication configuration |
93
+ | `tls` | `TLSConnectionOptions` | — | TLS/SSL connection options |
94
+ | `ssl` | `TLSConnectionOptions` | — | Alias for `tls` |
95
+ | `connectTimeout` | `number` | — | TCP connection timeout in milliseconds |
96
+ | `requestTimeout` | `number` | — | Kafka request timeout in milliseconds |
97
+
98
+ ### SASL Authentication
99
+
100
+ `@platformatic/kafka` supports five SASL mechanisms:
101
+
102
+ | Mechanism | Use Case |
103
+ |-----------|----------|
104
+ | `PLAIN` | Simple username/password (use with TLS in production) |
105
+ | `SCRAM-SHA-256` | Challenge-response, password never sent in plaintext |
106
+ | `SCRAM-SHA-512` | Same as SHA-256 with stronger hash |
107
+ | `OAUTHBEARER` | Token-based (Azure Event Hubs, Confluent Cloud) |
108
+ | `GSSAPI` | Kerberos authentication |
109
+
110
+ ```typescript
111
+ interface SASLOptions {
112
+ mechanism: 'PLAIN' | 'SCRAM-SHA-256' | 'SCRAM-SHA-512' | 'OAUTHBEARER' | 'GSSAPI';
113
+ username?: string | CredentialProvider;
114
+ password?: string | CredentialProvider;
115
+ token?: string | CredentialProvider;
116
+ oauthBearerExtensions?: Record<string, string> | CredentialProvider<Record<string, string>>;
117
+ authenticate?: SASLCustomAuthenticator;
118
+ }
119
+ ```
120
+
121
+ #### SCRAM-SHA-512 Example
122
+
123
+ ```typescript
124
+ const helper = KafkaConsumerHelper.newInstance({
125
+ bootstrapBrokers: ['broker1:9092', 'broker2:9092', 'broker3:9092'],
126
+ clientId: 'my-consumer',
127
+ groupId: 'my-group',
128
+ sasl: {
129
+ mechanism: 'SCRAM-SHA-512',
130
+ username: 'kafka-user',
131
+ password: 'kafka-password',
132
+ },
133
+ connectTimeout: 30_000,
134
+ requestTimeout: 30_000,
60
135
  });
136
+ ```
61
137
 
62
- // Send messages
63
- await producer.send({
64
- messages: [
65
- { topic: 'orders', key: 'order-123', value: JSON.stringify({ status: 'created' }) },
66
- ],
138
+ #### OAUTHBEARER Example
139
+
140
+ ```typescript
141
+ const helper = KafkaProducerHelper.newInstance({
142
+ bootstrapBrokers: ['pkc-xxxxx.us-west-2.aws.confluent.cloud:9092'],
143
+ clientId: 'my-producer',
144
+ sasl: {
145
+ mechanism: 'OAUTHBEARER',
146
+ token: async () => {
147
+ const response = await fetch('https://auth.example.com/token', { method: 'POST' });
148
+ const { access_token } = await response.json();
149
+ return access_token;
150
+ },
151
+ },
152
+ tls: true,
67
153
  });
154
+ ```
68
155
 
69
- // Send batch across multiple topics
70
- await producer.sendBatch({
71
- topicMessages: [
72
- { topic: 'orders', messages: [{ key: 'o1', value: '...' }] },
73
- { topic: 'notifications', messages: [{ key: 'n1', value: '...' }] },
74
- ],
156
+ #### TLS Without SASL
157
+
158
+ ```typescript
159
+ const helper = KafkaProducerHelper.newInstance({
160
+ bootstrapBrokers: ['broker:9093'],
161
+ clientId: 'my-producer',
162
+ tls: {
163
+ ca: fs.readFileSync('/path/to/ca.pem'),
164
+ cert: fs.readFileSync('/path/to/client-cert.pem'),
165
+ key: fs.readFileSync('/path/to/client-key.pem'),
166
+ },
75
167
  });
168
+ ```
169
+
170
+ ---
76
171
 
77
- // Graceful shutdown
78
- await producer.close();
172
+ ## Serialization & Deserialization
173
+
174
+ `@platformatic/kafka`'s default wire format is `Buffer`. The helpers default generic types to `string` (matching common usage), but you must provide serializers/deserializers explicitly.
175
+
176
+ ### Built-in Serializers
177
+
178
+ | Export | Type | Description |
179
+ |--------|------|-------------|
180
+ | `stringSerializer` | `Serializer<string>` | `string → Buffer` (UTF-8) |
181
+ | `stringDeserializer` | `Deserializer<string>` | `Buffer → string` (UTF-8) |
182
+ | `jsonSerializer` | `Serializer<T>` | `object → Buffer` (JSON.stringify + UTF-8) |
183
+ | `jsonDeserializer` | `Deserializer<T>` | `Buffer → object` (UTF-8 + JSON.parse) |
184
+ | `stringSerializers` | `Serializers<string, string, string, string>` | All four positions as string |
185
+ | `stringDeserializers` | `Deserializers<string, string, string, string>` | All four positions as string |
186
+
187
+ ### Helper Functions
188
+
189
+ | Export | Signature | Description |
190
+ |--------|-----------|-------------|
191
+ | `serializersFrom(s)` | `<T>(s: Serializer<T>) => Serializers<T, T, T, T>` | Create full serializers from a single serializer |
192
+ | `deserializersFrom(d)` | `<T>(d: Deserializer<T>) => Deserializers<T, T, T, T>` | Create full deserializers from a single deserializer |
193
+
194
+ ### Serializers/Deserializers Interface
195
+
196
+ ```typescript
197
+ interface Serializers<Key, Value, HeaderKey, HeaderValue> {
198
+ key: SerializerWithHeaders<Key, HeaderKey, HeaderValue>;
199
+ value: SerializerWithHeaders<Value, HeaderKey, HeaderValue>;
200
+ headerKey: Serializer<HeaderKey>;
201
+ headerValue: Serializer<HeaderValue>;
202
+ }
203
+
204
+ interface Deserializers<Key, Value, HeaderKey, HeaderValue> {
205
+ key: DeserializerWithHeaders<Key, HeaderKey, HeaderValue>;
206
+ value: DeserializerWithHeaders<Value, HeaderKey, HeaderValue>;
207
+ headerKey: Deserializer<HeaderKey>;
208
+ headerValue: Deserializer<HeaderValue>;
209
+ }
79
210
  ```
80
211
 
81
- ### IKafkaProducerOptions
212
+ ### String Serialization
82
213
 
83
- | Option | Type | Default | Description |
84
- |--------|------|---------|-------------|
85
- | `bootstrapBrokers` | `string[]` | -- | Kafka broker addresses (required) |
86
- | `identifier` | `string` | -- | Scoped logging identifier |
87
- | `clientId` | `string` | `'ignis-kafka'` | Kafka client ID |
88
- | `acks` | `number` | -- | Acknowledgment level (`KafkaAcks.NONE`, `LEADER`, `ALL`) |
89
- | `autocreateTopics` | `boolean` | -- | Auto-create topics on first produce |
90
- | `timeout` | `number` | -- | Connection timeout in ms |
91
- | `retries` | `number \| boolean` | -- | Retry configuration |
92
- | `retryDelay` | `number` | -- | Delay between retries in ms |
93
- | `serializers` | `Partial<Serializers>` | -- | Custom key/value/header serializers |
94
- | `onConnected` | `() => void` | -- | Broker connect callback |
95
- | `onDisconnected` | `() => void` | -- | Broker disconnect callback |
96
- | `onError` | `(opts: { error: Error }) => void` | -- | Error callback |
97
-
98
- ### Producer API
99
-
100
- | Method | Returns | Description |
101
- |--------|---------|-------------|
102
- | `send(opts)` | `Promise<void>` | Send messages. `opts: { messages: IKafkaProduceMessage[]; acks? }` |
103
- | `sendBatch(opts)` | `Promise<void>` | Send to multiple topics. `opts: { topicMessages: Array<{ topic; messages }> }` |
104
- | `getProducer()` | `Producer` | Access the underlying `@platformatic/kafka` Producer |
105
- | `close()` | `Promise<void>` | Gracefully close the producer connection |
106
- | `static newInstance(opts)` | `KafkaProducerHelper` | Factory method |
107
-
108
- ## Consumer
109
-
110
- The `KafkaConsumerHelper` provides a stream-based consumer with consumer group support, pause/resume, manual commit, and lag monitoring.
214
+ The simplest approach all keys, values, and headers are strings:
111
215
 
112
216
  ```typescript
113
- import { KafkaConsumerHelper } from '@venizia/ignis-helpers/kafka';
217
+ import { stringSerializers, stringDeserializers } from '@platformatic/kafka';
114
218
 
115
- const consumer = new KafkaConsumerHelper({
116
- identifier: 'order-consumer',
219
+ // Producer
220
+ const producer = KafkaProducerHelper.newInstance({
117
221
  bootstrapBrokers: ['localhost:9092'],
118
- groupId: 'order-processing-group',
119
- topics: ['orders'],
120
- mode: 'latest',
121
- autocommit: true,
122
- onMessage: async ({ message }) => {
123
- console.log(`Topic: ${message.topic}, Partition: ${message.partition}`);
124
- console.log(`Key: ${message.key}, Value: ${message.value}`);
125
- },
126
- onConnected: () => console.log('Consumer connected'),
127
- onGroupJoin: ({ groupId, memberId }) => {
128
- console.log(`Joined group ${groupId} as ${memberId}`);
129
- },
130
- onError: ({ error }) => console.error('Consumer error:', error),
222
+ clientId: 'my-producer',
223
+ serializers: stringSerializers,
131
224
  });
132
225
 
133
- // Start consuming
134
- await consumer.start();
135
-
136
- // Pause/resume
137
- consumer.pause();
138
- consumer.resume();
139
-
140
- // Graceful shutdown
141
- await consumer.close();
226
+ // Consumer
227
+ const consumer = KafkaConsumerHelper.newInstance({
228
+ bootstrapBrokers: ['localhost:9092'],
229
+ clientId: 'my-consumer',
230
+ groupId: 'my-group',
231
+ deserializers: stringDeserializers,
232
+ });
142
233
  ```
143
234
 
144
- ### IKafkaConsumerOptions
235
+ ### JSON Serialization
145
236
 
146
- | Option | Type | Default | Description |
147
- |--------|------|---------|-------------|
148
- | `bootstrapBrokers` | `string[]` | -- | Kafka broker addresses (required) |
149
- | `groupId` | `string` | -- | Consumer group ID (required) |
150
- | `topics` | `string[]` | -- | Topics to consume (required) |
151
- | `identifier` | `string` | -- | Scoped logging identifier |
152
- | `clientId` | `string` | `'ignis-kafka'` | Kafka client ID |
153
- | `mode` | `'latest' \| 'earliest' \| 'committed'` | `'latest'` | Offset reset strategy |
154
- | `autocommit` | `boolean \| number` | `true` | Auto-commit offsets (or interval in ms) |
155
- | `sessionTimeout` | `number` | `30000` | Session timeout in ms |
156
- | `heartbeatInterval` | `number` | `3000` | Heartbeat interval in ms |
157
- | `highWaterMark` | `number` | `1024` | Stream high water mark |
158
- | `maxWaitTime` | `number` | `5000` | Max wait time for fetch in ms |
159
- | `deserializers` | `Partial<Deserializers>` | -- | Custom key/value/header deserializers |
160
- | `onMessage` | `(opts: { message }) => ValueOrPromise<void>` | -- | Message handler |
161
- | `onConnected` | `() => void` | -- | Broker connect callback |
162
- | `onDisconnected` | `() => void` | -- | Broker disconnect callback |
163
- | `onGroupJoin` | `(opts: { groupId; memberId }) => void` | -- | Consumer group join callback |
164
- | `onGroupLeave` | `() => void` | -- | Consumer group leave callback |
165
- | `onRebalance` | `() => void` | -- | Group rebalance callback |
166
- | `onLag` | `(opts: { offsets }) => void` | -- | Consumer lag callback |
167
- | `onError` | `(opts: { error: Error }) => void` | -- | Error callback |
168
-
169
- ### Consumer API
170
-
171
- | Method | Returns | Description |
172
- |--------|---------|-------------|
173
- | `start()` | `Promise<void>` | Start consuming messages (fires async consume loop) |
174
- | `pause()` | `void` | Pause the message stream |
175
- | `resume()` | `void` | Resume the message stream |
176
- | `isPaused()` | `boolean` | Check if the stream is paused |
177
- | `isConsuming()` | `boolean` | Check if the consumer is running |
178
- | `commit(opts)` | `Promise<void>` | Manually commit offsets |
179
- | `startLagMonitoring(opts)` | `void` | Start lag monitoring. `opts: { interval: number }` |
180
- | `stopLagMonitoring()` | `void` | Stop lag monitoring |
181
- | `getConsumer()` | `Consumer` | Access the underlying `@platformatic/kafka` Consumer |
182
- | `close()` | `Promise<void>` | Abort consume loop, close stream and consumer |
183
- | `static newInstance(opts)` | `KafkaConsumerHelper` | Factory method |
184
-
185
- ### Manual Commit
186
-
187
- When `autocommit` is `false`, commit offsets explicitly:
237
+ For structured data serialize objects as JSON:
188
238
 
189
239
  ```typescript
190
- const consumer = new KafkaConsumerHelper({
191
- // ...
192
- autocommit: false,
193
- onMessage: async ({ message }) => {
194
- await processMessage(message);
195
- // Commit after successful processing
196
- await consumer.commit({
197
- offsets: [{
198
- topic: message.topic,
199
- partition: message.partition,
200
- offset: message.offset,
201
- leaderEpoch: 0,
202
- }],
203
- });
204
- },
240
+ import {
241
+ jsonSerializer, jsonDeserializer,
242
+ stringSerializer, stringDeserializer,
243
+ serializersFrom, deserializersFrom,
244
+ } from '@platformatic/kafka';
245
+
246
+ // JSON values with string keys
247
+ const producer = KafkaProducerHelper.newInstance({
248
+ bootstrapBrokers: ['localhost:9092'],
249
+ clientId: 'my-producer',
250
+ serializers: { ...serializersFrom(jsonSerializer), key: stringSerializer },
251
+ });
252
+
253
+ const p = producer.getProducer();
254
+ await p.send({
255
+ messages: [{
256
+ topic: 'orders',
257
+ key: 'order-123', // string key
258
+ value: { id: '123', status: 'created', amount: 99 }, // object value → auto-serialized to JSON
259
+ }],
260
+ });
261
+
262
+ // Consumer with matching deserializers
263
+ const consumer = KafkaConsumerHelper.newInstance({
264
+ bootstrapBrokers: ['localhost:9092'],
265
+ clientId: 'my-consumer',
266
+ groupId: 'my-group',
267
+ deserializers: { ...deserializersFrom(jsonDeserializer), key: stringDeserializer },
205
268
  });
206
269
  ```
207
270
 
208
- ## Admin
271
+ ### Custom Serialization
209
272
 
210
- The `KafkaAdminHelper` provides topic, partition, consumer group, and config management.
273
+ For advanced use cases (Avro, Protobuf, MessagePack):
211
274
 
212
275
  ```typescript
213
- import { KafkaAdminHelper, KafkaConfigResourceTypes } from '@venizia/ignis-helpers/kafka';
276
+ import type { Serializer, Deserializer } from '@platformatic/kafka';
277
+ import * as msgpack from '@msgpack/msgpack';
278
+
279
+ const msgpackSerializer: Serializer<unknown> = (data) => {
280
+ if (data === undefined) return undefined;
281
+ return Buffer.from(msgpack.encode(data));
282
+ };
283
+
284
+ const msgpackDeserializer: Deserializer<unknown> = (data) => {
285
+ if (!data) return undefined;
286
+ return msgpack.decode(data);
287
+ };
214
288
 
215
- const admin = new KafkaAdminHelper({
216
- identifier: 'kafka-admin',
289
+ const producer = KafkaProducerHelper.newInstance<string, unknown, string, string>({
217
290
  bootstrapBrokers: ['localhost:9092'],
291
+ clientId: 'msgpack-producer',
292
+ serializers: {
293
+ key: stringSerializer,
294
+ value: msgpackSerializer,
295
+ headerKey: stringSerializer,
296
+ headerValue: stringSerializer,
297
+ },
218
298
  });
299
+ ```
219
300
 
220
- // Topic management
221
- await admin.createTopics({ topics: ['orders', 'notifications'], partitions: 3, replicas: 1 });
222
- const topics = await admin.listTopics();
223
- await admin.deleteTopics({ topics: ['old-topic'] });
301
+ ---
224
302
 
225
- // Partition management
226
- await admin.createPartitions({ topics: [{ name: 'orders', count: 6 }] });
303
+ ## Generic Type Parameters
227
304
 
228
- // Consumer group management
229
- const groups = await admin.listGroups();
230
- const groupDetails = await admin.describeGroups({ groups: ['order-processing-group'] });
231
- await admin.deleteGroups({ groups: ['stale-group'] });
305
+ All three helpers (and their option interfaces) support generic type parameters controlling the serialization types:
232
306
 
233
- // Config management
234
- const configs = await admin.describeConfigs({
235
- resources: [{
236
- resourceType: KafkaConfigResourceTypes.TOPIC,
237
- resourceName: 'orders',
238
- }],
239
- });
307
+ ```typescript
308
+ class KafkaProducerHelper<
309
+ KeyType = string,
310
+ ValueType = string,
311
+ HeaderKeyType = string,
312
+ HeaderValueType = string,
313
+ >
314
+ ```
240
315
 
241
- // Metadata
242
- const meta = await admin.metadata({ topics: ['orders'] });
316
+ | Parameter | Default | Description |
317
+ |-----------|---------|-------------|
318
+ | `KeyType` | `string` | Message key type after serialization/deserialization |
319
+ | `ValueType` | `string` | Message value type after serialization/deserialization |
320
+ | `HeaderKeyType` | `string` | Header key type |
321
+ | `HeaderValueType` | `string` | Header value type |
243
322
 
244
- // Cleanup
245
- await admin.close();
323
+ > [!NOTE]
324
+ > `@platformatic/kafka` defaults to `Buffer` for all four positions. The helpers default to `string` which is more common for application code. If you don't pass serializers, your messages will be sent/received as `Buffer`.
325
+
326
+ ```typescript
327
+ // Default: string types (most common)
328
+ const helper = KafkaProducerHelper.newInstance({ ... });
329
+
330
+ // Custom: string keys, JSON object values
331
+ const helper = KafkaProducerHelper.newInstance<string, MyEvent, string, string>({
332
+ serializers: { ...serializersFrom(jsonSerializer), key: stringSerializer },
333
+ ...
334
+ });
335
+
336
+ // Custom: Buffer keys, Buffer values (raw wire format)
337
+ const helper = KafkaProducerHelper.newInstance<Buffer, Buffer, Buffer, Buffer>({
338
+ // No serializers needed — @platformatic/kafka defaults to Buffer
339
+ ...
340
+ });
246
341
  ```
247
342
 
248
- ### Admin API
249
-
250
- | Method | Returns | Description |
251
- |--------|---------|-------------|
252
- | `createTopics(opts)` | `Promise<any>` | Create topics with partitions and replicas |
253
- | `deleteTopics(opts)` | `Promise<void>` | Delete topics |
254
- | `listTopics(opts?)` | `Promise<string[]>` | List topics (optionally include internals) |
255
- | `metadata(opts?)` | `Promise<any>` | Fetch cluster/topic metadata |
256
- | `listGroups(opts?)` | `Promise<any>` | List consumer groups (filter by state) |
257
- | `describeGroups(opts)` | `Promise<any>` | Describe consumer groups |
258
- | `deleteGroups(opts)` | `Promise<void>` | Delete consumer groups |
259
- | `listConsumerGroupOffsets(opts)` | `Promise<any>` | List offsets for consumer groups |
260
- | `alterConsumerGroupOffsets(opts)` | `Promise<void>` | Alter consumer group offsets |
261
- | `createPartitions(opts)` | `Promise<void>` | Create partitions for topics |
262
- | `describeConfigs(opts)` | `Promise<any>` | Describe resource configs |
263
- | `alterConfigs(opts)` | `Promise<void>` | Alter resource configs |
264
- | `getAdmin()` | `Admin` | Access the underlying `@platformatic/kafka` Admin |
265
- | `close()` | `Promise<void>` | Close the admin connection |
266
- | `static newInstance(opts)` | `KafkaAdminHelper` | Factory method |
343
+ ---
267
344
 
268
345
  ## Constants
269
346
 
270
347
  ### KafkaDefaults
271
348
 
272
- | Constant | Value | Description |
273
- |----------|-------|-------------|
274
- | `CLIENT_ID` | `'ignis-kafka'` | Default Kafka client ID |
275
- | `SESSION_TIMEOUT` | `30000` | Session timeout (ms) |
276
- | `HEARTBEAT_INTERVAL` | `3000` | Heartbeat interval (ms) |
277
- | `MAX_WAIT_TIME` | `5000` | Max fetch wait time (ms) |
278
- | `HIGH_WATER_MARK` | `1024` | Stream high water mark |
279
- | `AUTOCOMMIT_INTERVAL` | `100` | Auto-commit interval (ms) |
349
+ Centralized default values used by all three helpers.
350
+
351
+ ```typescript
352
+ import { KafkaDefaults } from '@venizia/ignis-helpers/kafka';
353
+ ```
354
+
355
+ | Constant | Value | Scope | Used By | Description |
356
+ |----------|-------|-------|---------|-------------|
357
+ | `RETRIES` | `3` | Shared | All helpers | Connection retry count |
358
+ | `RETRY_DELAY` | `1000` | Shared | All helpers | Retry delay in ms |
359
+ | `STRICT` | `true` | Producer | `KafkaProducerHelper` | Fail on unknown topics |
360
+ | `AUTOCREATE_TOPICS` | `false` | Producer | `KafkaProducerHelper` | Auto-create topics on produce |
361
+ | `AUTOCOMMIT` | `false` | Consumer | `KafkaConsumerHelper` | Auto-commit offsets |
362
+ | `SESSION_TIMEOUT` | `30000` | Consumer | `KafkaConsumerHelper` | Session timeout in ms |
363
+ | `HEARTBEAT_INTERVAL` | `3000` | Consumer | `KafkaConsumerHelper` | Heartbeat interval in ms |
364
+ | `HIGH_WATER_MARK` | `1024` | Consumer | `KafkaConsumerHelper` | Stream buffer size (messages) |
365
+ | `MIN_BYTES` | `1` | Consumer | `KafkaConsumerHelper` | Min bytes per fetch |
366
+ | `METADATA_MAX_AGE` | `300000` | Consumer | `KafkaConsumerHelper` | Metadata cache TTL in ms |
367
+ | `GROUP_PROTOCOL` | `'classic'` | Consumer | `KafkaConsumerHelper` | Default group protocol |
280
368
 
281
369
  ### KafkaAcks
282
370
 
283
- | Constant | Value | Description |
284
- |----------|-------|-------------|
285
- | `NONE` | `0` | No acknowledgment |
286
- | `LEADER` | `1` | Leader acknowledgment only |
287
- | `ALL` | `-1` | All replicas must acknowledge |
371
+ Producer acknowledgment levels.
372
+
373
+ ```typescript
374
+ import { KafkaAcks } from '@venizia/ignis-helpers/kafka';
375
+ ```
376
+
377
+ | Constant | Value | Description | Trade-off |
378
+ |----------|-------|-------------|-----------|
379
+ | `NONE` | `0` | No acknowledgment — fire-and-forget | Fastest, no durability guarantee |
380
+ | `LEADER` | `1` | Leader broker acknowledges | Fast, leader-durable |
381
+ | `ALL` | `-1` | All in-sync replicas acknowledge | Slowest, fully durable |
382
+
383
+ **Static methods:**
384
+
385
+ | Method | Signature | Description |
386
+ |--------|-----------|-------------|
387
+ | `isValid(ack)` | `(ack: number): boolean` | Check if value is a valid ack level |
388
+ | `SCHEME_SET` | `Set<number>` | Set of valid values: `{0, 1, -1}` |
389
+
390
+ ```typescript
391
+ KafkaAcks.ALL; // -1
392
+ KafkaAcks.isValid(-1); // true
393
+ KafkaAcks.isValid(2); // false
394
+ KafkaAcks.SCHEME_SET; // Set { 0, 1, -1 }
395
+ ```
396
+
397
+ ### KafkaGroupProtocol
288
398
 
289
- ### KafkaConfigResourceTypes
399
+ Consumer group protocol versions.
400
+
401
+ ```typescript
402
+ import { KafkaGroupProtocol } from '@venizia/ignis-helpers/kafka';
403
+ ```
290
404
 
291
405
  | Constant | Value | Description |
292
406
  |----------|-------|-------------|
293
- | `UNKNOWN` | `0` | Unknown resource type |
294
- | `TOPIC` | `2` | Topic resource |
295
- | `BROKER` | `4` | Broker resource |
296
- | `BROKER_LOGGER` | `8` | Broker logger resource |
407
+ | `CLASSIC` | `'classic'` | Classic consumer group protocol (default, all Kafka versions) |
408
+ | `CONSUMER` | `'consumer'` | New consumer group protocol — KIP-848 (Kafka 3.7+) |
409
+
410
+ **Static methods:**
411
+
412
+ | Method | Signature | Description |
413
+ |--------|-----------|-------------|
414
+ | `isValid(mode)` | `(mode: string): boolean` | Check if value is a valid protocol |
415
+ | `SCHEME_SET` | `Set<string>` | Set of valid values: `{'classic', 'consumer'}` |
416
+
417
+ ```typescript
418
+ KafkaGroupProtocol.CLASSIC; // 'classic'
419
+ KafkaGroupProtocol.isValid('classic'); // true
420
+ KafkaGroupProtocol.isValid('foo'); // false
421
+ ```
422
+
423
+ ### Derived Types
424
+
425
+ ```typescript
426
+ import type { TKafkaAcks, TKafkaGroupProtocol } from '@venizia/ignis-helpers/kafka';
427
+
428
+ // TKafkaAcks = 0 | 1 | -1
429
+ // TKafkaGroupProtocol = 'classic' | 'consumer'
430
+ ```
431
+
432
+ These union types are derived using `TConstValue<T>` from the constant classes.
433
+
434
+ ---
435
+
436
+ ## Compression
437
+
438
+ `@platformatic/kafka` supports five compression algorithms:
439
+
440
+ | Algorithm | Value | Description |
441
+ |-----------|-------|-------------|
442
+ | None | `'none'` | No compression (default) |
443
+ | GZIP | `'gzip'` | Good compression ratio, moderate CPU |
444
+ | Snappy | `'snappy'` | Fast compression, moderate ratio |
445
+ | LZ4 | `'lz4'` | Very fast, good for high-throughput |
446
+ | Zstandard | `'zstd'` | Best ratio, moderate CPU |
447
+
448
+ ```typescript
449
+ const helper = KafkaProducerHelper.newInstance({
450
+ bootstrapBrokers: ['localhost:9092'],
451
+ clientId: 'my-producer',
452
+ serializers: stringSerializers,
453
+ compression: 'zstd', // Applied to all messages by default
454
+ });
455
+
456
+ // Override per-send
457
+ const producer = helper.getProducer();
458
+ await producer.send({
459
+ messages: [{ topic: 'logs', key: 'l1', value: largePayload }],
460
+ compression: 'lz4',
461
+ });
462
+ ```
463
+
464
+ ---
465
+
466
+ ## Pages
467
+
468
+ - **[Producer](./producer)** — Producer helper setup, usage, and full `@platformatic/kafka` Producer API reference
469
+ - **[Consumer](./consumer)** — Consumer helper setup, usage, and full `@platformatic/kafka` Consumer API reference
470
+ - **[Admin](./admin)** — Admin helper setup, usage, and full `@platformatic/kafka` Admin API reference
471
+ - **[Examples & Troubleshooting](./examples)** — Complete examples, IoC integration, and troubleshooting guide
297
472
 
298
473
  ## See Also
299
474
 
300
475
  - **Other Helpers:**
301
- - [Queue Helper](../queue/) -- BullMQ, MQTT, and in-memory queues
302
- - [Redis Helper](../redis/) -- Redis connection management
476
+ - [Queue Helper](../queue/) BullMQ, MQTT, and in-memory queues
477
+ - [Redis Helper](../redis/) Redis connection management
303
478
 
304
479
  - **External Resources:**
305
- - [@platformatic/kafka](https://github.com/platformatic/kafka) -- Underlying Kafka client library
480
+ - [@platformatic/kafka](https://github.com/platformatic/kafka) Underlying Kafka client library
481
+ - [Apache Kafka Documentation](https://kafka.apache.org/documentation/) — Official Kafka docs
482
+ - [KIP-848](https://cwiki.apache.org/confluence/display/KAFKA/KIP-848) — New consumer group protocol