rimless 2.9.0 → 3.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (51) hide show
  1. checksums.yaml +4 -4
  2. data/Appraisals +2 -2
  3. data/CHANGELOG.md +66 -0
  4. data/Gemfile +0 -1
  5. data/README.md +64 -62
  6. data/Rakefile +13 -4
  7. data/UPGRADING.md +491 -0
  8. data/doc/upgrade-guide-sources/README.md +221 -0
  9. data/doc/upgrade-guide-sources/dep-avro_turf-1.20.md +23 -0
  10. data/doc/upgrade-guide-sources/dep-karafka-2.0.md +117 -0
  11. data/doc/upgrade-guide-sources/dep-waterdrop-2.8.md +30 -0
  12. data/gemfiles/rails_8.0.gemfile +1 -1
  13. data/gemfiles/rails_8.1.gemfile +1 -1
  14. data/lib/rimless/compatibility/.gitkeep +0 -0
  15. data/lib/rimless/configuration.rb +80 -6
  16. data/lib/rimless/consumer/app.rb +182 -0
  17. data/lib/rimless/{karafka → consumer}/avro_deserializer.rb +8 -6
  18. data/lib/rimless/consumer/base.rb +118 -0
  19. data/lib/rimless/consumer/job.rb +35 -0
  20. data/lib/rimless/consumer/job_bridge.rb +113 -0
  21. data/lib/rimless/extensions/avro_helpers.rb +83 -0
  22. data/lib/rimless/extensions/configuration_handling.rb +77 -0
  23. data/lib/rimless/extensions/consumer.rb +20 -0
  24. data/lib/rimless/extensions/dependencies.rb +84 -0
  25. data/lib/rimless/extensions/kafka_helpers.rb +46 -0
  26. data/lib/rimless/extensions/producer.rb +103 -0
  27. data/lib/rimless/initializers/compatibility.rb +3 -4
  28. data/lib/rimless/railtie.rb +7 -7
  29. data/lib/rimless/rspec/helpers.rb +53 -13
  30. data/lib/rimless/rspec/matchers.rb +14 -11
  31. data/lib/rimless/rspec.rb +13 -29
  32. data/lib/rimless/tasks/consumer.rake +18 -6
  33. data/lib/rimless/tasks/templates/application_consumer.rb +1 -1
  34. data/lib/rimless/tasks/templates/custom_consumer.rb +1 -1
  35. data/lib/rimless/tasks/templates/custom_consumer_spec.rb +5 -4
  36. data/lib/rimless/tasks/templates/karafka.rb +5 -4
  37. data/lib/rimless/version.rb +3 -1
  38. data/lib/rimless.rb +12 -14
  39. data/rimless.gemspec +7 -9
  40. metadata +38 -67
  41. data/lib/rimless/avro_helpers.rb +0 -81
  42. data/lib/rimless/base_consumer.rb +0 -30
  43. data/lib/rimless/compatibility/karafka_1_4.rb +0 -52
  44. data/lib/rimless/configuration_handling.rb +0 -82
  45. data/lib/rimless/consumer.rb +0 -209
  46. data/lib/rimless/consumer_job.rb +0 -10
  47. data/lib/rimless/dependencies.rb +0 -69
  48. data/lib/rimless/kafka_helpers.rb +0 -104
  49. data/lib/rimless/karafka/base64_interchanger.rb +0 -32
  50. data/lib/rimless/karafka/passthrough_mapper.rb +0 -29
  51. data/lib/rimless/tasks/stats.rake +0 -22
data/UPGRADING.md ADDED
@@ -0,0 +1,491 @@
1
+ # Upgrading from Rimless 2.x to 3.0
2
+
3
+ This guide covers all breaking changes and required migration steps when
4
+ upgrading from Rimless 2.9.x to 3.0.
5
+
6
+ ## Table of Contents
7
+
8
+ - [Dependency Changes](#dependency-changes)
9
+ - [Configuration Changes](#configuration-changes)
10
+ - [Kafka Brokers Format](#kafka-brokers-format)
11
+ - [New Configuration Options](#new-configuration-options)
12
+ - [Consumer Changes](#consumer-changes)
13
+ - [Setup (karafka.rb)](#setup-karafkarb)
14
+ - [Consumer Classes](#consumer-classes)
15
+ - [Application Consumer Base Class](#application-consumer-base-class)
16
+ - [Offset Management](#offset-management)
17
+ - [Railtie / Sidekiq Server Initialization](#railtie--sidekiq-server-initialization)
18
+ - [Anonymous Consumer Classes](#anonymous-consumer-classes)
19
+ - [Producer Changes](#producer-changes)
20
+ - [Testing Changes](#testing-changes)
21
+ - [Consumer Specs](#consumer-specs)
22
+ - [Producer Specs / Custom WaterDrop Stubs](#producer-specs--custom-waterdrop-stubs)
23
+ - [Karafka Configuration (Advanced)](#karafka-configuration-advanced)
24
+
25
+ ---
26
+
27
+ ## Dependency Changes
28
+
29
+ Dependency | 2.x | 3.0 | Upgrading Guide
30
+ ------------------|----------------|-----------|----------------
31
+ `karafka` | `~> 1.4` | `~> 2.5` | [Guide](https://github.com/karafka/karafka/wiki/Upgrades-Karafka-2.0)
32
+ `karafka-testing` | `~> 1.4` | `~> 2.5` | [Guide](https://github.com/karafka/karafka-testing/blob/master/2.0-Upgrade.md)
33
+ `waterdrop` | `~> 1.4` | `~> 2.8` | [Changelog](https://github.com/karafka/waterdrop/blob/master/CHANGELOG.md)
34
+ `avro_turf` | `~> 0.11.0` | `~> 1.20` | [Changelog](https://github.com/dasch/avro_turf/blob/master/CHANGELOG.md)
35
+ `activejob` | _not required_ | `>= 8.0` | _not required_
36
+
37
+ **Removed dependencies:**
38
+
39
+ - `karafka-sidekiq-backend` — no longer used (Karafka 2 removed Sidekiq
40
+ backend support)
41
+ - `sinatra` — no longer required for the fake schema registry in tests
42
+
43
+ The underlying Kafka driver changed from `ruby-kafka` to `librdkafka` (via
44
+ `karafka-rdkafka`). This is a native C extension — make sure your build
45
+ environment supports it.
46
+
47
+ Rimless 3.0 now depends on `activejob` (>= 8.0). If your application is a
48
+ Rails app, this is already included. For standalone apps, ensure `activejob`
49
+ is in your Gemfile.
50
+
51
+ ## Configuration Changes
52
+
53
+ ### Kafka Brokers Format
54
+
55
+ The `KAFKA_BROKERS` environment variable (and `config.kafka_brokers`) no longer
56
+ requires the `kafka://` protocol prefix. Plain `host:port` CSV is now expected.
57
+ The old format is still accepted for backwards compatibility, but you should
58
+ update it:
59
+
60
+ ```diff
61
+ - KAFKA_BROKERS=kafka://broker1.example.com:9092,kafka://broker2.example.com:9092
62
+ + KAFKA_BROKERS=broker1.example.com:9092,broker2.example.com:9092
63
+ ```
64
+
65
+ Or in a Rails initializer:
66
+
67
+ ```diff
68
+ Rimless.configure do |conf|
69
+ - conf.kafka_brokers = 'kafka://your.domain:9092,kafka://other.host:9092'
70
+ + conf.kafka_brokers = 'your.domain:9092,other.host:9092'
71
+ # Make sure it's NOT an array like ['host:port', 'host:port']
72
+ end
73
+ ```
74
+
75
+ ### New Configuration Options
76
+
77
+ Rimless 3.0 adds several new configuration options. All have sensible defaults
78
+ and are **optional**:
79
+
80
+ ```ruby
81
+ Rimless.configure do |conf|
82
+ # Customize the Karafka logger listener (set to false/nil to disable)
83
+ conf.consumer_logger_listener = Karafka::Instrumentation::LoggerListener.new(
84
+ # Karafka, when the logger level is set to INFO, produces logs each time it
85
+ # polls data from an internal messages queue. This can be extensive, so you
86
+ # can turn it off by setting below to false. (Rimless defaults to false)
87
+ log_polling: false
88
+ )
89
+
90
+ # Custom job bridge class (Kafka messages → ActiveJob)
91
+ conf.job_bridge_class = Rimless::Consumer::JobBridge
92
+
93
+ # Custom consumer job class (processes enqueued Kafka messages)
94
+ conf.consumer_job_class = Rimless::Consumer::Job
95
+
96
+ # Custom Apache Avro deserializer class
97
+ conf.avro_deserializer_class = Rimless::Consumer::AvroDeserializer
98
+
99
+ # Fully customize the AvroTurf::Messaging instance
100
+ conf.avro_configure = ->(config) { config.merge(connect_timeout: 5) }
101
+
102
+ # Fully customize the WaterDrop::Producer instance
103
+ # See: https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md
104
+ conf.producer_configure = ->(config) {
105
+ # The number of acknowledgements the leader broker must receive from ISR
106
+ # brokers before responding to the request
107
+ config.kafka[:'request.required.acks'] = 1
108
+ }
109
+
110
+ # Fully customize the Karafka::App instance
111
+ conf.consumer_configure = ->(config) {
112
+ config.kafka[:'max.poll.interval.ms'] = 300_000
113
+ }
114
+ end
115
+ ```
116
+
117
+ ## Consumer Changes
118
+
119
+ ### Setup (karafka.rb)
120
+
121
+ The `karafka.rb` boot file requires several changes:
122
+
123
+ **1. Remove `.boot!`** — Karafka 2 no longer requires an explicit boot call:
124
+
125
+ ```diff
126
+ Rimless.consumer.topics(
127
+ { app: :your_app, name: :your_topic } => YourConsumer
128
+ - ).boot!
129
+ + )
130
+ ```
131
+
132
+ **2. Remove WaterDrop setup and listener subscriptions** — these are now
133
+ handled internally by Rimless:
134
+
135
+ ```diff
136
+ # This is not needed anymore, WaterDrop now directly uses the Rimless.logger
137
+ - Karafka.monitor.subscribe(WaterDrop::Instrumentation::LoggerListener.new)
138
+
139
+ # Use the config.producer_configure instead
140
+ - monitor.subscribe('app.initialized') do
141
+ - WaterDrop.setup { |config| config.deliver = !Karafka.env.test? }
142
+ - end
143
+ ```
144
+
145
+ **3. Remove `KarafkaApp.boot!`** if present: (vanilla Karafka 1.x)
146
+
147
+ ```diff
148
+ - KarafkaApp.boot!
149
+ ```
150
+
151
+ **4. Update inline Karafka configuration** (if any) — Karafka 2 uses
152
+ librdkafka-style settings:
153
+
154
+ ```diff
155
+ Rimless.consumer.configure do |config|
156
+ - config.kafka.start_from_beginning = false
157
+ - config.kafka.heartbeat_interval = 10
158
+ + # See: https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md
159
+ + config.kafka[:'auto.offset.reset'] = 'latest'
160
+ end
161
+ ```
162
+
163
+ Below you can find some of the most significant naming changes in the
164
+ configuration options of Karafka 1.x to 2.x:
165
+
166
+ Root options: ([`config.*`](https://github.com/karafka/karafka/blob/v2.5.5/lib/karafka/setup/config.rb))
167
+ * `start_from_beginning` is now `initial_offset` and accepts either 'earliest'
168
+ or 'latest'
169
+ * `ssl_ca_certs_from_system` is no longer needed, but `kafka`
170
+ `security.protocol` needs to be set to `ssl`
171
+ * `batch_fetching` is no longer needed
172
+ * `batch_consuming` is no longer needed
173
+ * `serializer` is no longer needed because Responders have been removed from
174
+ Karafka
175
+ * `topic_mapper` is no longer needed, as the concept of mapping topic names has
176
+ been removed from Karafka
177
+ * `backend` is no longer needed because Karafka is now multi-threaded
178
+ * `manual_offset_management` now needs to be set on a per-topic basis
179
+
180
+ Kafka options: ([`config.kafka.*`](https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md))
181
+ * `kafka.seed_brokers` is now `bootstrap.servers` without the protocol
182
+ definition
183
+ * `kafka.heartbeat_interval` is no longer needed
184
+ * SASL and SSL options changes are [described in their own
185
+ section](https://github.com/karafka/karafka/wiki/Upgrades-Karafka-2.0#sasl-ssl-authentication)
186
+
187
+ See: https://github.com/karafka/karafka/wiki/Upgrades-Karafka-2.0
188
+
189
+ **Full `karafka.rb` example (after migration):**
190
+
191
+ ```ruby
192
+ # frozen_string_literal: true
193
+
194
+ require 'rimless'
195
+
196
+ # Setup the topic-consumer routing table
197
+ Rimless.consumer.topics(
198
+ { app: :your_app, name: :your_topic } => YourConsumer
199
+ )
200
+
201
+ # Optional: configure Karafka/librdkafka settings
202
+ # Rimless.consumer.configure do |config|
203
+ # config.kafka[:'auto.offset.reset'] = 'latest'
204
+ # end
205
+ ```
206
+
207
+ ### Consumer Classes
208
+
209
+ All consumer-related constants have been reorganized under the
210
+ `Rimless::Consumer` namespace:
211
+
212
+ 2.x Constant | 3.0 Constant
213
+ --- | ---
214
+ `Rimless::BaseConsumer` | `Rimless::Consumer::Base`
215
+ `Rimless::ConsumerApp` | `Rimless::Consumer::App`
216
+ `Rimless::ConsumerJob` | `Rimless::Consumer::Job`
217
+ `Rimless::Karafka::AvroDeserializer` | `Rimless::Consumer::AvroDeserializer`
218
+
219
+ **Removed (no replacement needed):**
220
+
221
+ 2.x Constant | Reason
222
+ --|---
223
+ `Rimless::Karafka::Base64Interchanger` | Messages are now decoded within the Karafka process before being passed to ActiveJob. No binary interchanging needed anymore.
224
+ `Rimless::Karafka::PassthroughMapper` | Karafka 2 removed the topic/consumer mapper concept entirely. This was previously a no-op (input equals output) for Rimless anyway.
225
+
226
+ ### Application Consumer Base Class
227
+
228
+ Update your `ApplicationConsumer` (and any direct references):
229
+
230
+ ```diff
231
+ - class ApplicationConsumer < Rimless::BaseConsumer
232
+ + class ApplicationConsumer < Rimless::Consumer::Base
233
+ end
234
+ ```
235
+
236
+ The consumer API is mostly unchanged. Key differences:
237
+
238
+ - `#consume` now iterates over `#messages` (batch of one or more) instead of
239
+ processing a single message. End-user event methods (`user_created`,
240
+ `user_updated`, etc.) are still called once per message — this is handled
241
+ internally.
242
+ - `#params_batch` is available as a compatibility alias for `#messages`.
243
+ - `#params` is available as a compatibility alias for `#message` (the current
244
+ single message being processed).
245
+
246
+ ### Offset Management
247
+
248
+ Rimless 3.0 marks each Kafka message as consumed (`mark_as_consumed`)
249
+ individually after it has been decoded and enqueued as an ActiveJob job. This
250
+ is a change from the previous `karafka-sidekiq-backend` behavior, which
251
+ relied on Karafka's automatic offset management (committing the entire batch
252
+ offset after `#consume` returned).
253
+
254
+ If you previously configured `manual_offset_management` in Karafka, or relied
255
+ on the batch-level commit behavior, be aware that Rimless now commits per
256
+ message. For custom offset strategies, provide your own `job_bridge_class` via
257
+ `Rimless.configuration.job_bridge_class`.
258
+
259
+ ### Railtie / Sidekiq Server Initialization
260
+
261
+ Rimless 2.x initialized the Karafka consumer application inside the Sidekiq
262
+ server process. This is no longer done or needed. **Remove any manual Sidekiq
263
+ server initialization:**
264
+
265
+ ```diff
266
+ - Sidekiq.configure_server { Rimless.consumer.initialize! }
267
+ ```
268
+
269
+ Consumer jobs are now processed via ActiveJob. Your existing ActiveJob adapter
270
+ (Sidekiq, Solid Queue, etc.) will pick them up automatically.
271
+
272
+ ### Anonymous Consumer Classes
273
+
274
+ Rimless 3.0 **no longer supports anonymous consumer classes** passed to
275
+ `Rimless.consumer.topics(...)`, e.g. `Class.new(Rimless::Consumer::Base)`.
276
+
277
+ In Rimless 2.x, the `karafka-sidekiq-backend` gem transferred binary
278
+ (marshalled) Kafka message payloads to Sidekiq, and Karafka was loaded within
279
+ the Sidekiq worker process to deserialize them. This allowed anonymous classes
280
+ (e.g. created via `Class.new(MyConsumer) { ... }`) because the Karafka routing
281
+ table — including the anonymous class reference — was available in the same
282
+ process.
283
+
284
+ In Rimless 3.0, messages are decoded within the Karafka process and then
285
+ enqueued as ActiveJob jobs. The consumer class name is serialized as a string
286
+ argument (`consumer.name`) and later resolved via `constantize` in the
287
+ ActiveJob worker process, where Karafka is **not** loaded. Anonymous classes
288
+ have no name and cannot be resolved this way.
289
+
290
+ **If you use anonymous consumer classes, you must rewrite them as named
291
+ classes.**
292
+
293
+ Example migration — before (anonymous class):
294
+
295
+ ```ruby
296
+ # 2.x — anonymous class created at configuration time
297
+ def self.create(supported_events: [])
298
+ Class.new(self) do
299
+ supported_events.each do |event|
300
+ class_eval <<-RUBY
301
+ def payment_#{event}(payment:, **)
302
+ # Your logic
303
+ end
304
+ RUBY
305
+ end
306
+ end
307
+ end
308
+
309
+ Rimless.consumer.topics(
310
+ { app: :payment_api, name: :payments } => PaymentConsumer.create(
311
+ supported_events: %i[created updated authorized]
312
+ )
313
+ )
314
+ ```
315
+
316
+ After (named class with configuration):
317
+
318
+ ```ruby
319
+ # 3.0 — configure the named class directly
320
+ class PaymentConsumer < Rimless::Consumer::Base
321
+ def method_missing(method_name, **args)
322
+ # Your logic to detect relevant events
323
+ # (method_name == message.payload[:event])
324
+ end
325
+
326
+ def respond_to_missing?(method_name, include_private = false)
327
+ # Your logic to detect relevant events
328
+ # (method_name == message.payload[:event])
329
+ end
330
+
331
+ def handle_payment_event(event, payment:, **)
332
+ # Your logic
333
+ end
334
+ end
335
+
336
+ Rimless.consumer.topics(
337
+ { app: :payment_api, name: :payments } => PaymentConsumer
338
+ )
339
+ ```
340
+
341
+ The `JobBridge.build` method will raise an `ArgumentError` if an anonymous
342
+ class is passed.
343
+
344
+ ## Producer Changes
345
+
346
+ The public producer API (`Rimless.message`, `Rimless.async_message`,
347
+ `Rimless.raw_message`, `Rimless.async_raw_message`) is **unchanged**.
348
+
349
+ Under the hood, WaterDrop 2.x replaced `WaterDrop::SyncProducer.call` /
350
+ `WaterDrop::AsyncProducer.call` with `Rimless.producer.produce_sync` /
351
+ `Rimless.producer.produce_async`. If you were calling WaterDrop directly
352
+ (bypassing Rimless helpers), update those calls:
353
+
354
+ ```diff
355
+ - WaterDrop::SyncProducer.call(encoded_data, topic: 'my.topic')
356
+ + Rimless.producer.produce_sync(payload: encoded_data, topic: 'my.topic')
357
+
358
+ - WaterDrop::AsyncProducer.call(encoded_data, topic: 'my.topic')
359
+ + Rimless.producer.produce_async(payload: encoded_data, topic: 'my.topic')
360
+ ```
361
+
362
+ Note that WaterDrop 2.x uses keyword arguments (`payload:`, `topic:`, `key:`,
363
+ etc.) instead of positional arguments. All time-related configuration values of
364
+ WaterDrop are now in **milliseconds** (previously some were in seconds).
365
+
366
+ ## Testing Changes
367
+
368
+ ### Consumer Specs
369
+
370
+ **1. Replace `#karafka_consumer_for` with `#kafka_consumer_for`:**
371
+
372
+ Rimless now provides its own `#kafka_consumer_for` helper that wraps the
373
+ Karafka testing helper and automatically inlines the job bridge (so your
374
+ consumer logic executes synchronously in tests):
375
+
376
+ ```diff
377
+ - let(:instance) { karafka_consumer_for(topic) }
378
+ + let(:instance) { kafka_consumer_for(topic) }
379
+ ```
380
+
381
+ **2. Replace `#publish_for_karafka` with `karafka.produce`** (if you used the
382
+ Karafka testing helpers directly):
383
+
384
+ ```diff
385
+ - publish_for_karafka(message)
386
+ + karafka.produce(message)
387
+ ```
388
+
389
+ **3. Update message setup in consumer specs:**
390
+
391
+ The `kafka_message` helper now returns a proper `Karafka::Messages::Message`
392
+ instance double. Update how messages are injected:
393
+
394
+ ```diff
395
+ - let(:params) { kafka_message(topic: topic, **payload) }
396
+ - before { allow(instance).to receive(:params).and_return(params) }
397
+ + let(:message) { kafka_message(topic: topic, **payload) }
398
+ + before { allow(instance).to receive(:messages).and_return([message]) }
399
+ ```
400
+
401
+ **Full consumer spec example (after migration):**
402
+
403
+ ```ruby
404
+ RSpec.describe YourConsumer do
405
+ let(:topic) { Rimless.topic(app: :your_app, name: :your_topic) }
406
+ let(:instance) { kafka_consumer_for(topic) }
407
+ let(:action) { instance.consume }
408
+ let(:message) { kafka_message(topic: topic, **payload) }
409
+
410
+ before { allow(instance).to receive(:messages).and_return([message]) }
411
+
412
+ context 'with user_created message' do
413
+ let(:payload) do
414
+ { event: :user_created, user: { name: 'John' } }
415
+ end
416
+
417
+ it 'processes the event' do
418
+ # your expectations, +YourConsumer#user_created+ will be called
419
+ action
420
+ end
421
+ end
422
+ end
423
+ ```
424
+
425
+ ### Producer Specs / Custom WaterDrop Stubs
426
+
427
+ If you had custom WaterDrop stubs in your test setup, update them:
428
+
429
+ ```diff
430
+ - allow(WaterDrop::SyncProducer).to receive(:call)
431
+ - allow(WaterDrop::AsyncProducer).to receive(:call)
432
+ + allow(Rimless.producer).to receive(:produce_sync)
433
+ + allow(Rimless.producer).to receive(:produce_async)
434
+ ```
435
+
436
+ The built-in `have_sent_kafka_message` RSpec matcher continues to work as
437
+ before — no changes needed for producer message expectations.
438
+
439
+ ## Karafka Configuration (Advanced)
440
+
441
+ If you had custom Karafka configuration beyond what Rimless provides, note the
442
+ following Karafka 2 changes:
443
+
444
+ **Removed settings** (no longer applicable):
445
+
446
+ Setting | Reason
447
+ --|---
448
+ `config.backend` | Karafka 2 is multi-threaded, no Sidekiq backend
449
+ `config.batch_fetching` | Always enabled in Karafka 2
450
+ `config.batch_consuming` | Removed; tune `max_wait_time` / `max_messages`
451
+ `config.serializer` | Responders removed from Karafka
452
+ `config.topic_mapper` | Topic mapping concept removed
453
+ `config.consumer_mapper` | Consumer mapping concept removed
454
+
455
+ **Renamed settings:**
456
+
457
+ 2.x | 3.0 (librdkafka)
458
+ --|---
459
+ `config.kafka.seed_brokers` | `config.kafka[:'bootstrap.servers']`
460
+ `config.start_from_beginning` | `config.kafka[:'auto.offset.reset']` (`'earliest'` or `'latest'`)
461
+ `config.kafka.heartbeat_interval` | No longer needed (handled by librdkafka)
462
+ `config.kafka.required_acks` | `config.kafka[:'request.required.acks']`
463
+
464
+ **Consumer group behavior change:** Karafka 2 builds a single consumer group
465
+ for all topics (instead of one per topic in 1.x). This is now the default and
466
+ matches Rimless conventions.
467
+
468
+ **Latency tuning:** Batch fetching is always on (just as it was configured by
469
+ Rimless in the past). To optimize for low latency or high throughput,
470
+ configure:
471
+
472
+ ```ruby
473
+ Rimless.consumer.configure do |config|
474
+ # Lower values = lower latency, higher values = higher throughput
475
+ config.max_wait_time = 500 # ms to wait for messages (default: 1_000)
476
+ config.max_messages = 50 # max messages per batch (default: 100)
477
+ # Enable end-of-partition notifications if needed
478
+ config.kafka[:'enable.partition.eof'] = true
479
+ end
480
+ ```
481
+
482
+ See the [Karafka latency and throughput guide](https://karafka.io/docs/Latency-and-Throughput/)
483
+ and the [librdkafka configuration
484
+ reference](https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md)
485
+ / [Karafka top-level
486
+ configuration](https://github.com/karafka/karafka/blob/v2.5.5/lib/karafka/setup/config.rb)
487
+ for all available options.
488
+
489
+ ---
490
+
491
+ The raw research sources for this guide are located at [`doc/upgrade-guide-sources/`](./doc/upgrade-guide-sources).