rimless 2.8.0 → 3.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/Appraisals +2 -2
- data/CHANGELOG.md +71 -0
- data/Gemfile +0 -1
- data/README.md +64 -62
- data/Rakefile +13 -4
- data/UPGRADING.md +491 -0
- data/doc/upgrade-guide-sources/README.md +221 -0
- data/doc/upgrade-guide-sources/dep-avro_turf-1.20.md +23 -0
- data/doc/upgrade-guide-sources/dep-karafka-2.0.md +117 -0
- data/doc/upgrade-guide-sources/dep-waterdrop-2.8.md +30 -0
- data/gemfiles/rails_8.0.gemfile +1 -1
- data/gemfiles/rails_8.1.gemfile +1 -1
- data/lib/rimless/compatibility/.gitkeep +0 -0
- data/lib/rimless/configuration.rb +80 -6
- data/lib/rimless/consumer/app.rb +182 -0
- data/lib/rimless/{karafka → consumer}/avro_deserializer.rb +8 -6
- data/lib/rimless/consumer/base.rb +118 -0
- data/lib/rimless/consumer/job.rb +35 -0
- data/lib/rimless/consumer/job_bridge.rb +113 -0
- data/lib/rimless/extensions/avro_helpers.rb +83 -0
- data/lib/rimless/extensions/configuration_handling.rb +77 -0
- data/lib/rimless/extensions/consumer.rb +20 -0
- data/lib/rimless/extensions/dependencies.rb +84 -0
- data/lib/rimless/extensions/kafka_helpers.rb +46 -0
- data/lib/rimless/extensions/producer.rb +103 -0
- data/lib/rimless/initializers/compatibility.rb +3 -4
- data/lib/rimless/railtie.rb +7 -7
- data/lib/rimless/rspec/helpers.rb +53 -13
- data/lib/rimless/rspec/matchers.rb +14 -11
- data/lib/rimless/rspec.rb +13 -29
- data/lib/rimless/tasks/consumer.rake +18 -6
- data/lib/rimless/tasks/templates/application_consumer.rb +1 -1
- data/lib/rimless/tasks/templates/custom_consumer.rb +1 -1
- data/lib/rimless/tasks/templates/custom_consumer_spec.rb +5 -4
- data/lib/rimless/tasks/templates/karafka.rb +5 -4
- data/lib/rimless/version.rb +3 -1
- data/lib/rimless.rb +12 -14
- data/rimless.gemspec +8 -10
- metadata +43 -72
- data/lib/rimless/avro_helpers.rb +0 -81
- data/lib/rimless/base_consumer.rb +0 -30
- data/lib/rimless/compatibility/karafka_1_4.rb +0 -52
- data/lib/rimless/configuration_handling.rb +0 -82
- data/lib/rimless/consumer.rb +0 -209
- data/lib/rimless/consumer_job.rb +0 -10
- data/lib/rimless/dependencies.rb +0 -69
- data/lib/rimless/kafka_helpers.rb +0 -104
- data/lib/rimless/karafka/base64_interchanger.rb +0 -32
- data/lib/rimless/karafka/passthrough_mapper.rb +0 -29
- data/lib/rimless/tasks/stats.rake +0 -22
|
@@ -0,0 +1,221 @@
|
|
|
1
|
+
# Top Level Overview
|
|
2
|
+
|
|
3
|
+
- [Dependency Changes](#dependency-changes)
|
|
4
|
+
- [Structural Changes](#structural-changes)
|
|
5
|
+
- [Removed](#removed)
|
|
6
|
+
- [Moved, API unchanged, constants changed](#moved-api-unchanged-constants-changed)
|
|
7
|
+
- [Moved, API mostly unchanged, constants changed](#moved-api-mostly-unchanged-constants-changed)
|
|
8
|
+
- [Moved, API unchanged](#moved-api-unchanged)
|
|
9
|
+
- [Logic changed](#logic-changed)
|
|
10
|
+
- [Configuration Changes](#configuration-changes)
|
|
11
|
+
- [Changed](#changed)
|
|
12
|
+
- [New](#new)
|
|
13
|
+
- [Producer (WaterDrop)](#producer-waterdrop)
|
|
14
|
+
- [Consumer Setup (Karafka)](#consumer-setup-karafka)
|
|
15
|
+
- [Karafka 2 (New)](#karafka-2-new)
|
|
16
|
+
- [Phase 1: Boot](#phase-1-boot)
|
|
17
|
+
- [Phase 2: Message Consumption (batch fetching, single consumption)](#phase-2-message-consumption-batch-fetching-single-consumption)
|
|
18
|
+
- [Karafka 1 (Old)](#karafka-1-old)
|
|
19
|
+
- [Phase 1: Boot](#phase-1-boot-1)
|
|
20
|
+
- [Phase 2: Message Consumption (batch fetching, single consumption)](#phase-2-message-consumption-batch-fetching-single-consumption-1)
|
|
21
|
+
- [Testing](#testing)
|
|
22
|
+
|
|
23
|
+
## Dependency Changes
|
|
24
|
+
|
|
25
|
+
* Migrated the `avro_turf` gem from `~> 0.11.0` to `~> 1.20`
|
|
26
|
+
* Migrated the `waterdrop` gem from `~> 1.4` to `~> 2.8`
|
|
27
|
+
* Migrated the `karafka` gem from `~> 1.4` to `~> 2.5`
|
|
28
|
+
* Switched from Sidekiq to ActiveJob (and added the dependency)
|
|
29
|
+
|
|
30
|
+
## Structural Changes
|
|
31
|
+
|
|
32
|
+
### Removed
|
|
33
|
+
|
|
34
|
+
* lib/rimless/karafka/base64_interchanger.rb
|
|
35
|
+
`Rimless::Karafka::Base64Interchanger` (no interchanging needed, as we do not
|
|
36
|
+
move the avro/binary encoded kafka message payload from Karafka to Sidekiq
|
|
37
|
+
anymore — instead we decode the message payload within the Karafka process,
|
|
38
|
+
and move the decoded data to ActiveJob and use the regular ActiveJob
|
|
39
|
+
arguments serialization)
|
|
40
|
+
* lib/rimless/karafka/passthrough_mapper.rb
|
|
41
|
+
`Rimless::Karafka::PassthroughMapper` (Karafka removed consumer/topics
|
|
42
|
+
mapper concepts completely — which was effectively a no-op, as Rimless
|
|
43
|
+
implemented a passthrough mapper to just keep inputs equal to outputs for
|
|
44
|
+
these names)
|
|
45
|
+
* lib/rimless/compatibility/karafka_1_4.rb (dropped Karafka 1.x support)
|
|
46
|
+
|
|
47
|
+
### Moved, API unchanged, constants changed
|
|
48
|
+
|
|
49
|
+
* lib/rimless/karafka/avro_deserializer.rb `Rimless::Karafka::AvroDeserializer`
|
|
50
|
+
-> lib/rimless/consumer/avro_deserializer.rb
|
|
51
|
+
`Rimless::Consumer::AvroDeserializer`
|
|
52
|
+
|
|
53
|
+
### Moved, API mostly unchanged, constants changed
|
|
54
|
+
|
|
55
|
+
* lib/rimless/consumer.rb `Rimless::ConsumerApp` -> lib/rimless/consumer/app.rb
|
|
56
|
+
`Rimless::Consumer::App` (functionality is kept, but some methods were
|
|
57
|
+
removed)
|
|
58
|
+
* lib/rimless/consumer_job.rb `Rimless::ConsumerJob` ->
|
|
59
|
+
lib/rimless/consumer/job.rb `Rimless::Consumer::Job` (Sidekiq -> ActiveJob)
|
|
60
|
+
* lib/rimless/base_consumer.rb `Rimless::BaseConsumer` ->
|
|
61
|
+
lib/rimless/consumer/base.rb `Rimless::Consumer::Base` (all
|
|
62
|
+
functionality/methods kept, API extended, `#consume` now returns the
|
|
63
|
+
`#messages` array instead of the result of the dispatched event method)
|
|
64
|
+
|
|
65
|
+
### Moved, API unchanged
|
|
66
|
+
|
|
67
|
+
* lib/rimless/avro_helpers.rb -> lib/rimless/extensions/avro_helpers.rb
|
|
68
|
+
* lib/rimless/configuration_handling.rb ->
|
|
69
|
+
lib/rimless/extensions/configuration_handling.rb
|
|
70
|
+
* lib/rimless/kafka_helpers.rb -> lib/rimless/extensions/kafka_helpers.rb
|
|
71
|
+
* lib/rimless/dependencies.rb -> lib/rimless/extensions/dependencies.rb
|
|
72
|
+
|
|
73
|
+
### Logic changed
|
|
74
|
+
|
|
75
|
+
* lib/rimless/railtie.rb (Karafka is no longer initialized within a Sidekiq
|
|
76
|
+
server context; this was needed in the past for the encoded/binary Kafka
|
|
77
|
+
message payload interchanging, as the data was actually parsed within the
|
|
78
|
+
Sidekiq process)
|
|
79
|
+
|
|
80
|
+
## Configuration Changes
|
|
81
|
+
|
|
82
|
+
### Changed
|
|
83
|
+
|
|
84
|
+
* `KAFKA_BROKERS` (env var) / `config.kafka_brokers` (format change — no
|
|
85
|
+
protocol anymore, just host:port CSV, old format:
|
|
86
|
+
`kafka://your.domain:9092,kafka..`, new format: `your.domain:9092,host..`) —
|
|
87
|
+
the old format is still supported for compatibility
|
|
88
|
+
|
|
89
|
+
### New
|
|
90
|
+
|
|
91
|
+
* `config.consumer_logger_listener` (allows configuring the Karafka logging,
|
|
92
|
+
or providing a custom solution)
|
|
93
|
+
* `config.job_bridge_class` (allows configuring a custom job bridge class
|
|
94
|
+
that takes care of receiving Kafka messages and producing/enqueuing ActiveJob
|
|
95
|
+
jobs)
|
|
96
|
+
* `config.consumer_job_class` (allows configuring a custom job class that
|
|
97
|
+
processes the enqueued Kafka messages produced by the job bridge)
|
|
98
|
+
* `config.avro_deserializer_class` (allows configuring a custom Apache Avro
|
|
99
|
+
deserializer class that may implement additional parsing/processing, for
|
|
100
|
+
example)
|
|
101
|
+
* `config.avro_configure` (allows users to fully customize the
|
|
102
|
+
`AvroTurf::Messaging` instance)
|
|
103
|
+
* `config.producer_configure` (allows users to fully customize the
|
|
104
|
+
`WaterDrop::Producer` instance)
|
|
105
|
+
* `config.consumer_configure` (allows users to fully customize the
|
|
106
|
+
`Karafka::App` instance)
|
|
107
|
+
|
|
108
|
+
## Producer (WaterDrop)
|
|
109
|
+
|
|
110
|
+
* No breaking changes, as we wrap it with our Kafka helpers (e.g.
|
|
111
|
+
`Rimless.message` and the like — their API stayed stable)
|
|
112
|
+
|
|
113
|
+
## Consumer Setup (Karafka)
|
|
114
|
+
|
|
115
|
+
<table>
|
|
116
|
+
<tr>
|
|
117
|
+
<td valign="top">
|
|
118
|
+
|
|
119
|
+
### Karafka 2 (New)
|
|
120
|
+
|
|
121
|
+
#### Phase 1: Boot
|
|
122
|
+
|
|
123
|
+
* `bundle exec karafka server`
|
|
124
|
+
* Karafka: load some Karafka (server) defaults
|
|
125
|
+
* Karafka: require `rails` — if available
|
|
126
|
+
* Karafka: require `/app/karafka.rb`
|
|
127
|
+
* `/app/karafka.rb`: require `rimless`
|
|
128
|
+
* `/app/karafka.rb`: `Rimless.consumer.topics(..)`
|
|
129
|
+
* Rimless: `Rimless.consumer -> Rimless::Consumer::App.new` — this configures
|
|
130
|
+
Karafka (including logging, code reload)
|
|
131
|
+
* Karafka: Karafka server takes over
|
|
132
|
+
=> (set up consumer groups, start listening for messages)
|
|
133
|
+
|
|
134
|
+
<br><br><br><br><br><br>
|
|
135
|
+
|
|
136
|
+
#### Phase 2: Message Consumption (batch fetching, single consumption)
|
|
137
|
+
|
|
138
|
+
* Karafka: receives message(s) on topic (synced by consumer group)
|
|
139
|
+
=> just one Karafka server process handles a single message, per partition
|
|
140
|
+
(no double processing)
|
|
141
|
+
* Karafka: routes the message(s) of the topic to the Rimless "job bridge"
|
|
142
|
+
consumer (`Rimless::Consumer::JobBridge`), then all messages of the batch are
|
|
143
|
+
processed (lazily deserialized) and enqueued as an ActiveJob
|
|
144
|
+
(`Rimless::Consumer::Job`) — while the decoded message payload is passed as
|
|
145
|
+
job parameters and serialized/deserialized by ActiveJob (the job execution
|
|
146
|
+
may then be concurrent via Sidekiq or another ActiveJob adapter)
|
|
147
|
+
=> the Kafka message now leaves the Karafka server process
|
|
148
|
+
|
|
149
|
+
* ActiveJob: `Rimless::Consumer::Job` is picked up and executed
|
|
150
|
+
* Rimless: a `Rimless::Consumer::Base` child class is searched by the
|
|
151
|
+
`consumer` parameter (class inside `/app/consumers`) and instantiated for
|
|
152
|
+
the job context (hydrating consumer metadata, the message batch containing
|
|
153
|
+
the single message, etc)
|
|
154
|
+
* Rimless: `Rimless::Consumer::Base` unpacks the message `event` (e.g.
|
|
155
|
+
`user_updated`) and dispatches it as a method on the child consumer with
|
|
156
|
+
the remaining event parameters as arguments
|
|
157
|
+
* App: `/app/consumers` class kicks in and runs business application logic
|
|
158
|
+
=> e.g. `IdentityApiConsumer.user_updated(user:, **_)`
|
|
159
|
+
|
|
160
|
+
</td>
|
|
161
|
+
<td valign="top">
|
|
162
|
+
|
|
163
|
+
### Karafka 1 (Old)
|
|
164
|
+
|
|
165
|
+
#### Phase 1: Boot
|
|
166
|
+
|
|
167
|
+
* `bundle exec karafka server`
|
|
168
|
+
* Karafka: load some Karafka (server) defaults
|
|
169
|
+
* Karafka: require `/app/karafka.rb`
|
|
170
|
+
* `/app/karafka.rb`: require `rimless`
|
|
171
|
+
* Rimless: require `railtie` — set up Sidekiq server part
|
|
172
|
+
* `/app/karafka.rb`: `Rimless.consumer.topics(..).boot!`
|
|
173
|
+
* Rimless: `Rimless.consumer -> ConsumerApp.initialize!`
|
|
174
|
+
* `initialize_rails!`
|
|
175
|
+
* `initialize_monitors!`
|
|
176
|
+
* `initialize_karafka!`
|
|
177
|
+
* `initialize_logger!`
|
|
178
|
+
* `initialize_code_reload!`
|
|
179
|
+
* Karafka: Karafka server takes over
|
|
180
|
+
=> (set up consumer groups, start listening for messages)
|
|
181
|
+
|
|
182
|
+
#### Phase 2: Message Consumption (batch fetching, single consumption)
|
|
183
|
+
|
|
184
|
+
* Karafka: receives message on topic (synced by consumer group)
|
|
185
|
+
=> just one Karafka server process handles a single message, per partition
|
|
186
|
+
(no double processing)
|
|
187
|
+
* Karafka: run `Rimless::Karafka::PassthroughMapper` for routing (no-op)
|
|
188
|
+
* Karafka: deserialize message payload with `Rimless::Karafka::AvroDeserializer`
|
|
189
|
+
* Karafka: decoded message is passed into `Karafka::Backends::Sidekiq`
|
|
190
|
+
=> (karafka-sidekiq-backend gem)
|
|
191
|
+
* karafka-sidekiq-backend: wrap the message payload with
|
|
192
|
+
`Rimless::Karafka::Base64Interchanger`
|
|
193
|
+
=> (Ruby object marshalling + base64 encoding for Valkey/Redis transport,
|
|
194
|
+
to overcome binary encoding issues)
|
|
195
|
+
=> quite high size overhead on Valkey/Redis
|
|
196
|
+
* karafka-sidekiq-backend: enqueue `Rimless::ConsumerJob` with the wrapped
|
|
197
|
+
message payload
|
|
198
|
+
=> the Kafka message now leaves the Karafka server process
|
|
199
|
+
|
|
200
|
+
* Sidekiq: `Rimless::ConsumerJob` is picked up and executed
|
|
201
|
+
* karafka-sidekiq-backend: a `Rimless::BaseConsumer` class is searched for
|
|
202
|
+
the message (child class inside `/app/consumers`)
|
|
203
|
+
* Rimless: `Rimless::BaseConsumer` unpacks the message `event` (e.g.
|
|
204
|
+
`user_updated`) and dispatches it as a method on the child consumer with
|
|
205
|
+
the remaining event parameters as arguments
|
|
206
|
+
* App: `/app/consumers` class kicks in and runs business application logic
|
|
207
|
+
=> e.g. `IdentityApiConsumer.user_updated(user:, **_)`
|
|
208
|
+
|
|
209
|
+
</td>
|
|
210
|
+
</tr>
|
|
211
|
+
</table>
|
|
212
|
+
|
|
213
|
+
## Testing
|
|
214
|
+
|
|
215
|
+
See: https://github.com/karafka/karafka-testing/blob/master/2.0-Upgrade.md
|
|
216
|
+
|
|
217
|
+
* Replace `#karafka_consumer_for` in your specs with `#kafka_consumer_for`
|
|
218
|
+
(provided and augmented by Rimless to skip job enqueuing and instead
|
|
219
|
+
perform the wrapped consumer job directly)
|
|
220
|
+
* Replace `#publish_for_karafka` in your specs with `karafka.produce` (in
|
|
221
|
+
case you did not use the Rimless message producing helpers)
|
|
@@ -0,0 +1,23 @@
|
|
|
1
|
+
# AvroTurf RubyGem (relevant changes)
|
|
2
|
+
|
|
3
|
+
* See: https://github.com/dasch/avro_turf/blob/master/CHANGELOG.md
|
|
4
|
+
* Migrated the `avro_turf` gem from `~> 0.11.0` to `~> 1.20`
|
|
5
|
+
|
|
6
|
+
---
|
|
7
|
+
|
|
8
|
+
- [Important](#important)
|
|
9
|
+
- [Minor](#minor)
|
|
10
|
+
|
|
11
|
+
## Important
|
|
12
|
+
|
|
13
|
+
* The `excon` dependency was upgraded to `>= 0.104, < 2`
|
|
14
|
+
* Removed `sinatra` as a development dependency (our Rimless gem dropped the
|
|
15
|
+
`sinatra` gem dependency, too)
|
|
16
|
+
* Stopped caching nested sub-schemas
|
|
17
|
+
|
|
18
|
+
## Minor
|
|
19
|
+
|
|
20
|
+
* Added compatibility with Avro v1.12.x
|
|
21
|
+
* Added `resolv_resolver` parameter to `AvroTurf::Messaging` to make use of
|
|
22
|
+
custom domain name resolvers and their options, for example `nameserver` and
|
|
23
|
+
`timeouts`
|
|
@@ -0,0 +1,117 @@
|
|
|
1
|
+
# Karafka RubyGem (relevant changes)
|
|
2
|
+
|
|
3
|
+
* See: https://github.com/karafka/karafka/blob/master/CHANGELOG.md
|
|
4
|
+
* See: https://github.com/karafka/karafka/wiki/Upgrades-Karafka-2.0
|
|
5
|
+
* Migrated the `karafka` gem from `~> 1.4` to `~> 2.5`
|
|
6
|
+
|
|
7
|
+
---
|
|
8
|
+
|
|
9
|
+
- [Important (Structural)](#important-structural)
|
|
10
|
+
- [Important (Logical)](#important-logical)
|
|
11
|
+
- [Important (Configurations)](#important-configurations)
|
|
12
|
+
- [Important (End-user code changes)](#important-end-user-code-changes)
|
|
13
|
+
- [Minor](#minor)
|
|
14
|
+
|
|
15
|
+
## Important (Structural)
|
|
16
|
+
|
|
17
|
+
* Removed the topic mappers concept completely
|
|
18
|
+
* Removed support for using sidekiq-backend due to the introduction of
|
|
19
|
+
multi-threading
|
|
20
|
+
* Removed the now incompatible `karafka-sidekiq-backend` gem
|
|
21
|
+
* If you use sidekiq-backend, you have two options:
|
|
22
|
+
* Leverage Karafka's multi-threading capabilities
|
|
23
|
+
* Pipe the jobs to Sidekiq yourself (this is what we do with the Rimless
|
|
24
|
+
gem now)
|
|
25
|
+
* Removed the Responders concept in favor of WaterDrop producer usage
|
|
26
|
+
* Removed all callbacks completely in favor of the finalizer method `#shutdown`
|
|
27
|
+
* Removed single message consumption mode in favor of documentation on how to
|
|
28
|
+
do it easily yourself (see:
|
|
29
|
+
https://github.com/karafka/karafka/wiki/Consuming-messages#consuming-messages-one-at-a-time)
|
|
30
|
+
* In the past, Rimless configured `config.batch_fetching = true` and
|
|
31
|
+
`config.batch_consuming = false`, resulting in single message processing
|
|
32
|
+
within the Karafka process, but each Kafka message was enqueued as a
|
|
33
|
+
Sidekiq worker/job — so message consumption was always concurrent. Batch
|
|
34
|
+
fetching is now always done by Karafka; adjust `config.max_wait_time` or
|
|
35
|
+
`config.max_messages` to optimize for latency or throughput (also check
|
|
36
|
+
`config.kafka[:'enable.partition.eof'] = true`, see:
|
|
37
|
+
https://karafka.io/docs/Latency-and-Throughput/).
|
|
38
|
+
* Renamed `Karafka::Params::BatchMetadata` to
|
|
39
|
+
`Karafka::Messages::BatchMetadata`
|
|
40
|
+
* Renamed `Karafka::Params::Params` to `Karafka::Messages::Message`
|
|
41
|
+
* Renamed `#params_batch` in consumers to `#messages` (Rimless adds a
|
|
42
|
+
compatibility delegation for the old `#params_batch`)
|
|
43
|
+
* Renamed `Karafka::Params::Metadata` to `Karafka::Messages::Metadata`
|
|
44
|
+
* Renamed `Karafka::Fetcher` to `Karafka::Runner` and aligned notification key
|
|
45
|
+
names
|
|
46
|
+
* Renamed `Karafka::Instrumentation::StdoutListener` to
|
|
47
|
+
`Karafka::Instrumentation::LoggerListener`
|
|
48
|
+
* Renamed `Karafka::Serializers::JSON::Deserializer` to
|
|
49
|
+
`Karafka::Deserializers::Payload`
|
|
50
|
+
|
|
51
|
+
## Important (Logical)
|
|
52
|
+
|
|
53
|
+
* Changed how the routing style (0.5) behaves. It now builds a single consumer
|
|
54
|
+
group instead of one per topic (consumer groups: 2.0 uses 1 for all topics,
|
|
55
|
+
1.4 used 1 per topic)
|
|
56
|
+
* Karafka 2.0 introduces seamless Ruby on Rails integration via `Rails::Railtie`
|
|
57
|
+
without needing extra configuration (this is reflected in the Rimless gem,
|
|
58
|
+
as we no longer initialize the Rails application)
|
|
59
|
+
|
|
60
|
+
## Important (Configurations)
|
|
61
|
+
|
|
62
|
+
* Karafka 2.0 is powered by librdkafka, Rimless allows configuration via
|
|
63
|
+
`Rimless.configuration.consumer_configure` and the configuration is split
|
|
64
|
+
into Karafka settings (root level, see:
|
|
65
|
+
https://github.com/karafka/karafka/blob/v2.5.5/lib/karafka/setup/config.rb)
|
|
66
|
+
and Kafka settings (`config.kafka`, see:
|
|
67
|
+
https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md)
|
|
68
|
+
* Below you can find some of the most significant naming changes in the
|
|
69
|
+
configuration options:
|
|
70
|
+
* Root options:
|
|
71
|
+
* `start_from_beginning` is now `initial_offset` and accepts either
|
|
72
|
+
'earliest' or 'latest'
|
|
73
|
+
* `ssl_ca_certs_from_system` is no longer needed, but `kafka`
|
|
74
|
+
`security.protocol` needs to be set to `ssl`
|
|
75
|
+
* `batch_fetching` is no longer needed
|
|
76
|
+
* `batch_consuming` is no longer needed
|
|
77
|
+
* `serializer` is no longer needed because Responders have been removed
|
|
78
|
+
from Karafka
|
|
79
|
+
* `topic_mapper` is no longer needed, as the concept of mapping topic names
|
|
80
|
+
has been removed from Karafka
|
|
81
|
+
* `backend` is no longer needed because Karafka is now multi-threaded
|
|
82
|
+
* `manual_offset_management` now needs to be set on a per-topic basis
|
|
83
|
+
* Kafka options:
|
|
84
|
+
* `kafka.seed_brokers` is now `bootstrap.servers` without the protocol
|
|
85
|
+
definition
|
|
86
|
+
* `kafka.heartbeat_interval` is no longer needed.
|
|
87
|
+
* SASL and SSL options changes are described in their own section.
|
|
88
|
+
|
|
89
|
+
## Important (End-user code changes)
|
|
90
|
+
|
|
91
|
+
* Remove WaterDrop setup code from your `karafka.rb`:
|
|
92
|
+
```ruby
|
|
93
|
+
# This can be safely removed
|
|
94
|
+
monitor.subscribe('app.initialized') do
|
|
95
|
+
WaterDrop.setup { |config| config.deliver = !Karafka.env.test? }
|
|
96
|
+
end
|
|
97
|
+
```
|
|
98
|
+
* Remove direct WaterDrop listener references from your `karafka.rb`:
|
|
99
|
+
```ruby
|
|
100
|
+
# This can be safely removed
|
|
101
|
+
Karafka.monitor.subscribe(WaterDrop::Instrumentation::LoggerListener.new)
|
|
102
|
+
```
|
|
103
|
+
* Remove the `KarafkaApp.boot!` from the end of `karafka.rb`:
|
|
104
|
+
```ruby
|
|
105
|
+
# Remove this
|
|
106
|
+
KarafkaApp.boot!
|
|
107
|
+
# or in case of Rimless:
|
|
108
|
+
Rimless.consumer.topics(...).boot! # just the `.boot!` call
|
|
109
|
+
```
|
|
110
|
+
|
|
111
|
+
## Minor
|
|
112
|
+
|
|
113
|
+
* No `dry-*` gems are used as dependencies anymore
|
|
114
|
+
* Added `KARAFKA_REQUIRE_RAILS` to disable the default Rails require, to run
|
|
115
|
+
Karafka without Rails despite having Rails in the Gemfile
|
|
116
|
+
* Allow running boot-file-less Rails setup Karafka CLI commands where
|
|
117
|
+
configuration is done in initializers
|
|
@@ -0,0 +1,30 @@
|
|
|
1
|
+
# WaterDrop RubyGem (relevant changes)
|
|
2
|
+
|
|
3
|
+
* See: https://github.com/karafka/waterdrop/blob/master/CHANGELOG.md
|
|
4
|
+
* Migrated the `waterdrop` gem from `~> 1.4` to `~> 2.8`
|
|
5
|
+
|
|
6
|
+
---
|
|
7
|
+
|
|
8
|
+
- [Important](#important)
|
|
9
|
+
- [Minor](#minor)
|
|
10
|
+
|
|
11
|
+
## Important
|
|
12
|
+
|
|
13
|
+
* Replaced `ruby-kafka` with `rdkafka` (karafka-rdkafka, native gem/lib)
|
|
14
|
+
* The new underlying Kafka library has different/renamed options (see:
|
|
15
|
+
https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md)
|
|
16
|
+
* No `dry-*` gems are used as dependencies anymore
|
|
17
|
+
* Complete redesign of the API (this is wrapped by Rimless helpers, which
|
|
18
|
+
remained stable)
|
|
19
|
+
* All time-related values are now configured in milliseconds instead of some
|
|
20
|
+
being in seconds and some in milliseconds
|
|
21
|
+
|
|
22
|
+
## Minor
|
|
23
|
+
|
|
24
|
+
* Added support for sending tombstone messages
|
|
25
|
+
* Changed auto-generated ID from `SecureRandom#uuid` to `SecureRandom#hex(6)`
|
|
26
|
+
* Introduced transactions support
|
|
27
|
+
* Added support for producing messages with arrays of strings in headers
|
|
28
|
+
(KIP-82)
|
|
29
|
+
* Added `WaterDrop::ConnectionPool` for efficient connection pooling using the
|
|
30
|
+
proven `connection_pool` gem
|
data/gemfiles/rails_8.0.gemfile
CHANGED
|
@@ -7,7 +7,6 @@ gem "bundler", ">= 2.6", "< 5"
|
|
|
7
7
|
gem "countless", "~> 2.2"
|
|
8
8
|
gem "factory_bot", "~> 6.2"
|
|
9
9
|
gem "guard-rspec", "~> 4.7"
|
|
10
|
-
gem "railties", "~> 8.0.0"
|
|
11
10
|
gem "rake", "~> 13.0"
|
|
12
11
|
gem "redcarpet", "~> 3.5"
|
|
13
12
|
gem "rspec", "~> 3.12"
|
|
@@ -19,6 +18,7 @@ gem "timecop", ">= 0.9.6"
|
|
|
19
18
|
gem "vcr", "~> 6.0"
|
|
20
19
|
gem "yard", ">= 0.9.28"
|
|
21
20
|
gem "yard-activesupport-concern", ">= 0.0.1"
|
|
21
|
+
gem "activejob", "~> 8.0.0"
|
|
22
22
|
gem "activesupport", "~> 8.0.0"
|
|
23
23
|
|
|
24
24
|
gemspec path: "../"
|
data/gemfiles/rails_8.1.gemfile
CHANGED
|
@@ -7,7 +7,6 @@ gem "bundler", ">= 2.6", "< 5"
|
|
|
7
7
|
gem "countless", "~> 2.2"
|
|
8
8
|
gem "factory_bot", "~> 6.2"
|
|
9
9
|
gem "guard-rspec", "~> 4.7"
|
|
10
|
-
gem "railties", "~> 8.1.0"
|
|
11
10
|
gem "rake", "~> 13.0"
|
|
12
11
|
gem "redcarpet", "~> 3.5"
|
|
13
12
|
gem "rspec", "~> 3.12"
|
|
@@ -19,6 +18,7 @@ gem "timecop", ">= 0.9.6"
|
|
|
19
18
|
gem "vcr", "~> 6.0"
|
|
20
19
|
gem "yard", ">= 0.9.28"
|
|
21
20
|
gem "yard-activesupport-concern", ">= 0.0.1"
|
|
21
|
+
gem "activejob", "~> 8.1.0"
|
|
22
22
|
gem "activesupport", "~> 8.1.0"
|
|
23
23
|
|
|
24
24
|
gemspec path: "../"
|
|
File without changes
|
|
@@ -71,7 +71,18 @@ module Rimless
|
|
|
71
71
|
|
|
72
72
|
# At least one broker of the Apache Kafka cluster
|
|
73
73
|
config_accessor(:kafka_brokers) do
|
|
74
|
-
ENV.fetch('KAFKA_BROKERS', 'kafka://message-bus.local:9092')
|
|
74
|
+
ENV.fetch('KAFKA_BROKERS', 'kafka://message-bus.local:9092')
|
|
75
|
+
.split(',').map { |uri| uri.split('://', 2).last }.join(',')
|
|
76
|
+
end
|
|
77
|
+
|
|
78
|
+
# A custom writer for the kafka brokers configuration.
|
|
79
|
+
#
|
|
80
|
+
# @param val [String, Array<String>] the new kafka brokers list
|
|
81
|
+
def kafka_brokers=(val)
|
|
82
|
+
self[:kafka_brokers] =
|
|
83
|
+
Array(val).join(',').split(',')
|
|
84
|
+
.map { |uri| uri.split('://', 2).last }
|
|
85
|
+
.join(',')
|
|
75
86
|
end
|
|
76
87
|
|
|
77
88
|
# The source Apache Avro schema files location (templates)
|
|
@@ -96,9 +107,39 @@ module Rimless
|
|
|
96
107
|
'http://schema-registry.message-bus.local')
|
|
97
108
|
end
|
|
98
109
|
|
|
99
|
-
#
|
|
110
|
+
# This configuration allows users to configure a customized logger listener
|
|
111
|
+
# (which is bound to +Rimless.logger+). When configured to a falsy value
|
|
112
|
+
# (eg. +false+, or +nil+), no listener is installed by Rimless to Karafka.
|
|
113
|
+
config_accessor(:consumer_logger_listener) do
|
|
114
|
+
Karafka::Instrumentation::LoggerListener.new(
|
|
115
|
+
log_polling: false
|
|
116
|
+
)
|
|
117
|
+
end
|
|
118
|
+
|
|
119
|
+
# This setting allows users to configure a custom job bridge class, which
|
|
120
|
+
# takes care of receiving Kafka messages and produce/enqueue ActiveJob
|
|
121
|
+
# jobs. The configured class must be +Karafka::BaseConsumer+ compatible,
|
|
122
|
+
# for Karafka.
|
|
123
|
+
config_accessor(:job_bridge_class) { Rimless::Consumer::JobBridge }
|
|
124
|
+
|
|
125
|
+
# This configuration allows to choose a different consumer job class,
|
|
126
|
+
# enqueued by the +job_bridge_class+. This allows fully customized handling
|
|
127
|
+
# on user applications.
|
|
128
|
+
config_accessor(:consumer_job_class) { Rimless::Consumer::Job }
|
|
129
|
+
|
|
130
|
+
# This configuration allows to choose the default Apache Avro deserializer
|
|
131
|
+
# class, which is used by the Karafka consumer while using the
|
|
132
|
+
# +Rimless.consumer.topics+ helper.
|
|
133
|
+
config_accessor(:avro_deserializer_class) do
|
|
134
|
+
Rimless::Consumer::AvroDeserializer
|
|
135
|
+
end
|
|
136
|
+
|
|
137
|
+
# The ActiveJob job queue to use for consuming jobs
|
|
100
138
|
config_accessor(:consumer_job_queue) do
|
|
101
|
-
ENV.fetch(
|
|
139
|
+
ENV.fetch(
|
|
140
|
+
'KAFKA_JOB_QUEUE',
|
|
141
|
+
ENV.fetch('KAFKA_SIDEKIQ_JOB_QUEUE', 'default')
|
|
142
|
+
).to_sym
|
|
102
143
|
end
|
|
103
144
|
|
|
104
145
|
# A custom writer for the consumer job queue name.
|
|
@@ -106,10 +147,43 @@ module Rimless
|
|
|
106
147
|
# @param val [String, Symbol] the new job queue name
|
|
107
148
|
def consumer_job_queue=(val)
|
|
108
149
|
self[:consumer_job_queue] = val.to_sym
|
|
150
|
+
|
|
109
151
|
# Refresh the consumer job queue
|
|
110
|
-
|
|
111
|
-
queue: Rimless.configuration.consumer_job_queue
|
|
112
|
-
)
|
|
152
|
+
consumer_job_class.queue_as(val)
|
|
113
153
|
end
|
|
154
|
+
|
|
155
|
+
# This configuration block allows users to fully customized the
|
|
156
|
+
# +AvroTurf::Messaging+ instance. The Rimless default parameters hash is
|
|
157
|
+
# injected as argument to the configured block. The result of the block is
|
|
158
|
+
# then used to instantiate +AvroTurf::Messaging+, use like this.
|
|
159
|
+
#
|
|
160
|
+
# ->(config) { config.merge(connect_timeout: 5) }
|
|
161
|
+
#
|
|
162
|
+
# See: https://bit.ly/4r0mDnw
|
|
163
|
+
config_accessor(:avro_configure) { ->(config) { config } }
|
|
164
|
+
|
|
165
|
+
# This configuration block allows users to fully customize the
|
|
166
|
+
# +WaterDrop::Producer+ instance (accessible as +Rimless.producer+). The
|
|
167
|
+
# Rimless settings are already applied when the block is called. The
|
|
168
|
+
# +WaterDrop::Config+ is then injected as argument to the given block, and
|
|
169
|
+
# can be used regular like this:
|
|
170
|
+
#
|
|
171
|
+
# ->(config) { config.kafka[:'request.required.acks'] = -1 }
|
|
172
|
+
#
|
|
173
|
+
# See: https://bit.ly/4r5Uprv (+config.*+ root level WaterDrop settings)
|
|
174
|
+
# See: https://bit.ly/3OtIfeu (+config.kafka+ settings)
|
|
175
|
+
config_accessor(:producer_configure) { ->(config) { config } }
|
|
176
|
+
|
|
177
|
+
# This configuration block allows users to fully customize the
|
|
178
|
+
# +Karafka::App+ instance (accessible as +Rimless.consumer+). The Rimless
|
|
179
|
+
# settings are already applied when the block is called. The
|
|
180
|
+
# +Karafka::Setup::Config+ is then injected as argument to the given block,
|
|
181
|
+
# and can be used regular like this:
|
|
182
|
+
#
|
|
183
|
+
# ->(config) { config.kafka[:'request.required.acks'] = -1 }
|
|
184
|
+
#
|
|
185
|
+
# See: https://bit.ly/3MAF6Jk (+config.*+ root level Karafka settings)
|
|
186
|
+
# See: https://bit.ly/3OtIfeu (+config.kafka+ settings)
|
|
187
|
+
config_accessor(:consumer_configure) { ->(config) { config } }
|
|
114
188
|
end
|
|
115
189
|
end
|