waterdrop 2.4.2 → 2.4.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ba9d9c3c27b55980d16b4bca8266ec70864e53fc00e8f9823bc6407704105638
4
- data.tar.gz: '017907f6d33941b52bb8943befe6981586083de52739cf12937ceae3dececd79'
3
+ metadata.gz: 065c53c0e711cfe49e718a321ce64740b68a45bd6ac477c9eaf2a79876de8df7
4
+ data.tar.gz: e77179ca5162c4862369838197a62063b5127b4d3a72b7ab08f85f930cee6ea6
5
5
  SHA512:
6
- metadata.gz: e3d60772d7a278eed00a63ee08690a3897c45d55a74f2dd796353f528f8b6ae0ec93a8f79c8b6c717ad3435f9f3661b811ba278cd35355a2079b0194a6dc56c6
7
- data.tar.gz: 721694ab354d11419dc21f2911cbea57f4d89160067ec6b69687741471f498535779e4ee43ece02ac7697a070dfc3d5c6dc2d9119b811fd4b84ecb9703ad89a2
6
+ metadata.gz: 76af1a5f87a5f31bda8b9f5f111173e1992f6d5ddc85fa2b43d5e1fbd45b93e67601691cbf13b3b2a2e8eb49bc04196b84c5eb81c5b0bd6e11bbe014220fe821
7
+ data.tar.gz: 27c667d0c93f6948a85d9411fd1dfa46bd38ebb36f7c6907785658bb58ae3001b8d2b81bcb1a9faac64acc5248bc1300a3530736cdd7193b155e01eca4eefbbe
checksums.yaml.gz.sig CHANGED
@@ -1,3 +1,2 @@
1
- ��k���H�? ӻ����Ko�=��O��;z]͌
2
- ���r��ȼ sKZސ��b��M��ۅkc
3
- #�)��Ju�LT�õ����/�����S�O 謂! FR׵߃oÈ���ZU��%�.m���̀ٚ[޶�n!���$3@��e7-<P���^�ϩ42��1t�͉�p�&|�&䭠��i��K L��̤� ��i=W��MS�3o�C4e08g�x����v��B�gU�&W�(��@��A�|�*�X����8q%�Y�I��D��Q]ӮL���,0�#7�
1
+ ߸'��A�=�Y撂.�~h��H�ύ̋�`��FI��Bw�+&�����)*�7����ʣ5w� ��3T���)�K����s���9l�PӐ*5 �W�>��'J�����}^2YcN׬b��~[�� 9�?�j�5�o�պ�{���2��a�8�
2
+ ��,��S
@@ -0,0 +1 @@
1
+ custom: ['https://karafka.io/#become-pro']
@@ -23,13 +23,15 @@ jobs:
23
23
  - ruby: '3.1'
24
24
  coverage: 'true'
25
25
  steps:
26
- - uses: actions/checkout@v2
26
+ - uses: actions/checkout@v3
27
27
  - name: Install package dependencies
28
28
  run: "[ -e $APT_DEPS ] || sudo apt-get install -y --no-install-recommends $APT_DEPS"
29
29
  - name: Set up Ruby
30
30
  uses: ruby/setup-ruby@v1
31
31
  with:
32
32
  ruby-version: ${{matrix.ruby}}
33
+ - name: Remove libzstd-dev to check no supported compressions
34
+ run: sudo apt-get -y remove libzstd-dev
33
35
  - name: Run Kafka with docker-compose
34
36
  # We need to give Kafka enough time to start and create all the needed topics, etc
35
37
  # If anyone has a better idea on how to do it smart and easily, please contact me
@@ -54,7 +56,7 @@ jobs:
54
56
  strategy:
55
57
  fail-fast: false
56
58
  steps:
57
- - uses: actions/checkout@v2
59
+ - uses: actions/checkout@v3
58
60
  with:
59
61
  fetch-depth: 0
60
62
  - name: Set up Ruby
@@ -73,7 +75,7 @@ jobs:
73
75
  strategy:
74
76
  fail-fast: false
75
77
  steps:
76
- - uses: actions/checkout@v2
78
+ - uses: actions/checkout@v3
77
79
  with:
78
80
  fetch-depth: 0
79
81
  - name: Run Coditsu
data/.ruby-version CHANGED
@@ -1 +1 @@
1
- 3.1.2
1
+ 3.1.3
data/CHANGELOG.md CHANGED
@@ -1,5 +1,11 @@
1
1
  # WaterDrop changelog
2
2
 
3
+ ## 2.4.3 (2022-12-07)
4
+ - Support for librdkafka 0.13
5
+ - Update Github Actions
6
+ - Change auto-generated id from `SecureRandom#uuid` to `SecureRandom#hex(6)`
7
+ - Remove shared components that were moved to `karafka-core` from WaterDrop
8
+
3
9
  ## 2.4.2 (2022-09-29)
4
10
  - Allow sending tombstone messages (#267)
5
11
 
data/Gemfile.lock CHANGED
@@ -1,9 +1,8 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- waterdrop (2.4.2)
5
- karafka-core (>= 2.0.2, < 3.0.0)
6
- rdkafka (>= 0.10)
4
+ waterdrop (2.4.3)
5
+ karafka-core (>= 2.0.6, < 3.0.0)
7
6
  zeitwerk (~> 2.3)
8
7
 
9
8
  GEM
@@ -23,8 +22,9 @@ GEM
23
22
  ffi (1.15.5)
24
23
  i18n (1.12.0)
25
24
  concurrent-ruby (~> 1.0)
26
- karafka-core (2.0.2)
25
+ karafka-core (2.0.6)
27
26
  concurrent-ruby (>= 1.1)
27
+ rdkafka (>= 0.12)
28
28
  mini_portile2 (2.8.0)
29
29
  minitest (5.16.3)
30
30
  rake (13.0.6)
@@ -32,19 +32,19 @@ GEM
32
32
  ffi (~> 1.15)
33
33
  mini_portile2 (~> 2.6)
34
34
  rake (> 12)
35
- rspec (3.11.0)
36
- rspec-core (~> 3.11.0)
37
- rspec-expectations (~> 3.11.0)
38
- rspec-mocks (~> 3.11.0)
39
- rspec-core (3.11.0)
40
- rspec-support (~> 3.11.0)
41
- rspec-expectations (3.11.1)
35
+ rspec (3.12.0)
36
+ rspec-core (~> 3.12.0)
37
+ rspec-expectations (~> 3.12.0)
38
+ rspec-mocks (~> 3.12.0)
39
+ rspec-core (3.12.0)
40
+ rspec-support (~> 3.12.0)
41
+ rspec-expectations (3.12.0)
42
42
  diff-lcs (>= 1.2.0, < 2.0)
43
- rspec-support (~> 3.11.0)
44
- rspec-mocks (3.11.1)
43
+ rspec-support (~> 3.12.0)
44
+ rspec-mocks (3.12.0)
45
45
  diff-lcs (>= 1.2.0, < 2.0)
46
- rspec-support (~> 3.11.0)
47
- rspec-support (3.11.1)
46
+ rspec-support (~> 3.12.0)
47
+ rspec-support (3.12.0)
48
48
  simplecov (0.21.2)
49
49
  docile (~> 1.1)
50
50
  simplecov-html (~> 0.11)
@@ -53,7 +53,7 @@ GEM
53
53
  simplecov_json_formatter (0.1.4)
54
54
  tzinfo (2.0.5)
55
55
  concurrent-ruby (~> 1.0)
56
- zeitwerk (2.6.0)
56
+ zeitwerk (2.6.6)
57
57
 
58
58
  PLATFORMS
59
59
  arm64-darwin
@@ -67,4 +67,4 @@ DEPENDENCIES
67
67
  waterdrop!
68
68
 
69
69
  BUNDLED WITH
70
- 2.3.22
70
+ 2.3.26
data/README.md CHANGED
@@ -1,11 +1,5 @@
1
1
  # WaterDrop
2
2
 
3
- **Note**: Documentation presented here refers to WaterDrop `2.x`.
4
-
5
- WaterDrop `2.x` works with Karafka `2.*` and aims to either work as a standalone producer or as a part of the Karafka `2.*`.
6
-
7
- Please refer to [this](https://github.com/karafka/waterdrop/tree/1.4) branch and its documentation for details about WaterDrop `1.*` usage.
8
-
9
3
  [![Build Status](https://github.com/karafka/waterdrop/workflows/ci/badge.svg)](https://github.com/karafka/waterdrop/actions?query=workflow%3Aci)
10
4
  [![Gem Version](https://badge.fury.io/rb/waterdrop.svg)](http://badge.fury.io/rb/waterdrop)
11
5
  [![Join the chat at https://slack.karafka.io](https://raw.githubusercontent.com/karafka/misc/master/slack.svg)](https://slack.karafka.io)
@@ -35,6 +29,7 @@ It:
35
29
  * [Buffering](#buffering)
36
30
  + [Using WaterDrop to buffer messages based on the application logic](#using-waterdrop-to-buffer-messages-based-on-the-application-logic)
37
31
  + [Using WaterDrop with rdkafka buffers to achieve periodic auto-flushing](#using-waterdrop-with-rdkafka-buffers-to-achieve-periodic-auto-flushing)
32
+ * [Compression](#compression)
38
33
  - [Instrumentation](#instrumentation)
39
34
  * [Usage statistics](#usage-statistics)
40
35
  * [Error notifications](#error-notifications)
@@ -161,9 +156,14 @@ Keep in mind, that message you want to send should be either binary or stringifi
161
156
 
162
157
  ### Using WaterDrop across the application and with Ruby on Rails
163
158
 
164
- If you plan to both produce and consume messages using Kafka, you should install and use [Karafka](https://github.com/karafka/karafka). It integrates automatically with Ruby on Rails applications and auto-configures WaterDrop producer to make it accessible via `Karafka#producer` method.
159
+ If you plan to both produce and consume messages using Kafka, you should install and use [Karafka](https://github.com/karafka/karafka). It integrates automatically with Ruby on Rails applications and auto-configures WaterDrop producer to make it accessible via `Karafka#producer` method:
160
+
161
+ ```ruby
162
+ event = Events.last
163
+ Karafka.producer.produce_async(topic: 'events', payload: event.to_json)
164
+ ```
165
165
 
166
- If you want to only produce messages from within your application, since WaterDrop is thread-safe you can create a single instance in an initializer like so:
166
+ If you want to only produce messages from within your application without consuming with Karafka, since WaterDrop is thread-safe you can create a single instance in an initializer like so:
167
167
 
168
168
  ```ruby
169
169
  KAFKA_PRODUCER = WaterDrop::Producer.new
@@ -200,6 +200,35 @@ WaterDrop producers support buffering messages in their internal buffers and on
200
200
 
201
201
  This means that depending on your use case, you can achieve both granular buffering and flushing control when needed with context awareness and periodic and size-based flushing functionalities.
202
202
 
203
+ ### Compression
204
+
205
+ WaterDrop supports following compression types:
206
+
207
+ - `gzip`
208
+ - `zstd`
209
+ - `lz4`
210
+ - `snappy`
211
+
212
+ To use compression, set `kafka` scope `compression.codec` setting. You can also optionally indicate the `compression.level`:
213
+
214
+ ```ruby
215
+ producer = WaterDrop::Producer.new
216
+
217
+ producer.setup do |config|
218
+ config.kafka = {
219
+ 'bootstrap.servers': 'localhost:9092',
220
+ 'compression.codec': 'gzip',
221
+ 'compression.level': 6
222
+ }
223
+ end
224
+ ```
225
+
226
+ **Note**: In order to use `zstd`, you need to install `libzstd-dev`:
227
+
228
+ ```bash
229
+ apt-get install -y libzstd-dev
230
+ ```
231
+
203
232
  #### Using WaterDrop to buffer messages based on the application logic
204
233
 
205
234
  ```ruby
data/config/errors.yml CHANGED
@@ -17,7 +17,7 @@ en:
17
17
  topic_format: 'does not match the topic allowed format'
18
18
  partition_key_format: must be a non-empty string
19
19
  timestamp_format: must be either time or integer
20
- payload_format: must be string
20
+ payload_format: must be string or nil
21
21
  headers_format: must be a hash
22
22
  key_format: must be a non-empty string
23
23
  payload_max_size: is more than `max_payload_size` config value
@@ -22,7 +22,7 @@ module WaterDrop
22
22
  setting(
23
23
  :id,
24
24
  default: false,
25
- constructor: ->(id) { id || SecureRandom.uuid }
25
+ constructor: ->(id) { id || SecureRandom.hex(6) }
26
26
  )
27
27
  # option [Instance] logger that we want to use
28
28
  # @note Due to how rdkafka works, this setting is global for all the producers
@@ -34,14 +34,14 @@ module WaterDrop
34
34
  optional(:timestamp) { |val| val.nil? || (val.is_a?(Time) || val.is_a?(Integer)) }
35
35
  optional(:headers) { |val| val.nil? || val.is_a?(Hash) }
36
36
 
37
- virtual do |config, errors|
37
+ virtual do |message, errors|
38
38
  next true unless errors.empty?
39
- next true unless config.key?(:headers)
40
- next true if config[:headers].nil?
39
+ next true unless message.key?(:headers)
40
+ next true if message[:headers].nil?
41
41
 
42
42
  errors = []
43
43
 
44
- config.fetch(:headers).each do |key, value|
44
+ message.fetch(:headers).each do |key, value|
45
45
  errors << [%i[headers], :invalid_key_type] unless key.is_a?(String)
46
46
  errors << [%i[headers], :invalid_value_type] unless value.is_a?(String)
47
47
  end
@@ -49,10 +49,10 @@ module WaterDrop
49
49
  errors
50
50
  end
51
51
 
52
- virtual do |config, errors, validator|
52
+ virtual do |message, errors, validator|
53
53
  next true unless errors.empty?
54
- next if config[:payload].nil? # tombstone payload
55
- next true if config[:payload].bytesize <= validator.max_payload_size
54
+ next if message[:payload].nil? # tombstone payload
55
+ next true if message[:payload].bytesize <= validator.max_payload_size
56
56
 
57
57
  [[%i[payload], :max_size]]
58
58
  end
@@ -8,14 +8,22 @@ module WaterDrop
8
8
  # Rdkafka::Producer patches
9
9
  module Producer
10
10
  # Adds a method that allows us to get the native kafka producer name
11
+ #
12
+ # In between rdkafka versions, there are internal changes that force us to add some extra
13
+ # magic to support all the versions.
14
+ #
11
15
  # @return [String] producer instance name
12
16
  def name
13
17
  unless @_native
14
18
  version = ::Gem::Version.new(::Rdkafka::VERSION)
15
- change = ::Gem::Version.new('0.12.0')
16
- # 0.12.0 changed how the native producer client reference works.
17
- # This code supports both older and newer versions of rdkafka
18
- @_native = version >= change ? @client.native : @native_kafka
19
+
20
+ if version < ::Gem::Version.new('0.12.0')
21
+ @native = @native_kafka
22
+ elsif version < ::Gem::Version.new('0.13.0.beta.1')
23
+ @_native = @client.native
24
+ else
25
+ @_native = @native_kafka.inner
26
+ end
19
27
  end
20
28
 
21
29
  ::Rdkafka::Bindings.rd_kafka_name(@_native)
@@ -82,13 +82,13 @@ module WaterDrop
82
82
  @client = Builder.new.call(self, @config)
83
83
 
84
84
  # Register statistics runner for this particular type of callbacks
85
- ::WaterDrop::Instrumentation.statistics_callbacks.add(
85
+ ::Karafka::Core::Instrumentation.statistics_callbacks.add(
86
86
  @id,
87
87
  Instrumentation::Callbacks::Statistics.new(@id, @client.name, @config.monitor)
88
88
  )
89
89
 
90
90
  # Register error tracking callback
91
- ::WaterDrop::Instrumentation.error_callbacks.add(
91
+ ::Karafka::Core::Instrumentation.error_callbacks.add(
92
92
  @id,
93
93
  Instrumentation::Callbacks::Error.new(@id, @client.name, @config.monitor)
94
94
  )
@@ -125,8 +125,8 @@ module WaterDrop
125
125
  client.close if @client
126
126
 
127
127
  # Remove callbacks runners that were registered
128
- ::WaterDrop::Instrumentation.statistics_callbacks.delete(@id)
129
- ::WaterDrop::Instrumentation.error_callbacks.delete(@id)
128
+ ::Karafka::Core::Instrumentation.statistics_callbacks.delete(@id)
129
+ ::Karafka::Core::Instrumentation.error_callbacks.delete(@id)
130
130
 
131
131
  @status.closed!
132
132
  end
@@ -3,5 +3,5 @@
3
3
  # WaterDrop library
4
4
  module WaterDrop
5
5
  # Current WaterDrop version
6
- VERSION = '2.4.2'
6
+ VERSION = '2.4.3'
7
7
  end
data/lib/waterdrop.rb CHANGED
@@ -3,12 +3,11 @@
3
3
  # External components
4
4
  # delegate should be removed because we don't need it, we just add it because of ruby-kafka
5
5
  %w[
6
- karafka-core
7
6
  forwardable
8
- rdkafka
9
7
  json
10
8
  zeitwerk
11
9
  securerandom
10
+ karafka-core
12
11
  ].each { |lib| require lib }
13
12
 
14
13
  # WaterDrop library
@@ -27,9 +26,3 @@ loader.inflector.inflect('waterdrop' => 'WaterDrop')
27
26
  loader.ignore("#{__dir__}/waterdrop/instrumentation/vendors/**/*.rb")
28
27
  loader.setup
29
28
  loader.eager_load
30
-
31
- # Rdkafka uses a single global callback for things. We bypass that by injecting a manager for
32
- # each callback type. Callback manager allows us to register more than one callback
33
- # @note Those managers are also used by Karafka for consumer related statistics
34
- Rdkafka::Config.statistics_callback = WaterDrop::Instrumentation.statistics_callbacks
35
- Rdkafka::Config.error_callback = WaterDrop::Instrumentation.error_callbacks
data/waterdrop.gemspec CHANGED
@@ -16,8 +16,7 @@ Gem::Specification.new do |spec|
16
16
  spec.description = spec.summary
17
17
  spec.license = 'MIT'
18
18
 
19
- spec.add_dependency 'karafka-core', '>= 2.0.2', '< 3.0.0'
20
- spec.add_dependency 'rdkafka', '>= 0.10'
19
+ spec.add_dependency 'karafka-core', '>= 2.0.6', '< 3.0.0'
21
20
  spec.add_dependency 'zeitwerk', '~> 2.3'
22
21
 
23
22
  spec.required_ruby_version = '>= 2.7'
data.tar.gz.sig CHANGED
Binary file
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: waterdrop
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.4.2
4
+ version: 2.4.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Maciej Mensfeld
@@ -35,7 +35,7 @@ cert_chain:
35
35
  Qf04B9ceLUaC4fPVEz10FyobjaFoY4i32xRto3XnrzeAgfEe4swLq8bQsR3w/EF3
36
36
  MGU0FeSV2Yj7Xc2x/7BzLK8xQn5l7Yy75iPF+KP3vVmDHnNl
37
37
  -----END CERTIFICATE-----
38
- date: 2022-09-29 00:00:00.000000000 Z
38
+ date: 2022-12-07 00:00:00.000000000 Z
39
39
  dependencies:
40
40
  - !ruby/object:Gem::Dependency
41
41
  name: karafka-core
@@ -43,7 +43,7 @@ dependencies:
43
43
  requirements:
44
44
  - - ">="
45
45
  - !ruby/object:Gem::Version
46
- version: 2.0.2
46
+ version: 2.0.6
47
47
  - - "<"
48
48
  - !ruby/object:Gem::Version
49
49
  version: 3.0.0
@@ -53,24 +53,10 @@ dependencies:
53
53
  requirements:
54
54
  - - ">="
55
55
  - !ruby/object:Gem::Version
56
- version: 2.0.2
56
+ version: 2.0.6
57
57
  - - "<"
58
58
  - !ruby/object:Gem::Version
59
59
  version: 3.0.0
60
- - !ruby/object:Gem::Dependency
61
- name: rdkafka
62
- requirement: !ruby/object:Gem::Requirement
63
- requirements:
64
- - - ">="
65
- - !ruby/object:Gem::Version
66
- version: '0.10'
67
- type: :runtime
68
- prerelease: false
69
- version_requirements: !ruby/object:Gem::Requirement
70
- requirements:
71
- - - ">="
72
- - !ruby/object:Gem::Version
73
- version: '0.10'
74
60
  - !ruby/object:Gem::Dependency
75
61
  name: zeitwerk
76
62
  requirement: !ruby/object:Gem::Requirement
@@ -94,6 +80,7 @@ extra_rdoc_files: []
94
80
  files:
95
81
  - ".coditsu/ci.yml"
96
82
  - ".diffend.yml"
83
+ - ".github/FUNDING.yml"
97
84
  - ".github/workflows/ci.yml"
98
85
  - ".gitignore"
99
86
  - ".rspec"
@@ -113,17 +100,14 @@ files:
113
100
  - lib/waterdrop/contracts/config.rb
114
101
  - lib/waterdrop/contracts/message.rb
115
102
  - lib/waterdrop/errors.rb
116
- - lib/waterdrop/instrumentation.rb
117
103
  - lib/waterdrop/instrumentation/callbacks/delivery.rb
118
104
  - lib/waterdrop/instrumentation/callbacks/error.rb
119
105
  - lib/waterdrop/instrumentation/callbacks/statistics.rb
120
- - lib/waterdrop/instrumentation/callbacks_manager.rb
121
106
  - lib/waterdrop/instrumentation/logger_listener.rb
122
107
  - lib/waterdrop/instrumentation/monitor.rb
123
108
  - lib/waterdrop/instrumentation/notifications.rb
124
109
  - lib/waterdrop/instrumentation/vendors/datadog/dashboard.json
125
110
  - lib/waterdrop/instrumentation/vendors/datadog/listener.rb
126
- - lib/waterdrop/patches/rdkafka/bindings.rb
127
111
  - lib/waterdrop/patches/rdkafka/producer.rb
128
112
  - lib/waterdrop/producer.rb
129
113
  - lib/waterdrop/producer/async.rb
@@ -156,7 +140,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
156
140
  - !ruby/object:Gem::Version
157
141
  version: '0'
158
142
  requirements: []
159
- rubygems_version: 3.3.7
143
+ rubygems_version: 3.3.26
160
144
  signing_key:
161
145
  specification_version: 4
162
146
  summary: Kafka messaging made easy!
metadata.gz.sig CHANGED
Binary file
@@ -1,39 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module WaterDrop
4
- module Instrumentation
5
- # This manager allows us to register multiple callbacks into a hook that is suppose to support
6
- # a single callback
7
- class CallbacksManager
8
- # @return [::WaterDrop::Instrumentation::CallbacksManager]
9
- def initialize
10
- @callbacks = Concurrent::Hash.new
11
- end
12
-
13
- # Invokes all the callbacks registered one after another
14
- #
15
- # @param args [Object] any args that should go to the callbacks
16
- # @note We do not use `#each_value` here on purpose. With it being used, we cannot dispatch
17
- # callbacks and add new at the same time. Since we don't know when and in what thread
18
- # things are going to be added to the manager, we need to extract values into an array and
19
- # run it. That way we can add new things the same time.
20
- def call(*args)
21
- @callbacks.values.each { |callback| callback.call(*args) }
22
- end
23
-
24
- # Adds a callback to the manager
25
- #
26
- # @param id [String] id of the callback (used when deleting it)
27
- # @param callable [#call] object that responds to a `#call` method
28
- def add(id, callable)
29
- @callbacks[id] = callable
30
- end
31
-
32
- # Removes the callback from the manager
33
- # @param id [String] id of the callback we want to remove
34
- def delete(id)
35
- @callbacks.delete(id)
36
- end
37
- end
38
- end
39
- end
@@ -1,20 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module WaterDrop
4
- # Namespace for all the things related with WaterDrop instrumentation process
5
- module Instrumentation
6
- class << self
7
- # Builds a manager for statistics callbacks
8
- # @return [WaterDrop::CallbacksManager]
9
- def statistics_callbacks
10
- @statistics_callbacks ||= CallbacksManager.new
11
- end
12
-
13
- # Builds a manager for error callbacks
14
- # @return [WaterDrop::CallbacksManager]
15
- def error_callbacks
16
- @error_callbacks ||= CallbacksManager.new
17
- end
18
- end
19
- end
20
- end
@@ -1,42 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module WaterDrop
4
- module Patches
5
- module Rdkafka
6
- # Extends `Rdkafka::Bindings` with some extra methods and updates callbacks that we intend
7
- # to work with in a bit different way than rdkafka itself
8
- module Bindings
9
- class << self
10
- # Add extra methods that we need
11
- # @param mod [::Rdkafka::Bindings] rdkafka bindings module
12
- def included(mod)
13
- mod.attach_function :rd_kafka_name, [:pointer], :string
14
-
15
- # Default rdkafka setup for errors doest not propagate client details, thus it always
16
- # publishes all the stuff for all rdkafka instances. We change that by providing
17
- # function that fetches the instance name, allowing us to have better notifications
18
- mod.send(:remove_const, :ErrorCallback)
19
- mod.const_set(:ErrorCallback, build_error_callback)
20
- end
21
-
22
- # @return [FFI::Function] overwritten callback function
23
- def build_error_callback
24
- FFI::Function.new(
25
- :void, %i[pointer int string pointer]
26
- ) do |client_prr, err_code, reason, _opaque|
27
- return nil unless ::Rdkafka::Config.error_callback
28
-
29
- name = ::Rdkafka::Bindings.rd_kafka_name(client_prr)
30
-
31
- error = ::Rdkafka::RdkafkaError.new(err_code, broker_message: reason)
32
-
33
- ::Rdkafka::Config.error_callback.call(name, error)
34
- end
35
- end
36
- end
37
- end
38
- end
39
- end
40
- end
41
-
42
- ::Rdkafka::Bindings.include(::WaterDrop::Patches::Rdkafka::Bindings)