rdkafka 0.16.1 → 0.18.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- checksums.yaml.gz.sig +0 -0
- data/.ruby-version +1 -1
- data/CHANGELOG.md +26 -0
- data/README.md +11 -9
- data/certs/cert.pem +26 -0
- data/dist/{librdkafka_2.3.0.tar.gz → librdkafka_2.5.0.tar.gz} +0 -0
- data/dist/patches/rdkafka_sticky_assignor.c.patch +26 -0
- data/docker-compose.yml +1 -1
- data/ext/README.md +5 -4
- data/ext/Rakefile +41 -1
- data/lib/rdkafka/abstract_handle.rb +1 -7
- data/lib/rdkafka/bindings.rb +1 -1
- data/lib/rdkafka/callbacks.rb +39 -15
- data/lib/rdkafka/config.rb +2 -0
- data/lib/rdkafka/consumer.rb +16 -3
- data/lib/rdkafka/producer.rb +1 -1
- data/lib/rdkafka/version.rb +3 -3
- data/rdkafka.gemspec +1 -1
- data/spec/rdkafka/abstract_handle_spec.rb +0 -9
- data/spec/rdkafka/admin_spec.rb +15 -1
- data/spec/rdkafka/consumer_spec.rb +89 -0
- data.tar.gz.sig +0 -0
- metadata +26 -25
- metadata.gz.sig +0 -0
- data/certs/cert_chain.pem +0 -26
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 9c9dad3e9f793616387a5665ecba20271fe431755919995bef1921d595af20c1
|
4
|
+
data.tar.gz: ffa1c227773b625195f48c0fd3e0cdbf386f0bdad9438a41e472fe16acf4c856
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: bbc22c2a6d032bbd0485670062f42d3aa7bd4ec60119e76e669ec2f133f68b21e6529d7674e191b05a3b298ac0935b7e34d1166292486475ab5d70d5665fac9d
|
7
|
+
data.tar.gz: 77f248a1373d8220b142b4d632063f692918263a6535fa0f28238e3dcb1f5e63ed8b8b6a56bbffd1c9c5c99ca79ae6d2eaf59dd223ac58788f6144cbc36e8c6d
|
checksums.yaml.gz.sig
CHANGED
Binary file
|
data/.ruby-version
CHANGED
@@ -1 +1 @@
|
|
1
|
-
3.3.
|
1
|
+
3.3.4
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,19 @@
|
|
1
1
|
# Rdkafka Changelog
|
2
2
|
|
3
|
+
## 0.18.0 (2024-09-02)
|
4
|
+
- [Enhancement] Update `librdkafka` to `2.5.0`
|
5
|
+
- [Enhancement] Do not release GVL on `rd_kafka_name` (ferrous26)
|
6
|
+
- [Patch] Patch cooperative-sticky assignments in librdkafka.
|
7
|
+
- [Fix] Mitigate a case where FFI would not restart the background events callback dispatcher in forks
|
8
|
+
- [Fix] Fix unused variable reference in producer (lucasmvnascimento)
|
9
|
+
|
10
|
+
## 0.17.0 (2024-08-03)
|
11
|
+
- [Feature] Add `#seek_by` to be able to seek for a message by topic, partition and offset (zinahia)
|
12
|
+
- [Enhancement] Update `librdkafka` to `2.4.0`
|
13
|
+
- [Enhancement] Support ability to release patches to librdkafka.
|
14
|
+
- [Change] Remove old producer timeout API warnings.
|
15
|
+
- [Fix] Switch to local release of librdkafka to mitigate its unavailability.
|
16
|
+
|
3
17
|
## 0.16.1 (2024-07-10)
|
4
18
|
- [Fix] Switch to local release of librdkafka to mitigate its unavailability.
|
5
19
|
|
@@ -21,6 +35,9 @@
|
|
21
35
|
- [Fix] Background logger stops working after forking causing memory leaks (mensfeld)
|
22
36
|
- [Fix] Fix bogus case/when syntax. Levels 1, 2, and 6 previously defaulted to UNKNOWN (jjowdy)
|
23
37
|
|
38
|
+
## 0.15.2 (2024-07-10)
|
39
|
+
- [Fix] Switch to local release of librdkafka to mitigate its unavailability.
|
40
|
+
|
24
41
|
## 0.15.1 (2024-01-30)
|
25
42
|
- [Enhancement] Provide support for Nix OS (alexandriainfantino)
|
26
43
|
- [Enhancement] Replace `rd_kafka_offset_store` with `rd_kafka_offsets_store` (mensfeld)
|
@@ -46,6 +63,9 @@
|
|
46
63
|
- [Enhancement] Increase the `#lag` and `#query_watermark_offsets` default timeouts from 100ms to 1000ms. This will compensate for network glitches and remote clusters operations (mensfeld)
|
47
64
|
- [Change] Use `SecureRandom.uuid` instead of `random` for test consumer groups (mensfeld)
|
48
65
|
|
66
|
+
## 0.14.1 (2024-07-10)
|
67
|
+
- [Fix] Switch to local release of librdkafka to mitigate its unavailability.
|
68
|
+
|
49
69
|
## 0.14.0 (2023-11-21)
|
50
70
|
- [Enhancement] Add `raise_response_error` flag to the `Rdkafka::AbstractHandle`.
|
51
71
|
- [Enhancement] Allow for setting `statistics_callback` as nil to reset predefined settings configured by a different gem (mensfeld)
|
@@ -63,6 +83,9 @@
|
|
63
83
|
- [Change] Update Kafka Docker with Confluent KRaft (mensfeld)
|
64
84
|
- [Change] Update librdkafka repo reference from edenhill to confluentinc (mensfeld)
|
65
85
|
|
86
|
+
## 0.13.1 (2024-07-10)
|
87
|
+
- [Fix] Switch to local release of librdkafka to mitigate its unavailability.
|
88
|
+
|
66
89
|
## 0.13.0 (2023-07-24)
|
67
90
|
- Support cooperative sticky partition assignment in the rebalance callback (methodmissing)
|
68
91
|
- Support both string and symbol header keys (ColinDKelley)
|
@@ -79,6 +102,9 @@
|
|
79
102
|
- Make metadata request timeout configurable (mensfeld)
|
80
103
|
- call_on_partitions_assigned and call_on_partitions_revoked only get a tpl passed in (thijsc)
|
81
104
|
|
105
|
+
## 0.12.1 (2024-07-11)
|
106
|
+
- [Fix] Switch to local release of librdkafka to mitigate its unavailability.
|
107
|
+
|
82
108
|
## 0.12.0 (2022-06-17)
|
83
109
|
- Bumps librdkafka to 1.9.0
|
84
110
|
- Fix crash on empty partition key (mensfeld)
|
data/README.md
CHANGED
@@ -161,12 +161,14 @@ bundle exec rake produce_messages
|
|
161
161
|
|
162
162
|
## Versions
|
163
163
|
|
164
|
-
| rdkafka-ruby | librdkafka |
|
165
|
-
|
166
|
-
| 0.
|
167
|
-
| 0.
|
168
|
-
| 0.
|
169
|
-
| 0.
|
170
|
-
| 0.
|
171
|
-
| 0.
|
172
|
-
| 0.
|
164
|
+
| rdkafka-ruby | librdkafka | patches |
|
165
|
+
|-|-|-|
|
166
|
+
| 0.18.0 (2024-09-02) | 2.5.0 (2024-06-10) | yes |
|
167
|
+
| 0.17.0 (2024-08-03) | 2.4.0 (2024-05-07) | no |
|
168
|
+
| 0.16.0 (2024-06-13) | 2.3.0 (2023-10-25) | no |
|
169
|
+
| 0.15.0 (2023-12-03) | 2.3.0 (2023-10-25) | no |
|
170
|
+
| 0.14.0 (2023-11-21) | 2.2.0 (2023-07-12) | no |
|
171
|
+
| 0.13.0 (2023-07-24) | 2.0.2 (2023-01-20) | no |
|
172
|
+
| 0.12.0 (2022-06-17) | 1.9.0 (2022-06-16) | no |
|
173
|
+
| 0.11.0 (2021-11-17) | 1.8.2 (2021-10-18) | no |
|
174
|
+
| 0.10.0 (2021-09-07) | 1.5.0 (2020-07-20) | no |
|
data/certs/cert.pem
ADDED
@@ -0,0 +1,26 @@
|
|
1
|
+
-----BEGIN CERTIFICATE-----
|
2
|
+
MIIEcDCCAtigAwIBAgIBATANBgkqhkiG9w0BAQsFADA/MRAwDgYDVQQDDAdjb250
|
3
|
+
YWN0MRcwFQYKCZImiZPyLGQBGRYHa2FyYWZrYTESMBAGCgmSJomT8ixkARkWAmlv
|
4
|
+
MB4XDTI0MDgyMzEwMTkyMFoXDTQ5MDgxNzEwMTkyMFowPzEQMA4GA1UEAwwHY29u
|
5
|
+
dGFjdDEXMBUGCgmSJomT8ixkARkWB2thcmFma2ExEjAQBgoJkiaJk/IsZAEZFgJp
|
6
|
+
bzCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKjLhLjQqUlNayxkXnO+
|
7
|
+
PsmCDs/KFIzhrsYMfLZRZNaWmzV3ujljMOdDjd4snM2X06C41iVdQPWjpe3j8vVe
|
8
|
+
ZXEWR/twSbOP6Eeg8WVH2wCOo0x5i7yhVn4UBLH4JpfEMCbemVcWQ9ry9OMg4WpH
|
9
|
+
Uu4dRwxFV7hzCz3p0QfNLRI4miAxnGWcnlD98IJRjBAksTuR1Llj0vbOrDGsL9ZT
|
10
|
+
JeXP2gdRLd8SqzAFJEWrbeTBCBU7gfSh3oMg5SVDLjaqf7Kz5wC/8bDZydzanOxB
|
11
|
+
T6CDXPsCnllmvTNx2ei2T5rGYJOzJeNTmJLLK6hJWUlAvaQSvCwZRvFJ0tVGLEoS
|
12
|
+
flqSr6uGyyl1eMUsNmsH4BqPEYcAV6P2PKTv2vUR8AP0raDvZ3xL1TKvfRb8xRpo
|
13
|
+
vPopCGlY5XBWEc6QERHfVLTIVsjnls2/Ujj4h8/TSfqqYnaHKefIMLbuD/tquMjD
|
14
|
+
iWQsW2qStBV0T+U7FijKxVfrfqZP7GxQmDAc9o1iiyAa3QIDAQABo3cwdTAJBgNV
|
15
|
+
HRMEAjAAMAsGA1UdDwQEAwIEsDAdBgNVHQ4EFgQU3O4dTXmvE7YpAkszGzR9DdL9
|
16
|
+
sbEwHQYDVR0RBBYwFIESY29udGFjdEBrYXJhZmthLmlvMB0GA1UdEgQWMBSBEmNv
|
17
|
+
bnRhY3RAa2FyYWZrYS5pbzANBgkqhkiG9w0BAQsFAAOCAYEAVKTfoLXn7mqdSxIR
|
18
|
+
eqxcR6Huudg1jes81s1+X0uiRTR3hxxKZ3Y82cPsee9zYWyBrN8TA4KA0WILTru7
|
19
|
+
Ygxvzha0SRPsSiaKLmgOJ+61ebI4+bOORzIJLpD6GxCxu1r7MI4+0r1u1xe0EWi8
|
20
|
+
agkVo1k4Vi8cKMLm6Gl9b3wG9zQBw6fcgKwmpjKiNnOLP+OytzUANrIUJjoq6oal
|
21
|
+
TC+f/Uc0TLaRqUaW/bejxzDWWHoM3SU6aoLPuerglzp9zZVzihXwx3jPLUVKDFpF
|
22
|
+
Rl2lcBDxlpYGueGo0/oNzGJAAy6js8jhtHC9+19PD53vk7wHtFTZ/0ugDQYnwQ+x
|
23
|
+
oml2fAAuVWpTBCgOVFe6XCQpMKopzoxQ1PjKztW2KYxgJdIBX87SnL3aWuBQmhRd
|
24
|
+
i9zWxov0mr44TWegTVeypcWGd/0nxu1+QHVNHJrpqlPBRvwQsUm7fwmRInGpcaB8
|
25
|
+
ap8wNYvryYzrzvzUxIVFBVM5PacgkFqRmolCa8I7tdKQN+R1
|
26
|
+
-----END CERTIFICATE-----
|
Binary file
|
@@ -0,0 +1,26 @@
|
|
1
|
+
# This patch is released under the 2-clause BSD license, same as librdkafka
|
2
|
+
# Fixes: https://github.com/confluentinc/librdkafka/issues/4783
|
3
|
+
#
|
4
|
+
--- librdkafka_2.5.0/src/rdkafka_sticky_assignor.c 2024-07-08 09:47:43.000000000 +0200
|
5
|
+
+++ librdkafka_2.5.0/src/rdkafka_sticky_assignor.c 2024-07-30 09:44:38.529759640 +0200
|
6
|
+
@@ -769,7 +769,7 @@
|
7
|
+
const rd_kafka_topic_partition_list_t *partitions;
|
8
|
+
const char *consumer;
|
9
|
+
const rd_map_elem_t *elem;
|
10
|
+
- int i;
|
11
|
+
+ int i, j;
|
12
|
+
|
13
|
+
/* The assignment is balanced if minimum and maximum numbers of
|
14
|
+
* partitions assigned to consumers differ by at most one. */
|
15
|
+
@@ -836,9 +836,9 @@
|
16
|
+
|
17
|
+
/* Otherwise make sure it can't get any more partitions */
|
18
|
+
|
19
|
+
- for (i = 0; i < potentialTopicPartitions->cnt; i++) {
|
20
|
+
+ for (j = 0; j < potentialTopicPartitions->cnt; j++) {
|
21
|
+
const rd_kafka_topic_partition_t *partition =
|
22
|
+
- &potentialTopicPartitions->elems[i];
|
23
|
+
+ &potentialTopicPartitions->elems[j];
|
24
|
+
const char *otherConsumer;
|
25
|
+
int otherConsumerPartitionCount;
|
26
|
+
|
data/docker-compose.yml
CHANGED
data/ext/README.md
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
# Ext
|
2
2
|
|
3
|
-
This gem depends on the `librdkafka` C library. It is downloaded
|
4
|
-
|
3
|
+
This gem depends on the `librdkafka` C library. It is downloaded, stored in
|
4
|
+
`dist/` directory, and checked into source control.
|
5
5
|
|
6
6
|
To update the `librdkafka` version follow the following steps:
|
7
7
|
|
@@ -9,8 +9,9 @@ To update the `librdkafka` version follow the following steps:
|
|
9
9
|
version number and asset checksum for `tar.gz`.
|
10
10
|
* Change the version in `lib/rdkafka/version.rb`
|
11
11
|
* Change the `sha256` in `lib/rdkafka/version.rb`
|
12
|
-
* Run `bundle exec rake` in the `ext` directory to download
|
13
|
-
the
|
12
|
+
* Run `bundle exec rake dist:download` in the `ext` directory to download the
|
13
|
+
new release and place it in the `dist/` for you
|
14
|
+
* Run `bundle exec rake` in the `ext` directory to build the new version
|
14
15
|
* Run `docker-compose pull` in the main gem directory to ensure the docker
|
15
16
|
images used by the tests and run `docker-compose up`
|
16
17
|
* Finally, run `bundle exec rspec` in the main gem directory to execute
|
data/ext/Rakefile
CHANGED
@@ -1,6 +1,7 @@
|
|
1
1
|
# frozen_string_literal: true
|
2
2
|
|
3
3
|
require File.expand_path('../../lib/rdkafka/version', __FILE__)
|
4
|
+
require "digest"
|
4
5
|
require "fileutils"
|
5
6
|
require "open-uri"
|
6
7
|
|
@@ -30,6 +31,8 @@ task :default => :clean do
|
|
30
31
|
}
|
31
32
|
recipe.configure_options = ["--host=#{recipe.host}"]
|
32
33
|
|
34
|
+
recipe.patch_files = Dir[File.join(releases, 'patches', "*.patch")].sort
|
35
|
+
|
33
36
|
# Disable using libc regex engine in favor of the embedded one
|
34
37
|
# The default regex engine of librdkafka does not always work exactly as most of the users
|
35
38
|
# would expect, hence this flag allows for changing it to the other one
|
@@ -71,6 +74,42 @@ task :clean do
|
|
71
74
|
FileUtils.rm_rf File.join(File.dirname(__FILE__), "tmp")
|
72
75
|
end
|
73
76
|
|
77
|
+
namespace :dist do
|
78
|
+
task :dir do
|
79
|
+
ENV["RDKAFKA_DIST_PATH"] ||= File.expand_path(File.join(File.dirname(__FILE__), '..', 'dist'))
|
80
|
+
end
|
81
|
+
|
82
|
+
task :file => "dist:dir" do
|
83
|
+
ENV["RDKAFKA_DIST_FILE"] ||= File.join(ENV["RDKAFKA_DIST_PATH"], "librdkafka_#{Rdkafka::LIBRDKAFKA_VERSION}.tar.gz")
|
84
|
+
end
|
85
|
+
|
86
|
+
task :clean => "dist:file" do
|
87
|
+
Dir.glob(File.join("#{ENV['RDKAFKA_DIST_PATH']}", "*")).each do |filename|
|
88
|
+
next if filename.include? ENV["RDKAFKA_DIST_FILE"]
|
89
|
+
|
90
|
+
FileUtils.rm_rf filename
|
91
|
+
end
|
92
|
+
end
|
93
|
+
|
94
|
+
task :download => "dist:file" do
|
95
|
+
version = Rdkafka::LIBRDKAFKA_VERSION
|
96
|
+
librdkafka_download = "https://codeload.github.com/confluentinc/librdkafka/tar.gz/v#{version}"
|
97
|
+
|
98
|
+
URI.open(librdkafka_download) do |file|
|
99
|
+
filename = ENV["RDKAFKA_DIST_FILE"]
|
100
|
+
data = file.read
|
101
|
+
|
102
|
+
if Digest::SHA256.hexdigest(data) != Rdkafka::LIBRDKAFKA_SOURCE_SHA256
|
103
|
+
raise "SHA256 does not match downloaded file"
|
104
|
+
end
|
105
|
+
|
106
|
+
File.write(filename, data)
|
107
|
+
end
|
108
|
+
end
|
109
|
+
|
110
|
+
task :update => %w[dist:download dist:clean]
|
111
|
+
end
|
112
|
+
|
74
113
|
namespace :build do
|
75
114
|
desc "Build librdkafka at the given git sha or tag"
|
76
115
|
task :git, [:ref] do |task, args|
|
@@ -78,8 +117,9 @@ namespace :build do
|
|
78
117
|
version = "git-#{ref}"
|
79
118
|
|
80
119
|
recipe = MiniPortile.new("librdkafka", version)
|
81
|
-
recipe.files << "https://github.com/
|
120
|
+
recipe.files << "https://github.com/confluentinc/librdkafka/archive/#{ref}.tar.gz"
|
82
121
|
recipe.configure_options = ["--host=#{recipe.host}","--enable-static", "--enable-zstd"]
|
122
|
+
recipe.patch_files = Dir[File.join(releases, 'patches', "*.patch")].sort
|
83
123
|
recipe.cook
|
84
124
|
|
85
125
|
ext = recipe.host.include?("darwin") ? "dylib" : "so"
|
@@ -16,9 +16,6 @@ module Rdkafka
|
|
16
16
|
REGISTRY = {}
|
17
17
|
# Default wait timeout is 31 years
|
18
18
|
MAX_WAIT_TIMEOUT_FOREVER = 10_000_000_000
|
19
|
-
# Deprecation message for wait_timeout argument in wait method
|
20
|
-
WAIT_TIMEOUT_DEPRECATION_MESSAGE = "The 'wait_timeout' argument is deprecated and will be removed in future versions without replacement. " \
|
21
|
-
"We don't rely on it's value anymore. Please refactor your code to remove references to it."
|
22
19
|
|
23
20
|
private_constant :MAX_WAIT_TIMEOUT_FOREVER
|
24
21
|
|
@@ -59,16 +56,13 @@ module Rdkafka
|
|
59
56
|
#
|
60
57
|
# @param max_wait_timeout [Numeric, nil] Amount of time to wait before timing out.
|
61
58
|
# If this is nil we will wait forever
|
62
|
-
# @param wait_timeout [nil] deprecated
|
63
59
|
# @param raise_response_error [Boolean] should we raise error when waiting finishes
|
64
60
|
#
|
65
61
|
# @return [Object] Operation-specific result
|
66
62
|
#
|
67
63
|
# @raise [RdkafkaError] When the operation failed
|
68
64
|
# @raise [WaitTimeoutError] When the timeout has been reached and the handle is still pending
|
69
|
-
def wait(max_wait_timeout: 60,
|
70
|
-
Kernel.warn(WAIT_TIMEOUT_DEPRECATION_MESSAGE) unless wait_timeout.nil?
|
71
|
-
|
65
|
+
def wait(max_wait_timeout: 60, raise_response_error: true)
|
72
66
|
timeout = max_wait_timeout ? monotonic_now + max_wait_timeout : MAX_WAIT_TIMEOUT_FOREVER
|
73
67
|
|
74
68
|
@mutex.synchronize do
|
data/lib/rdkafka/bindings.rb
CHANGED
@@ -38,7 +38,7 @@ module Rdkafka
|
|
38
38
|
|
39
39
|
# Metadata
|
40
40
|
|
41
|
-
attach_function :rd_kafka_name, [:pointer], :string
|
41
|
+
attach_function :rd_kafka_name, [:pointer], :string
|
42
42
|
attach_function :rd_kafka_memberid, [:pointer], :string, blocking: true
|
43
43
|
attach_function :rd_kafka_clusterid, [:pointer], :string, blocking: true
|
44
44
|
attach_function :rd_kafka_metadata, [:pointer, :int, :pointer, :pointer, :int], :int, blocking: true
|
data/lib/rdkafka/callbacks.rb
CHANGED
@@ -2,7 +2,6 @@
|
|
2
2
|
|
3
3
|
module Rdkafka
|
4
4
|
module Callbacks
|
5
|
-
|
6
5
|
# Extracts attributes of a rd_kafka_topic_result_t
|
7
6
|
#
|
8
7
|
# @private
|
@@ -149,13 +148,6 @@ module Rdkafka
|
|
149
148
|
end
|
150
149
|
end
|
151
150
|
|
152
|
-
# FFI Function used for Create Topic and Delete Topic callbacks
|
153
|
-
BackgroundEventCallbackFunction = FFI::Function.new(
|
154
|
-
:void, [:pointer, :pointer, :pointer]
|
155
|
-
) do |client_ptr, event_ptr, opaque_ptr|
|
156
|
-
BackgroundEventCallback.call(client_ptr, event_ptr, opaque_ptr)
|
157
|
-
end
|
158
|
-
|
159
151
|
# @private
|
160
152
|
class BackgroundEventCallback
|
161
153
|
def self.call(_, event_ptr, _)
|
@@ -348,13 +340,6 @@ module Rdkafka
|
|
348
340
|
end
|
349
341
|
end
|
350
342
|
|
351
|
-
# FFI Function used for Message Delivery callbacks
|
352
|
-
DeliveryCallbackFunction = FFI::Function.new(
|
353
|
-
:void, [:pointer, :pointer, :pointer]
|
354
|
-
) do |client_ptr, message_ptr, opaque_ptr|
|
355
|
-
DeliveryCallback.call(client_ptr, message_ptr, opaque_ptr)
|
356
|
-
end
|
357
|
-
|
358
343
|
# @private
|
359
344
|
class DeliveryCallback
|
360
345
|
def self.call(_, message_ptr, opaque_ptr)
|
@@ -387,5 +372,44 @@ module Rdkafka
|
|
387
372
|
end
|
388
373
|
end
|
389
374
|
end
|
375
|
+
|
376
|
+
@@mutex = Mutex.new
|
377
|
+
@@current_pid = nil
|
378
|
+
|
379
|
+
class << self
|
380
|
+
# Defines or recreates after fork callbacks that require FFI thread so the callback thread
|
381
|
+
# is always correctly initialized
|
382
|
+
#
|
383
|
+
# @see https://github.com/ffi/ffi/issues/1114
|
384
|
+
def ensure_ffi_running
|
385
|
+
@@mutex.synchronize do
|
386
|
+
return if @@current_pid == ::Process.pid
|
387
|
+
|
388
|
+
if const_defined?(:BackgroundEventCallbackFunction, false)
|
389
|
+
send(:remove_const, :BackgroundEventCallbackFunction)
|
390
|
+
send(:remove_const, :DeliveryCallbackFunction)
|
391
|
+
end
|
392
|
+
|
393
|
+
# FFI Function used for Create Topic and Delete Topic callbacks
|
394
|
+
background_event_callback_function = FFI::Function.new(
|
395
|
+
:void, [:pointer, :pointer, :pointer]
|
396
|
+
) do |client_ptr, event_ptr, opaque_ptr|
|
397
|
+
BackgroundEventCallback.call(client_ptr, event_ptr, opaque_ptr)
|
398
|
+
end
|
399
|
+
|
400
|
+
# FFI Function used for Message Delivery callbacks
|
401
|
+
delivery_callback_function = FFI::Function.new(
|
402
|
+
:void, [:pointer, :pointer, :pointer]
|
403
|
+
) do |client_ptr, message_ptr, opaque_ptr|
|
404
|
+
DeliveryCallback.call(client_ptr, message_ptr, opaque_ptr)
|
405
|
+
end
|
406
|
+
|
407
|
+
const_set(:BackgroundEventCallbackFunction, background_event_callback_function)
|
408
|
+
const_set(:DeliveryCallbackFunction, delivery_callback_function)
|
409
|
+
|
410
|
+
@@current_pid = ::Process.pid
|
411
|
+
end
|
412
|
+
end
|
413
|
+
end
|
390
414
|
end
|
391
415
|
end
|
data/lib/rdkafka/config.rb
CHANGED
data/lib/rdkafka/consumer.rb
CHANGED
@@ -423,6 +423,19 @@ module Rdkafka
|
|
423
423
|
# @return [nil]
|
424
424
|
# @raise [RdkafkaError] When seeking fails
|
425
425
|
def seek(message)
|
426
|
+
seek_by(message.topic, message.partition, message.offset)
|
427
|
+
end
|
428
|
+
|
429
|
+
# Seek to a particular message by providing the topic, partition and offset.
|
430
|
+
# The next poll on the topic/partition will return the
|
431
|
+
# message at the given offset.
|
432
|
+
#
|
433
|
+
# @param topic [String] The topic in which to seek
|
434
|
+
# @param partition [Integer] The partition number to seek
|
435
|
+
# @param offset [Integer] The partition offset to seek
|
436
|
+
# @return [nil]
|
437
|
+
# @raise [RdkafkaError] When seeking fails
|
438
|
+
def seek_by(topic, partition, offset)
|
426
439
|
closed_consumer_check(__method__)
|
427
440
|
|
428
441
|
# rd_kafka_offset_store is one of the few calls that does not support
|
@@ -430,14 +443,14 @@ module Rdkafka
|
|
430
443
|
native_topic = @native_kafka.with_inner do |inner|
|
431
444
|
Rdkafka::Bindings.rd_kafka_topic_new(
|
432
445
|
inner,
|
433
|
-
|
446
|
+
topic,
|
434
447
|
nil
|
435
448
|
)
|
436
449
|
end
|
437
450
|
response = Rdkafka::Bindings.rd_kafka_seek(
|
438
451
|
native_topic,
|
439
|
-
|
440
|
-
|
452
|
+
partition,
|
453
|
+
offset,
|
441
454
|
0 # timeout
|
442
455
|
)
|
443
456
|
if response != 0
|
data/lib/rdkafka/producer.rb
CHANGED
@@ -298,7 +298,7 @@ module Rdkafka
|
|
298
298
|
partitioner_name = @topics_configs.dig(topic, topic_config_hash, :partitioner) || @partitioner_name
|
299
299
|
|
300
300
|
# If the topic is not present, set to -1
|
301
|
-
partition = Rdkafka::Bindings.partitioner(partition_key, partition_count,
|
301
|
+
partition = Rdkafka::Bindings.partitioner(partition_key, partition_count, partitioner_name) if partition_count.positive?
|
302
302
|
end
|
303
303
|
|
304
304
|
# If partition is nil, use -1 to let librdafka set the partition randomly or
|
data/lib/rdkafka/version.rb
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
# frozen_string_literal: true
|
2
2
|
|
3
3
|
module Rdkafka
|
4
|
-
VERSION = "0.
|
5
|
-
LIBRDKAFKA_VERSION = "2.
|
6
|
-
LIBRDKAFKA_SOURCE_SHA256 = "
|
4
|
+
VERSION = "0.18.0"
|
5
|
+
LIBRDKAFKA_VERSION = "2.5.0"
|
6
|
+
LIBRDKAFKA_SOURCE_SHA256 = "3dc62de731fd516dfb1032861d9a580d4d0b5b0856beb0f185d06df8e6c26259"
|
7
7
|
end
|
data/rdkafka.gemspec
CHANGED
@@ -17,7 +17,7 @@ Gem::Specification.new do |gem|
|
|
17
17
|
gem.version = Rdkafka::VERSION
|
18
18
|
gem.required_ruby_version = '>= 3.0'
|
19
19
|
gem.extensions = %w(ext/Rakefile)
|
20
|
-
gem.cert_chain = %w[certs/
|
20
|
+
gem.cert_chain = %w[certs/cert.pem]
|
21
21
|
|
22
22
|
if $PROGRAM_NAME.end_with?('gem')
|
23
23
|
gem.signing_key = File.expand_path('~/.ssh/gem-private_key.pem')
|
@@ -80,7 +80,6 @@ describe Rdkafka::AbstractHandle do
|
|
80
80
|
let(:pending_handle) { true }
|
81
81
|
|
82
82
|
it "should wait until the timeout and then raise an error" do
|
83
|
-
expect(Kernel).not_to receive(:warn)
|
84
83
|
expect {
|
85
84
|
subject.wait(max_wait_timeout: 0.1)
|
86
85
|
}.to raise_error Rdkafka::AbstractHandle::WaitTimeoutError, /test_operation/
|
@@ -90,22 +89,15 @@ describe Rdkafka::AbstractHandle do
|
|
90
89
|
context 'when pending_handle false' do
|
91
90
|
let(:pending_handle) { false }
|
92
91
|
|
93
|
-
it 'should show a deprecation warning when wait_timeout is set' do
|
94
|
-
expect(Kernel).to receive(:warn).with(Rdkafka::AbstractHandle::WAIT_TIMEOUT_DEPRECATION_MESSAGE)
|
95
|
-
subject.wait(wait_timeout: 0.1)
|
96
|
-
end
|
97
|
-
|
98
92
|
context "without error" do
|
99
93
|
let(:result) { 1 }
|
100
94
|
|
101
95
|
it "should return a result" do
|
102
|
-
expect(Kernel).not_to receive(:warn)
|
103
96
|
wait_result = subject.wait
|
104
97
|
expect(wait_result).to eq(result)
|
105
98
|
end
|
106
99
|
|
107
100
|
it "should wait without a timeout" do
|
108
|
-
expect(Kernel).not_to receive(:warn)
|
109
101
|
wait_result = subject.wait(max_wait_timeout: nil)
|
110
102
|
expect(wait_result).to eq(result)
|
111
103
|
end
|
@@ -115,7 +107,6 @@ describe Rdkafka::AbstractHandle do
|
|
115
107
|
let(:response) { 20 }
|
116
108
|
|
117
109
|
it "should raise an rdkafka error" do
|
118
|
-
expect(Kernel).not_to receive(:warn)
|
119
110
|
expect {
|
120
111
|
subject.wait
|
121
112
|
}.to raise_error Rdkafka::RdkafkaError
|
data/spec/rdkafka/admin_spec.rb
CHANGED
@@ -34,7 +34,7 @@ describe Rdkafka::Admin do
|
|
34
34
|
describe '#describe_errors' do
|
35
35
|
let(:errors) { admin.class.describe_errors }
|
36
36
|
|
37
|
-
it { expect(errors.size).to eq(
|
37
|
+
it { expect(errors.size).to eq(170) }
|
38
38
|
it { expect(errors[-184]).to eq(code: -184, description: 'Local: Queue full', name: '_QUEUE_FULL') }
|
39
39
|
it { expect(errors[21]).to eq(code: 21, description: 'Broker: Invalid required acks value', name: 'INVALID_REQUIRED_ACKS') }
|
40
40
|
end
|
@@ -737,4 +737,18 @@ expect(ex.broker_message).to match(/Topic name.*is invalid: .* contains one or m
|
|
737
737
|
end
|
738
738
|
end
|
739
739
|
end
|
740
|
+
|
741
|
+
context "when operating from a fork" do
|
742
|
+
# @see https://github.com/ffi/ffi/issues/1114
|
743
|
+
it 'expect to be able to create topics and run other admin operations without hanging' do
|
744
|
+
# If the FFI issue is not mitigated, this will hang forever
|
745
|
+
pid = fork do
|
746
|
+
admin
|
747
|
+
.create_topic(topic_name, topic_partition_count, topic_replication_factor)
|
748
|
+
.wait
|
749
|
+
end
|
750
|
+
|
751
|
+
Process.wait(pid)
|
752
|
+
end
|
753
|
+
end
|
740
754
|
end
|
@@ -258,6 +258,95 @@ describe Rdkafka::Consumer do
|
|
258
258
|
end
|
259
259
|
end
|
260
260
|
|
261
|
+
describe "#seek_by" do
|
262
|
+
let(:topic) { "consume_test_topic" }
|
263
|
+
let(:partition) { 0 }
|
264
|
+
let(:offset) { 0 }
|
265
|
+
|
266
|
+
it "should raise an error when seeking fails" do
|
267
|
+
expect(Rdkafka::Bindings).to receive(:rd_kafka_seek).and_return(20)
|
268
|
+
expect {
|
269
|
+
consumer.seek_by(topic, partition, offset)
|
270
|
+
}.to raise_error Rdkafka::RdkafkaError
|
271
|
+
end
|
272
|
+
|
273
|
+
context "subscription" do
|
274
|
+
let(:timeout) { 1000 }
|
275
|
+
|
276
|
+
before do
|
277
|
+
consumer.subscribe(topic)
|
278
|
+
|
279
|
+
# 1. partitions are assigned
|
280
|
+
wait_for_assignment(consumer)
|
281
|
+
expect(consumer.assignment).not_to be_empty
|
282
|
+
|
283
|
+
# 2. eat unrelated messages
|
284
|
+
while(consumer.poll(timeout)) do; end
|
285
|
+
end
|
286
|
+
after { consumer.unsubscribe }
|
287
|
+
|
288
|
+
def send_one_message(val)
|
289
|
+
producer.produce(
|
290
|
+
topic: topic,
|
291
|
+
payload: "payload #{val}",
|
292
|
+
key: "key 1",
|
293
|
+
partition: 0
|
294
|
+
).wait
|
295
|
+
end
|
296
|
+
|
297
|
+
it "works when a partition is paused" do
|
298
|
+
# 3. get reference message
|
299
|
+
send_one_message(:a)
|
300
|
+
message1 = consumer.poll(timeout)
|
301
|
+
expect(message1&.payload).to eq "payload a"
|
302
|
+
|
303
|
+
# 4. pause the subscription
|
304
|
+
tpl = Rdkafka::Consumer::TopicPartitionList.new
|
305
|
+
tpl.add_topic(topic, 1)
|
306
|
+
consumer.pause(tpl)
|
307
|
+
|
308
|
+
# 5. seek by the previous message fields
|
309
|
+
consumer.seek_by(message1.topic, message1.partition, message1.offset)
|
310
|
+
|
311
|
+
# 6. resume the subscription
|
312
|
+
tpl = Rdkafka::Consumer::TopicPartitionList.new
|
313
|
+
tpl.add_topic(topic, 1)
|
314
|
+
consumer.resume(tpl)
|
315
|
+
|
316
|
+
# 7. ensure same message is read again
|
317
|
+
message2 = consumer.poll(timeout)
|
318
|
+
|
319
|
+
# This is needed because `enable.auto.offset.store` is true but when running in CI that
|
320
|
+
# is overloaded, offset store lags
|
321
|
+
sleep(2)
|
322
|
+
|
323
|
+
consumer.commit
|
324
|
+
expect(message1.offset).to eq message2.offset
|
325
|
+
expect(message1.payload).to eq message2.payload
|
326
|
+
end
|
327
|
+
|
328
|
+
it "allows skipping messages" do
|
329
|
+
# 3. send messages
|
330
|
+
send_one_message(:a)
|
331
|
+
send_one_message(:b)
|
332
|
+
send_one_message(:c)
|
333
|
+
|
334
|
+
# 4. get reference message
|
335
|
+
message = consumer.poll(timeout)
|
336
|
+
expect(message&.payload).to eq "payload a"
|
337
|
+
|
338
|
+
# 5. seek over one message
|
339
|
+
consumer.seek_by(message.topic, message.partition, message.offset + 2)
|
340
|
+
|
341
|
+
# 6. ensure that only one message is available
|
342
|
+
records = consumer.poll(timeout)
|
343
|
+
expect(records&.payload).to eq "payload c"
|
344
|
+
records = consumer.poll(timeout)
|
345
|
+
expect(records).to be_nil
|
346
|
+
end
|
347
|
+
end
|
348
|
+
end
|
349
|
+
|
261
350
|
describe "#assign and #assignment" do
|
262
351
|
it "should return an empty assignment if nothing is assigned" do
|
263
352
|
expect(consumer.assignment).to be_empty
|
data.tar.gz.sig
CHANGED
Binary file
|
metadata
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: rdkafka
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.
|
4
|
+
version: 0.18.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Thijs Cadier
|
@@ -13,30 +13,30 @@ cert_chain:
|
|
13
13
|
-----BEGIN CERTIFICATE-----
|
14
14
|
MIIEcDCCAtigAwIBAgIBATANBgkqhkiG9w0BAQsFADA/MRAwDgYDVQQDDAdjb250
|
15
15
|
YWN0MRcwFQYKCZImiZPyLGQBGRYHa2FyYWZrYTESMBAGCgmSJomT8ixkARkWAmlv
|
16
|
-
|
16
|
+
MB4XDTI0MDgyMzEwMTkyMFoXDTQ5MDgxNzEwMTkyMFowPzEQMA4GA1UEAwwHY29u
|
17
17
|
dGFjdDEXMBUGCgmSJomT8ixkARkWB2thcmFma2ExEjAQBgoJkiaJk/IsZAEZFgJp
|
18
|
-
|
19
|
-
|
20
|
-
|
21
|
-
|
22
|
-
|
23
|
-
|
24
|
-
|
25
|
-
|
26
|
-
|
27
|
-
|
28
|
-
|
29
|
-
|
30
|
-
|
31
|
-
|
32
|
-
|
33
|
-
|
34
|
-
|
35
|
-
|
36
|
-
|
37
|
-
|
18
|
+
bzCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKjLhLjQqUlNayxkXnO+
|
19
|
+
PsmCDs/KFIzhrsYMfLZRZNaWmzV3ujljMOdDjd4snM2X06C41iVdQPWjpe3j8vVe
|
20
|
+
ZXEWR/twSbOP6Eeg8WVH2wCOo0x5i7yhVn4UBLH4JpfEMCbemVcWQ9ry9OMg4WpH
|
21
|
+
Uu4dRwxFV7hzCz3p0QfNLRI4miAxnGWcnlD98IJRjBAksTuR1Llj0vbOrDGsL9ZT
|
22
|
+
JeXP2gdRLd8SqzAFJEWrbeTBCBU7gfSh3oMg5SVDLjaqf7Kz5wC/8bDZydzanOxB
|
23
|
+
T6CDXPsCnllmvTNx2ei2T5rGYJOzJeNTmJLLK6hJWUlAvaQSvCwZRvFJ0tVGLEoS
|
24
|
+
flqSr6uGyyl1eMUsNmsH4BqPEYcAV6P2PKTv2vUR8AP0raDvZ3xL1TKvfRb8xRpo
|
25
|
+
vPopCGlY5XBWEc6QERHfVLTIVsjnls2/Ujj4h8/TSfqqYnaHKefIMLbuD/tquMjD
|
26
|
+
iWQsW2qStBV0T+U7FijKxVfrfqZP7GxQmDAc9o1iiyAa3QIDAQABo3cwdTAJBgNV
|
27
|
+
HRMEAjAAMAsGA1UdDwQEAwIEsDAdBgNVHQ4EFgQU3O4dTXmvE7YpAkszGzR9DdL9
|
28
|
+
sbEwHQYDVR0RBBYwFIESY29udGFjdEBrYXJhZmthLmlvMB0GA1UdEgQWMBSBEmNv
|
29
|
+
bnRhY3RAa2FyYWZrYS5pbzANBgkqhkiG9w0BAQsFAAOCAYEAVKTfoLXn7mqdSxIR
|
30
|
+
eqxcR6Huudg1jes81s1+X0uiRTR3hxxKZ3Y82cPsee9zYWyBrN8TA4KA0WILTru7
|
31
|
+
Ygxvzha0SRPsSiaKLmgOJ+61ebI4+bOORzIJLpD6GxCxu1r7MI4+0r1u1xe0EWi8
|
32
|
+
agkVo1k4Vi8cKMLm6Gl9b3wG9zQBw6fcgKwmpjKiNnOLP+OytzUANrIUJjoq6oal
|
33
|
+
TC+f/Uc0TLaRqUaW/bejxzDWWHoM3SU6aoLPuerglzp9zZVzihXwx3jPLUVKDFpF
|
34
|
+
Rl2lcBDxlpYGueGo0/oNzGJAAy6js8jhtHC9+19PD53vk7wHtFTZ/0ugDQYnwQ+x
|
35
|
+
oml2fAAuVWpTBCgOVFe6XCQpMKopzoxQ1PjKztW2KYxgJdIBX87SnL3aWuBQmhRd
|
36
|
+
i9zWxov0mr44TWegTVeypcWGd/0nxu1+QHVNHJrpqlPBRvwQsUm7fwmRInGpcaB8
|
37
|
+
ap8wNYvryYzrzvzUxIVFBVM5PacgkFqRmolCa8I7tdKQN+R1
|
38
38
|
-----END CERTIFICATE-----
|
39
|
-
date: 2024-
|
39
|
+
date: 2024-09-02 00:00:00.000000000 Z
|
40
40
|
dependencies:
|
41
41
|
- !ruby/object:Gem::Dependency
|
42
42
|
name: ffi
|
@@ -185,8 +185,9 @@ files:
|
|
185
185
|
- MIT-LICENSE
|
186
186
|
- README.md
|
187
187
|
- Rakefile
|
188
|
-
- certs/
|
189
|
-
- dist/librdkafka_2.
|
188
|
+
- certs/cert.pem
|
189
|
+
- dist/librdkafka_2.5.0.tar.gz
|
190
|
+
- dist/patches/rdkafka_sticky_assignor.c.patch
|
190
191
|
- docker-compose.yml
|
191
192
|
- ext/README.md
|
192
193
|
- ext/Rakefile
|
metadata.gz.sig
CHANGED
Binary file
|
data/certs/cert_chain.pem
DELETED
@@ -1,26 +0,0 @@
|
|
1
|
-
-----BEGIN CERTIFICATE-----
|
2
|
-
MIIEcDCCAtigAwIBAgIBATANBgkqhkiG9w0BAQsFADA/MRAwDgYDVQQDDAdjb250
|
3
|
-
YWN0MRcwFQYKCZImiZPyLGQBGRYHa2FyYWZrYTESMBAGCgmSJomT8ixkARkWAmlv
|
4
|
-
MB4XDTIzMDgyMTA3MjU1NFoXDTI0MDgyMDA3MjU1NFowPzEQMA4GA1UEAwwHY29u
|
5
|
-
dGFjdDEXMBUGCgmSJomT8ixkARkWB2thcmFma2ExEjAQBgoJkiaJk/IsZAEZFgJp
|
6
|
-
bzCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAOuZpyQKEwsTG9plLat7
|
7
|
-
8bUaNuNBEnouTsNMr6X+XTgvyrAxTuocdsyP1sNCjdS1B8RiiDH1/Nt9qpvlBWon
|
8
|
-
sdJ1SYhaWNVfqiYStTDnCx3PRMmHRdD4KqUWKpN6VpZ1O/Zu+9Mw0COmvXgZuuO9
|
9
|
-
wMSJkXRo6dTCfMedLAIxjMeBIxtoLR2e6Jm6MR8+8WYYVWrO9kSOOt5eKQLBY7aK
|
10
|
-
b/Dc40EcJKPg3Z30Pia1M9ZyRlb6SOj6SKpHRqc7vbVQxjEw6Jjal1lZ49m3YZMd
|
11
|
-
ArMAs9lQZNdSw5/UX6HWWURLowg6k10RnhTUtYyzO9BFev0JFJftHnmuk8vtb+SD
|
12
|
-
5VPmjFXg2VOcw0B7FtG75Vackk8QKfgVe3nSPhVpew2CSPlbJzH80wChbr19+e3+
|
13
|
-
YGr1tOiaJrL6c+PNmb0F31NXMKpj/r+n15HwlTMRxQrzFcgjBlxf2XFGnPQXHhBm
|
14
|
-
kp1OFnEq4GG9sON4glRldkwzi/f/fGcZmo5fm3d+0ZdNgwIDAQABo3cwdTAJBgNV
|
15
|
-
HRMEAjAAMAsGA1UdDwQEAwIEsDAdBgNVHQ4EFgQUPVH5+dLA80A1kJ2Uz5iGwfOa
|
16
|
-
1+swHQYDVR0RBBYwFIESY29udGFjdEBrYXJhZmthLmlvMB0GA1UdEgQWMBSBEmNv
|
17
|
-
bnRhY3RAa2FyYWZrYS5pbzANBgkqhkiG9w0BAQsFAAOCAYEAnpa0jcN7JzREHMTQ
|
18
|
-
bfZ+xcvlrzuROMY6A3zIZmQgbnoZZNuX4cMRrT1p1HuwXpxdpHPw7dDjYqWw3+1h
|
19
|
-
3mXLeMuk7amjQpYoSWU/OIZMhIsARra22UN8qkkUlUj3AwTaChVKN/bPJOM2DzfU
|
20
|
-
kz9vUgLeYYFfQbZqeI6SsM7ltilRV4W8D9yNUQQvOxCFxtLOetJ00fC/E7zMUzbK
|
21
|
-
IBwYFQYsbI6XQzgAIPW6nGSYKgRhkfpmquXSNKZRIQ4V6bFrufa+DzD0bt2ZA3ah
|
22
|
-
fMmJguyb5L2Gf1zpDXzFSPMG7YQFLzwYz1zZZvOU7/UCpQsHpID/YxqDp4+Dgb+Y
|
23
|
-
qma0whX8UG/gXFV2pYWpYOfpatvahwi+A1TwPQsuZwkkhi1OyF1At3RY+hjSXyav
|
24
|
-
AnG1dJU+yL2BK7vaVytLTstJME5mepSZ46qqIJXMuWob/YPDmVaBF39TDSG9e34s
|
25
|
-
msG3BiCqgOgHAnL23+CN3Rt8MsuRfEtoTKpJVcCfoEoNHOkc
|
26
|
-
-----END CERTIFICATE-----
|