rdkafka 0.6.0 → 0.7.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 0b248aee9a260de3d8bb56efba9255f21d3bfcb6c269a1ab15beac7ae2521b6c
4
- data.tar.gz: 2d1199d5717a832674f659ac1d5e69d2a659845e9c8e5509428270a4a8390ae1
3
+ metadata.gz: 38ccc62319eff24c7def2b8cebced2a8551b28a1651e77348c868ce189a7d418
4
+ data.tar.gz: e23e2cad27a95e2bf51c2428e4d2246deae903f8e0d7800b086b35c5b262fc65
5
5
  SHA512:
6
- metadata.gz: 62afc350fa40f5e2117e7791927fe5ad7fcb3cb7ba13cdbba2f450dc6537d41e37668a5997fccbb260abe3fbb09e7a73c0cb5d5059b94837d5097f373c27eb75
7
- data.tar.gz: 1723633bcce6569992a2368df38c2d59b217dd94af9e8afab9932ca085e9ae0fbc3661ee8fbc370d31f3985204b38b3bbe15bbdfa11ca2be61c69b4c938a40f3
6
+ metadata.gz: 5faa270613e25d64f2b5d955a637b97b8315e7e365843b0e5a10d34c6340a795ebcf8ed2cb9dba2ae3a7d6839e464271d979760d351a6a8777224c31cdb262b6
7
+ data.tar.gz: d209464e6fe089945638063caa4ec6ab80d7c0d171678c43e320314cdc2ff0f980806b92941d0529b8ec93d598e5a637673fe3d97df4755503d2b484cc8f670b
@@ -11,7 +11,6 @@ env:
11
11
  - KAFKA_HEAP_OPTS="-Xmx512m -Xms512m"
12
12
 
13
13
  rvm:
14
- - 2.3
15
14
  - 2.4
16
15
  - 2.5
17
16
  - 2.6
@@ -1,3 +1,7 @@
1
+ # 0.6.0
2
+ * Bump librdkafka to 1.2.0 (by rob-as)
3
+ * Allow customizing the wait time for delivery report availability (by mensfeld )
4
+
1
5
  # 0.6.0
2
6
  * Bump librdkafka to 1.1.0 (by Chris Gaffney)
3
7
  * Implement seek (by breunigs)
@@ -7,6 +11,7 @@
7
11
  * Add cluster and member information (by dmexe)
8
12
  * Support message headers for consumer & producer (by dmexe)
9
13
  * Add consumer rebalance listener (by dmexe)
14
+ * Implement pause/resume partitions (by dmexe)
10
15
 
11
16
  # 0.4.2
12
17
  * Delivery callback for producer
data/README.md CHANGED
@@ -8,7 +8,11 @@
8
8
  The `rdkafka` gem is a modern Kafka client library for Ruby based on
9
9
  [librdkafka](https://github.com/edenhill/librdkafka/).
10
10
  It wraps the production-ready C client using the [ffi](https://github.com/ffi/ffi)
11
- gem and targets Kafka 1.0+ and Ruby 2.3+.
11
+ gem and targets Kafka 1.0+ and Ruby 2.4+.
12
+
13
+ `rdkafka` was written because we needed a reliable Ruby client for
14
+ Kafka that supports modern Kafka at [AppSignal](https://appsignal.com).
15
+ We run it in production on very high traffic systems.
12
16
 
13
17
  This gem only provides a high-level Kafka consumer. If you are running
14
18
  an older version of Kafka and/or need the legacy simple consumer we
@@ -68,13 +72,6 @@ end
68
72
  delivery_handles.each(&:wait)
69
73
  ```
70
74
 
71
- ## Known issues
72
-
73
- When using forked process such as when using Unicorn you currently need
74
- to make sure that you create rdkafka instances after forking. Otherwise
75
- they will not work and crash your Ruby process when they are garbage
76
- collected. See https://github.com/appsignal/rdkafka-ruby/issues/19
77
-
78
75
  ## Development
79
76
 
80
77
  A Docker Compose file is included to run Kafka and Zookeeper. To run
@@ -67,7 +67,7 @@ module Rdkafka
67
67
 
68
68
  # Returns a new config with the provided options which are merged with {DEFAULT_CONFIG}.
69
69
  #
70
- # @param config_hash [Hash<String,Symbol => String>] The config options for rdkafka
70
+ # @param config_hash [Hash{String,Symbol => String}] The config options for rdkafka
71
71
  #
72
72
  # @return [Config]
73
73
  def initialize(config_hash = {})
@@ -155,7 +155,7 @@ module Rdkafka
155
155
 
156
156
  private
157
157
 
158
- # This method is only intented to be used to create a client,
158
+ # This method is only intended to be used to create a client,
159
159
  # using it in another way will leak memory.
160
160
  def native_config(opaque=nil)
161
161
  Rdkafka::Bindings.rd_kafka_conf_new.tap do |config|
@@ -318,7 +318,7 @@ module Rdkafka
318
318
  # @param list [TopicPartitionList,nil] The topic with partitions to commit
319
319
  # @param async [Boolean] Whether to commit async or wait for the commit to finish
320
320
  #
321
- # @raise [RdkafkaError] When comitting fails
321
+ # @raise [RdkafkaError] When committing fails
322
322
  #
323
323
  # @return [nil]
324
324
  def commit(list=nil, async=false)
@@ -367,16 +367,20 @@ module Rdkafka
367
367
  # Poll for new messages and yield for each received one. Iteration
368
368
  # will end when the consumer is closed.
369
369
  #
370
+ # If `enable.partition.eof` is turned on in the config this will raise an
371
+ # error when an eof is reached, so you probably want to disable that when
372
+ # using this method of iteration.
373
+ #
370
374
  # @raise [RdkafkaError] When polling fails
371
375
  #
372
376
  # @yieldparam message [Message] Received message
373
377
  #
374
378
  # @return [nil]
375
- def each(&block)
379
+ def each
376
380
  loop do
377
381
  message = poll(250)
378
382
  if message
379
- block.call(message)
383
+ yield(message)
380
384
  else
381
385
  if @closing
382
386
  break
@@ -11,7 +11,7 @@ module Rdkafka
11
11
  attr_reader :offset
12
12
 
13
13
  # Partition's error code
14
- # @retuen [Integer]
14
+ # @return [Integer]
15
15
  attr_reader :err
16
16
 
17
17
  # @private
@@ -4,7 +4,7 @@ module Rdkafka
4
4
  class TopicPartitionList
5
5
  # Create a topic partition list.
6
6
  #
7
- # @param data [Hash<String => [nil,Partition]>] The topic and partion data or nil to create an empty list
7
+ # @param data [Hash{String => nil,Partition}] The topic and partition data or nil to create an empty list
8
8
  #
9
9
  # @return [TopicPartitionList]
10
10
  def initialize(data=nil)
@@ -71,7 +71,7 @@ module Rdkafka
71
71
 
72
72
  # Return a `Hash` with the topics as keys and and an array of partition information as the value if present.
73
73
  #
74
- # @return [Hash<String, [Array<Partition>, nil]>]
74
+ # @return [Hash{String => Array<Partition>,nil}]
75
75
  def to_h
76
76
  @data
77
77
  end
@@ -41,7 +41,7 @@ module Rdkafka
41
41
  end
42
42
  end
43
43
 
44
- # Error with potic partition list returned by the underlying rdkafka library.
44
+ # Error with topic partition list returned by the underlying rdkafka library.
45
45
  class RdkafkaTopicPartitionListError < RdkafkaError
46
46
  # @return [TopicPartitionList]
47
47
  attr_reader :topic_partition_list
@@ -2,7 +2,10 @@ module Rdkafka
2
2
  # A producer for Kafka messages. To create a producer set up a {Config} and call {Config#producer producer} on that.
3
3
  class Producer
4
4
  # @private
5
- @delivery_callback = nil
5
+ # Returns the current delivery callback, by default this is nil.
6
+ #
7
+ # @return [Proc, nil]
8
+ attr_reader :delivery_callback
6
9
 
7
10
  # @private
8
11
  def initialize(native_kafka)
@@ -32,13 +35,6 @@ module Rdkafka
32
35
  @delivery_callback = callback
33
36
  end
34
37
 
35
- # Returns the current delivery callback, by default this is nil.
36
- #
37
- # @return [Proc, nil]
38
- def delivery_callback
39
- @delivery_callback
40
- end
41
-
42
38
  # Close this producer and wait for the internal poll queue to empty.
43
39
  def close
44
40
  # Indicate to polling thread that we're closing
@@ -50,7 +46,7 @@ module Rdkafka
50
46
  # Produces a message to a Kafka topic. The message is added to rdkafka's queue, call {DeliveryHandle#wait wait} on the returned delivery handle to make sure it is delivered.
51
47
  #
52
48
  # When no partition is specified the underlying Kafka library picks a partition based on the key. If no key is specified, a random partition will be used.
53
- # When a timestamp is provided this is used instead of the autogenerated timestamp.
49
+ # When a timestamp is provided this is used instead of the auto-generated timestamp.
54
50
  #
55
51
  # @param topic [String] The topic to produce to
56
52
  # @param payload [String,nil] The message's payload
@@ -133,7 +129,7 @@ module Rdkafka
133
129
  *args
134
130
  )
135
131
 
136
- # Raise error if the produce call was not successfull
132
+ # Raise error if the produce call was not successful
137
133
  if response != 0
138
134
  DeliveryHandle.remove(delivery_handle.to_ptr.address)
139
135
  raise RdkafkaError.new(response)
@@ -10,6 +10,10 @@ module Rdkafka
10
10
 
11
11
  REGISTRY = {}
12
12
 
13
+ CURRENT_TIME = -> { Process.clock_gettime(Process::CLOCK_MONOTONIC) }.freeze
14
+
15
+ private_constant :CURRENT_TIME
16
+
13
17
  def self.register(address, handle)
14
18
  REGISTRY[address] = handle
15
19
  end
@@ -29,25 +33,25 @@ module Rdkafka
29
33
  # If there is a timeout this does not mean the message is not delivered, rdkafka might still be working on delivering the message.
30
34
  # In this case it is possible to call wait again.
31
35
  #
32
- # @param timeout_in_seconds [Integer, nil] Number of seconds to wait before timing out. If this is nil it does not time out.
36
+ # @param max_wait_timeout [Numeric, nil] Amount of time to wait before timing out. If this is nil it does not time out.
37
+ # @param wait_timeout [Numeric] Amount of time we should wait before we recheck if there is a delivery report available
33
38
  #
34
39
  # @raise [RdkafkaError] When delivering the message failed
35
40
  # @raise [WaitTimeoutError] When the timeout has been reached and the handle is still pending
36
41
  #
37
42
  # @return [DeliveryReport]
38
- def wait(timeout_in_seconds=60)
39
- timeout = if timeout_in_seconds
40
- Time.now.to_i + timeout_in_seconds
43
+ def wait(max_wait_timeout: 60, wait_timeout: 0.1)
44
+ timeout = if max_wait_timeout
45
+ CURRENT_TIME.call + max_wait_timeout
41
46
  else
42
47
  nil
43
48
  end
44
49
  loop do
45
50
  if pending?
46
- if timeout && timeout <= Time.now.to_i
47
- raise WaitTimeoutError.new("Waiting for delivery timed out after #{timeout_in_seconds} seconds")
51
+ if timeout && timeout <= CURRENT_TIME.call
52
+ raise WaitTimeoutError.new("Waiting for delivery timed out after #{max_wait_timeout} seconds")
48
53
  end
49
- sleep 0.1
50
- next
54
+ sleep wait_timeout
51
55
  elsif self[:response] != 0
52
56
  raise RdkafkaError.new(self[:response])
53
57
  else
@@ -1,6 +1,6 @@
1
1
  module Rdkafka
2
2
  class Producer
3
- # Delivery report for a succesfully produced message.
3
+ # Delivery report for a successfully produced message.
4
4
  class DeliveryReport
5
5
  # The partition this message was produced to.
6
6
  # @return [Integer]
@@ -1,5 +1,5 @@
1
1
  module Rdkafka
2
- VERSION = "0.6.0"
3
- LIBRDKAFKA_VERSION = "1.1.0"
4
- LIBRDKAFKA_SOURCE_SHA256 = "123b47404c16bcde194b4bd1221c21fdce832ad12912bd8074f88f64b2b86f2b"
2
+ VERSION = "0.7.0"
3
+ LIBRDKAFKA_VERSION = "1.2.0"
4
+ LIBRDKAFKA_SOURCE_SHA256 = "eedde1c96104e4ac2d22a4230e34f35dd60d53976ae2563e3dd7c27190a96859"
5
5
  end
@@ -4,7 +4,7 @@ Gem::Specification.new do |gem|
4
4
  gem.authors = ['Thijs Cadier']
5
5
  gem.email = ["thijs@appsignal.com"]
6
6
  gem.description = "Modern Kafka client library for Ruby based on librdkafka"
7
- gem.summary = "Kafka client library wrapping librdkafka using the ffi gem and futures from concurrent-ruby for Kafka 0.10+"
7
+ gem.summary = "The rdkafka gem is a modern Kafka client library for Ruby based on librdkafka. It wraps the production-ready C client using the ffi gem and targets Kafka 1.0+ and Ruby 2.4+."
8
8
  gem.license = 'MIT'
9
9
  gem.homepage = 'https://github.com/thijsc/rdkafka-ruby'
10
10
 
@@ -14,7 +14,7 @@ Gem::Specification.new do |gem|
14
14
  gem.name = 'rdkafka'
15
15
  gem.require_paths = ['lib']
16
16
  gem.version = Rdkafka::VERSION
17
- gem.required_ruby_version = '>= 2.1'
17
+ gem.required_ruby_version = '>= 2.4'
18
18
  gem.extensions = %w(ext/Rakefile)
19
19
 
20
20
  gem.add_dependency 'ffi', '~> 1.9'
@@ -85,7 +85,7 @@ describe Rdkafka::Consumer do
85
85
  tpl.add_topic("consume_test_topic", (0..2))
86
86
  consumer.resume(tpl)
87
87
 
88
- # 8. ensure that message is successfuly consumed
88
+ # 8. ensure that message is successfully consumed
89
89
  records = consumer.poll(timeout)
90
90
  expect(records).not_to be_nil
91
91
  consumer.commit
@@ -46,7 +46,7 @@ describe Rdkafka::Producer::DeliveryHandle do
46
46
 
47
47
  it "should wait until the timeout and then raise an error" do
48
48
  expect {
49
- subject.wait(0.1)
49
+ subject.wait(max_wait_timeout: 0.1)
50
50
  }.to raise_error Rdkafka::Producer::DeliveryHandle::WaitTimeoutError
51
51
  end
52
52
 
@@ -61,7 +61,7 @@ describe Rdkafka::Producer::DeliveryHandle do
61
61
  end
62
62
 
63
63
  it "should wait without a timeout" do
64
- report = subject.wait(nil)
64
+ report = subject.wait(max_wait_timeout: nil)
65
65
 
66
66
  expect(report.partition).to eq(2)
67
67
  expect(report.offset).to eq(100)
@@ -42,7 +42,7 @@ describe Rdkafka::Producer do
42
42
  )
43
43
 
44
44
  # Wait for it to be delivered
45
- handle.wait(5)
45
+ handle.wait(max_wait_timeout: 5)
46
46
 
47
47
  # Callback should have been called
48
48
  expect(@callback_called).to be true
@@ -70,7 +70,7 @@ describe Rdkafka::Producer do
70
70
  expect(handle.pending?).to be true
71
71
 
72
72
  # Check delivery handle and report
73
- report = handle.wait(5)
73
+ report = handle.wait(max_wait_timeout: 5)
74
74
  expect(handle.pending?).to be false
75
75
  expect(report).not_to be_nil
76
76
  expect(report.partition).to eq 1
@@ -100,7 +100,7 @@ describe Rdkafka::Producer do
100
100
  key: "key partition",
101
101
  partition: 1
102
102
  )
103
- report = handle.wait(5)
103
+ report = handle.wait(max_wait_timeout: 5)
104
104
 
105
105
  # Consume message and verify it's content
106
106
  message = wait_for_message(
@@ -117,7 +117,7 @@ describe Rdkafka::Producer do
117
117
  payload: "Τη γλώσσα μου έδωσαν ελληνική",
118
118
  key: "key utf8"
119
119
  )
120
- report = handle.wait(5)
120
+ report = handle.wait(max_wait_timeout: 5)
121
121
 
122
122
  # Consume message and verify it's content
123
123
  message = wait_for_message(
@@ -149,7 +149,7 @@ describe Rdkafka::Producer do
149
149
  key: "key timestamp",
150
150
  timestamp: 1505069646252
151
151
  )
152
- report = handle.wait(5)
152
+ report = handle.wait(max_wait_timeout: 5)
153
153
 
154
154
  # Consume message and verify it's content
155
155
  message = wait_for_message(
@@ -169,7 +169,7 @@ describe Rdkafka::Producer do
169
169
  key: "key timestamp",
170
170
  timestamp: Time.at(1505069646, 353_000)
171
171
  )
172
- report = handle.wait(5)
172
+ report = handle.wait(max_wait_timeout: 5)
173
173
 
174
174
  # Consume message and verify it's content
175
175
  message = wait_for_message(
@@ -188,7 +188,7 @@ describe Rdkafka::Producer do
188
188
  topic: "produce_test_topic",
189
189
  payload: "payload no key"
190
190
  )
191
- report = handle.wait(5)
191
+ report = handle.wait(max_wait_timeout: 5)
192
192
 
193
193
  # Consume message and verify it's content
194
194
  message = wait_for_message(
@@ -205,7 +205,7 @@ describe Rdkafka::Producer do
205
205
  topic: "produce_test_topic",
206
206
  key: "key no payload"
207
207
  )
208
- report = handle.wait(5)
208
+ report = handle.wait(max_wait_timeout: 5)
209
209
 
210
210
  # Consume message and verify it's content
211
211
  message = wait_for_message(
@@ -224,7 +224,7 @@ describe Rdkafka::Producer do
224
224
  key: "key headers",
225
225
  headers: { foo: :bar, baz: :foobar }
226
226
  )
227
- report = handle.wait(5)
227
+ report = handle.wait(max_wait_timeout: 5)
228
228
 
229
229
  # Consume message and verify it's content
230
230
  message = wait_for_message(
@@ -246,7 +246,7 @@ describe Rdkafka::Producer do
246
246
  key: "key headers",
247
247
  headers: {}
248
248
  )
249
- report = handle.wait(5)
249
+ report = handle.wait(max_wait_timeout: 5)
250
250
 
251
251
  # Consume message and verify it's content
252
252
  message = wait_for_message(
@@ -280,55 +280,49 @@ describe Rdkafka::Producer do
280
280
  end
281
281
  end
282
282
 
283
- # TODO this spec crashes if you create and use the producer before
284
- # forking like so:
285
- #
286
- # @producer = producer
287
- #
288
- # This will be added as part of https://github.com/appsignal/rdkafka-ruby/issues/19
289
- #it "should produce a message in a forked process" do
290
- # # Fork, produce a message, send the report of a pipe and
291
- # # wait for it in the main process.
292
-
293
- # reader, writer = IO.pipe
294
-
295
- # fork do
296
- # reader.close
297
-
298
- # handle = producer.produce(
299
- # topic: "produce_test_topic",
300
- # payload: "payload",
301
- # key: "key"
302
- # )
303
-
304
- # report = handle.wait(5)
305
- # producer.close
306
-
307
- # report_json = JSON.generate(
308
- # "partition" => report.partition,
309
- # "offset" => report.offset
310
- # )
311
-
312
- # writer.write(report_json)
313
- # end
314
-
315
- # writer.close
316
-
317
- # report_hash = JSON.parse(reader.read)
318
- # report = Rdkafka::Producer::DeliveryReport.new(
319
- # report_hash["partition"],
320
- # report_hash["offset"]
321
- # )
322
-
323
- # # Consume message and verify it's content
324
- # message = wait_for_message(
325
- # topic: "produce_test_topic",
326
- # delivery_report: report
327
- # )
328
- # expect(message.partition).to eq 1
329
- # expect(message.payload).to eq "payload"
330
- # expect(message.key).to eq "key"
331
- #end
283
+ it "should produce a message in a forked process" do
284
+ # Fork, produce a message, send the report over a pipe and
285
+ # wait for and check the message in the main process.
286
+
287
+ reader, writer = IO.pipe
288
+
289
+ fork do
290
+ reader.close
291
+
292
+ handle = producer.produce(
293
+ topic: "produce_test_topic",
294
+ payload: "payload-forked",
295
+ key: "key-forked"
296
+ )
297
+
298
+ report = handle.wait(max_wait_timeout: 5)
299
+ producer.close
300
+
301
+ report_json = JSON.generate(
302
+ "partition" => report.partition,
303
+ "offset" => report.offset
304
+ )
305
+
306
+ writer.write(report_json)
307
+ end
308
+
309
+ writer.close
310
+
311
+ report_hash = JSON.parse(reader.read)
312
+ report = Rdkafka::Producer::DeliveryReport.new(
313
+ report_hash["partition"],
314
+ report_hash["offset"]
315
+ )
316
+
317
+ # Consume message and verify it's content
318
+ message = wait_for_message(
319
+ topic: "produce_test_topic",
320
+ delivery_report: report
321
+ )
322
+ expect(message.partition).to eq 0
323
+ expect(message.payload).to eq "payload-forked"
324
+ expect(message.key).to eq "key-forked"
325
+ end
332
326
 
333
327
  it "should raise an error when producing fails" do
334
328
  expect(Rdkafka::Bindings).to receive(:rd_kafka_producev).and_return(20)
@@ -348,10 +342,10 @@ describe Rdkafka::Producer do
348
342
  key: "key timeout"
349
343
  )
350
344
  expect {
351
- handle.wait(0)
345
+ handle.wait(max_wait_timeout: 0)
352
346
  }.to raise_error Rdkafka::Producer::DeliveryHandle::WaitTimeoutError
353
347
 
354
348
  # Waiting a second time should work
355
- handle.wait(5)
349
+ handle.wait(max_wait_timeout: 5)
356
350
  end
357
351
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: rdkafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.6.0
4
+ version: 0.7.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Thijs Cadier
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2019-07-23 00:00:00.000000000 Z
11
+ date: 2019-09-21 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: ffi
@@ -164,7 +164,7 @@ required_ruby_version: !ruby/object:Gem::Requirement
164
164
  requirements:
165
165
  - - ">="
166
166
  - !ruby/object:Gem::Version
167
- version: '2.1'
167
+ version: '2.4'
168
168
  required_rubygems_version: !ruby/object:Gem::Requirement
169
169
  requirements:
170
170
  - - ">="
@@ -172,11 +172,12 @@ required_rubygems_version: !ruby/object:Gem::Requirement
172
172
  version: '0'
173
173
  requirements: []
174
174
  rubyforge_project:
175
- rubygems_version: 2.7.6
175
+ rubygems_version: 2.7.6.2
176
176
  signing_key:
177
177
  specification_version: 4
178
- summary: Kafka client library wrapping librdkafka using the ffi gem and futures from
179
- concurrent-ruby for Kafka 0.10+
178
+ summary: The rdkafka gem is a modern Kafka client library for Ruby based on librdkafka.
179
+ It wraps the production-ready C client using the ffi gem and targets Kafka 1.0+
180
+ and Ruby 2.4+.
180
181
  test_files:
181
182
  - spec/rdkafka/bindings_spec.rb
182
183
  - spec/rdkafka/config_spec.rb