delivery_boy 0.2.7 → 1.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: dbf511c98ca6b6d56175d0624aa499f73e431139ca93de2591b8a490af633bc2
4
- data.tar.gz: dc7359238540968814b9b13e79fe7f95f80b9f0b441d7e1b2d14575900ea2c48
3
+ metadata.gz: a12782da215edcc91544e184fd5804b2c21fb4a4be180f69addd0e2f7787884e
4
+ data.tar.gz: 6de9279ba3441524aceef6091a4feaf57d67a8623499779877b39ed9a11edad9
5
5
  SHA512:
6
- metadata.gz: 812bf9fc1020bc846ebfbc4853847c6d7cf28649d30e1d5d242e636dd2ce50a87dff8347653afb517b0108d45576358a729d10763054ee93e09084375f5f8462
7
- data.tar.gz: d6b30c5fd5ddd38287f8c3fef45cdcc64fe31f0e7f5b92977b1fb8dd338bad2176f487cbff3a547edf2ae1a7a493bfec2f26a2cf8ae3a647d07c9d1e410879d8
6
+ metadata.gz: 6787bea888f6a2db4bb0d66eaad3c2234c9b6e4482c81ec69e43c27960b4c4a191ec553c0a24c87a46c5067811e2b9eb85265b34fc9457692465dfa8d902c5af
7
+ data.tar.gz: e10d3020dbaedfe5608fab51e6e1d343341919aab23b905766cdc386e2b844d5177d464fba93d69b79cc6d3704a5ad703ca97464f5754a363a9d362fecec2559
@@ -0,0 +1,33 @@
1
+ version: 2
2
+ jobs:
3
+ build:
4
+ docker:
5
+ - image: circleci/ruby:2.5.1-node
6
+ environment:
7
+ LOG_LEVEL: DEBUG
8
+ - image: wurstmeister/zookeeper
9
+ - image: wurstmeister/kafka:2.11-2.0.0
10
+ environment:
11
+ KAFKA_ADVERTISED_HOST_NAME: localhost
12
+ KAFKA_ADVERTISED_PORT: 9092
13
+ KAFKA_PORT: 9092
14
+ KAFKA_ZOOKEEPER_CONNECT: localhost:2181
15
+ KAFKA_DELETE_TOPIC_ENABLE: true
16
+ - image: wurstmeister/kafka:2.11-2.0.0
17
+ environment:
18
+ KAFKA_ADVERTISED_HOST_NAME: localhost
19
+ KAFKA_ADVERTISED_PORT: 9093
20
+ KAFKA_PORT: 9093
21
+ KAFKA_ZOOKEEPER_CONNECT: localhost:2181
22
+ KAFKA_DELETE_TOPIC_ENABLE: true
23
+ - image: wurstmeister/kafka:2.11-2.0.0
24
+ environment:
25
+ KAFKA_ADVERTISED_HOST_NAME: localhost
26
+ KAFKA_ADVERTISED_PORT: 9094
27
+ KAFKA_PORT: 9094
28
+ KAFKA_ZOOKEEPER_CONNECT: localhost:2181
29
+ KAFKA_DELETE_TOPIC_ENABLE: true
30
+ steps:
31
+ - checkout
32
+ - run: bundle install --path vendor/bundle
33
+ - run: bundle exec rspec
data/.gitignore CHANGED
@@ -9,4 +9,4 @@
9
9
  /tmp/
10
10
 
11
11
  # rspec failure tracking
12
- .rspec_status
12
+ .rspec_status
data/CHANGELOG CHANGED
@@ -2,6 +2,31 @@
2
2
 
3
3
  ## Unreleased
4
4
 
5
+ ## v1.1.0
6
+
7
+ * Support for `ssl_client_cert_key_password` in the configuration (#52)
8
+ * Add `DeliveryBoy.buffer_size` to return the number of messages in the buffer
9
+ * Add `DeliveryBoy::Fake#clear_buffer` and `DeliveryBoy::Fake#buffer_size` to
10
+ support the public API when using the test helper.
11
+ * Support for `sasl_oauth_token_provider` in the configuration. (#55)
12
+
13
+ ## v1.0.1
14
+
15
+ * Require ruby-kafka v1.0 or higher.
16
+
17
+ ## v1.0.0
18
+
19
+ * Add `DeliveryBoy.clear_buffer` method.
20
+ * Support exactly once delivery and transactional messaging (#50)
21
+ * Check that Rails::Railtie is defined (#48)
22
+
23
+ ## v0.2.8
24
+
25
+ * Support `log_level` config option.
26
+ * Support for ssl_verify_hostname in the configuration (#44)
27
+ * Upgrade dependency on KingKonf.
28
+ * Allow configuring `sasl_over_ssl`.
29
+
5
30
  ## v0.2.7
6
31
 
7
32
  * Support for ssl_ca_certs_from_system #18
data/README.md CHANGED
@@ -59,7 +59,34 @@ end
59
59
 
60
60
  In addition to improving response time, delivering messages asynchronously also protects your application against Kafka availability issues -- if messages cannot be delivered, they'll be buffered for later and retried automatically.
61
61
 
62
- Both `deliver` and `deliver_async` take the following options:
62
+ A third method is to produce messages first (without delivering the messages to Kafka yet), and deliver them synchronously later.
63
+
64
+ ```ruby
65
+ # app/controllers/comments_controller.rb
66
+ class CommentsController < ApplicationController
67
+ def create
68
+ @comment = Comment.create!(params)
69
+
70
+ event = {
71
+ name: "comment_created",
72
+ data: {
73
+ comment_id: @comment.id
74
+ user_id: current_user.id
75
+ }
76
+ }
77
+
78
+ # This will queue the two messages in the internal buffer.
79
+ DeliveryBoy.produce(comment.to_json, topic: "comments")
80
+ DeliveryBoy.produce(event.to_json, topic: "activity")
81
+
82
+ # This will deliver all messages in the buffer to Kafka.
83
+ # This call is blocking.
84
+ DeliveryBoy.deliver_messages
85
+ end
86
+ end
87
+ ```
88
+
89
+ The methods `deliver`, `deliver_async` and `produce` take the following options:
63
90
 
64
91
  * `topic` – the Kafka topic that should be written to (required).
65
92
  * `key` – the key that should be set on the Kafka message (optional).
@@ -103,6 +130,10 @@ A list of Kafka brokers that should be used to initialize the client. Defaults t
103
130
 
104
131
  This is how the client will identify itself to the Kafka brokers. Default is `delivery_boy`.
105
132
 
133
+ ##### `log_level`
134
+
135
+ The log level for the logger.
136
+
106
137
  #### Message delivery
107
138
 
108
139
  ##### `delivery_interval`
@@ -189,11 +220,15 @@ A PEM encoded client cert to use with an SSL connection. Must be used in combina
189
220
 
190
221
  A PEM encoded client cert key to use with an SSL connection. Must be used in combination with `ssl_client_cert`.
191
222
 
223
+ ##### `ssl_client_cert_key_password`
224
+
225
+ The password required to read the ssl_client_cert_key. Must be used in combination with ssl_client_cert_key.
226
+
192
227
  #### SASL Authentication and authorization
193
228
 
194
229
  See [ruby-kafka](https://github.com/zendesk/ruby-kafka#authentication-using-sasl) for more information.
195
230
 
196
- Use either `sasl_gssapi_*` _or_ `sasl_plain_*`, not both.
231
+ Use it through `GSSAPI`, `PLAIN` _or_ `OAUTHBEARER`.
197
232
 
198
233
  ##### `sasl_gssapi_principal`
199
234
 
@@ -215,6 +250,25 @@ The username used to authenticate.
215
250
 
216
251
  The password used to authenticate.
217
252
 
253
+ ##### `sasl_oauth_token_provider`
254
+
255
+ A instance of a class which implements the `token` method.
256
+ As described in [ruby-kafka](https://github.com/zendesk/ruby-kafka/tree/c3e90bc355fad1e27b9af1048966ff08d3d5735b#oauthbearer)
257
+
258
+ ```ruby
259
+ class TokenProvider
260
+ def token
261
+ "oauth-token"
262
+ end
263
+ end
264
+
265
+ DeliveryBoy.configure do |config|
266
+ config.sasl_oauth_token_provider = TokenProvider.new
267
+ config.ssl_ca_certs_from_system = true
268
+ end
269
+ ```
270
+
271
+
218
272
  ### Testing
219
273
 
220
274
  DeliveryBoy provides a test mode out of the box. When this mode is enabled, messages will be stored in memory rather than being sent to Kafka. If you use RSpec, enabling test mode is as easy as adding this to your spec helper:
@@ -231,17 +285,17 @@ describe PostsController do
231
285
  describe "#show" do
232
286
  it "emits an event to Kafka" do
233
287
  post = Post.create!(body: "hello")
234
-
288
+
235
289
  get :show, id: post.id
236
-
290
+
237
291
  # Use this API to extract all messages written to a Kafka topic.
238
292
  messages = DeliveryBoy.testing.messages_for("post_views")
239
-
293
+
240
294
  expect(messages.count).to eq 1
241
-
295
+
242
296
  # In addition to #value, you can also pull out #key and #partition_key.
243
297
  event = JSON.parse(messages.first.value)
244
-
298
+
245
299
  expect(event["post_id"]).to eq post.id
246
300
  end
247
301
  end
@@ -20,8 +20,8 @@ Gem::Specification.new do |spec|
20
20
 
21
21
  spec.require_paths = ["lib"]
22
22
 
23
- spec.add_runtime_dependency "ruby-kafka", "~> 0.5"
24
- spec.add_runtime_dependency "king_konf", "~> 0.2"
23
+ spec.add_runtime_dependency "ruby-kafka", "~> 1.0"
24
+ spec.add_runtime_dependency "king_konf", "~> 1.0"
25
25
 
26
26
  spec.add_development_dependency "bundler", "~> 1.15"
27
27
  spec.add_development_dependency "rake", "~> 10.0"
@@ -4,7 +4,8 @@ require "delivery_boy/version"
4
4
  require "delivery_boy/instance"
5
5
  require "delivery_boy/fake"
6
6
  require "delivery_boy/config"
7
- require "delivery_boy/railtie" if defined?(Rails)
7
+ require "delivery_boy/config_error"
8
+ require "delivery_boy/railtie" if defined?(Rails::Railtie)
8
9
 
9
10
  module DeliveryBoy
10
11
  class << self
@@ -47,6 +48,49 @@ module DeliveryBoy
47
48
  instance.deliver_async!(value, topic: topic, **options)
48
49
  end
49
50
 
51
+ # Like {.produce!}, but handles +Kafka::BufferOverflow+ errors
52
+ # by logging them and just going on with normal business.
53
+ #
54
+ # @return [nil]
55
+ def produce(value, topic:, **options)
56
+ produce!(value, topic: topic, **options)
57
+ rescue Kafka::BufferOverflow
58
+ logger.error "Message for `#{topic}` dropped due to buffer overflow"
59
+ end
60
+
61
+ # Appends the given message to the producer buffer but does not send it until {.deliver_messages} is called.
62
+ #
63
+ # @param value [String] the message value.
64
+ # @param topic [String] the topic that the message should be written to.
65
+ # @param key [String, nil] the message key.
66
+ # @param partition [Integer, nil] the topic partition that the message should
67
+ # be written to.
68
+ # @param partition_key [String, nil] a key used to deterministically assign
69
+ # a partition to the message.
70
+ # @return [nil]
71
+ # @raise [Kafka::BufferOverflow] if the producer's buffer is full.
72
+ def produce!(value, topic:, **options)
73
+ instance.produce(value, topic: topic, **options)
74
+ end
75
+
76
+ # Delivers the items currently in the producer buffer.
77
+ #
78
+ # @return [nil]
79
+ # @raise [Kafka::DeliveryFailed] if delivery failed for some reason.
80
+ def deliver_messages
81
+ instance.deliver_messages
82
+ end
83
+
84
+ # Clear any buffered messages generated by {.produce} or {.produce!} methods.
85
+ def clear_buffer
86
+ instance.clear_buffer
87
+ end
88
+
89
+ # Return the number of messages in the buffer
90
+ def buffer_size
91
+ instance.buffer_size
92
+ end
93
+
50
94
  # Shut down DeliveryBoy.
51
95
  #
52
96
  # Automatically called when the process exits.
@@ -60,7 +104,11 @@ module DeliveryBoy
60
104
  #
61
105
  # @return [Logger]
62
106
  def logger
63
- @logger ||= Logger.new($stdout)
107
+ @logger ||= Logger.new($stdout).tap do |logger|
108
+ if config.log_level
109
+ logger.level = Object.const_get("Logger::#{config.log_level.upcase}")
110
+ end
111
+ end
64
112
  end
65
113
 
66
114
  attr_writer :logger
@@ -7,6 +7,7 @@ module DeliveryBoy
7
7
  # Basic
8
8
  list :brokers, items: :string, sep: ",", default: ["localhost:9092"]
9
9
  string :client_id, default: "delivery_boy"
10
+ string :log_level, default: nil
10
11
 
11
12
  # Buffering
12
13
  integer :max_buffer_bytesize, default: 10_000_000
@@ -24,6 +25,9 @@ module DeliveryBoy
24
25
  integer :max_retries, default: 2
25
26
  integer :required_acks, default: -1
26
27
  integer :retry_backoff, default: 1
28
+ boolean :idempotent, default: false
29
+ boolean :transactional, default: false
30
+ integer :transactional_timeout, default: 60
27
31
 
28
32
  # Compression
29
33
  integer :compression_threshold, default: 1
@@ -34,7 +38,9 @@ module DeliveryBoy
34
38
  string :ssl_ca_cert_file_path
35
39
  string :ssl_client_cert, default: nil
36
40
  string :ssl_client_cert_key, default: nil
41
+ string :ssl_client_cert_key_password, default: nil
37
42
  boolean :ssl_ca_certs_from_system, default: false
43
+ boolean :ssl_verify_hostname, default: true
38
44
 
39
45
  # SASL authentication
40
46
  string :sasl_gssapi_principal
@@ -45,6 +51,10 @@ module DeliveryBoy
45
51
  string :sasl_scram_username
46
52
  string :sasl_scram_password
47
53
  string :sasl_scram_mechanism
54
+ boolean :sasl_over_ssl, default: true
55
+
56
+ # SASL OAUTHBEARER
57
+ attr_accessor :sasl_oauth_token_provider
48
58
 
49
59
  # Datadog monitoring
50
60
  boolean :datadog_enabled
@@ -2,7 +2,7 @@ module DeliveryBoy
2
2
 
3
3
  # A fake implementation that is useful for testing.
4
4
  class Fake
5
- FakeMessage = Struct.new(:value, :topic, :key, :offset, :partition, :partition_key, :create_time) do
5
+ FakeMessage = Struct.new(:value, :topic, :key, :headers, :offset, :partition, :partition_key, :create_time) do
6
6
  def bytesize
7
7
  key.to_s.bytesize + value.to_s.bytesize
8
8
  end
@@ -10,31 +10,74 @@ module DeliveryBoy
10
10
 
11
11
  def initialize
12
12
  @messages = Hash.new {|h, k| h[k] = [] }
13
+ @buffer = Hash.new {|h, k| h[k] = [] }
14
+ @delivery_lock = Mutex.new
13
15
  end
14
16
 
15
- def deliver(value, topic:, key: nil, partition: nil, partition_key: nil, create_time: Time.now)
16
- offset = @messages[topic].count
17
- message = FakeMessage.new(value, topic, key, offset, partition, partition_key, create_time)
17
+ def deliver(value, topic:, key: nil, headers: {}, partition: nil, partition_key: nil, create_time: Time.now)
18
+ @delivery_lock.synchronize do
19
+ offset = @messages[topic].count
20
+ message = FakeMessage.new(value, topic, key, headers, offset, partition, partition_key, create_time)
18
21
 
19
- @messages[topic] << message
22
+ @messages[topic] << message
23
+ end
20
24
 
21
25
  nil
22
26
  end
23
27
 
24
28
  alias deliver_async! deliver
25
29
 
30
+ def produce(value, topic:, key: nil, headers: {}, partition: nil, partition_key: nil, create_time: Time.now)
31
+ @delivery_lock.synchronize do
32
+ offset = @buffer[topic].count
33
+ message = FakeMessage.new(value, topic, key, headers, offset, partition, partition_key, create_time)
34
+
35
+ @buffer[topic] << message
36
+ end
37
+
38
+ nil
39
+ end
40
+
41
+ def deliver_messages
42
+ @delivery_lock.synchronize do
43
+ @buffer.each do |topic, messages|
44
+ @messages[topic].push(*messages)
45
+ end
46
+ @buffer.clear
47
+ end
48
+ end
49
+
26
50
  def shutdown
27
51
  clear
28
52
  end
29
53
 
54
+ def clear_buffer
55
+ @delivery_lock.synchronize do
56
+ @buffer.clear
57
+ end
58
+ end
59
+
60
+ def buffer_size
61
+ @delivery_lock.synchronize do
62
+ @buffer.values.flatten.size
63
+ end
64
+ end
65
+
30
66
  # Clear all messages stored in memory.
31
67
  def clear
32
- @messages.clear
68
+ @delivery_lock.synchronize do
69
+ @messages.clear
70
+ @buffer.clear
71
+ end
33
72
  end
34
73
 
35
74
  # Return all messages written to the specified topic.
36
75
  def messages_for(topic)
37
- @messages[topic]
76
+ @delivery_lock.synchronize do
77
+ # Return a clone so that the list of messages can be traversed
78
+ # without worrying about a concurrent modification
79
+ @messages[topic].clone
80
+ end
38
81
  end
39
82
  end
40
83
  end
@@ -6,6 +6,7 @@ module DeliveryBoy
6
6
  def initialize(config, logger)
7
7
  @config = config
8
8
  @logger = logger
9
+ @async_producer = nil
9
10
  end
10
11
 
11
12
  def deliver(value, topic:, **options)
@@ -13,7 +14,7 @@ module DeliveryBoy
13
14
  sync_producer.deliver_messages
14
15
  rescue
15
16
  # Make sure to clear any buffered messages if there's an error.
16
- sync_producer.clear_buffer
17
+ clear_buffer
17
18
 
18
19
  raise
19
20
  end
@@ -27,6 +28,22 @@ module DeliveryBoy
27
28
  async_producer.shutdown if async_producer?
28
29
  end
29
30
 
31
+ def produce(value, topic:, **options)
32
+ sync_producer.produce(value, topic: topic, **options)
33
+ end
34
+
35
+ def deliver_messages
36
+ sync_producer.deliver_messages
37
+ end
38
+
39
+ def clear_buffer
40
+ sync_producer.clear_buffer
41
+ end
42
+
43
+ def buffer_size
44
+ sync_producer.buffer_size
45
+ end
46
+
30
47
  private
31
48
 
32
49
  attr_reader :config, :logger
@@ -67,7 +84,9 @@ module DeliveryBoy
67
84
  ssl_ca_cert_file_path: config.ssl_ca_cert_file_path,
68
85
  ssl_client_cert: config.ssl_client_cert,
69
86
  ssl_client_cert_key: config.ssl_client_cert_key,
87
+ ssl_client_cert_key_password: config.ssl_client_cert_key_password,
70
88
  ssl_ca_certs_from_system: config.ssl_ca_certs_from_system,
89
+ ssl_verify_hostname: config.ssl_verify_hostname,
71
90
  sasl_gssapi_principal: config.sasl_gssapi_principal,
72
91
  sasl_gssapi_keytab: config.sasl_gssapi_keytab,
73
92
  sasl_plain_authzid: config.sasl_plain_authzid,
@@ -76,6 +95,8 @@ module DeliveryBoy
76
95
  sasl_scram_username: config.sasl_scram_username,
77
96
  sasl_scram_password: config.sasl_scram_password,
78
97
  sasl_scram_mechanism: config.sasl_scram_mechanism,
98
+ sasl_over_ssl: config.sasl_over_ssl,
99
+ sasl_oauth_token_provider: config.sasl_oauth_token_provider
79
100
  )
80
101
  end
81
102
 
@@ -90,6 +111,9 @@ module DeliveryBoy
90
111
  max_buffer_bytesize: config.max_buffer_bytesize,
91
112
  compression_codec: (config.compression_codec.to_sym if config.compression_codec),
92
113
  compression_threshold: config.compression_threshold,
114
+ idempotent: config.idempotent,
115
+ transactional: config.transactional,
116
+ transactional_timeout: config.transactional_timeout,
93
117
  }
94
118
  end
95
119
  end
@@ -1,3 +1,3 @@
1
1
  module DeliveryBoy
2
- VERSION = "0.2.7"
2
+ VERSION = "1.1.0"
3
3
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: delivery_boy
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.7
4
+ version: 1.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Daniel Schierbeck
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-06-19 00:00:00.000000000 Z
11
+ date: 2021-01-21 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: ruby-kafka
@@ -16,28 +16,28 @@ dependencies:
16
16
  requirements:
17
17
  - - "~>"
18
18
  - !ruby/object:Gem::Version
19
- version: '0.5'
19
+ version: '1.0'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
- version: '0.5'
26
+ version: '1.0'
27
27
  - !ruby/object:Gem::Dependency
28
28
  name: king_konf
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
31
  - - "~>"
32
32
  - !ruby/object:Gem::Version
33
- version: '0.2'
33
+ version: '1.0'
34
34
  type: :runtime
35
35
  prerelease: false
36
36
  version_requirements: !ruby/object:Gem::Requirement
37
37
  requirements:
38
38
  - - "~>"
39
39
  - !ruby/object:Gem::Version
40
- version: '0.2'
40
+ version: '1.0'
41
41
  - !ruby/object:Gem::Dependency
42
42
  name: bundler
43
43
  requirement: !ruby/object:Gem::Requirement
@@ -87,6 +87,7 @@ executables: []
87
87
  extensions: []
88
88
  extra_rdoc_files: []
89
89
  files:
90
+ - ".circleci/config.yml"
90
91
  - ".gitignore"
91
92
  - ".rspec"
92
93
  - ".travis.yml"
@@ -97,7 +98,6 @@ files:
97
98
  - Rakefile
98
99
  - bin/console
99
100
  - bin/setup
100
- - circle.yml
101
101
  - delivery_boy.gemspec
102
102
  - examples/async.rb
103
103
  - examples/sync.rb
@@ -130,8 +130,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
130
130
  - !ruby/object:Gem::Version
131
131
  version: '0'
132
132
  requirements: []
133
- rubyforge_project:
134
- rubygems_version: 2.7.6
133
+ rubygems_version: 3.1.2
135
134
  signing_key:
136
135
  specification_version: 4
137
136
  summary: A simple way to produce messages to Kafka from Ruby applications
data/circle.yml DELETED
@@ -1,16 +0,0 @@
1
- machine:
2
- pre:
3
- - curl -sSL https://s3.amazonaws.com/circle-downloads/install-circleci-docker.sh | bash -s -- 1.10.0
4
- services:
5
- - docker
6
- ruby:
7
- version: 2.4.1
8
-
9
- dependencies:
10
- pre:
11
- - docker -v
12
- - docker run -p 2181:2181 -p 9092:9092 --env ADVERTISED_HOST=localhost --env ADVERTISED_PORT=9092 -d spotify/kafka
13
-
14
- test:
15
- override:
16
- - bundle exec rspec