delivery_boy 0.2.8.beta1 → 0.2.8

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a8b6f4609e705c72433240e90bd6c7bdda86f5ca99fdd9faf9d44f229d9f1e7e
4
- data.tar.gz: 5db195d68c9eae6afd34a87984ec18e91e99f6976362a83f840abcc4d5de8117
3
+ metadata.gz: 7d1d10c8e6cf173eeca342252bf3e47327f08200fd6e34207e803d6ef490ead7
4
+ data.tar.gz: 4e7d841926323cc46aefec6cdf93812796ae2a454a562e4ad0c6d02eb79b229e
5
5
  SHA512:
6
- metadata.gz: bb243655a65186433f0dde3b36a9c9c7fb6240f68e238afb361b57c5458732e7152745fb8026f67331986cecc7ceb9f1bb2319c4d76ba778141dc088ae412608
7
- data.tar.gz: 19eeaf3237e612814226602c671b16f374d8c881da1a8e1095f15ea0785e0c9fff3453d238133f1bfb60d5897e9760b84a8cbe6759c68853f9e2dbd58065811a
6
+ metadata.gz: 6388bf8cadefd13a20ceabb7e887171e0f6c8dc42cb5f31a5370da7578bb5e5913192a20161d0370af43cc3a88798f74417886657f887e636fec2586dddfd27a
7
+ data.tar.gz: e1cbea934e040fcea51b9730e0340376aef0f59ede9c8d82199bba3b0adc60c309ef87a86ade3b7a6bb1d9f7136403dfb6d64408693974f4832f85a04ce24500
@@ -0,0 +1,33 @@
1
+ version: 2
2
+ jobs:
3
+ build:
4
+ docker:
5
+ - image: circleci/ruby:2.5.1-node
6
+ environment:
7
+ LOG_LEVEL: DEBUG
8
+ - image: wurstmeister/zookeeper
9
+ - image: wurstmeister/kafka:2.11-2.0.0
10
+ environment:
11
+ KAFKA_ADVERTISED_HOST_NAME: localhost
12
+ KAFKA_ADVERTISED_PORT: 9092
13
+ KAFKA_PORT: 9092
14
+ KAFKA_ZOOKEEPER_CONNECT: localhost:2181
15
+ KAFKA_DELETE_TOPIC_ENABLE: true
16
+ - image: wurstmeister/kafka:2.11-2.0.0
17
+ environment:
18
+ KAFKA_ADVERTISED_HOST_NAME: localhost
19
+ KAFKA_ADVERTISED_PORT: 9093
20
+ KAFKA_PORT: 9093
21
+ KAFKA_ZOOKEEPER_CONNECT: localhost:2181
22
+ KAFKA_DELETE_TOPIC_ENABLE: true
23
+ - image: wurstmeister/kafka:2.11-2.0.0
24
+ environment:
25
+ KAFKA_ADVERTISED_HOST_NAME: localhost
26
+ KAFKA_ADVERTISED_PORT: 9094
27
+ KAFKA_PORT: 9094
28
+ KAFKA_ZOOKEEPER_CONNECT: localhost:2181
29
+ KAFKA_DELETE_TOPIC_ENABLE: true
30
+ steps:
31
+ - checkout
32
+ - run: bundle install --path vendor/bundle
33
+ - run: bundle exec rspec
data/.gitignore CHANGED
@@ -9,4 +9,4 @@
9
9
  /tmp/
10
10
 
11
11
  # rspec failure tracking
12
- .rspec_status
12
+ .rspec_status
data/CHANGELOG CHANGED
@@ -4,7 +4,10 @@
4
4
 
5
5
  ## v0.2.8
6
6
 
7
+ * Support `log_level` config option.
8
+ * Support for ssl_verify_hostname in the configuration (#44)
7
9
  * Upgrade dependency on KingKonf.
10
+ * Allow configuring `sasl_over_ssl`.
8
11
 
9
12
  ## v0.2.7
10
13
 
data/README.md CHANGED
@@ -59,7 +59,34 @@ end
59
59
 
60
60
  In addition to improving response time, delivering messages asynchronously also protects your application against Kafka availability issues -- if messages cannot be delivered, they'll be buffered for later and retried automatically.
61
61
 
62
- Both `deliver` and `deliver_async` take the following options:
62
+ A third method is to produce messages first (without delivering the messages to Kafka yet), and deliver them synchronously later.
63
+
64
+ ```ruby
65
+ # app/controllers/comments_controller.rb
66
+ class CommentsController < ApplicationController
67
+ def create
68
+ @comment = Comment.create!(params)
69
+
70
+ event = {
71
+ name: "comment_created",
72
+ data: {
73
+ comment_id: @comment.id
74
+ user_id: current_user.id
75
+ }
76
+ }
77
+
78
+ # This will queue the two messages in the internal buffer.
79
+ DeliveryBoy.produce(comment.to_json, topic: "comments")
80
+ DeliveryBoy.produce(event.to_json, topic: "activity")
81
+
82
+ # This will deliver all messages in the buffer to Kafka.
83
+ # This call is blocking.
84
+ DeliveryBoy.deliver_messages
85
+ end
86
+ end
87
+ ```
88
+
89
+ The methods `deliver`, `deliver_async` and `produce` take the following options:
63
90
 
64
91
  * `topic` – the Kafka topic that should be written to (required).
65
92
  * `key` – the key that should be set on the Kafka message (optional).
@@ -103,6 +130,10 @@ A list of Kafka brokers that should be used to initialize the client. Defaults t
103
130
 
104
131
  This is how the client will identify itself to the Kafka brokers. Default is `delivery_boy`.
105
132
 
133
+ ##### `log_level`
134
+
135
+ The log level for the logger.
136
+
106
137
  #### Message delivery
107
138
 
108
139
  ##### `delivery_interval`
@@ -231,17 +262,17 @@ describe PostsController do
231
262
  describe "#show" do
232
263
  it "emits an event to Kafka" do
233
264
  post = Post.create!(body: "hello")
234
-
265
+
235
266
  get :show, id: post.id
236
-
267
+
237
268
  # Use this API to extract all messages written to a Kafka topic.
238
269
  messages = DeliveryBoy.testing.messages_for("post_views")
239
-
270
+
240
271
  expect(messages.count).to eq 1
241
-
272
+
242
273
  # In addition to #value, you can also pull out #key and #partition_key.
243
274
  event = JSON.parse(messages.first.value)
244
-
275
+
245
276
  expect(event["post_id"]).to eq post.id
246
277
  end
247
278
  end
@@ -20,7 +20,7 @@ Gem::Specification.new do |spec|
20
20
 
21
21
  spec.require_paths = ["lib"]
22
22
 
23
- spec.add_runtime_dependency "ruby-kafka", "~> 0.5"
23
+ spec.add_runtime_dependency "ruby-kafka", "~> 0.7.8"
24
24
  spec.add_runtime_dependency "king_konf", "~> 0.3"
25
25
 
26
26
  spec.add_development_dependency "bundler", "~> 1.15"
@@ -47,6 +47,39 @@ module DeliveryBoy
47
47
  instance.deliver_async!(value, topic: topic, **options)
48
48
  end
49
49
 
50
+ # Like {.produce!}, but handles +Kafka::BufferOverflow+ errors
51
+ # by logging them and just going on with normal business.
52
+ #
53
+ # @return [nil]
54
+ def produce(value, topic:, **options)
55
+ produce!(value, topic: topic, **options)
56
+ rescue Kafka::BufferOverflow
57
+ logger.error "Message for `#{topic}` dropped due to buffer overflow"
58
+ end
59
+
60
+ # Appends the given message to the producer buffer but does not send it until {.deliver_messages} is called.
61
+ #
62
+ # @param value [String] the message value.
63
+ # @param topic [String] the topic that the message should be written to.
64
+ # @param key [String, nil] the message key.
65
+ # @param partition [Integer, nil] the topic partition that the message should
66
+ # be written to.
67
+ # @param partition_key [String, nil] a key used to deterministically assign
68
+ # a partition to the message.
69
+ # @return [nil]
70
+ # @raise [Kafka::BufferOverflow] if the producer's buffer is full.
71
+ def produce!(value, topic:, **options)
72
+ instance.produce(value, topic: topic, **options)
73
+ end
74
+
75
+ # Delivers the items currently in the producer buffer.
76
+ #
77
+ # @return [nil]
78
+ # @raise [Kafka::DeliveryFailed] if delivery failed for some reason.
79
+ def deliver_messages
80
+ instance.deliver_messages
81
+ end
82
+
50
83
  # Shut down DeliveryBoy.
51
84
  #
52
85
  # Automatically called when the process exits.
@@ -60,7 +93,11 @@ module DeliveryBoy
60
93
  #
61
94
  # @return [Logger]
62
95
  def logger
63
- @logger ||= Logger.new($stdout)
96
+ @logger ||= Logger.new($stdout).tap do |logger|
97
+ if config.log_level
98
+ logger.level = Object.const_get("Logger::#{config.log_level.upcase}")
99
+ end
100
+ end
64
101
  end
65
102
 
66
103
  attr_writer :logger
@@ -7,6 +7,7 @@ module DeliveryBoy
7
7
  # Basic
8
8
  list :brokers, items: :string, sep: ",", default: ["localhost:9092"]
9
9
  string :client_id, default: "delivery_boy"
10
+ string :log_level, default: nil
10
11
 
11
12
  # Buffering
12
13
  integer :max_buffer_bytesize, default: 10_000_000
@@ -35,6 +36,7 @@ module DeliveryBoy
35
36
  string :ssl_client_cert, default: nil
36
37
  string :ssl_client_cert_key, default: nil
37
38
  boolean :ssl_ca_certs_from_system, default: false
39
+ boolean :ssl_verify_hostname, default: true
38
40
 
39
41
  # SASL authentication
40
42
  string :sasl_gssapi_principal
@@ -45,6 +47,7 @@ module DeliveryBoy
45
47
  string :sasl_scram_username
46
48
  string :sasl_scram_password
47
49
  string :sasl_scram_mechanism
50
+ boolean :sasl_over_ssl, default: true
48
51
 
49
52
  # Datadog monitoring
50
53
  boolean :datadog_enabled
@@ -10,31 +10,62 @@ module DeliveryBoy
10
10
 
11
11
  def initialize
12
12
  @messages = Hash.new {|h, k| h[k] = [] }
13
+ @buffer = Hash.new {|h, k| h[k] = [] }
14
+ @delivery_lock = Mutex.new
13
15
  end
14
16
 
15
17
  def deliver(value, topic:, key: nil, partition: nil, partition_key: nil, create_time: Time.now)
16
- offset = @messages[topic].count
17
- message = FakeMessage.new(value, topic, key, offset, partition, partition_key, create_time)
18
+ @delivery_lock.synchronize do
19
+ offset = @messages[topic].count
20
+ message = FakeMessage.new(value, topic, key, offset, partition, partition_key, create_time)
18
21
 
19
- @messages[topic] << message
22
+ @messages[topic] << message
23
+ end
20
24
 
21
25
  nil
22
26
  end
23
27
 
24
28
  alias deliver_async! deliver
25
29
 
30
+ def produce(value, topic:, key: nil, partition: nil, partition_key: nil, create_time: Time.now)
31
+ @delivery_lock.synchronize do
32
+ offset = @buffer[topic].count
33
+ message = FakeMessage.new(value, topic, key, offset, partition, partition_key, create_time)
34
+
35
+ @buffer[topic] << message
36
+ end
37
+
38
+ nil
39
+ end
40
+
41
+ def deliver_messages
42
+ @delivery_lock.synchronize do
43
+ @buffer.each do |topic, messages|
44
+ @messages[topic].push(*messages)
45
+ end
46
+ @buffer.clear
47
+ end
48
+ end
49
+
26
50
  def shutdown
27
51
  clear
28
52
  end
29
53
 
30
54
  # Clear all messages stored in memory.
31
55
  def clear
32
- @messages.clear
56
+ @delivery_lock.synchronize do
57
+ @messages.clear
58
+ @buffer.clear
59
+ end
33
60
  end
34
61
 
35
62
  # Return all messages written to the specified topic.
36
63
  def messages_for(topic)
37
- @messages[topic]
64
+ @delivery_lock.synchronize do
65
+ # Return a clone so that the list of messages can be traversed
66
+ # without worrying about a concurrent modification
67
+ @messages[topic].clone
68
+ end
38
69
  end
39
70
  end
40
71
  end
@@ -6,6 +6,7 @@ module DeliveryBoy
6
6
  def initialize(config, logger)
7
7
  @config = config
8
8
  @logger = logger
9
+ @async_producer = nil
9
10
  end
10
11
 
11
12
  def deliver(value, topic:, **options)
@@ -27,6 +28,14 @@ module DeliveryBoy
27
28
  async_producer.shutdown if async_producer?
28
29
  end
29
30
 
31
+ def produce(value, topic:, **options)
32
+ sync_producer.produce(value, topic: topic, **options)
33
+ end
34
+
35
+ def deliver_messages
36
+ sync_producer.deliver_messages
37
+ end
38
+
30
39
  private
31
40
 
32
41
  attr_reader :config, :logger
@@ -68,6 +77,7 @@ module DeliveryBoy
68
77
  ssl_client_cert: config.ssl_client_cert,
69
78
  ssl_client_cert_key: config.ssl_client_cert_key,
70
79
  ssl_ca_certs_from_system: config.ssl_ca_certs_from_system,
80
+ ssl_verify_hostname: config.ssl_verify_hostname,
71
81
  sasl_gssapi_principal: config.sasl_gssapi_principal,
72
82
  sasl_gssapi_keytab: config.sasl_gssapi_keytab,
73
83
  sasl_plain_authzid: config.sasl_plain_authzid,
@@ -76,6 +86,7 @@ module DeliveryBoy
76
86
  sasl_scram_username: config.sasl_scram_username,
77
87
  sasl_scram_password: config.sasl_scram_password,
78
88
  sasl_scram_mechanism: config.sasl_scram_mechanism,
89
+ sasl_over_ssl: config.sasl_over_ssl
79
90
  )
80
91
  end
81
92
 
@@ -1,3 +1,3 @@
1
1
  module DeliveryBoy
2
- VERSION = "0.2.8.beta1"
2
+ VERSION = "0.2.8"
3
3
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: delivery_boy
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.8.beta1
4
+ version: 0.2.8
5
5
  platform: ruby
6
6
  authors:
7
7
  - Daniel Schierbeck
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2019-01-21 00:00:00.000000000 Z
11
+ date: 2019-07-31 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: ruby-kafka
@@ -16,14 +16,14 @@ dependencies:
16
16
  requirements:
17
17
  - - "~>"
18
18
  - !ruby/object:Gem::Version
19
- version: '0.5'
19
+ version: 0.7.8
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
- version: '0.5'
26
+ version: 0.7.8
27
27
  - !ruby/object:Gem::Dependency
28
28
  name: king_konf
29
29
  requirement: !ruby/object:Gem::Requirement
@@ -87,6 +87,7 @@ executables: []
87
87
  extensions: []
88
88
  extra_rdoc_files: []
89
89
  files:
90
+ - ".circleci/config.yml"
90
91
  - ".gitignore"
91
92
  - ".rspec"
92
93
  - ".travis.yml"
@@ -97,7 +98,6 @@ files:
97
98
  - Rakefile
98
99
  - bin/console
99
100
  - bin/setup
100
- - circle.yml
101
101
  - delivery_boy.gemspec
102
102
  - examples/async.rb
103
103
  - examples/sync.rb
@@ -126,12 +126,11 @@ required_ruby_version: !ruby/object:Gem::Requirement
126
126
  version: '0'
127
127
  required_rubygems_version: !ruby/object:Gem::Requirement
128
128
  requirements:
129
- - - ">"
129
+ - - ">="
130
130
  - !ruby/object:Gem::Version
131
- version: 1.3.1
131
+ version: '0'
132
132
  requirements: []
133
- rubyforge_project:
134
- rubygems_version: 2.7.6
133
+ rubygems_version: 3.0.3
135
134
  signing_key:
136
135
  specification_version: 4
137
136
  summary: A simple way to produce messages to Kafka from Ruby applications
data/circle.yml DELETED
@@ -1,16 +0,0 @@
1
- machine:
2
- pre:
3
- - curl -sSL https://s3.amazonaws.com/circle-downloads/install-circleci-docker.sh | bash -s -- 1.10.0
4
- services:
5
- - docker
6
- ruby:
7
- version: 2.4.1
8
-
9
- dependencies:
10
- pre:
11
- - docker -v
12
- - docker run -p 2181:2181 -p 9092:9092 --env ADVERTISED_HOST=localhost --env ADVERTISED_PORT=9092 -d spotify/kafka
13
-
14
- test:
15
- override:
16
- - bundle exec rspec