deimos-ruby 1.0.0 → 1.1.0.pre.beta1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 65652367bda550248f06a89305f831f4edeaef1cdd1fc2f87f21b80b5c124adf
4
- data.tar.gz: b58ca940506703e6c1f39edd1a8b9c058d711fa47925696e799593e7d741ce7d
3
+ metadata.gz: b9cb2755df1f6588b130a9171d427a54301e7e5258d6f414d8e9950393b0a361
4
+ data.tar.gz: dc475fcb4e450a693501b6c3a148c93dec90e0c40212e40de991e2aa7b8afb4f
5
5
  SHA512:
6
- metadata.gz: b646a55133358dfe0e188cdc47338610b327eb5bd4a56cfb1e329d73f1798c00f85ac9b07eb6c675fceee1b0dedecd9f5d2c1028c10bc5d270856d33ba55a271
7
- data.tar.gz: 8c4d4557d4a1a4aebfc9d55159c68c26864a87d065e68b14975f1843318851b024cc711ee12ed85ba1b0dcc1a14d7cedf80eba2be5961fdcd0d33dae3aa7141b
6
+ metadata.gz: 8b7594655e6051986ddc55efc7a5df32fafd3e445324b8d2592eeb406cbf607cf401b6c6ab9c4d7c76d6d8cfa3c8890d47400231dec15f330dbffb0341644a59
7
+ data.tar.gz: bc909ec5cb051ee58e6f7840a1d191e9881f72e3442d9eba35be14e28fada2ff070f318fd5d73d46eca3bdc05f7ed05e0b0ac4d8a98f8ff4f29377854dd129ca
data/.rubocop.yml CHANGED
@@ -179,6 +179,9 @@ Style/GuardClause:
179
179
  Style/HashSyntax:
180
180
  EnforcedStyle: ruby19_no_mixed_keys
181
181
 
182
+ Style/IfUnlessModifier:
183
+ Enabled: false
184
+
182
185
  # Allow the following:
183
186
  # var x = "foo" +
184
187
  # "bar"
data/CHANGELOG.md CHANGED
@@ -7,6 +7,9 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
7
7
 
8
8
  ## UNRELEASED
9
9
 
10
+ # [1.1.0-beta1] - 2019-09-10
11
+ - Added BatchConsumer.
12
+
10
13
  ## [1.0.0] - 2019-09-03
11
14
  - Official release of Deimos 1.0!
12
15
 
data/Gemfile.lock CHANGED
@@ -1,10 +1,10 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- deimos-ruby (1.0.0)
4
+ deimos-ruby (1.1.0.pre.beta1)
5
5
  avro-patches (~> 0.3)
6
6
  avro_turf (~> 0.8)
7
- phobos (~> 1.8)
7
+ phobos (~> 1.8.2.pre.beta2)
8
8
  ruby-kafka (~> 0.7)
9
9
 
10
10
  GEM
@@ -87,7 +87,7 @@ GEM
87
87
  parser (2.6.3.0)
88
88
  ast (~> 2.4.0)
89
89
  pg (1.1.3)
90
- phobos (1.8.1)
90
+ phobos (1.8.2.pre.beta2)
91
91
  activesupport (>= 3.0.0)
92
92
  concurrent-ruby (>= 1.0.2)
93
93
  concurrent-ruby-ext (>= 1.0.2)
data/README.md CHANGED
@@ -21,6 +21,7 @@ Built on Phobos and hence Ruby-Kafka.
21
21
  * [Kafka Message Keys](#kafka-message-keys)
22
22
  * [Consumers](#consumers)
23
23
  * [Rails Integration](#rails-integration)
24
+ * [Database Backend](#database-backend)
24
25
  * [Running Consumers](#running-consumers)
25
26
  * [Metrics](#metrics)
26
27
  * [Testing](#testing)
@@ -335,8 +336,53 @@ class MyConsumer < Deimos::Consumer
335
336
  def consume(payload, metadata)
336
337
  # Same method as Phobos consumers.
337
338
  # payload is an Avro-decoded hash.
338
- # Metadata is a hash that contains information like :key and :topic. Both
339
- # key (if configured) and payload will be Avro-decoded.
339
+ # metadata is a hash that contains information like :key and :topic.
340
+ # In general, your key should be included in the payload itself. However,
341
+ # if you need to access it separately from the payload, you can use
342
+ # metadata[:key]
343
+ end
344
+ end
345
+ ```
346
+
347
+ ### Batch Consumption
348
+
349
+ Instead of consuming messages one at a time, consumers can receive a batch of
350
+ messages as an array and then process them together. This can improve
351
+ consumer throughput, depending on the use case. Batch consumers behave like
352
+ other consumers in regards to key and payload decoding, etc.
353
+
354
+ To enable batch consumption, create a listener in `phobos.yml` and ensure that
355
+ the `delivery` property is set to `inline_batch`. For example:
356
+
357
+ ```yaml
358
+ listeners:
359
+ - handler: Consumers::MyBatchConsumer
360
+ topic: my_batched_topic
361
+ group_id: my_group_id
362
+ delivery: inline_batch
363
+ ```
364
+
365
+ Batch consumers must inherit from the Deimos::BatchConsumer class as in
366
+ this sample:
367
+
368
+ ```ruby
369
+ class MyBatchConsumer < Deimos::BatchConsumer
370
+
371
+ # See the Consumer sample in the previous section
372
+ schema 'MySchema'
373
+ namespace 'com.my-namespace'
374
+ key_config field: :my_id
375
+
376
+ def consume_batch(payloads, metadata)
377
+ # payloads is an array of Avro-decoded hashes.
378
+ # metadata is a hash that contains information like :keys and :topic.
379
+ # Keys are automatically decoded and available as an array with
380
+ # the same cardinality as the payloads. If you need to iterate
381
+ # over payloads and keys together, you can use something like this:
382
+
383
+ payloads.zip(metadata[:keys]) do |_payload, _key|
384
+ # Do something
385
+ end
340
386
  end
341
387
  end
342
388
  ```
@@ -454,7 +500,7 @@ class Widget < ActiveRecord::Base
454
500
  end
455
501
  ```
456
502
 
457
- #### Database Backend
503
+ # Database Backend
458
504
 
459
505
  Deimos provides a way to allow Kafka messages to be created inside a
460
506
  database transaction, and send them asynchronously. This ensures that your
@@ -479,6 +525,7 @@ of immediately sending to Kafka. Now, you just need to call
479
525
 
480
526
  Deimos.start_db_backend!
481
527
 
528
+ You can do this inside a thread or fork block.
482
529
  If using Rails, you can use a Rake task to do this:
483
530
 
484
531
  rails deimos:db_producer
@@ -495,6 +542,16 @@ If you want to force a message to send immediately, just call the `publish_list`
495
542
  method with `force_send: true`. You can also pass `force_send` into any of the
496
543
  other methods that publish events, like `send_event` in `ActiveRecordProducer`.
497
544
 
545
+ A couple of gotchas when using this feature:
546
+ * This may result in high throughput depending on your scale. If you're
547
+ using Rails < 5.1, you should add a migration to change the `id` column
548
+ to `BIGINT`. Rails >= 5.1 sets it to BIGINT by default.
549
+ * This table is high throughput but should generally be empty. Make sure
550
+ you optimize/vacuum this table regularly to reclaim the disk space.
551
+ * Currently, threads allow you to scale the *number* of topics but not
552
+ a single large topic with lots of messages. There is an [issue](https://github.com/flipp-oss/deimos/issues/23)
553
+ opened that would help with this case.
554
+
498
555
  For more information on how the database backend works and why it was
499
556
  implemented, please see [Database Backends](docs/DATABASE_BACKEND.md).
500
557
 
@@ -562,7 +619,17 @@ The following metrics are reported:
562
619
  * `status:success`
563
620
  * `status:error`
564
621
  * `time:consume` (histogram)
622
+ * Amount of time spent executing handler for each message
623
+ * Batch Consumers - report counts by number of batches
624
+ * `status:batch_received`
625
+ * `status:batch_success`
626
+ * `status:batch_error`
627
+ * `time:consume_batch` (histogram)
628
+ * Amount of time spent executing handler for entire batch
565
629
  * `time:time_delayed` (histogram)
630
+ * Indicates the amount of time between the `timestamp` property of each
631
+ payload (if present) and the time that the consumer started processing
632
+ the message.
566
633
  * `publish` - a count of the number of messages received. Tagged
567
634
  with `topic:{topic_name}`
568
635
  * `publish_error` - a count of the number of messages which failed
@@ -655,6 +722,15 @@ test_consume_message(MyConsumer,
655
722
  test_consume_invalid_message(MyConsumer,
656
723
  { 'some-invalid-payload' => 'some-value' })
657
724
 
725
+ # For batch consumers, there are similar methods such as:
726
+ test_consume_batch(MyBatchConsumer,
727
+ [{ 'some-payload' => 'some-value' },
728
+ { 'some-payload' => 'some-other-value' }]) do |payloads, metadata|
729
+ # Expectations here
730
+ end
731
+
732
+ ## Producing
733
+
658
734
  # A matcher which allows you to test that a message was sent on the given
659
735
  # topic, without having to know which class produced it.
660
736
  expect(topic_name).to have_sent(payload, key=nil)
data/deimos-ruby.gemspec CHANGED
@@ -20,7 +20,7 @@ Gem::Specification.new do |spec|
20
20
 
21
21
  spec.add_runtime_dependency('avro-patches', '~> 0.3')
22
22
  spec.add_runtime_dependency('avro_turf', '~> 0.8')
23
- spec.add_runtime_dependency('phobos', '~> 1.8')
23
+ spec.add_runtime_dependency('phobos', '~> 1.8.2.pre.beta2')
24
24
  spec.add_runtime_dependency('ruby-kafka', '~> 0.7')
25
25
 
26
26
  spec.add_development_dependency('activerecord', '~> 5.2')
@@ -0,0 +1,93 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Deimos
4
+ # Shared methods for Kafka Consumers
5
+ class BaseConsumer
6
+ include SharedConfig
7
+
8
+ class << self
9
+ # @return [AvroDataEncoder]
10
+ def decoder
11
+ @decoder ||= AvroDataDecoder.new(schema: config[:schema],
12
+ namespace: config[:namespace])
13
+ end
14
+
15
+ # @return [AvroDataEncoder]
16
+ def key_decoder
17
+ @key_decoder ||= AvroDataDecoder.new(schema: config[:key_schema],
18
+ namespace: config[:namespace])
19
+ end
20
+ end
21
+
22
+ # Helper method to decode an Avro-encoded key.
23
+ # @param key [String]
24
+ # @return [Object] the decoded key.
25
+ def decode_key(key)
26
+ return nil if key.nil?
27
+
28
+ config = self.class.config
29
+ if config[:encode_key] && config[:key_field].nil? &&
30
+ config[:key_schema].nil?
31
+ raise 'No key config given - if you are not decoding keys, please use '\
32
+ '`key_config plain: true`'
33
+ end
34
+
35
+ if config[:key_field]
36
+ self.class.decoder.decode_key(key, config[:key_field])
37
+ elsif config[:key_schema]
38
+ self.class.key_decoder.decode(key, schema: config[:key_schema])
39
+ else # no encoding
40
+ key
41
+ end
42
+ end
43
+
44
+ protected
45
+
46
+ # @param payload [Hash|String]
47
+ # @param metadata [Hash]
48
+ def _with_error_span(payload, metadata)
49
+ @span = Deimos.config.tracer&.start(
50
+ 'deimos-consumer',
51
+ resource: self.class.name.gsub('::', '-')
52
+ )
53
+ yield
54
+ rescue StandardError => e
55
+ _handle_error(e, payload, metadata)
56
+ ensure
57
+ Deimos.config.tracer&.finish(@span)
58
+ end
59
+
60
+ def _report_time_delayed(payload, metadata)
61
+ return if payload.nil? || payload['timestamp'].blank?
62
+
63
+ begin
64
+ time_delayed = Time.now.in_time_zone - payload['timestamp'].to_datetime
65
+ rescue ArgumentError
66
+ Deimos.config.logger.info(
67
+ message: "Error parsing timestamp! #{payload['timestamp']}"
68
+ )
69
+ return
70
+ end
71
+ Deimos.config.metrics&.histogram('handler', time_delayed, tags: %W(
72
+ time:time_delayed
73
+ topic:#{metadata[:topic]}
74
+ ))
75
+ end
76
+
77
+ # @param exception [Throwable]
78
+ # @param _payload [Hash]
79
+ # @param _metadata [Hash]
80
+ def _handle_error(exception, _payload, _metadata)
81
+ Deimos.config.tracer&.set_error(@span, exception)
82
+
83
+ raise if Deimos.config.reraise_consumer_errors
84
+ end
85
+
86
+ # @param _time_taken [Float]
87
+ # @param _payload [Hash]
88
+ # @param _metadata [Hash]
89
+ def _handle_success(_time_taken, _payload, _metadata)
90
+ raise NotImplementedError
91
+ end
92
+ end
93
+ end
@@ -0,0 +1,147 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'deimos/avro_data_decoder'
4
+ require 'deimos/base_consumer'
5
+ require 'phobos/batch_handler'
6
+
7
+ module Deimos
8
+ # Class to consume batches of messages in a topic
9
+ # Note: According to the docs, instances of your handler will be created
10
+ # for every incoming batch of messages. This class should be lightweight.
11
+ class BatchConsumer < BaseConsumer
12
+ include Phobos::BatchHandler
13
+
14
+ # :nodoc:
15
+ def around_consume_batch(payloads, metadata)
16
+ _received_batch(payloads, metadata)
17
+ benchmark = Benchmark.measure do
18
+ _with_error_span(payloads, metadata) { yield }
19
+ end
20
+ _handle_success(benchmark.real, payloads, metadata)
21
+ end
22
+
23
+ # :nodoc:
24
+ def before_consume_batch(batch, metadata)
25
+ _with_error_span(batch, metadata) do
26
+ metadata[:keys] = batch.map do |message|
27
+ decode_key(message.key)
28
+ end
29
+
30
+ batch.map do |message|
31
+ self.class.decoder.decode(message.payload) if message.payload.present?
32
+ end
33
+ end
34
+ end
35
+
36
+ # Consume a batch of incoming messages.
37
+ # @param _payloads [Array<Phobos::BatchMessage>]
38
+ # @param _metadata [Hash]
39
+ def consume_batch(_payloads, _metadata)
40
+ raise NotImplementedError
41
+ end
42
+
43
+ protected
44
+
45
+ def _received_batch(payloads, metadata)
46
+ Deimos.config.logger.info(
47
+ message: 'Got Kafka batch event',
48
+ message_ids: _payload_identifiers(payloads, metadata),
49
+ metadata: metadata.except(:keys)
50
+ )
51
+ Deimos.config.logger.debug(
52
+ message: 'Kafka batch event payloads',
53
+ payloads: payloads
54
+ )
55
+ Deimos.config.metrics&.increment(
56
+ 'handler',
57
+ tags: %W(
58
+ status:batch_received
59
+ topic:#{metadata[:topic]}
60
+ ))
61
+ Deimos.config.metrics&.increment(
62
+ 'handler',
63
+ by: metadata['batch_size'],
64
+ tags: %W(
65
+ status:received
66
+ topic:#{metadata[:topic]}
67
+ ))
68
+ if payloads.present?
69
+ payloads.each { |payload| _report_time_delayed(payload, metadata) }
70
+ end
71
+ end
72
+
73
+ # @param exception [Throwable]
74
+ # @param payloads [Array<Hash>]
75
+ # @param metadata [Hash]
76
+ def _handle_error(exception, payloads, metadata)
77
+ Deimos.config.metrics&.increment(
78
+ 'handler',
79
+ tags: %W(
80
+ status:batch_error
81
+ topic:#{metadata[:topic]}
82
+ ))
83
+ Deimos.config.logger.warn(
84
+ message: 'Error consuming message batch',
85
+ handler: self.class.name,
86
+ metadata: metadata.except(:keys),
87
+ message_ids: _payload_identifiers(payloads, metadata),
88
+ error_message: exception.message,
89
+ error: exception.backtrace
90
+ )
91
+ super
92
+ end
93
+
94
+ # @param time_taken [Float]
95
+ # @param payloads [Array<Hash>]
96
+ # @param metadata [Hash]
97
+ def _handle_success(time_taken, payloads, metadata)
98
+ Deimos.config.metrics&.histogram('handler', time_taken, tags: %W(
99
+ time:consume_batch
100
+ topic:#{metadata[:topic]}
101
+ ))
102
+ Deimos.config.metrics&.increment(
103
+ 'handler',
104
+ tags: %W(
105
+ status:batch_success
106
+ topic:#{metadata[:topic]}
107
+ ))
108
+ Deimos.config.metrics&.increment(
109
+ 'handler',
110
+ by: metadata['batch_size'],
111
+ tags: %W(
112
+ status:success
113
+ topic:#{metadata[:topic]}
114
+ ))
115
+ Deimos.config.logger.info(
116
+ message: 'Finished processing Kafka batch event',
117
+ message_ids: _payload_identifiers(payloads, metadata),
118
+ time_elapsed: time_taken,
119
+ metadata: metadata.except(:keys)
120
+ )
121
+ end
122
+
123
+ # Get payload identifiers (key and message_id if present) for logging.
124
+ # @param payloads [Array<Hash>]
125
+ # @param metadata [Hash]
126
+ # @return [Hash] the identifiers.
127
+ def _payload_identifiers(payloads, metadata)
128
+ message_ids = payloads&.map do |payload|
129
+ if payload.is_a?(Hash) && payload.key?('message_id')
130
+ payload['message_id']
131
+ end
132
+ end
133
+
134
+ # Payloads may be nil if preprocessing failed
135
+ messages = payloads || metadata[:keys] || []
136
+
137
+ messages.zip(metadata[:keys] || [], message_ids || []).map do |_, k, m_id|
138
+ ids = {}
139
+
140
+ ids[:key] = k if k.present?
141
+ ids[:message_id] = m_id if m_id.present?
142
+
143
+ ids
144
+ end
145
+ end
146
+ end
147
+ end
@@ -1,6 +1,7 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require 'deimos/avro_data_decoder'
4
+ require 'deimos/base_consumer'
4
5
  require 'deimos/shared_config'
5
6
  require 'phobos/handler'
6
7
  require 'active_support/all'
@@ -11,23 +12,8 @@ require 'ddtrace'
11
12
  # for every incoming message. This class should be lightweight.
12
13
  module Deimos
13
14
  # Parent consumer class.
14
- class Consumer
15
+ class Consumer < BaseConsumer
15
16
  include Phobos::Handler
16
- include SharedConfig
17
-
18
- class << self
19
- # @return [AvroDataEncoder]
20
- def decoder
21
- @decoder ||= AvroDataDecoder.new(schema: config[:schema],
22
- namespace: config[:namespace])
23
- end
24
-
25
- # @return [AvroDataEncoder]
26
- def key_decoder
27
- @key_decoder ||= AvroDataDecoder.new(schema: config[:key_schema],
28
- namespace: config[:namespace])
29
- end
30
- end
31
17
 
32
18
  # :nodoc:
33
19
  def around_consume(payload, metadata)
@@ -41,34 +27,11 @@ module Deimos
41
27
  # :nodoc:
42
28
  def before_consume(payload, metadata)
43
29
  _with_error_span(payload, metadata) do
44
- if self.class.config[:key_schema] || self.class.config[:key_field]
45
- metadata[:key] = decode_key(metadata[:key])
46
- end
30
+ metadata[:key] = decode_key(metadata[:key])
47
31
  self.class.decoder.decode(payload) if payload.present?
48
32
  end
49
33
  end
50
34
 
51
- # Helper method to decode an Avro-encoded key.
52
- # @param key [String]
53
- # @return [Object] the decoded key.
54
- def decode_key(key)
55
- return nil if key.nil?
56
-
57
- config = self.class.config
58
- if config[:encode_key] && config[:key_field].nil? &&
59
- config[:key_schema].nil?
60
- raise 'No key config given - if you are not decoding keys, please use `key_config plain: true`'
61
- end
62
-
63
- if config[:key_field]
64
- self.class.decoder.decode_key(key, config[:key_field])
65
- elsif config[:key_schema]
66
- self.class.key_decoder.decode(key, schema: config[:key_schema])
67
- else # no encoding
68
- key
69
- end
70
- end
71
-
72
35
  # Consume incoming messages.
73
36
  # @param _payload [String]
74
37
  # @param _metadata [Hash]
@@ -78,20 +41,6 @@ module Deimos
78
41
 
79
42
  private
80
43
 
81
- # @param payload [Hash|String]
82
- # @param metadata [Hash]
83
- def _with_error_span(payload, metadata)
84
- @span = Deimos.config.tracer&.start(
85
- 'deimos-consumer',
86
- resource: self.class.name.gsub('::', '-')
87
- )
88
- yield
89
- rescue StandardError => e
90
- _handle_error(e, payload, metadata)
91
- ensure
92
- Deimos.config.tracer&.finish(@span)
93
- end
94
-
95
44
  def _received_message(payload, metadata)
96
45
  Deimos.config.logger.info(
97
46
  message: 'Got Kafka event',
@@ -102,13 +51,13 @@ module Deimos
102
51
  status:received
103
52
  topic:#{metadata[:topic]}
104
53
  ))
54
+ _report_time_delayed(payload, metadata)
105
55
  end
106
56
 
107
57
  # @param exception [Throwable]
108
58
  # @param payload [Hash]
109
59
  # @param metadata [Hash]
110
60
  def _handle_error(exception, payload, metadata)
111
- Deimos.config.tracer&.set_error(@span, exception)
112
61
  Deimos.config.metrics&.increment(
113
62
  'handler',
114
63
  tags: %W(
@@ -124,7 +73,7 @@ module Deimos
124
73
  error_message: exception.message,
125
74
  error: exception.backtrace
126
75
  )
127
- raise if Deimos.config.reraise_consumer_errors
76
+ super
128
77
  end
129
78
 
130
79
  # @param time_taken [Float]
@@ -145,20 +94,6 @@ module Deimos
145
94
  time_elapsed: time_taken,
146
95
  metadata: metadata
147
96
  )
148
- return if payload.nil? || payload['timestamp'].blank?
149
-
150
- begin
151
- time_delayed = Time.now.in_time_zone - payload['timestamp'].to_datetime
152
- rescue ArgumentError
153
- Deimos.config.logger.info(
154
- message: "Error parsing timestamp! #{payload['timestamp']}"
155
- )
156
- return
157
- end
158
- Deimos.config.metrics&.histogram('handler', time_delayed, tags: %W(
159
- time:time_delayed
160
- topic:#{metadata[:topic]}
161
- ))
162
97
  end
163
98
  end
164
99
  end
@@ -91,7 +91,7 @@ module Deimos
91
91
  end
92
92
  end
93
93
 
94
- # Stub all already-loaded producers and consumers fir unit testing purposes.
94
+ # Stub all already-loaded producers and consumers for unit testing purposes.
95
95
  def stub_producers_and_consumers!
96
96
  Deimos::TestHelpers.sent_messages.clear
97
97
 
@@ -106,10 +106,17 @@ module Deimos
106
106
  end
107
107
 
108
108
  Deimos::Consumer.descendants.each do |klass|
109
+ # TODO: remove this when ActiveRecordConsumer uses batching
109
110
  next if klass == Deimos::ActiveRecordConsumer # "abstract" class
110
111
 
111
112
  stub_consumer(klass)
112
113
  end
114
+
115
+ Deimos::BatchConsumer.descendants.each do |klass|
116
+ next if klass == Deimos::ActiveRecordConsumer # "abstract" class
117
+
118
+ stub_batch_consumer(klass)
119
+ end
113
120
  end
114
121
 
115
122
  # Stub a given producer class.
@@ -126,9 +133,7 @@ module Deimos
126
133
  # Stub a given consumer class.
127
134
  # @param klass [Class < Deimos::Consumer]
128
135
  def stub_consumer(klass)
129
- allow(klass).to receive(:decoder) do
130
- create_decoder(klass.config[:schema], klass.config[:namespace])
131
- end
136
+ _stub_base_consumer(klass)
132
137
  klass.class_eval do
133
138
  alias_method(:old_consume, :consume) unless self.instance_methods.include?(:old_consume)
134
139
  end
@@ -138,6 +143,12 @@ module Deimos
138
143
  end
139
144
  end
140
145
 
146
+ # Stub a given batch consumer class.
147
+ # @param klass [Class < Deimos::BatchConsumer]
148
+ def stub_batch_consumer(klass)
149
+ _stub_base_consumer(klass)
150
+ end
151
+
141
152
  # get the difference of 2 hashes.
142
153
  # @param hash1 [Hash]
143
154
  # @param hash2 [Hash]
@@ -314,6 +325,102 @@ module Deimos
314
325
  }.to raise_error(Avro::SchemaValidator::ValidationError)
315
326
  end
316
327
 
328
+ # Test that a given handler will consume a given batch payload correctly,
329
+ # i.e. that the Avro schema is correct. If
330
+ # a block is given, that block will be executed when `consume` is called.
331
+ # Otherwise it will just confirm that `consume` is called at all.
332
+ # @param handler_class_or_topic [Class|String] Class which inherits from
333
+ # Deimos::Consumer or the topic as a string
334
+ # @param payloads [Array<Hash>] the payload to consume
335
+ def test_consume_batch(handler_class_or_topic,
336
+ payloads,
337
+ keys: [],
338
+ partition_keys: [],
339
+ call_original: false,
340
+ skip_expectation: false,
341
+ &block)
342
+ if call_original && block_given?
343
+ raise 'Cannot have both call_original and be given a block!'
344
+ end
345
+
346
+ topic_name = 'my-topic'
347
+ handler_class = if handler_class_or_topic.is_a?(String)
348
+ _get_handler_class_from_topic(handler_class_or_topic)
349
+ else
350
+ handler_class_or_topic
351
+ end
352
+ handler = handler_class.new
353
+ allow(handler_class).to receive(:new).and_return(handler)
354
+ listener = double('listener',
355
+ handler_class: handler_class,
356
+ encoding: nil)
357
+ batch_messages = payloads.zip(keys, partition_keys).map do |payload, key, partition_key|
358
+ key ||= _key_from_consumer(handler_class)
359
+
360
+ double('message',
361
+ 'key' => key,
362
+ 'partition_key' => partition_key,
363
+ 'partition' => 1,
364
+ 'offset' => 1,
365
+ 'value' => payload)
366
+ end
367
+ batch = double('fetched_batch',
368
+ 'messages' => batch_messages,
369
+ 'topic' => topic_name,
370
+ 'partition' => 1,
371
+ 'offset_lag' => 0)
372
+ unless skip_expectation
373
+ expectation = expect(handler).to receive(:consume_batch).
374
+ with(payloads, anything, &block)
375
+ expectation.and_call_original if call_original
376
+ end
377
+ action = Phobos::Actions::ProcessBatchInline.new(
378
+ listener: listener,
379
+ batch: batch,
380
+ metadata: { topic: topic_name }
381
+ )
382
+ allow(action).to receive(:backoff_interval).and_return(0)
383
+ allow(action).to receive(:handle_error) { |e| raise e }
384
+ action.send(:execute)
385
+ end
386
+
387
+ # Check to see that a given message will fail due to Avro errors.
388
+ # @param handler_class [Class]
389
+ # @param payloads [Array<Hash>]
390
+ def test_consume_batch_invalid_message(handler_class, payloads)
391
+ topic_name = 'my-topic'
392
+ handler = handler_class.new
393
+ allow(handler_class).to receive(:new).and_return(handler)
394
+ listener = double('listener',
395
+ handler_class: handler_class,
396
+ encoding: nil)
397
+ batch_messages = payloads.map do |payload|
398
+ key ||= _key_from_consumer(handler_class)
399
+
400
+ double('message',
401
+ 'key' => key,
402
+ 'partition' => 1,
403
+ 'offset' => 1,
404
+ 'value' => payload)
405
+ end
406
+ batch = double('fetched_batch',
407
+ 'messages' => batch_messages,
408
+ 'topic' => topic_name,
409
+ 'partition' => 1,
410
+ 'offset_lag' => 0)
411
+
412
+ action = Phobos::Actions::ProcessBatchInline.new(
413
+ listener: listener,
414
+ batch: batch,
415
+ metadata: { topic: topic_name }
416
+ )
417
+ allow(action).to receive(:backoff_interval).and_return(0)
418
+ allow(action).to receive(:handle_error) { |e| raise e }
419
+
420
+ expect { action.send(:execute) }.
421
+ to raise_error(Avro::SchemaValidator::ValidationError)
422
+ end
423
+
317
424
  # @param schema1 [String|Hash] a file path, JSON string, or
318
425
  # hash representing a schema.
319
426
  # @param schema2 [String|Hash] a file path, JSON string, or
@@ -352,5 +459,13 @@ module Deimos
352
459
 
353
460
  handler.handler.constantize
354
461
  end
462
+
463
+ # Stub shared methods between consumers/batch consumers
464
+ # @param [Class < Deimos::BaseConsumer] klass Consumer class to stub
465
+ def _stub_base_consumer(klass)
466
+ allow(klass).to receive(:decoder) do
467
+ create_decoder(klass.config[:schema], klass.config[:namespace])
468
+ end
469
+ end
355
470
  end
356
471
  end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Deimos
4
- VERSION = '1.0.0'
4
+ VERSION = '1.1.0-beta1'
5
5
  end
data/lib/deimos.rb CHANGED
@@ -10,6 +10,7 @@ require 'deimos/producer'
10
10
  require 'deimos/active_record_producer'
11
11
  require 'deimos/active_record_consumer'
12
12
  require 'deimos/consumer'
13
+ require 'deimos/batch_consumer'
13
14
  require 'deimos/configuration'
14
15
  require 'deimos/instrumentation'
15
16
  require 'deimos/utils/lag_reporter'
@@ -58,6 +59,8 @@ module Deimos
58
59
  configure_loggers(phobos_config)
59
60
 
60
61
  Phobos.configure(phobos_config)
62
+
63
+ validate_consumers
61
64
  end
62
65
 
63
66
  validate_db_backend if self.config.publish_backend == :db
@@ -78,7 +81,7 @@ module Deimos
78
81
  # Start the DB producers to send Kafka messages.
79
82
  # @param thread_count [Integer] the number of threads to start.
80
83
  def start_db_backend!(thread_count: 1)
81
- if self.config.publish_backend != :db # rubocop:disable Style/IfUnlessModifier
84
+ if self.config.publish_backend != :db
82
85
  raise('Publish backend is not set to :db, exiting')
83
86
  end
84
87
 
@@ -119,6 +122,28 @@ module Deimos
119
122
  def ssl_var_contents(filename)
120
123
  File.exist?(filename) ? File.read(filename) : filename
121
124
  end
125
+
126
+ # Validate that consumers are configured correctly, including their
127
+ # delivery mode.
128
+ def validate_consumers
129
+ Phobos.config.listeners.each do |listener|
130
+ handler_class = listener.handler.constantize
131
+ delivery = listener.delivery
132
+
133
+ # Validate that Deimos consumers use proper delivery configs
134
+ if handler_class < Deimos::BatchConsumer
135
+ unless delivery == 'inline_batch'
136
+ raise "BatchConsumer #{listener.handler} must have delivery set to"\
137
+ ' `inline_batch`'
138
+ end
139
+ elsif handler_class < Deimos::Consumer
140
+ if delivery.present? && !%w(message batch).include?(delivery)
141
+ raise "Non-batch Consumer #{listener.handler} must have delivery"\
142
+ ' set to `message` or `batch`'
143
+ end
144
+ end
145
+ end
146
+ end
122
147
  end
123
148
  end
124
149
 
@@ -0,0 +1,247 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'deimos/batch_consumer'
4
+
5
+ # :nodoc:
6
+ module ConsumerTest
7
+ describe Deimos::BatchConsumer do
8
+
9
+ prepend_before(:each) do
10
+ # :nodoc:
11
+ consumer_class = Class.new(Deimos::BatchConsumer) do
12
+ schema 'MySchema'
13
+ namespace 'com.my-namespace'
14
+ key_config field: 'test_id'
15
+
16
+ # :nodoc:
17
+ def consume_batch(_payloads, _metadata)
18
+ raise 'This should not be called unless call_original is set'
19
+ end
20
+ end
21
+ stub_const('ConsumerTest::MyBatchConsumer', consumer_class)
22
+ end
23
+
24
+ let(:batch) do
25
+ [
26
+ { 'test_id' => 'foo', 'some_int' => 123 },
27
+ { 'test_id' => 'bar', 'some_int' => 456 }
28
+ ]
29
+ end
30
+
31
+ let(:invalid_payloads) do
32
+ batch.concat([{ 'invalid' => 'key' }])
33
+ end
34
+
35
+ it 'should consume a batch of messages' do
36
+ test_consume_batch(MyBatchConsumer, batch) do |received, _metadata|
37
+ expect(received).to eq(batch)
38
+ end
39
+ end
40
+
41
+ it 'should consume a message on a topic' do
42
+ test_consume_batch('my_batch_consume_topic', batch) do |received, _metadata|
43
+ expect(received).to eq(batch)
44
+ end
45
+ end
46
+
47
+ it 'should fail on an invalid message in the batch' do
48
+ test_consume_batch_invalid_message(MyBatchConsumer, batch.concat(invalid_payloads))
49
+ end
50
+
51
+ describe 'when reraising errors is disabled' do
52
+ before(:each) do
53
+ Deimos.configure { |config| config.reraise_consumer_errors = false }
54
+ end
55
+
56
+ it 'should not fail when before_consume_batch fails' do
57
+ expect {
58
+ test_consume_batch(
59
+ MyBatchConsumer,
60
+ batch,
61
+ skip_expectation: true
62
+ ) { raise 'OH NOES' }
63
+ }.not_to raise_error
64
+ end
65
+
66
+ it 'should not fail when consume_batch fails' do
67
+ expect {
68
+ test_consume_batch(
69
+ MyBatchConsumer,
70
+ invalid_payloads,
71
+ skip_expectation: true
72
+ )
73
+ }.not_to raise_error
74
+ end
75
+ end
76
+
77
+ describe 'decoding' do
78
+ let(:keys) do
79
+ batch.map { |v| v.slice('test_id') }
80
+ end
81
+
82
+ it 'should decode payloads for all messages in the batch' do
83
+ expect_any_instance_of(Deimos::AvroDataDecoder).
84
+ to receive(:decode).with(batch[0])
85
+ expect_any_instance_of(Deimos::AvroDataDecoder).
86
+ to receive(:decode).with(batch[1])
87
+
88
+ test_consume_batch('my_batch_consume_topic', batch) do |received, _metadata|
89
+ # Mock decoder simply returns the payload
90
+ expect(received).to eq(batch)
91
+ end
92
+ end
93
+
94
+ it 'should decode keys for all messages in the batch' do
95
+ expect_any_instance_of(ConsumerTest::MyBatchConsumer).
96
+ to receive(:decode_key).with(keys[0]).and_call_original
97
+ expect_any_instance_of(ConsumerTest::MyBatchConsumer).
98
+ to receive(:decode_key).with(keys[1]).and_call_original
99
+
100
+ test_consume_batch('my_batch_consume_topic', batch, keys: keys) do |_received, metadata|
101
+ # Mock decode_key extracts the value of the first field as the key
102
+ expect(metadata[:keys]).to eq(%w(foo bar))
103
+ end
104
+ end
105
+
106
+ it 'should decode plain keys for all messages in the batch' do
107
+ consumer_class = Class.new(Deimos::BatchConsumer) do
108
+ schema 'MySchema'
109
+ namespace 'com.my-namespace'
110
+ key_config plain: true
111
+ end
112
+ stub_const('ConsumerTest::MyBatchConsumer', consumer_class)
113
+ stub_batch_consumer(consumer_class)
114
+
115
+ test_consume_batch('my_batch_consume_topic', batch, keys: [1, 2]) do |_received, metadata|
116
+ expect(metadata[:keys]).to eq([1, 2])
117
+ end
118
+ end
119
+ end
120
+
121
+ describe 'timestamps' do
122
+ before(:each) do
123
+ # :nodoc:
124
+ consumer_class = Class.new(Deimos::BatchConsumer) do
125
+ schema 'MySchemaWithDateTimes'
126
+ namespace 'com.my-namespace'
127
+ key_config plain: true
128
+
129
+ # :nodoc:
130
+ def consume_batch(_payloads, _metadata)
131
+ raise 'This should not be called unless call_original is set'
132
+ end
133
+ end
134
+ stub_const('ConsumerTest::MyBatchConsumer', consumer_class)
135
+ stub_batch_consumer(consumer_class)
136
+ allow(Deimos.config.metrics).to receive(:histogram)
137
+ end
138
+
139
+ let(:batch_with_time) do
140
+ [
141
+ {
142
+ 'test_id' => 'foo',
143
+ 'some_int' => 123,
144
+ 'updated_at' => Time.now.to_i,
145
+ 'timestamp' => 2.minutes.ago.to_s
146
+ },
147
+ {
148
+ 'test_id' => 'bar',
149
+ 'some_int' => 456,
150
+ 'updated_at' => Time.now.to_i,
151
+ 'timestamp' => 3.minutes.ago.to_s
152
+ }
153
+ ]
154
+ end
155
+
156
+ let(:invalid_times) do
157
+ [
158
+ {
159
+ 'test_id' => 'baz',
160
+ 'some_int' => 123,
161
+ 'updated_at' => Time.now.to_i,
162
+ 'timestamp' => 'yesterday morning'
163
+ },
164
+ {
165
+ 'test_id' => 'ok',
166
+ 'some_int' => 456,
167
+ 'updated_at' => Time.now.to_i,
168
+ 'timestamp' => ''
169
+ },
170
+ {
171
+ 'test_id' => 'hello',
172
+ 'some_int' => 456,
173
+ 'updated_at' => Time.now.to_i,
174
+ 'timestamp' => '1234567890'
175
+ }
176
+ ]
177
+ end
178
+
179
+ it 'should consume a batch' do
180
+ expect(Deimos.config.metrics).
181
+ to receive(:histogram).with('handler',
182
+ a_kind_of(Numeric),
183
+ tags: %w(time:time_delayed topic:my-topic)).twice
184
+
185
+ test_consume_batch('my_batch_consume_topic', batch_with_time) do |received, _metadata|
186
+ expect(received).to eq(batch_with_time)
187
+ end
188
+ end
189
+
190
+ it 'should fail nicely and ignore timestamps with the wrong format' do
191
+ batch = invalid_times.concat(batch_with_time)
192
+
193
+ expect(Deimos.config.metrics).
194
+ to receive(:histogram).with('handler',
195
+ a_kind_of(Numeric),
196
+ tags: %w(time:time_delayed topic:my-topic)).twice
197
+
198
+ test_consume_batch('my_batch_consume_topic', batch) do |received, _metadata|
199
+ expect(received).to eq(batch)
200
+ end
201
+ end
202
+ end
203
+
204
+ describe 'logging' do
205
+ before(:each) do
206
+ # :nodoc:
207
+ consumer_class = Class.new(Deimos::BatchConsumer) do
208
+ schema 'MySchemaWithUniqueId'
209
+ namespace 'com.my-namespace'
210
+ key_config plain: true
211
+
212
+ # :nodoc:
213
+ def consume_batch(_payloads, _metadata)
214
+ raise 'This should not be called unless call_original is set'
215
+ end
216
+ end
217
+ stub_const('ConsumerTest::MyBatchConsumer', consumer_class)
218
+ stub_batch_consumer(consumer_class)
219
+ allow(Deimos.config.metrics).to receive(:histogram)
220
+ end
221
+
222
+ it 'should log message identifiers' do
223
+ batch_with_message_id = [
224
+ { 'id' => 1, 'test_id' => 'foo', 'some_int' => 5,
225
+ 'timestamp' => 3.minutes.ago.to_s, 'message_id' => 'one' },
226
+ { 'id' => 2, 'test_id' => 'bar', 'some_int' => 6,
227
+ 'timestamp' => 2.minutes.ago.to_s, 'message_id' => 'two' }
228
+ ]
229
+
230
+ allow(Deimos.config.logger).
231
+ to receive(:info)
232
+
233
+ expect(Deimos.config.logger).
234
+ to receive(:info).
235
+ with(hash_including(
236
+ message_ids: [
237
+ { key: 1, message_id: 'one' },
238
+ { key: 2, message_id: 'two' }
239
+ ]
240
+ )).
241
+ twice
242
+
243
+ test_consume_batch('my_batch_consume_topic', batch_with_message_id, keys: [1, 2])
244
+ end
245
+ end
246
+ end
247
+ end
@@ -96,7 +96,7 @@ module ConsumerTest
96
96
  expect { MyConsumer.new.decode_key(123) }.to raise_error(NoMethodError)
97
97
  end
98
98
 
99
- it 'should not encode if plain is set' do
99
+ it 'should not decode if plain is set' do
100
100
  consumer_class = Class.new(Deimos::Consumer) do
101
101
  schema 'MySchema'
102
102
  namespace 'com.my-namespace'
data/spec/deimos_spec.rb CHANGED
@@ -2,58 +2,66 @@
2
2
 
3
3
  describe Deimos do
4
4
 
5
+ let(:phobos_configuration) do
6
+ { 'logger' =>
7
+ { 'file' => 'log/phobos.log',
8
+ 'stdout_json' => false,
9
+ 'level' => 'debug',
10
+ 'ruby_kafka' =>
11
+ { 'level' => 'debug' } },
12
+ 'kafka' =>
13
+ { 'client_id' => 'phobos',
14
+ 'connect_timeout' => 15,
15
+ 'socket_timeout' => 15,
16
+ 'seed_brokers' => 'my_seed_broker.com',
17
+ 'ssl_ca_cert' => 'my_ssl_ca_cert',
18
+ 'ssl_client_cert' => 'my_ssl_client_cert',
19
+ 'ssl_client_cert_key' => 'my_ssl_client_cert_key' },
20
+ 'producer' =>
21
+ { 'ack_timeout' => 5,
22
+ 'required_acks' => :all,
23
+ 'max_retries' => 2,
24
+ 'retry_backoff' => 1,
25
+ 'max_buffer_size' => 10_000,
26
+ 'max_buffer_bytesize' => 10_000_000,
27
+ 'compression_codec' => nil,
28
+ 'compression_threshold' => 1,
29
+ 'max_queue_size' => 10_000,
30
+ 'delivery_threshold' => 0,
31
+ 'delivery_interval' => 0 },
32
+ 'consumer' =>
33
+ { 'session_timeout' => 300,
34
+ 'offset_commit_interval' => 10,
35
+ 'offset_commit_threshold' => 0,
36
+ 'heartbeat_interval' => 10 },
37
+ 'backoff' =>
38
+ { 'min_ms' => 1000,
39
+ 'max_ms' => 60_000 },
40
+ 'listeners' => [
41
+ { 'handler' => 'ConsumerTest::MyConsumer',
42
+ 'topic' => 'my_consume_topic',
43
+ 'group_id' => 'my_group_id',
44
+ 'max_bytes_per_partition' => 524_288 },
45
+ { 'handler' => 'ConsumerTest::MyBatchConsumer',
46
+ 'topic' => 'my_batch_consume_topic',
47
+ 'group_id' => 'my_batch_group_id',
48
+ 'delivery' => 'inline_batch' }
49
+ ],
50
+ 'custom_logger' => nil,
51
+ 'custom_kafka_logger' => nil }
52
+ end
53
+
54
+ let(:config_path) { File.join(File.dirname(__FILE__), 'phobos.yml') }
55
+
5
56
  it 'should have a version number' do
6
57
  expect(Deimos::VERSION).not_to be_nil
7
58
  end
8
59
 
9
60
  specify 'configure' do
10
- phobos_configuration = { 'logger' =>
11
- { 'file' => 'log/phobos.log',
12
- 'stdout_json' => false,
13
- 'level' => 'debug',
14
- 'ruby_kafka' =>
15
- { 'level' => 'debug' } },
16
- 'kafka' =>
17
- { 'client_id' => 'phobos',
18
- 'connect_timeout' => 15,
19
- 'socket_timeout' => 15,
20
- 'seed_brokers' => 'my_seed_broker.com',
21
- 'ssl_ca_cert' => 'my_ssl_ca_cert',
22
- 'ssl_client_cert' => 'my_ssl_client_cert',
23
- 'ssl_client_cert_key' => 'my_ssl_client_cert_key' },
24
- 'producer' =>
25
- { 'ack_timeout' => 5,
26
- 'required_acks' => :all,
27
- 'max_retries' => 2,
28
- 'retry_backoff' => 1,
29
- 'max_buffer_size' => 10_000,
30
- 'max_buffer_bytesize' => 10_000_000,
31
- 'compression_codec' => nil,
32
- 'compression_threshold' => 1,
33
- 'max_queue_size' => 10_000,
34
- 'delivery_threshold' => 0,
35
- 'delivery_interval' => 0 },
36
- 'consumer' =>
37
- { 'session_timeout' => 300,
38
- 'offset_commit_interval' => 10,
39
- 'offset_commit_threshold' => 0,
40
- 'heartbeat_interval' => 10 },
41
- 'backoff' =>
42
- { 'min_ms' => 1000,
43
- 'max_ms' => 60_000 },
44
- 'listeners' => [
45
- { 'handler' => 'ConsumerTest::MyConsumer',
46
- 'topic' => 'my_consume_topic',
47
- 'group_id' => 'my_group_id',
48
- 'max_bytes_per_partition' => 524_288 }
49
- ],
50
- 'custom_logger' => nil,
51
- 'custom_kafka_logger' => nil }
52
-
53
61
  expect(Phobos).to receive(:configure).with(phobos_configuration)
54
62
  allow(described_class).to receive(:ssl_var_contents) { |key| key }
55
63
  described_class.configure do |config|
56
- config.phobos_config_file = File.join(File.dirname(__FILE__), 'phobos.yml')
64
+ config.phobos_config_file = config_path
57
65
  config.seed_broker = 'my_seed_broker.com'
58
66
  config.ssl_enabled = true
59
67
  config.ssl_ca_cert = 'my_ssl_ca_cert'
@@ -118,5 +126,61 @@ describe Deimos do
118
126
  to raise_error('Thread count is not given or set to zero, exiting')
119
127
  end
120
128
 
129
+ describe 'delivery configuration' do
130
+ before(:each) do
131
+ described_class.config = nil
132
+ allow(YAML).to receive(:load).and_return(phobos_configuration)
133
+ end
134
+
135
+ it 'should not raise an error with properly configured handlers' do
136
+ # Add explicit consumers
137
+ phobos_configuration['listeners'] << { 'handler' => 'ConsumerTest::MyConsumer',
138
+ 'delivery' => 'message' }
139
+ phobos_configuration['listeners'] << { 'handler' => 'ConsumerTest::MyConsumer',
140
+ 'delivery' => 'batch' }
141
+
142
+ expect {
143
+ described_class.configure { |c| c.phobos_config_file = config_path }
144
+ }.not_to raise_error
145
+ end
146
+
147
+ it 'should raise an error if BatchConsumers do not have inline_batch delivery' do
148
+ phobos_configuration['listeners'] = [{ 'handler' => 'ConsumerTest::MyBatchConsumer',
149
+ 'delivery' => 'message' }]
150
+
151
+ expect {
152
+ described_class.configure { |c| c.phobos_config_file = config_path }
153
+ }.to raise_error('BatchConsumer ConsumerTest::MyBatchConsumer must have delivery set to `inline_batch`')
154
+ end
155
+
156
+ it 'should raise an error if Consumers do not have message or batch delivery' do
157
+ phobos_configuration['listeners'] = [{ 'handler' => 'ConsumerTest::MyConsumer',
158
+ 'delivery' => 'inline_batch' }]
159
+
160
+ expect {
161
+ described_class.configure { |c| c.phobos_config_file = config_path }
162
+ }.to raise_error('Non-batch Consumer ConsumerTest::MyConsumer must have delivery set to `message` or `batch`')
163
+ end
164
+
165
+ it 'should treat nil as `batch`' do
166
+ phobos_configuration['listeners'] = [{ 'handler' => 'ConsumerTest::MyConsumer' }]
167
+
168
+ expect {
169
+ described_class.configure { |c| c.phobos_config_file = config_path }
170
+ }.not_to raise_error
171
+ end
172
+
173
+ it 'should ignore non-Deimos listeners' do
174
+ consumer_class = Class.new { include Phobos::Handler }
175
+ stub_const('ConsumerTest::MyOtherConsumer', consumer_class)
176
+ phobos_configuration['listeners'] = [{ 'handler' => 'ConsumerTest::MyOtherConsumer',
177
+ 'topic' => 'my_consume_topic',
178
+ 'group_id' => 'my_group_id' }]
179
+
180
+ expect {
181
+ described_class.configure { |c| c.phobos_config_file = config_path }
182
+ }.not_to raise_error
183
+ end
184
+ end
121
185
  end
122
186
  end
@@ -0,0 +1,5 @@
1
+ # frozen_string_literal: true
2
+
3
+ module ConsumerTest
4
+ class MyBatchConsumer < Deimos::BatchConsumer; end
5
+ end
@@ -0,0 +1,5 @@
1
+ # frozen_string_literal: true
2
+
3
+ module ConsumerTest
4
+ class MyConsumer < Deimos::Consumer; end
5
+ end
data/spec/phobos.yml CHANGED
@@ -71,3 +71,7 @@ listeners:
71
71
  topic: my_consume_topic
72
72
  group_id: my_group_id
73
73
  max_bytes_per_partition: 524288 # 512 KB
74
+ - handler: ConsumerTest::MyBatchConsumer
75
+ topic: my_batch_consume_topic
76
+ group_id: my_batch_group_id
77
+ delivery: inline_batch
data/spec/spec_helper.rb CHANGED
@@ -8,6 +8,8 @@ require 'deimos/tracing/mock'
8
8
  require 'deimos/test_helpers'
9
9
  require 'active_support/testing/time_helpers'
10
10
  require 'activerecord-import'
11
+ require 'handlers/my_batch_consumer'
12
+ require 'handlers/my_consumer'
11
13
 
12
14
  # Helpers for Executor/DbProducer
13
15
  module TestRunners
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: deimos-ruby
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0
4
+ version: 1.1.0.pre.beta1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Daniel Orner
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2019-09-03 00:00:00.000000000 Z
11
+ date: 2019-09-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: avro-patches
@@ -44,14 +44,14 @@ dependencies:
44
44
  requirements:
45
45
  - - "~>"
46
46
  - !ruby/object:Gem::Version
47
- version: '1.8'
47
+ version: 1.8.2.pre.beta2
48
48
  type: :runtime
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
52
  - - "~>"
53
53
  - !ruby/object:Gem::Version
54
- version: '1.8'
54
+ version: 1.8.2.pre.beta2
55
55
  - !ruby/object:Gem::Dependency
56
56
  name: ruby-kafka
57
57
  requirement: !ruby/object:Gem::Requirement
@@ -328,6 +328,8 @@ files:
328
328
  - lib/deimos/backends/db.rb
329
329
  - lib/deimos/backends/kafka.rb
330
330
  - lib/deimos/backends/kafka_async.rb
331
+ - lib/deimos/base_consumer.rb
332
+ - lib/deimos/batch_consumer.rb
331
333
  - lib/deimos/configuration.rb
332
334
  - lib/deimos/consumer.rb
333
335
  - lib/deimos/instrumentation.rb
@@ -369,8 +371,11 @@ files:
369
371
  - spec/backends/db_spec.rb
370
372
  - spec/backends/kafka_async_spec.rb
371
373
  - spec/backends/kafka_spec.rb
374
+ - spec/batch_consumer_spec.rb
372
375
  - spec/consumer_spec.rb
373
376
  - spec/deimos_spec.rb
377
+ - spec/handlers/my_batch_consumer.rb
378
+ - spec/handlers/my_consumer.rb
374
379
  - spec/kafka_source_spec.rb
375
380
  - spec/kafka_topic_info_spec.rb
376
381
  - spec/phobos.bad_db.yml
@@ -411,9 +416,9 @@ required_ruby_version: !ruby/object:Gem::Requirement
411
416
  version: '0'
412
417
  required_rubygems_version: !ruby/object:Gem::Requirement
413
418
  requirements:
414
- - - ">="
419
+ - - ">"
415
420
  - !ruby/object:Gem::Version
416
- version: '0'
421
+ version: 1.3.1
417
422
  requirements: []
418
423
  rubygems_version: 3.0.2
419
424
  signing_key:
@@ -427,8 +432,11 @@ test_files:
427
432
  - spec/backends/db_spec.rb
428
433
  - spec/backends/kafka_async_spec.rb
429
434
  - spec/backends/kafka_spec.rb
435
+ - spec/batch_consumer_spec.rb
430
436
  - spec/consumer_spec.rb
431
437
  - spec/deimos_spec.rb
438
+ - spec/handlers/my_batch_consumer.rb
439
+ - spec/handlers/my_consumer.rb
432
440
  - spec/kafka_source_spec.rb
433
441
  - spec/kafka_topic_info_spec.rb
434
442
  - spec/phobos.bad_db.yml