deimos-ruby 1.8.1.pre.beta9 → 1.8.4

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 18ef8f674c86fcdba3fc7a2556162b1104084fd38c6ee8046ed773e9352633bd
4
- data.tar.gz: 60f5a05874a0cd2cd6abceb7cfc5af1a01068a2a3cc4cc4e65dd59d42783468a
3
+ metadata.gz: 497160ac15346d0acb471818c0f02e10f745761ba3777a70039b1a762e8f4edb
4
+ data.tar.gz: d85717deea2ac68d6c995d1dce1c42187d4fe5585b577c4468cafff7f0f71097
5
5
  SHA512:
6
- metadata.gz: '0963acc646195fac41c3aaa71c67bbdc253bf26fa650c6132096a7912b665bb1956e3511f41db08309fbc9db35a6717c7a48f17a340039732bced4c5b6bd8b7a'
7
- data.tar.gz: 0c642071d49e099d8516ed05c5856b1bba00a432c30567b3c79491654b0e1b1cc43e51a4c20f9765e0f2cbcd4af6550fbaa7620b1693d9ed342202609e496914
6
+ metadata.gz: 441577e56812dbc9d0dbbf22b43cb76e390c792165f7d21febcdf17a309fe85ec57bd9ddaad0929ddd3d49dd602a94601fc2660b124419e5943460e2fab94aa5
7
+ data.tar.gz: 2fdd3a94d91130c2430d189165504bdf437907fa1a9a459841bd4e260a2ff906469fb0f6989a8faf6c140cc133d22639af91a89549c85a9aeef2145636aa7b58
@@ -7,6 +7,56 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
7
7
 
8
8
  ## UNRELEASED
9
9
 
10
+ ## 1.8.4 - 2020-12-02
11
+
12
+ ### Features :star:
13
+ - Add overridable "process_message?" method to ActiveRecordConsumer to allow for skipping of saving/updating records
14
+
15
+ ### Fixes :wrench:
16
+
17
+ - Do not apply type coercion to `timestamp-millis` and `timestamp-micros` logical types (fixes [#97](https://github.com/flipp-oss/deimos/issues/97))
18
+
19
+ ## 1.8.3 - 2020-11-18
20
+
21
+ ### Fixes :wrench:
22
+ - Do not resend already sent messages when splitting up batches
23
+ (fixes [#24](https://github.com/flipp-oss/deimos/issues/24))
24
+ - KafkaSource crashing on bulk-imports if import hooks are disabled
25
+ (fixes [#73](https://github.com/flipp-oss/deimos/issues/73))
26
+ - #96 Use string-safe encoding for partition keys
27
+ - Retry on offset seek failures in inline consumer
28
+ (fixes [#5](Inline consumer should use retries when seeking))
29
+
30
+ ## 1.8.2 - 2020-09-25
31
+
32
+ ### Features :star:
33
+ - Add "disabled" config field to consumers to allow disabling
34
+ individual consumers without having to comment out their
35
+ entries and possibly affecting unit tests.
36
+
37
+ ### Fixes :wrench:
38
+ - Prepend topic_prefix while encoding messages
39
+ (fixes [#37](https://github.com/flipp-oss/deimos/issues/37))
40
+ - Raise error if producing without a topic
41
+ (fixes [#50](https://github.com/flipp-oss/deimos/issues/50))
42
+ - Don't try to load producers/consumers when running rake tasks involving webpacker or assets
43
+
44
+ ## 1.8.2-beta2 - 2020-09-15
45
+
46
+ ### Features :star:
47
+
48
+ - Add details on using schema backend directly in README.
49
+ - Default to the provided schema if topic is not provided when
50
+ encoding to `AvroSchemaRegistry`.
51
+ - Add mapping syntax for the `schema` call in `SchemaControllerMixin`.
52
+
53
+ ## 1.8.2-beta1 - 2020-09-09
54
+
55
+ ### Features :star:
56
+
57
+ - Added the ability to specify the topic for `publish`
58
+ and `publish_list` in a producer
59
+
10
60
  ## 1.8.1-beta9 - 2020-08-27
11
61
 
12
62
  ### Fixes :wrench:
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- deimos-ruby (1.8.1.pre.beta8)
4
+ deimos-ruby (1.8.4)
5
5
  avro_turf (~> 0.11)
6
6
  phobos (~> 1.9)
7
7
  ruby-kafka (~> 0.7)
data/README.md CHANGED
@@ -11,6 +11,7 @@ a useful toolbox of goodies for Ruby-based Kafka development.
11
11
  Built on Phobos and hence Ruby-Kafka.
12
12
 
13
13
  <!--ts-->
14
+ * [Additional Documentation](#additional-documentation)
14
15
  * [Installation](#installation)
15
16
  * [Versioning](#versioning)
16
17
  * [Configuration](#configuration)
@@ -29,9 +30,20 @@ Built on Phobos and hence Ruby-Kafka.
29
30
  * [Metrics](#metrics)
30
31
  * [Testing](#testing)
31
32
  * [Integration Test Helpers](#integration-test-helpers)
33
+ * [Utilities](#utilities)
32
34
  * [Contributing](#contributing)
33
35
  <!--te-->
34
36
 
37
+ # Additional Documentation
38
+
39
+ Please see the following for further information not covered by this readme:
40
+
41
+ * [Architecture Design](docs/ARCHITECTURE.md)
42
+ * [Configuration Reference](docs/CONFIGURATION.md)
43
+ * [Database Backend Feature](docs/DATABASE_BACKEND.md)
44
+ * [Upgrading Deimos](docs/UPGRADING.md)
45
+ * [Contributing to Integration Tests](docs/INTEGRATION_TESTS.md)
46
+
35
47
  # Installation
36
48
 
37
49
  Add this line to your application's Gemfile:
@@ -108,6 +120,7 @@ class MyProducer < Deimos::Producer
108
120
  'some-key2' => an_object.bar
109
121
  }
110
122
  # You can also publish an array with self.publish_list(payloads)
123
+ # You may specify the topic here with self.publish(payload, topic: 'my-topic')
111
124
  self.publish(payload)
112
125
  end
113
126
 
@@ -481,6 +494,12 @@ class WhateverController < ApplicationController
481
494
  # will look for: my.namespace.requests.Index.avsc
482
495
  # my.namespace.responses.Index.avsc
483
496
 
497
+ # Can use mapping to change the schema but keep the namespaces,
498
+ # i.e. use the same schema name across the two namespaces
499
+ schemas create: 'CreateTopic'
500
+ # will look for: my.namespace.requests.CreateTopic.avsc
501
+ # my.namespace.responses.CreateTopic.avsc
502
+
484
503
  # If all routes use the default, you can add them all at once
485
504
  schemas :index, :show, :update
486
505
 
@@ -607,6 +626,15 @@ class MyConsumer < Deimos::ActiveRecordConsumer
607
626
  def record_key(payload)
608
627
  super
609
628
  end
629
+
630
+ # Optional override, returns true by default.
631
+ # When this method returns true, a record corresponding to the message
632
+ # is created/updated.
633
+ # When this method returns false, message processing is skipped and a
634
+ # corresponding record will NOT be created/updated.
635
+ def process_message?(payload)
636
+ super
637
+ end
610
638
  end
611
639
  ```
612
640
 
@@ -988,13 +1016,24 @@ Deimos::Utils::InlineConsumer.get_messages_for(
988
1016
  )
989
1017
  ```
990
1018
 
1019
+ ## Utilities
1020
+
1021
+ You can use your configured schema backend directly if you want to
1022
+ encode and decode payloads outside of the context of sending messages.
1023
+
1024
+ ```ruby
1025
+ backend = Deimos.schema_backend(schema: 'MySchema', namespace: 'com.my-namespace')
1026
+ encoded = backend.encode(my_payload)
1027
+ decoded = backend.decode(my_encoded_payload)
1028
+ coerced = backend.coerce(my_payload) # coerce to correct types
1029
+ backend.validate(my_payload) # throws an error if not valid
1030
+ fields = backend.schema_fields # list of fields defined in the schema
1031
+ ```
1032
+
991
1033
  ## Contributing
992
1034
 
993
1035
  Bug reports and pull requests are welcome on GitHub at https://github.com/flipp-oss/deimos .
994
1036
 
995
- We have more information on the [internal architecture](docs/ARCHITECTURE.md) of Deimos
996
- for contributors!
997
-
998
1037
  ### Linting
999
1038
 
1000
1039
  Deimos uses Rubocop to lint the code. Please run Rubocop on your code
@@ -79,6 +79,7 @@ topic|nil|Topic to produce to.
79
79
  schema|nil|This is optional but strongly recommended for testing purposes; this will validate against a local schema file used as the reader schema, as well as being able to write tests against this schema. This is recommended since it ensures you are always getting the values you expect.
80
80
  namespace|nil|Namespace of the schema to use when finding it locally.
81
81
  key_config|nil|Configuration hash for message keys. See [Kafka Message Keys](../README.md#installation)
82
+ disabled|false|Set to true to skip starting an actual listener for this consumer on startup.
82
83
  group_id|nil|ID of the consumer group.
83
84
  max_concurrency|1|Number of threads created for this listener. Each thread will behave as an independent consumer. They don't share any state.
84
85
  start_from_beginning|true|Once the consumer group has checkpointed its progress in the topic's partitions, the consumers will always start from the checkpointed offsets, regardless of config. As such, this setting only applies when the consumer initially starts consuming from a topic
@@ -0,0 +1,52 @@
1
+ # Running Integration Tests
2
+
3
+ This repo includes integration tests in the [spec/utils](spec/utils) directory.
4
+ Here, there are tests for more deimos features that include a database integration like
5
+ * [Database Poller](README.md#Database Poller)
6
+ * [Database Backend](docs/DATABASE_BACKEND.md)
7
+ * [Deadlock Retrying](lib/deimos/utils/deadlock_retry.rb)
8
+
9
+ You will need to set up the following databases to develop and create unit tests in these test suites.
10
+ * [SQLite](#SQLite)
11
+ * [MySQL](#MySQL)
12
+ * [PostgreSQL](#PostgreSQL)
13
+
14
+ ### SQLite
15
+ This database is covered through the `sqlite3` gem.
16
+
17
+ ## MySQL
18
+ ### Setting up a local MySQL server (Mac)
19
+ ```bash
20
+ # Download MySQL (Optionally, choose a version you are comfortable with)
21
+ brew install mysql
22
+ # Start automatically after rebooting your machine
23
+ brew services start mysql
24
+
25
+ # Cleanup once you are done with MySQL
26
+ brew services stop mysql
27
+ ```
28
+
29
+ ## PostgreSQL
30
+ ### Setting up a local PostgreSQL server (Mac)
31
+ ```bash
32
+ # Install postgres if it's not already installed
33
+ brew install postgres
34
+
35
+ # Initialize and Start up postgres db
36
+ brew services start postgres
37
+ initdb /usr/local/var/postgres
38
+ # Create the default database and user
39
+ # Use the password "root"
40
+ createuser -s --password postgres
41
+
42
+ # Cleanup once done with Postgres
43
+ killall postgres
44
+ brew services stop postgres
45
+ ```
46
+
47
+ ## Running Integration Tests
48
+ You must specify the tag "integration" when running these these test suites.
49
+ This can be done through the CLI with the `--tag integration` argument.
50
+ ```bash
51
+ rspec spec/utils/ --tag integration
52
+ ```
@@ -28,6 +28,7 @@ Please describe the tests that you ran to verify your changes. Provide instructi
28
28
  - [ ] I have performed a self-review of my own code
29
29
  - [ ] I have commented my code, particularly in hard-to-understand areas
30
30
  - [ ] I have made corresponding changes to the documentation
31
+ - [ ] I have added a line in the CHANGELOG describing this change, under the UNRELEASED heading
31
32
  - [ ] My changes generate no new warnings
32
33
  - [ ] I have added tests that prove my fix is effective or that my feature works
33
34
  - [ ] New and existing unit tests pass locally with my changes
@@ -0,0 +1,128 @@
1
+ # Upgrading Deimos
2
+
3
+ ## Upgrading from < 1.5.0 to >= 1.5.0
4
+
5
+ If you are using Confluent's schema registry to Avro-encode your
6
+ messages, you will need to manually include the `avro_turf` gem
7
+ in your Gemfile now.
8
+
9
+ This update changes how to interact with Deimos's schema classes.
10
+ Although these are meant to be internal, they are still "public"
11
+ and can be used by calling code.
12
+
13
+ Before 1.5.0:
14
+
15
+ ```ruby
16
+ encoder = Deimos::AvroDataEncoder.new(schema: 'MySchema',
17
+ namespace: 'com.my-namespace')
18
+ encoder.encode(my_payload)
19
+
20
+ decoder = Deimos::AvroDataDecoder.new(schema: 'MySchema',
21
+ namespace: 'com.my-namespace')
22
+ decoder.decode(my_payload)
23
+ ```
24
+
25
+ After 1.5.0:
26
+ ```ruby
27
+ backend = Deimos.schema_backend(schema: 'MySchema', namespace: 'com.my-namespace')
28
+ backend.encode(my_payload)
29
+ backend.decode(my_payload)
30
+ ```
31
+
32
+ The two classes are different and if you are using them to e.g.
33
+ inspect Avro schema fields, please look at the source code for the following:
34
+ * `Deimos::SchemaBackends::Base`
35
+ * `Deimos::SchemaBackends::AvroBase`
36
+ * `Deimos::SchemaBackends::AvroSchemaRegistry`
37
+
38
+ Deprecated `Deimos::TestHelpers.sent_messages` in favor of
39
+ `Deimos::Backends::Test.sent_messages`.
40
+
41
+ ## Upgrading from < 1.4.0 to >= 1.4.0
42
+
43
+ Previously, configuration was handled as follows:
44
+ * Kafka configuration, including listeners, lived in `phobos.yml`
45
+ * Additional Deimos configuration would live in an initializer, e.g. `kafka.rb`
46
+ * Producer and consumer configuration lived in each individual producer and consumer
47
+
48
+ As of 1.4.0, all configuration is centralized in one initializer
49
+ file, using default configuration.
50
+
51
+ Before 1.4.0:
52
+ ```yaml
53
+ # config/phobos.yml
54
+ logger:
55
+ file: log/phobos.log
56
+ level: debug
57
+ ruby_kafka:
58
+ level: debug
59
+
60
+ kafka:
61
+ client_id: phobos
62
+ connect_timeout: 15
63
+ socket_timeout: 15
64
+
65
+ producer:
66
+ ack_timeout: 5
67
+ required_acks: :all
68
+ ...
69
+
70
+ listeners:
71
+ - handler: ConsumerTest::MyConsumer
72
+ topic: my_consume_topic
73
+ group_id: my_group_id
74
+ - handler: ConsumerTest::MyBatchConsumer
75
+ topic: my_batch_consume_topic
76
+ group_id: my_batch_group_id
77
+ delivery: inline_batch
78
+ ```
79
+
80
+ ```ruby
81
+ # kafka.rb
82
+ Deimos.configure do |config|
83
+ config.reraise_consumer_errors = true
84
+ config.logger = Rails.logger
85
+ ...
86
+ end
87
+
88
+ # my_consumer.rb
89
+ class ConsumerTest::MyConsumer < Deimos::Producer
90
+ namespace 'com.my-namespace'
91
+ schema 'MySchema'
92
+ topic 'MyTopic'
93
+ key_config field: :id
94
+ end
95
+ ```
96
+
97
+ After 1.4.0:
98
+ ```ruby
99
+ kafka.rb
100
+ Deimos.configure do
101
+ logger Rails.logger
102
+ kafka do
103
+ client_id 'phobos'
104
+ connect_timeout 15
105
+ socket_timeout 15
106
+ end
107
+ producers.ack_timeout 5
108
+ producers.required_acks :all
109
+ ...
110
+ consumer do
111
+ class_name 'ConsumerTest::MyConsumer'
112
+ topic 'my_consume_topic'
113
+ group_id 'my_group_id'
114
+ namespace 'com.my-namespace'
115
+ schema 'MySchema'
116
+ topic 'MyTopic'
117
+ key_config field: :id
118
+ end
119
+ ...
120
+ end
121
+ ```
122
+
123
+ Note that the old configuration way *will* work if you set
124
+ `config.phobos_config_file = "config/phobos.yml"`. You will
125
+ get a number of deprecation notices, however. You can also still
126
+ set the topic, namespace, etc. on the producer/consumer class,
127
+ but it's much more convenient to centralize these configs
128
+ in one place to see what your app does.
@@ -26,6 +26,15 @@ module Deimos
26
26
 
27
27
  # :nodoc:
28
28
  def consume(payload, metadata)
29
+ unless self.process_message?(payload)
30
+ Deimos.config.logger.debug(
31
+ message: 'Skipping processing of message',
32
+ payload: payload,
33
+ metadata: metadata
34
+ )
35
+ return
36
+ end
37
+
29
38
  key = metadata.with_indifferent_access[:key]
30
39
  klass = self.class.config[:record_class]
31
40
  record = fetch_record(klass, (payload || {}).with_indifferent_access, key)
@@ -55,5 +55,13 @@ module Deimos
55
55
  def record_attributes(payload, _key=nil)
56
56
  @converter.convert(payload)
57
57
  end
58
+
59
+ # Override this message to conditionally save records
60
+ # @param payload [Hash] The kafka message as a hash
61
+ # @return [Boolean] if true, record is created/update.
62
+ # If false, record processing is skipped but message offset is still committed.
63
+ def process_message?(_payload)
64
+ true
65
+ end
58
66
  end
59
67
  end
@@ -14,7 +14,7 @@ module Deimos
14
14
  message = Deimos::KafkaMessage.new(
15
15
  message: m.encoded_payload ? m.encoded_payload.to_s.b : nil,
16
16
  topic: m.topic,
17
- partition_key: m.partition_key || m.key
17
+ partition_key: partition_key_for(m)
18
18
  )
19
19
  message.key = m.encoded_key.to_s.b unless producer_class.config[:no_keys]
20
20
  message
@@ -26,6 +26,15 @@ module Deimos
26
26
  by: records.size
27
27
  )
28
28
  end
29
+
30
+ # @param message [Deimos::Message]
31
+ # @return [String] the partition key to use for this message
32
+ def partition_key_for(message)
33
+ return message.partition_key if message.partition_key.present?
34
+ return message.key unless message.key.is_a?(Hash)
35
+
36
+ message.key.to_yaml
37
+ end
29
38
  end
30
39
  end
31
40
  end
@@ -15,6 +15,11 @@ module Deimos
15
15
  # enabled true
16
16
  # ca_cert_file 'my_file'
17
17
  # end
18
+ # config.kafka do
19
+ # ssl do
20
+ # enabled true
21
+ # end
22
+ # end
18
23
  # end
19
24
  # - Allows for arrays of configurations:
20
25
  # Deimos.configure do |config|
@@ -245,6 +250,13 @@ module Deimos
245
250
 
246
251
  # Configure the settings with values.
247
252
  def configure(&block)
253
+ if defined?(Rake) && defined?(Rake.application)
254
+ tasks = Rake.application.top_level_tasks
255
+ if tasks.any? { |t| %w(assets webpacker yarn).include?(t.split(':').first) }
256
+ puts 'Skipping Deimos configuration since we are in JS/CSS compilation'
257
+ return
258
+ end
259
+ end
248
260
  config.run_callbacks(:configure) do
249
261
  config.instance_eval(&block)
250
262
  end
@@ -319,6 +319,10 @@ module Deimos
319
319
  # Key configuration (see docs).
320
320
  # @return [Hash]
321
321
  setting :key_config
322
+ # Set to true to ignore the consumer in the Phobos config and not actually start up a
323
+ # listener.
324
+ # @return [Boolean]
325
+ setting :disabled, false
322
326
 
323
327
  # These are the phobos "listener" configs. See CONFIGURATION.md for more
324
328
  # info.
@@ -63,8 +63,10 @@ module Deimos
63
63
  }
64
64
 
65
65
  p_config[:listeners] = self.consumer_objects.map do |consumer|
66
+ next nil if consumer.disabled
67
+
66
68
  hash = consumer.to_h.reject do |k, _|
67
- %i(class_name schema namespace key_config backoff).include?(k)
69
+ %i(class_name schema namespace key_config backoff disabled).include?(k)
68
70
  end
69
71
  hash = hash.map { |k, v| [k, v.is_a?(Symbol) ? v.to_s : v] }.to_h
70
72
  hash[:handler] = consumer.class_name
@@ -73,6 +75,7 @@ module Deimos
73
75
  end
74
76
  hash
75
77
  end
78
+ p_config[:listeners].compact!
76
79
 
77
80
  if self.kafka.ssl.enabled
78
81
  %w(ca_cert client_cert client_cert_key).each do |key|
@@ -88,8 +88,9 @@ module Deimos
88
88
  array_of_attributes,
89
89
  options={})
90
90
  results = super
91
- return unless self.kafka_config[:import]
92
- return if array_of_attributes.empty?
91
+ if !self.kafka_config[:import] || array_of_attributes.empty?
92
+ return results
93
+ end
93
94
 
94
95
  # This will contain an array of hashes, where each hash is the actual
95
96
  # attribute hash that created the object.
@@ -87,8 +87,9 @@ module Deimos
87
87
 
88
88
  # Publish the payload to the topic.
89
89
  # @param payload [Hash] with an optional payload_key hash key.
90
- def publish(payload)
91
- publish_list([payload])
90
+ # @param topic [String] if specifying the topic
91
+ def publish(payload, topic: self.topic)
92
+ publish_list([payload], topic: topic)
92
93
  end
93
94
 
94
95
  # Publish a list of messages.
@@ -97,11 +98,14 @@ module Deimos
97
98
  # whether to publish synchronously.
98
99
  # @param force_send [Boolean] if true, ignore the configured backend
99
100
  # and send immediately to Kafka.
100
- def publish_list(payloads, sync: nil, force_send: false)
101
+ # @param topic [String] if specifying the topic
102
+ def publish_list(payloads, sync: nil, force_send: false, topic: self.topic)
101
103
  return if Deimos.config.kafka.seed_brokers.blank? ||
102
104
  Deimos.config.producers.disabled ||
103
105
  Deimos.producers_disabled?(self)
104
106
 
107
+ raise 'Topic not specified. Please specify the topic.' if topic.blank?
108
+
105
109
  backend_class = determine_backend_class(sync, force_send)
106
110
  Deimos.instrument(
107
111
  'encode_messages',
@@ -110,7 +114,7 @@ module Deimos
110
114
  payloads: payloads
111
115
  ) do
112
116
  messages = Array(payloads).map { |p| Deimos::Message.new(p, self) }
113
- messages.each(&method(:_process_message))
117
+ messages.each { |m| _process_message(m, topic) }
114
118
  messages.in_groups_of(MAX_BATCH_SIZE, false) do |batch|
115
119
  self.produce_batch(backend_class, batch)
116
120
  end
@@ -163,7 +167,8 @@ module Deimos
163
167
  private
164
168
 
165
169
  # @param message [Message]
166
- def _process_message(message)
170
+ # @param topic [String]
171
+ def _process_message(message, topic)
167
172
  # this violates the Law of Demeter but it has to happen in a very
168
173
  # specific order and requires a bunch of methods on the producer
169
174
  # to work correctly.
@@ -175,12 +180,12 @@ module Deimos
175
180
  message.payload = nil if message.payload.blank?
176
181
  message.coerce_fields(encoder)
177
182
  message.encoded_key = _encode_key(message.key)
178
- message.topic = self.topic
183
+ message.topic = topic
179
184
  message.encoded_payload = if message.payload.nil?
180
185
  nil
181
186
  else
182
187
  encoder.encode(message.payload,
183
- topic: "#{config[:topic]}-value")
188
+ topic: "#{Deimos.config.producers.topic_prefix}#{config[:topic]}-value")
184
189
  end
185
190
  end
186
191
 
@@ -198,9 +203,9 @@ module Deimos
198
203
  end
199
204
 
200
205
  if config[:key_field]
201
- encoder.encode_key(config[:key_field], key, topic: "#{config[:topic]}-key")
206
+ encoder.encode_key(config[:key_field], key, topic: "#{Deimos.config.producers.topic_prefix}#{config[:topic]}-key")
202
207
  elsif config[:key_schema]
203
- key_encoder.encode(key, topic: "#{config[:topic]}-key")
208
+ key_encoder.encode(key, topic: "#{Deimos.config.producers.topic_prefix}#{config[:topic]}-key")
204
209
  else
205
210
  key
206
211
  end
@@ -44,9 +44,11 @@ module Deimos
44
44
 
45
45
  case field_type
46
46
  when :int, :long
47
- if val.is_a?(Integer) ||
48
- _is_integer_string?(val) ||
49
- int_classes.any? { |klass| val.is_a?(klass) }
47
+ if %w(timestamp-millis timestamp-micros).include?(type.logical_type)
48
+ val
49
+ elsif val.is_a?(Integer) ||
50
+ _is_integer_string?(val) ||
51
+ int_classes.any? { |klass| val.is_a?(klass) }
50
52
  val.to_i
51
53
  else
52
54
  val # this will fail
@@ -15,7 +15,7 @@ module Deimos
15
15
 
16
16
  # @override
17
17
  def encode_payload(payload, schema: nil, topic: nil)
18
- avro_turf_messaging.encode(payload, schema_name: schema, subject: topic)
18
+ avro_turf_messaging.encode(payload, schema_name: schema, subject: topic || schema)
19
19
  end
20
20
 
21
21
  private
@@ -190,11 +190,14 @@ module Deimos
190
190
  end
191
191
  end
192
192
 
193
+ # Produce messages in batches, reducing the size 1/10 if the batch is too
194
+ # large. Does not retry batches of messages that have already been sent.
193
195
  # @param batch [Array<Hash>]
194
196
  def produce_messages(batch)
195
197
  batch_size = batch.size
198
+ current_index = 0
196
199
  begin
197
- batch.in_groups_of(batch_size, false).each do |group|
200
+ batch[current_index..-1].in_groups_of(batch_size, false).each do |group|
198
201
  @logger.debug("Publishing #{group.size} messages to #{@current_topic}")
199
202
  producer.publish_list(group)
200
203
  Deimos.config.metrics&.increment(
@@ -202,6 +205,7 @@ module Deimos
202
205
  tags: %W(status:success topic:#{@current_topic}),
203
206
  by: group.size
204
207
  )
208
+ current_index += group.size
205
209
  @logger.info("Sent #{group.size} messages to #{@current_topic}")
206
210
  end
207
211
  rescue Kafka::BufferOverflow, Kafka::MessageSizeTooLarge,
@@ -6,6 +6,7 @@ module Deimos
6
6
  module Utils
7
7
  # Listener that can seek to get the last X messages in a topic.
8
8
  class SeekListener < Phobos::Listener
9
+ MAX_SEEK_RETRIES = 3
9
10
  attr_accessor :num_messages
10
11
 
11
12
  # :nodoc:
@@ -13,8 +14,10 @@ module Deimos
13
14
  @num_messages ||= 10
14
15
  @consumer = create_kafka_consumer
15
16
  @consumer.subscribe(topic, @subscribe_opts)
17
+ attempt = 0
16
18
 
17
19
  begin
20
+ attempt += 1
18
21
  last_offset = @kafka_client.last_offset_for(topic, 0)
19
22
  offset = last_offset - num_messages
20
23
  if offset.positive?
@@ -22,7 +25,11 @@ module Deimos
22
25
  @consumer.seek(topic, 0, offset)
23
26
  end
24
27
  rescue StandardError => e
25
- "Could not seek to offset: #{e.message}"
28
+ if attempt < MAX_SEEK_RETRIES
29
+ sleep(1.seconds * attempt)
30
+ retry
31
+ end
32
+ log_error("Could not seek to offset: #{e.message} after #{MAX_SEEK_RETRIES} retries", listener_metadata)
26
33
  end
27
34
 
28
35
  instrument('listener.start_handler', listener_metadata) do
@@ -50,7 +57,6 @@ module Deimos
50
57
 
51
58
  # :nodoc:
52
59
  def consume(payload, metadata)
53
- puts "Got #{payload}"
54
60
  self.class.total_messages << {
55
61
  key: metadata[:key],
56
62
  payload: payload
@@ -28,14 +28,18 @@ module Deimos
28
28
 
29
29
  # Indicate which schemas should be assigned to actions.
30
30
  # @param actions [Symbol]
31
+ # @param kwactions [String]
31
32
  # @param request [String]
32
33
  # @param response [String]
33
- def schemas(*actions, request: nil, response: nil)
34
+ def schemas(*actions, request: nil, response: nil, **kwactions)
34
35
  actions.each do |action|
35
36
  request ||= action.to_s.titleize
36
37
  response ||= action.to_s.titleize
37
38
  schema_mapping[action.to_s] = { request: request, response: response }
38
39
  end
40
+ kwactions.each do |key, val|
41
+ schema_mapping[key.to_s] = { request: val, response: val }
42
+ end
39
43
  end
40
44
 
41
45
  # @return [Hash<Symbol, String>]
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Deimos
4
- VERSION = '1.8.1-beta9'
4
+ VERSION = '1.8.4'
5
5
  end
@@ -137,5 +137,18 @@ module ActiveRecordConsumerTest
137
137
  expect(Widget.find_by_test_id('id1').some_int).to eq(3)
138
138
  expect(Widget.find_by_test_id('id2').some_int).to eq(4)
139
139
  end
140
+
141
+ it 'should not create record of process_message returns false' do
142
+ MyConsumer.any_instance.stub(:process_message?).and_return(false)
143
+ expect(Widget.count).to eq(0)
144
+ test_consume_message(MyConsumer, {
145
+ test_id: 'abc',
146
+ some_int: 3,
147
+ updated_at: 1.day.ago.to_i,
148
+ some_datetime_int: Time.zone.now.to_i,
149
+ timestamp: 2.minutes.ago.to_s
150
+ }, { call_original: true, key: 5 })
151
+ expect(Widget.count).to eq(0)
152
+ end
140
153
  end
141
154
  end
@@ -43,6 +43,12 @@ each_db_config(Deimos::Backends::Db) do
43
43
  described_class.publish(producer_class: MyNoKeyProducer,
44
44
  messages: [messages.first])
45
45
  expect(Deimos::KafkaMessage.count).to eq(4)
46
+ end
46
47
 
48
+ it 'should add messages with Hash keys with JSON encoding' do
49
+ described_class.publish(producer_class: MyProducer,
50
+ messages: [build_message({ foo: 0 }, 'my-topic', { 'test_id' => 0 })])
51
+ expect(Deimos::KafkaMessage.count).to eq(1)
52
+ expect(Deimos::KafkaMessage.last.partition_key).to eq(%(---\ntest_id: 0\n))
47
53
  end
48
54
  end
@@ -6,6 +6,14 @@ class MyConfigConsumer < Deimos::Consumer
6
6
  def consume
7
7
  end
8
8
  end
9
+
10
+ # Mock consumer 2
11
+ class MyConfigConsumer2 < Deimos::Consumer
12
+ # :no-doc:
13
+ def consume
14
+ end
15
+ end
16
+
9
17
  describe Deimos, 'configuration' do
10
18
  it 'should configure with deprecated fields' do
11
19
  logger = Logger.new(nil)
@@ -171,6 +179,13 @@ describe Deimos, 'configuration' do
171
179
  offset_retention_time 13
172
180
  heartbeat_interval 13
173
181
  end
182
+ consumer do
183
+ disabled true
184
+ class_name 'MyConfigConsumer2'
185
+ schema 'blah2'
186
+ topic 'blah2'
187
+ group_id 'myconsumerid2'
188
+ end
174
189
  end
175
190
 
176
191
  expect(described_class.config.phobos_config).
@@ -225,5 +225,88 @@ module KafkaSourceSpec
225
225
  expect(Deimos::KafkaMessage.count).to eq(0)
226
226
  end
227
227
  end
228
+
229
+ context 'with import hooks disabled' do
230
+ before(:each) do
231
+ # Dummy class we can include the mixin in. Has a backing table created
232
+ # earlier and has the import hook disabled
233
+ class WidgetNoImportHook < ActiveRecord::Base
234
+ include Deimos::KafkaSource
235
+ self.table_name = 'widgets'
236
+
237
+ # :nodoc:
238
+ def self.kafka_config
239
+ {
240
+ update: true,
241
+ delete: true,
242
+ import: false,
243
+ create: true
244
+ }
245
+ end
246
+
247
+ # :nodoc:
248
+ def self.kafka_producers
249
+ [WidgetProducer]
250
+ end
251
+ end
252
+ WidgetNoImportHook.reset_column_information
253
+ end
254
+
255
+ it 'should not fail when bulk-importing with existing records' do
256
+ widget1 = WidgetNoImportHook.create(widget_id: 1, name: 'Widget 1')
257
+ widget2 = WidgetNoImportHook.create(widget_id: 2, name: 'Widget 2')
258
+ widget1.name = 'New Widget No Import Hook 1'
259
+ widget2.name = 'New Widget No Import Hook 2'
260
+
261
+ expect {
262
+ WidgetNoImportHook.import([widget1, widget2], on_duplicate_key_update: %i(widget_id name))
263
+ }.not_to raise_error
264
+
265
+ expect('my-topic').not_to have_sent({
266
+ widget_id: 1,
267
+ name: 'New Widget No Import Hook 1',
268
+ id: widget1.id,
269
+ created_at: anything,
270
+ updated_at: anything
271
+ }, widget1.id)
272
+ expect('my-topic').not_to have_sent({
273
+ widget_id: 2,
274
+ name: 'New Widget No Import Hook 2',
275
+ id: widget2.id,
276
+ created_at: anything,
277
+ updated_at: anything
278
+ }, widget2.id)
279
+ end
280
+
281
+ it 'should not fail when mixing existing and new records' do
282
+ widget1 = WidgetNoImportHook.create(widget_id: 1, name: 'Widget 1')
283
+ expect('my-topic').to have_sent({
284
+ widget_id: 1,
285
+ name: 'Widget 1',
286
+ id: widget1.id,
287
+ created_at: anything,
288
+ updated_at: anything
289
+ }, widget1.id)
290
+
291
+ widget2 = WidgetNoImportHook.new(widget_id: 2, name: 'Widget 2')
292
+ widget1.name = 'New Widget 1'
293
+ WidgetNoImportHook.import([widget1, widget2], on_duplicate_key_update: %i(widget_id))
294
+ widgets = WidgetNoImportHook.all
295
+ expect('my-topic').not_to have_sent({
296
+ widget_id: 1,
297
+ name: 'New Widget 1',
298
+ id: widgets[0].id,
299
+ created_at: anything,
300
+ updated_at: anything
301
+ }, widgets[0].id)
302
+ expect('my-topic').not_to have_sent({
303
+ widget_id: 2,
304
+ name: 'Widget 2',
305
+ id: widgets[1].id,
306
+ created_at: anything,
307
+ updated_at: anything
308
+ }, widgets[1].id)
309
+ end
310
+ end
228
311
  end
229
312
  end
@@ -64,6 +64,14 @@ module ProducerTest
64
64
  end
65
65
  stub_const('MyErrorProducer', producer_class)
66
66
 
67
+ producer_class = Class.new(Deimos::Producer) do
68
+ schema 'MySchema'
69
+ namespace 'com.my-namespace'
70
+ topic nil
71
+ key_config none: true
72
+ end
73
+ stub_const('MyNoTopicProducer', producer_class)
74
+
67
75
  end
68
76
 
69
77
  it 'should fail on invalid message with error handler' do
@@ -102,6 +110,33 @@ module ProducerTest
102
110
  expect('my-topic').not_to have_sent('test_id' => 'foo2', 'some_int' => 123)
103
111
  end
104
112
 
113
+ it 'should allow setting the topic from publish_list' do
114
+ expect(described_class).to receive(:produce_batch).once.with(
115
+ Deimos::Backends::Test,
116
+ [
117
+ Deimos::Message.new({ 'test_id' => 'foo', 'some_int' => 123 },
118
+ MyProducer,
119
+ topic: 'a-new-topic',
120
+ partition_key: 'foo',
121
+ key: 'foo'),
122
+ Deimos::Message.new({ 'test_id' => 'bar', 'some_int' => 124 },
123
+ MyProducer,
124
+ topic: 'a-new-topic',
125
+ partition_key: 'bar',
126
+ key: 'bar')
127
+ ]
128
+ ).and_call_original
129
+
130
+ MyProducer.publish_list(
131
+ [{ 'test_id' => 'foo', 'some_int' => 123 },
132
+ { 'test_id' => 'bar', 'some_int' => 124 }],
133
+ topic: 'a-new-topic'
134
+ )
135
+ expect('a-new-topic').to have_sent('test_id' => 'foo', 'some_int' => 123)
136
+ expect('my-topic').not_to have_sent('test_id' => 'foo', 'some_int' => 123)
137
+ expect('my-topic').not_to have_sent('test_id' => 'foo2', 'some_int' => 123)
138
+ end
139
+
105
140
  it 'should add a message ID' do
106
141
  payload = { 'test_id' => 'foo',
107
142
  'some_int' => 123,
@@ -201,6 +236,7 @@ module ProducerTest
201
236
  end
202
237
 
203
238
  it 'should encode the key' do
239
+ Deimos.configure { |c| c.producers.topic_prefix = nil }
204
240
  expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'foo', topic: 'my-topic-key')
205
241
  expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'bar', topic: 'my-topic-key')
206
242
  expect(MyProducer.encoder).to receive(:encode).with({
@@ -218,6 +254,21 @@ module ProducerTest
218
254
  )
219
255
  end
220
256
 
257
+ it 'should encode the key with topic prefix' do
258
+ Deimos.configure { |c| c.producers.topic_prefix = 'prefix.' }
259
+ expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'foo', topic: 'prefix.my-topic-key')
260
+ expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'bar', topic: 'prefix.my-topic-key')
261
+ expect(MyProducer.encoder).to receive(:encode).with({ 'test_id' => 'foo',
262
+ 'some_int' => 123 },
263
+ { topic: 'prefix.my-topic-value' })
264
+ expect(MyProducer.encoder).to receive(:encode).with({ 'test_id' => 'bar',
265
+ 'some_int' => 124 },
266
+ { topic: 'prefix.my-topic-value' })
267
+
268
+ MyProducer.publish_list([{ 'test_id' => 'foo', 'some_int' => 123 },
269
+ { 'test_id' => 'bar', 'some_int' => 124 }])
270
+ end
271
+
221
272
  it 'should not encode with plaintext key' do
222
273
  expect(MyNonEncodedProducer.key_encoder).not_to receive(:encode_key)
223
274
 
@@ -269,6 +320,31 @@ module ProducerTest
269
320
  )
270
321
  end
271
322
 
323
+ it 'should raise error if blank topic is passed in explicitly' do
324
+ expect {
325
+ MyProducer.publish_list(
326
+ [{ 'test_id' => 'foo',
327
+ 'some_int' => 123 },
328
+ { 'test_id' => 'bar',
329
+ 'some_int' => 124 }],
330
+ topic: ''
331
+ )
332
+ }.to raise_error(RuntimeError,
333
+ 'Topic not specified. Please specify the topic.')
334
+ end
335
+
336
+ it 'should raise error if the producer has not been initialized with a topic' do
337
+ expect {
338
+ MyNoTopicProducer.publish_list(
339
+ [{ 'test_id' => 'foo',
340
+ 'some_int' => 123 },
341
+ { 'test_id' => 'bar',
342
+ 'some_int' => 124 }]
343
+ )
344
+ }.to raise_error(RuntimeError,
345
+ 'Topic not specified. Please specify the topic.')
346
+ end
347
+
272
348
  it 'should error with nothing set' do
273
349
  expect {
274
350
  MyErrorProducer.publish_list(
@@ -42,6 +42,20 @@ RSpec.shared_examples_for('an Avro backend') do
42
42
  {
43
43
  'name' => 'union-int-field',
44
44
  'type' => %w(null int)
45
+ },
46
+ {
47
+ 'name' => 'timestamp-millis-field',
48
+ 'type' => {
49
+ 'type' => 'long',
50
+ 'logicalType' => 'timestamp-millis'
51
+ }
52
+ },
53
+ {
54
+ 'name' => 'timestamp-micros-field',
55
+ 'type' => {
56
+ 'type' => 'long',
57
+ 'logicalType' => 'timestamp-micros'
58
+ }
45
59
  }
46
60
  ]
47
61
  }
@@ -95,7 +109,9 @@ RSpec.shared_examples_for('an Avro backend') do
95
109
  'string-field' => 'hi mom',
96
110
  'boolean-field' => true,
97
111
  'union-field' => nil,
98
- 'union-int-field' => nil
112
+ 'union-int-field' => nil,
113
+ 'timestamp-millis-field' => Time.utc(2020, 11, 12, 13, 14, 15, 909_090),
114
+ 'timestamp-micros-field' => Time.utc(2020, 11, 12, 13, 14, 15, 909_090)
99
115
  }
100
116
  end
101
117
 
@@ -169,6 +185,15 @@ RSpec.shared_examples_for('an Avro backend') do
169
185
  expect(result['union-field']).to eq('itsme')
170
186
  end
171
187
 
188
+ it 'should not convert timestamp-millis' do
189
+ result = backend.coerce(payload)
190
+ expect(result['timestamp-millis-field']).to eq(Time.utc(2020, 11, 12, 13, 14, 15, 909_090))
191
+ end
192
+
193
+ it 'should not convert timestamp-micros' do
194
+ result = backend.coerce(payload)
195
+ expect(result['timestamp-micros-field']).to eq(Time.utc(2020, 11, 12, 13, 14, 15, 909_090))
196
+ end
172
197
  end
173
198
 
174
199
  end
@@ -0,0 +1,11 @@
1
+ {
2
+ "namespace": "com.my-namespace.request",
3
+ "name": "CreateTopic",
4
+ "type": "record",
5
+ "fields": [
6
+ {
7
+ "name": "request_id",
8
+ "type": "string"
9
+ }
10
+ ]
11
+ }
@@ -0,0 +1,11 @@
1
+ {
2
+ "namespace": "com.my-namespace.response",
3
+ "name": "CreateTopic",
4
+ "type": "record",
5
+ "fields": [
6
+ {
7
+ "name": "response_id",
8
+ "type": "string"
9
+ }
10
+ ]
11
+ }
@@ -96,7 +96,32 @@ each_db_config(Deimos::Utils::DbProducer) do
96
96
  expect(phobos_producer).to have_received(:publish_list).with(['A'] * 100).once
97
97
  expect(phobos_producer).to have_received(:publish_list).with(['A'] * 10).once
98
98
  expect(phobos_producer).to have_received(:publish_list).with(['A']).once
99
+ end
100
+
101
+ it 'should not resend batches of sent messages' do
102
+ allow(phobos_producer).to receive(:publish_list) do |group|
103
+ raise Kafka::BufferOverflow if group.any?('A') && group.size >= 1000
104
+ raise Kafka::BufferOverflow if group.any?('BIG') && group.size >= 10
105
+ end
106
+ allow(Deimos.config.metrics).to receive(:increment)
107
+ batch = ['A'] * 450 + ['BIG'] * 550
108
+ producer.produce_messages(batch)
109
+
110
+ expect(phobos_producer).to have_received(:publish_list).with(batch)
111
+ expect(phobos_producer).to have_received(:publish_list).with(['A'] * 100).exactly(4).times
112
+ expect(phobos_producer).to have_received(:publish_list).with(['A'] * 50 + ['BIG'] * 50)
113
+ expect(phobos_producer).to have_received(:publish_list).with(['A'] * 10).exactly(5).times
114
+ expect(phobos_producer).to have_received(:publish_list).with(['BIG'] * 1).exactly(550).times
99
115
 
116
+ expect(Deimos.config.metrics).to have_received(:increment).with('publish',
117
+ tags: %w(status:success topic:),
118
+ by: 100).exactly(4).times
119
+ expect(Deimos.config.metrics).to have_received(:increment).with('publish',
120
+ tags: %w(status:success topic:),
121
+ by: 10).exactly(5).times
122
+ expect(Deimos.config.metrics).to have_received(:increment).with('publish',
123
+ tags: %w(status:success topic:),
124
+ by: 1).exactly(550).times
100
125
  end
101
126
 
102
127
  describe '#compact_messages' do
@@ -289,6 +314,8 @@ each_db_config(Deimos::Utils::DbProducer) do
289
314
  message: "mess#{i}",
290
315
  partition_key: "key#{i}"
291
316
  )
317
+ end
318
+ (5..8).each do |i|
292
319
  Deimos::KafkaMessage.create!(
293
320
  id: i,
294
321
  topic: 'my-topic2',
@@ -0,0 +1,31 @@
1
+ # frozen_string_literal: true
2
+
3
+ describe Deimos::Utils::SeekListener do
4
+
5
+ describe '#start_listener' do
6
+ let(:consumer) { instance_double(Kafka::Consumer) }
7
+ let(:handler) { class_double(Deimos::Utils::MessageBankHandler) }
8
+
9
+ before(:each) do
10
+ allow(handler).to receive(:start)
11
+ allow(consumer).to receive(:subscribe)
12
+ allow_any_instance_of(Phobos::Listener).to receive(:create_kafka_consumer).and_return(consumer)
13
+ allow_any_instance_of(Kafka::Client).to receive(:last_offset_for).and_return(100)
14
+ stub_const('Deimos::Utils::SeekListener::MAX_SEEK_RETRIES', 2)
15
+ end
16
+
17
+ it 'should seek offset' do
18
+ allow(consumer).to receive(:seek)
19
+ expect(consumer).to receive(:seek).once
20
+ seek_listener = described_class.new({ handler: handler, group_id: 999, topic: 'test_topic' })
21
+ seek_listener.start_listener
22
+ end
23
+
24
+ it 'should retry on errors when seeking offset' do
25
+ allow(consumer).to receive(:seek).and_raise(StandardError)
26
+ expect(consumer).to receive(:seek).twice
27
+ seek_listener = described_class.new({ handler: handler, group_id: 999, topic: 'test_topic' })
28
+ seek_listener.start_listener
29
+ end
30
+ end
31
+ end
@@ -17,6 +17,7 @@ RSpec.describe Deimos::Utils::SchemaControllerMixin, type: :controller do
17
17
  request_namespace 'com.my-namespace.request'
18
18
  response_namespace 'com.my-namespace.response'
19
19
  schemas :index, :show
20
+ schemas create: 'CreateTopic'
20
21
  schemas :update, request: 'UpdateRequest', response: 'UpdateResponse'
21
22
 
22
23
  # :nodoc:
@@ -29,6 +30,11 @@ RSpec.describe Deimos::Utils::SchemaControllerMixin, type: :controller do
29
30
  render_schema({ 'response_id' => payload[:request_id] + ' dad' })
30
31
  end
31
32
 
33
+ # :nodoc:
34
+ def create
35
+ render_schema({ 'response_id' => payload[:request_id] + ' bro' })
36
+ end
37
+
32
38
  # :nodoc:
33
39
  def update
34
40
  render_schema({ 'update_response_id' => payload[:update_request_id] + ' sis' })
@@ -65,4 +71,14 @@ RSpec.describe Deimos::Utils::SchemaControllerMixin, type: :controller do
65
71
  expect(response_backend.decode(response.body)).to eq({ 'update_response_id' => 'hi sis' })
66
72
  end
67
73
 
74
+ it 'should render the correct response for create' do
75
+ request_backend = Deimos.schema_backend(schema: 'CreateTopic',
76
+ namespace: 'com.my-namespace.request')
77
+ response_backend = Deimos.schema_backend(schema: 'CreateTopic',
78
+ namespace: 'com.my-namespace.response')
79
+ request.content_type = 'avro/binary'
80
+ post :create, params: { id: 1 }, body: request_backend.encode({ 'request_id' => 'hi' })
81
+ expect(response_backend.decode(response.body)).to eq({ 'response_id' => 'hi bro' })
82
+ end
83
+
68
84
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: deimos-ruby
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.8.1.pre.beta9
4
+ version: 1.8.4
5
5
  platform: ruby
6
6
  authors:
7
7
  - Daniel Orner
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2020-08-27 00:00:00.000000000 Z
11
+ date: 2020-12-02 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: avro_turf
@@ -348,7 +348,9 @@ files:
348
348
  - docs/ARCHITECTURE.md
349
349
  - docs/CONFIGURATION.md
350
350
  - docs/DATABASE_BACKEND.md
351
+ - docs/INTEGRATION_TESTS.md
351
352
  - docs/PULL_REQUEST_TEMPLATE.md
353
+ - docs/UPGRADING.md
352
354
  - lib/deimos.rb
353
355
  - lib/deimos/active_record_consume/batch_consumption.rb
354
356
  - lib/deimos/active_record_consume/batch_slicer.rb
@@ -453,14 +455,17 @@ files:
453
455
  - spec/schemas/com/my-namespace/Wibble.avsc
454
456
  - spec/schemas/com/my-namespace/Widget.avsc
455
457
  - spec/schemas/com/my-namespace/WidgetTheSecond.avsc
458
+ - spec/schemas/com/my-namespace/request/CreateTopic.avsc
456
459
  - spec/schemas/com/my-namespace/request/Index.avsc
457
460
  - spec/schemas/com/my-namespace/request/UpdateRequest.avsc
461
+ - spec/schemas/com/my-namespace/response/CreateTopic.avsc
458
462
  - spec/schemas/com/my-namespace/response/Index.avsc
459
463
  - spec/schemas/com/my-namespace/response/UpdateResponse.avsc
460
464
  - spec/spec_helper.rb
461
465
  - spec/utils/db_poller_spec.rb
462
466
  - spec/utils/db_producer_spec.rb
463
467
  - spec/utils/deadlock_retry_spec.rb
468
+ - spec/utils/inline_consumer_spec.rb
464
469
  - spec/utils/lag_reporter_spec.rb
465
470
  - spec/utils/platform_schema_validation_spec.rb
466
471
  - spec/utils/schema_controller_mixin_spec.rb
@@ -483,9 +488,9 @@ required_ruby_version: !ruby/object:Gem::Requirement
483
488
  version: '0'
484
489
  required_rubygems_version: !ruby/object:Gem::Requirement
485
490
  requirements:
486
- - - ">"
491
+ - - ">="
487
492
  - !ruby/object:Gem::Version
488
- version: 1.3.1
493
+ version: '0'
489
494
  requirements: []
490
495
  rubygems_version: 3.1.3
491
496
  signing_key:
@@ -534,14 +539,17 @@ test_files:
534
539
  - spec/schemas/com/my-namespace/Wibble.avsc
535
540
  - spec/schemas/com/my-namespace/Widget.avsc
536
541
  - spec/schemas/com/my-namespace/WidgetTheSecond.avsc
542
+ - spec/schemas/com/my-namespace/request/CreateTopic.avsc
537
543
  - spec/schemas/com/my-namespace/request/Index.avsc
538
544
  - spec/schemas/com/my-namespace/request/UpdateRequest.avsc
545
+ - spec/schemas/com/my-namespace/response/CreateTopic.avsc
539
546
  - spec/schemas/com/my-namespace/response/Index.avsc
540
547
  - spec/schemas/com/my-namespace/response/UpdateResponse.avsc
541
548
  - spec/spec_helper.rb
542
549
  - spec/utils/db_poller_spec.rb
543
550
  - spec/utils/db_producer_spec.rb
544
551
  - spec/utils/deadlock_retry_spec.rb
552
+ - spec/utils/inline_consumer_spec.rb
545
553
  - spec/utils/lag_reporter_spec.rb
546
554
  - spec/utils/platform_schema_validation_spec.rb
547
555
  - spec/utils/schema_controller_mixin_spec.rb