deimos-ruby 1.8.1.pre.beta8 → 1.8.3
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +42 -2
- data/Gemfile.lock +1 -1
- data/README.md +33 -3
- data/docs/CONFIGURATION.md +1 -0
- data/docs/INTEGRATION_TESTS.md +52 -0
- data/docs/UPGRADING.md +128 -0
- data/lib/deimos/backends/db.rb +10 -1
- data/lib/deimos/config/configurable.rb +12 -0
- data/lib/deimos/config/configuration.rb +4 -0
- data/lib/deimos/config/phobos_config.rb +4 -1
- data/lib/deimos/kafka_source.rb +3 -2
- data/lib/deimos/producer.rb +14 -9
- data/lib/deimos/schema_backends/avro_schema_registry.rb +1 -1
- data/lib/deimos/test_helpers.rb +8 -7
- data/lib/deimos/utils/db_producer.rb +5 -1
- data/lib/deimos/utils/schema_controller_mixin.rb +5 -1
- data/lib/deimos/version.rb +1 -1
- data/spec/backends/db_spec.rb +6 -0
- data/spec/config/configuration_spec.rb +15 -0
- data/spec/kafka_source_spec.rb +83 -0
- data/spec/producer_spec.rb +76 -0
- data/spec/schemas/com/my-namespace/request/CreateTopic.avsc +11 -0
- data/spec/schemas/com/my-namespace/response/CreateTopic.avsc +11 -0
- data/spec/spec_helper.rb +1 -0
- data/spec/utils/db_producer_spec.rb +27 -0
- data/spec/utils/schema_controller_mixin_spec.rb +16 -0
- metadata +10 -4
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 2badd671866a8fbde743e03acbc2da623279ac08abd5ed983ae9a15a0bef415d
|
4
|
+
data.tar.gz: e4b3cd2e80b13fc00ff54ebc85b59ab2e1daad052ab170dafd9a75f57eb80ce3
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 98b41fa7354e624a4538ac5006a091e1a144a2caaa57e2ef77c214e350cd8dcbf2b126c01a3663365d28bbed19baa7525e3c52390cc60f6378537ad8fe6bee12
|
7
|
+
data.tar.gz: 208ba417a35c7042155383d5c0eb9d9e35be95c0459cf2e58cdc24168c8fdf30af16434fcc66c612d197aab38279d790907f146e19b6a36157cde008c5aa60a3
|
data/CHANGELOG.md
CHANGED
@@ -7,9 +7,49 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|
7
7
|
|
8
8
|
## UNRELEASED
|
9
9
|
|
10
|
-
## 1.8.
|
10
|
+
## 1.8.3 - 2020-11-18
|
11
|
+
|
12
|
+
### Fixes :wrench:
|
13
|
+
- Do not resend already sent messages when splitting up batches
|
14
|
+
(fixes [#24](https://github.com/flipp-oss/deimos/issues/24))
|
15
|
+
- KafkaSource crashing on bulk-imports if import hooks are disabled
|
16
|
+
(fixes [#73](https://github.com/flipp-oss/deimos/issues/73))
|
17
|
+
- #96 Use string-safe encoding for partition keys
|
18
|
+
|
19
|
+
## 1.8.2 - 2020-09-25
|
20
|
+
|
21
|
+
### Features :star:
|
22
|
+
- Add "disabled" config field to consumers to allow disabling
|
23
|
+
individual consumers without having to comment out their
|
24
|
+
entries and possibly affecting unit tests.
|
25
|
+
|
26
|
+
### Fixes :wrench:
|
27
|
+
- Prepend topic_prefix while encoding messages
|
28
|
+
(fixes [#37](https://github.com/flipp-oss/deimos/issues/37))
|
29
|
+
- Raise error if producing without a topic
|
30
|
+
(fixes [#50](https://github.com/flipp-oss/deimos/issues/50))
|
31
|
+
- Don't try to load producers/consumers when running rake tasks involving webpacker or assets
|
32
|
+
|
33
|
+
## 1.8.2-beta2 - 2020-09-15
|
34
|
+
|
35
|
+
### Features :star:
|
36
|
+
|
37
|
+
- Add details on using schema backend directly in README.
|
38
|
+
- Default to the provided schema if topic is not provided when
|
39
|
+
encoding to `AvroSchemaRegistry`.
|
40
|
+
- Add mapping syntax for the `schema` call in `SchemaControllerMixin`.
|
41
|
+
|
42
|
+
## 1.8.2-beta1 - 2020-09-09
|
43
|
+
|
44
|
+
### Features :star:
|
45
|
+
|
46
|
+
- Added the ability to specify the topic for `publish`
|
47
|
+
and `publish_list` in a producer
|
48
|
+
|
49
|
+
## 1.8.1-beta9 - 2020-08-27
|
50
|
+
|
11
51
|
### Fixes :wrench:
|
12
|
-
- Moved the TestHelpers hook to `
|
52
|
+
- Moved the TestHelpers hook to `before(:suite)` to allow for
|
13
53
|
overriding e.g. in integration tests.
|
14
54
|
|
15
55
|
## 1.8.1-beta7 - 2020-08-25
|
data/Gemfile.lock
CHANGED
data/README.md
CHANGED
@@ -11,6 +11,7 @@ a useful toolbox of goodies for Ruby-based Kafka development.
|
|
11
11
|
Built on Phobos and hence Ruby-Kafka.
|
12
12
|
|
13
13
|
<!--ts-->
|
14
|
+
* [Additional Documentation](#additional-documentation)
|
14
15
|
* [Installation](#installation)
|
15
16
|
* [Versioning](#versioning)
|
16
17
|
* [Configuration](#configuration)
|
@@ -29,9 +30,20 @@ Built on Phobos and hence Ruby-Kafka.
|
|
29
30
|
* [Metrics](#metrics)
|
30
31
|
* [Testing](#testing)
|
31
32
|
* [Integration Test Helpers](#integration-test-helpers)
|
33
|
+
* [Utilities](#utilities)
|
32
34
|
* [Contributing](#contributing)
|
33
35
|
<!--te-->
|
34
36
|
|
37
|
+
# Additional Documentation
|
38
|
+
|
39
|
+
Please see the following for further information not covered by this readme:
|
40
|
+
|
41
|
+
* [Architecture Design](docs/ARCHITECTURE.md)
|
42
|
+
* [Configuration Reference](docs/CONFIGURATION.md)
|
43
|
+
* [Database Backend Feature](docs/DATABASE_BACKEND.md)
|
44
|
+
* [Upgrading Deimos](docs/UPGRADING.md)
|
45
|
+
* [Contributing to Integration Tests](docs/INTEGRATION_TESTS.md)
|
46
|
+
|
35
47
|
# Installation
|
36
48
|
|
37
49
|
Add this line to your application's Gemfile:
|
@@ -108,6 +120,7 @@ class MyProducer < Deimos::Producer
|
|
108
120
|
'some-key2' => an_object.bar
|
109
121
|
}
|
110
122
|
# You can also publish an array with self.publish_list(payloads)
|
123
|
+
# You may specify the topic here with self.publish(payload, topic: 'my-topic')
|
111
124
|
self.publish(payload)
|
112
125
|
end
|
113
126
|
|
@@ -481,6 +494,12 @@ class WhateverController < ApplicationController
|
|
481
494
|
# will look for: my.namespace.requests.Index.avsc
|
482
495
|
# my.namespace.responses.Index.avsc
|
483
496
|
|
497
|
+
# Can use mapping to change the schema but keep the namespaces,
|
498
|
+
# i.e. use the same schema name across the two namespaces
|
499
|
+
schemas create: 'CreateTopic'
|
500
|
+
# will look for: my.namespace.requests.CreateTopic.avsc
|
501
|
+
# my.namespace.responses.CreateTopic.avsc
|
502
|
+
|
484
503
|
# If all routes use the default, you can add them all at once
|
485
504
|
schemas :index, :show, :update
|
486
505
|
|
@@ -988,13 +1007,24 @@ Deimos::Utils::InlineConsumer.get_messages_for(
|
|
988
1007
|
)
|
989
1008
|
```
|
990
1009
|
|
1010
|
+
## Utilities
|
1011
|
+
|
1012
|
+
You can use your configured schema backend directly if you want to
|
1013
|
+
encode and decode payloads outside of the context of sending messages.
|
1014
|
+
|
1015
|
+
```ruby
|
1016
|
+
backend = Deimos.schema_backend(schema: 'MySchema', namespace: 'com.my-namespace')
|
1017
|
+
encoded = backend.encode(my_payload)
|
1018
|
+
decoded = backend.decode(my_encoded_payload)
|
1019
|
+
coerced = backend.coerce(my_payload) # coerce to correct types
|
1020
|
+
backend.validate(my_payload) # throws an error if not valid
|
1021
|
+
fields = backend.schema_fields # list of fields defined in the schema
|
1022
|
+
```
|
1023
|
+
|
991
1024
|
## Contributing
|
992
1025
|
|
993
1026
|
Bug reports and pull requests are welcome on GitHub at https://github.com/flipp-oss/deimos .
|
994
1027
|
|
995
|
-
We have more information on the [internal architecture](docs/ARCHITECTURE.md) of Deimos
|
996
|
-
for contributors!
|
997
|
-
|
998
1028
|
### Linting
|
999
1029
|
|
1000
1030
|
Deimos uses Rubocop to lint the code. Please run Rubocop on your code
|
data/docs/CONFIGURATION.md
CHANGED
@@ -79,6 +79,7 @@ topic|nil|Topic to produce to.
|
|
79
79
|
schema|nil|This is optional but strongly recommended for testing purposes; this will validate against a local schema file used as the reader schema, as well as being able to write tests against this schema. This is recommended since it ensures you are always getting the values you expect.
|
80
80
|
namespace|nil|Namespace of the schema to use when finding it locally.
|
81
81
|
key_config|nil|Configuration hash for message keys. See [Kafka Message Keys](../README.md#installation)
|
82
|
+
disabled|false|Set to true to skip starting an actual listener for this consumer on startup.
|
82
83
|
group_id|nil|ID of the consumer group.
|
83
84
|
max_concurrency|1|Number of threads created for this listener. Each thread will behave as an independent consumer. They don't share any state.
|
84
85
|
start_from_beginning|true|Once the consumer group has checkpointed its progress in the topic's partitions, the consumers will always start from the checkpointed offsets, regardless of config. As such, this setting only applies when the consumer initially starts consuming from a topic
|
@@ -0,0 +1,52 @@
|
|
1
|
+
# Running Integration Tests
|
2
|
+
|
3
|
+
This repo includes integration tests in the [spec/utils](spec/utils) directory.
|
4
|
+
Here, there are tests for more deimos features that include a database integration like
|
5
|
+
* [Database Poller](README.md#Database Poller)
|
6
|
+
* [Database Backend](docs/DATABASE_BACKEND.md)
|
7
|
+
* [Deadlock Retrying](lib/deimos/utils/deadlock_retry.rb)
|
8
|
+
|
9
|
+
You will need to set up the following databases to develop and create unit tests in these test suites.
|
10
|
+
* [SQLite](#SQLite)
|
11
|
+
* [MySQL](#MySQL)
|
12
|
+
* [PostgreSQL](#PostgreSQL)
|
13
|
+
|
14
|
+
### SQLite
|
15
|
+
This database is covered through the `sqlite3` gem.
|
16
|
+
|
17
|
+
## MySQL
|
18
|
+
### Setting up a local MySQL server (Mac)
|
19
|
+
```bash
|
20
|
+
# Download MySQL (Optionally, choose a version you are comfortable with)
|
21
|
+
brew install mysql
|
22
|
+
# Start automatically after rebooting your machine
|
23
|
+
brew services start mysql
|
24
|
+
|
25
|
+
# Cleanup once you are done with MySQL
|
26
|
+
brew services stop mysql
|
27
|
+
```
|
28
|
+
|
29
|
+
## PostgreSQL
|
30
|
+
### Setting up a local PostgreSQL server (Mac)
|
31
|
+
```bash
|
32
|
+
# Install postgres if it's not already installed
|
33
|
+
brew install postgres
|
34
|
+
|
35
|
+
# Initialize and Start up postgres db
|
36
|
+
brew services start postgres
|
37
|
+
initdb /usr/local/var/postgres
|
38
|
+
# Create the default database and user
|
39
|
+
# Use the password "root"
|
40
|
+
createuser -s --password postgres
|
41
|
+
|
42
|
+
# Cleanup once done with Postgres
|
43
|
+
killall postgres
|
44
|
+
brew services stop postgres
|
45
|
+
```
|
46
|
+
|
47
|
+
## Running Integration Tests
|
48
|
+
You must specify the tag "integration" when running these these test suites.
|
49
|
+
This can be done through the CLI with the `--tag integration` argument.
|
50
|
+
```bash
|
51
|
+
rspec spec/utils/ --tag integration
|
52
|
+
```
|
data/docs/UPGRADING.md
ADDED
@@ -0,0 +1,128 @@
|
|
1
|
+
# Upgrading Deimos
|
2
|
+
|
3
|
+
## Upgrading from < 1.5.0 to >= 1.5.0
|
4
|
+
|
5
|
+
If you are using Confluent's schema registry to Avro-encode your
|
6
|
+
messages, you will need to manually include the `avro_turf` gem
|
7
|
+
in your Gemfile now.
|
8
|
+
|
9
|
+
This update changes how to interact with Deimos's schema classes.
|
10
|
+
Although these are meant to be internal, they are still "public"
|
11
|
+
and can be used by calling code.
|
12
|
+
|
13
|
+
Before 1.5.0:
|
14
|
+
|
15
|
+
```ruby
|
16
|
+
encoder = Deimos::AvroDataEncoder.new(schema: 'MySchema',
|
17
|
+
namespace: 'com.my-namespace')
|
18
|
+
encoder.encode(my_payload)
|
19
|
+
|
20
|
+
decoder = Deimos::AvroDataDecoder.new(schema: 'MySchema',
|
21
|
+
namespace: 'com.my-namespace')
|
22
|
+
decoder.decode(my_payload)
|
23
|
+
```
|
24
|
+
|
25
|
+
After 1.5.0:
|
26
|
+
```ruby
|
27
|
+
backend = Deimos.schema_backend(schema: 'MySchema', namespace: 'com.my-namespace')
|
28
|
+
backend.encode(my_payload)
|
29
|
+
backend.decode(my_payload)
|
30
|
+
```
|
31
|
+
|
32
|
+
The two classes are different and if you are using them to e.g.
|
33
|
+
inspect Avro schema fields, please look at the source code for the following:
|
34
|
+
* `Deimos::SchemaBackends::Base`
|
35
|
+
* `Deimos::SchemaBackends::AvroBase`
|
36
|
+
* `Deimos::SchemaBackends::AvroSchemaRegistry`
|
37
|
+
|
38
|
+
Deprecated `Deimos::TestHelpers.sent_messages` in favor of
|
39
|
+
`Deimos::Backends::Test.sent_messages`.
|
40
|
+
|
41
|
+
## Upgrading from < 1.4.0 to >= 1.4.0
|
42
|
+
|
43
|
+
Previously, configuration was handled as follows:
|
44
|
+
* Kafka configuration, including listeners, lived in `phobos.yml`
|
45
|
+
* Additional Deimos configuration would live in an initializer, e.g. `kafka.rb`
|
46
|
+
* Producer and consumer configuration lived in each individual producer and consumer
|
47
|
+
|
48
|
+
As of 1.4.0, all configuration is centralized in one initializer
|
49
|
+
file, using default configuration.
|
50
|
+
|
51
|
+
Before 1.4.0:
|
52
|
+
```yaml
|
53
|
+
# config/phobos.yml
|
54
|
+
logger:
|
55
|
+
file: log/phobos.log
|
56
|
+
level: debug
|
57
|
+
ruby_kafka:
|
58
|
+
level: debug
|
59
|
+
|
60
|
+
kafka:
|
61
|
+
client_id: phobos
|
62
|
+
connect_timeout: 15
|
63
|
+
socket_timeout: 15
|
64
|
+
|
65
|
+
producer:
|
66
|
+
ack_timeout: 5
|
67
|
+
required_acks: :all
|
68
|
+
...
|
69
|
+
|
70
|
+
listeners:
|
71
|
+
- handler: ConsumerTest::MyConsumer
|
72
|
+
topic: my_consume_topic
|
73
|
+
group_id: my_group_id
|
74
|
+
- handler: ConsumerTest::MyBatchConsumer
|
75
|
+
topic: my_batch_consume_topic
|
76
|
+
group_id: my_batch_group_id
|
77
|
+
delivery: inline_batch
|
78
|
+
```
|
79
|
+
|
80
|
+
```ruby
|
81
|
+
# kafka.rb
|
82
|
+
Deimos.configure do |config|
|
83
|
+
config.reraise_consumer_errors = true
|
84
|
+
config.logger = Rails.logger
|
85
|
+
...
|
86
|
+
end
|
87
|
+
|
88
|
+
# my_consumer.rb
|
89
|
+
class ConsumerTest::MyConsumer < Deimos::Producer
|
90
|
+
namespace 'com.my-namespace'
|
91
|
+
schema 'MySchema'
|
92
|
+
topic 'MyTopic'
|
93
|
+
key_config field: :id
|
94
|
+
end
|
95
|
+
```
|
96
|
+
|
97
|
+
After 1.4.0:
|
98
|
+
```ruby
|
99
|
+
kafka.rb
|
100
|
+
Deimos.configure do
|
101
|
+
logger Rails.logger
|
102
|
+
kafka do
|
103
|
+
client_id 'phobos'
|
104
|
+
connect_timeout 15
|
105
|
+
socket_timeout 15
|
106
|
+
end
|
107
|
+
producers.ack_timeout 5
|
108
|
+
producers.required_acks :all
|
109
|
+
...
|
110
|
+
consumer do
|
111
|
+
class_name 'ConsumerTest::MyConsumer'
|
112
|
+
topic 'my_consume_topic'
|
113
|
+
group_id 'my_group_id'
|
114
|
+
namespace 'com.my-namespace'
|
115
|
+
schema 'MySchema'
|
116
|
+
topic 'MyTopic'
|
117
|
+
key_config field: :id
|
118
|
+
end
|
119
|
+
...
|
120
|
+
end
|
121
|
+
```
|
122
|
+
|
123
|
+
Note that the old configuration way *will* work if you set
|
124
|
+
`config.phobos_config_file = "config/phobos.yml"`. You will
|
125
|
+
get a number of deprecation notices, however. You can also still
|
126
|
+
set the topic, namespace, etc. on the producer/consumer class,
|
127
|
+
but it's much more convenient to centralize these configs
|
128
|
+
in one place to see what your app does.
|
data/lib/deimos/backends/db.rb
CHANGED
@@ -14,7 +14,7 @@ module Deimos
|
|
14
14
|
message = Deimos::KafkaMessage.new(
|
15
15
|
message: m.encoded_payload ? m.encoded_payload.to_s.b : nil,
|
16
16
|
topic: m.topic,
|
17
|
-
partition_key: m
|
17
|
+
partition_key: partition_key_for(m)
|
18
18
|
)
|
19
19
|
message.key = m.encoded_key.to_s.b unless producer_class.config[:no_keys]
|
20
20
|
message
|
@@ -26,6 +26,15 @@ module Deimos
|
|
26
26
|
by: records.size
|
27
27
|
)
|
28
28
|
end
|
29
|
+
|
30
|
+
# @param message [Deimos::Message]
|
31
|
+
# @return [String] the partition key to use for this message
|
32
|
+
def partition_key_for(message)
|
33
|
+
return message.partition_key if message.partition_key.present?
|
34
|
+
return message.key unless message.key.is_a?(Hash)
|
35
|
+
|
36
|
+
message.key.to_yaml
|
37
|
+
end
|
29
38
|
end
|
30
39
|
end
|
31
40
|
end
|
@@ -15,6 +15,11 @@ module Deimos
|
|
15
15
|
# enabled true
|
16
16
|
# ca_cert_file 'my_file'
|
17
17
|
# end
|
18
|
+
# config.kafka do
|
19
|
+
# ssl do
|
20
|
+
# enabled true
|
21
|
+
# end
|
22
|
+
# end
|
18
23
|
# end
|
19
24
|
# - Allows for arrays of configurations:
|
20
25
|
# Deimos.configure do |config|
|
@@ -245,6 +250,13 @@ module Deimos
|
|
245
250
|
|
246
251
|
# Configure the settings with values.
|
247
252
|
def configure(&block)
|
253
|
+
if defined?(Rake) && defined?(Rake.application)
|
254
|
+
tasks = Rake.application.top_level_tasks
|
255
|
+
if tasks.any? { |t| %w(assets webpacker yarn).include?(t.split(':').first) }
|
256
|
+
puts 'Skipping Deimos configuration since we are in JS/CSS compilation'
|
257
|
+
return
|
258
|
+
end
|
259
|
+
end
|
248
260
|
config.run_callbacks(:configure) do
|
249
261
|
config.instance_eval(&block)
|
250
262
|
end
|
@@ -319,6 +319,10 @@ module Deimos
|
|
319
319
|
# Key configuration (see docs).
|
320
320
|
# @return [Hash]
|
321
321
|
setting :key_config
|
322
|
+
# Set to true to ignore the consumer in the Phobos config and not actually start up a
|
323
|
+
# listener.
|
324
|
+
# @return [Boolean]
|
325
|
+
setting :disabled, false
|
322
326
|
|
323
327
|
# These are the phobos "listener" configs. See CONFIGURATION.md for more
|
324
328
|
# info.
|
@@ -63,8 +63,10 @@ module Deimos
|
|
63
63
|
}
|
64
64
|
|
65
65
|
p_config[:listeners] = self.consumer_objects.map do |consumer|
|
66
|
+
next nil if consumer.disabled
|
67
|
+
|
66
68
|
hash = consumer.to_h.reject do |k, _|
|
67
|
-
%i(class_name schema namespace key_config backoff).include?(k)
|
69
|
+
%i(class_name schema namespace key_config backoff disabled).include?(k)
|
68
70
|
end
|
69
71
|
hash = hash.map { |k, v| [k, v.is_a?(Symbol) ? v.to_s : v] }.to_h
|
70
72
|
hash[:handler] = consumer.class_name
|
@@ -73,6 +75,7 @@ module Deimos
|
|
73
75
|
end
|
74
76
|
hash
|
75
77
|
end
|
78
|
+
p_config[:listeners].compact!
|
76
79
|
|
77
80
|
if self.kafka.ssl.enabled
|
78
81
|
%w(ca_cert client_cert client_cert_key).each do |key|
|
data/lib/deimos/kafka_source.rb
CHANGED
@@ -88,8 +88,9 @@ module Deimos
|
|
88
88
|
array_of_attributes,
|
89
89
|
options={})
|
90
90
|
results = super
|
91
|
-
|
92
|
-
|
91
|
+
if !self.kafka_config[:import] || array_of_attributes.empty?
|
92
|
+
return results
|
93
|
+
end
|
93
94
|
|
94
95
|
# This will contain an array of hashes, where each hash is the actual
|
95
96
|
# attribute hash that created the object.
|
data/lib/deimos/producer.rb
CHANGED
@@ -87,8 +87,9 @@ module Deimos
|
|
87
87
|
|
88
88
|
# Publish the payload to the topic.
|
89
89
|
# @param payload [Hash] with an optional payload_key hash key.
|
90
|
-
|
91
|
-
|
90
|
+
# @param topic [String] if specifying the topic
|
91
|
+
def publish(payload, topic: self.topic)
|
92
|
+
publish_list([payload], topic: topic)
|
92
93
|
end
|
93
94
|
|
94
95
|
# Publish a list of messages.
|
@@ -97,11 +98,14 @@ module Deimos
|
|
97
98
|
# whether to publish synchronously.
|
98
99
|
# @param force_send [Boolean] if true, ignore the configured backend
|
99
100
|
# and send immediately to Kafka.
|
100
|
-
|
101
|
+
# @param topic [String] if specifying the topic
|
102
|
+
def publish_list(payloads, sync: nil, force_send: false, topic: self.topic)
|
101
103
|
return if Deimos.config.kafka.seed_brokers.blank? ||
|
102
104
|
Deimos.config.producers.disabled ||
|
103
105
|
Deimos.producers_disabled?(self)
|
104
106
|
|
107
|
+
raise 'Topic not specified. Please specify the topic.' if topic.blank?
|
108
|
+
|
105
109
|
backend_class = determine_backend_class(sync, force_send)
|
106
110
|
Deimos.instrument(
|
107
111
|
'encode_messages',
|
@@ -110,7 +114,7 @@ module Deimos
|
|
110
114
|
payloads: payloads
|
111
115
|
) do
|
112
116
|
messages = Array(payloads).map { |p| Deimos::Message.new(p, self) }
|
113
|
-
messages.each(
|
117
|
+
messages.each { |m| _process_message(m, topic) }
|
114
118
|
messages.in_groups_of(MAX_BATCH_SIZE, false) do |batch|
|
115
119
|
self.produce_batch(backend_class, batch)
|
116
120
|
end
|
@@ -163,7 +167,8 @@ module Deimos
|
|
163
167
|
private
|
164
168
|
|
165
169
|
# @param message [Message]
|
166
|
-
|
170
|
+
# @param topic [String]
|
171
|
+
def _process_message(message, topic)
|
167
172
|
# this violates the Law of Demeter but it has to happen in a very
|
168
173
|
# specific order and requires a bunch of methods on the producer
|
169
174
|
# to work correctly.
|
@@ -175,12 +180,12 @@ module Deimos
|
|
175
180
|
message.payload = nil if message.payload.blank?
|
176
181
|
message.coerce_fields(encoder)
|
177
182
|
message.encoded_key = _encode_key(message.key)
|
178
|
-
message.topic =
|
183
|
+
message.topic = topic
|
179
184
|
message.encoded_payload = if message.payload.nil?
|
180
185
|
nil
|
181
186
|
else
|
182
187
|
encoder.encode(message.payload,
|
183
|
-
topic: "#{config[:topic]}-value")
|
188
|
+
topic: "#{Deimos.config.producers.topic_prefix}#{config[:topic]}-value")
|
184
189
|
end
|
185
190
|
end
|
186
191
|
|
@@ -198,9 +203,9 @@ module Deimos
|
|
198
203
|
end
|
199
204
|
|
200
205
|
if config[:key_field]
|
201
|
-
encoder.encode_key(config[:key_field], key, topic: "#{config[:topic]}-key")
|
206
|
+
encoder.encode_key(config[:key_field], key, topic: "#{Deimos.config.producers.topic_prefix}#{config[:topic]}-key")
|
202
207
|
elsif config[:key_schema]
|
203
|
-
key_encoder.encode(key, topic: "#{config[:topic]}-key")
|
208
|
+
key_encoder.encode(key, topic: "#{Deimos.config.producers.topic_prefix}#{config[:topic]}-key")
|
204
209
|
else
|
205
210
|
key
|
206
211
|
end
|
@@ -15,7 +15,7 @@ module Deimos
|
|
15
15
|
|
16
16
|
# @override
|
17
17
|
def encode_payload(payload, schema: nil, topic: nil)
|
18
|
-
avro_turf_messaging.encode(payload, schema_name: schema, subject: topic)
|
18
|
+
avro_turf_messaging.encode(payload, schema_name: schema, subject: topic || schema)
|
19
19
|
end
|
20
20
|
|
21
21
|
private
|
data/lib/deimos/test_helpers.rb
CHANGED
@@ -30,18 +30,19 @@ module Deimos
|
|
30
30
|
d_config.consumers.reraise_errors = true
|
31
31
|
d_config.kafka.seed_brokers ||= ['test_broker']
|
32
32
|
d_config.schema.backend = Deimos.schema_backend_class.mock_backend
|
33
|
+
d_config.producers.backend = :test
|
33
34
|
end
|
34
35
|
end
|
35
|
-
end
|
36
36
|
|
37
|
-
|
38
|
-
|
39
|
-
|
40
|
-
|
37
|
+
config.before(:each) do
|
38
|
+
client = double('client').as_null_object
|
39
|
+
allow(client).to receive(:time) do |*_args, &block|
|
40
|
+
block.call
|
41
|
+
end
|
42
|
+
Deimos::Backends::Test.sent_messages.clear
|
41
43
|
end
|
42
|
-
Deimos.configure { |c| c.producers.backend = :test }
|
43
|
-
Deimos::Backends::Test.sent_messages.clear
|
44
44
|
end
|
45
|
+
|
45
46
|
end
|
46
47
|
|
47
48
|
# @deprecated
|
@@ -190,11 +190,14 @@ module Deimos
|
|
190
190
|
end
|
191
191
|
end
|
192
192
|
|
193
|
+
# Produce messages in batches, reducing the size 1/10 if the batch is too
|
194
|
+
# large. Does not retry batches of messages that have already been sent.
|
193
195
|
# @param batch [Array<Hash>]
|
194
196
|
def produce_messages(batch)
|
195
197
|
batch_size = batch.size
|
198
|
+
current_index = 0
|
196
199
|
begin
|
197
|
-
batch.in_groups_of(batch_size, false).each do |group|
|
200
|
+
batch[current_index..-1].in_groups_of(batch_size, false).each do |group|
|
198
201
|
@logger.debug("Publishing #{group.size} messages to #{@current_topic}")
|
199
202
|
producer.publish_list(group)
|
200
203
|
Deimos.config.metrics&.increment(
|
@@ -202,6 +205,7 @@ module Deimos
|
|
202
205
|
tags: %W(status:success topic:#{@current_topic}),
|
203
206
|
by: group.size
|
204
207
|
)
|
208
|
+
current_index += group.size
|
205
209
|
@logger.info("Sent #{group.size} messages to #{@current_topic}")
|
206
210
|
end
|
207
211
|
rescue Kafka::BufferOverflow, Kafka::MessageSizeTooLarge,
|
@@ -28,14 +28,18 @@ module Deimos
|
|
28
28
|
|
29
29
|
# Indicate which schemas should be assigned to actions.
|
30
30
|
# @param actions [Symbol]
|
31
|
+
# @param kwactions [String]
|
31
32
|
# @param request [String]
|
32
33
|
# @param response [String]
|
33
|
-
def schemas(*actions, request: nil, response: nil)
|
34
|
+
def schemas(*actions, request: nil, response: nil, **kwactions)
|
34
35
|
actions.each do |action|
|
35
36
|
request ||= action.to_s.titleize
|
36
37
|
response ||= action.to_s.titleize
|
37
38
|
schema_mapping[action.to_s] = { request: request, response: response }
|
38
39
|
end
|
40
|
+
kwactions.each do |key, val|
|
41
|
+
schema_mapping[key.to_s] = { request: val, response: val }
|
42
|
+
end
|
39
43
|
end
|
40
44
|
|
41
45
|
# @return [Hash<Symbol, String>]
|
data/lib/deimos/version.rb
CHANGED
data/spec/backends/db_spec.rb
CHANGED
@@ -43,6 +43,12 @@ each_db_config(Deimos::Backends::Db) do
|
|
43
43
|
described_class.publish(producer_class: MyNoKeyProducer,
|
44
44
|
messages: [messages.first])
|
45
45
|
expect(Deimos::KafkaMessage.count).to eq(4)
|
46
|
+
end
|
46
47
|
|
48
|
+
it 'should add messages with Hash keys with JSON encoding' do
|
49
|
+
described_class.publish(producer_class: MyProducer,
|
50
|
+
messages: [build_message({ foo: 0 }, 'my-topic', { 'test_id' => 0 })])
|
51
|
+
expect(Deimos::KafkaMessage.count).to eq(1)
|
52
|
+
expect(Deimos::KafkaMessage.last.partition_key).to eq(%(---\ntest_id: 0\n))
|
47
53
|
end
|
48
54
|
end
|
@@ -6,6 +6,14 @@ class MyConfigConsumer < Deimos::Consumer
|
|
6
6
|
def consume
|
7
7
|
end
|
8
8
|
end
|
9
|
+
|
10
|
+
# Mock consumer 2
|
11
|
+
class MyConfigConsumer2 < Deimos::Consumer
|
12
|
+
# :no-doc:
|
13
|
+
def consume
|
14
|
+
end
|
15
|
+
end
|
16
|
+
|
9
17
|
describe Deimos, 'configuration' do
|
10
18
|
it 'should configure with deprecated fields' do
|
11
19
|
logger = Logger.new(nil)
|
@@ -171,6 +179,13 @@ describe Deimos, 'configuration' do
|
|
171
179
|
offset_retention_time 13
|
172
180
|
heartbeat_interval 13
|
173
181
|
end
|
182
|
+
consumer do
|
183
|
+
disabled true
|
184
|
+
class_name 'MyConfigConsumer2'
|
185
|
+
schema 'blah2'
|
186
|
+
topic 'blah2'
|
187
|
+
group_id 'myconsumerid2'
|
188
|
+
end
|
174
189
|
end
|
175
190
|
|
176
191
|
expect(described_class.config.phobos_config).
|
data/spec/kafka_source_spec.rb
CHANGED
@@ -225,5 +225,88 @@ module KafkaSourceSpec
|
|
225
225
|
expect(Deimos::KafkaMessage.count).to eq(0)
|
226
226
|
end
|
227
227
|
end
|
228
|
+
|
229
|
+
context 'with import hooks disabled' do
|
230
|
+
before(:each) do
|
231
|
+
# Dummy class we can include the mixin in. Has a backing table created
|
232
|
+
# earlier and has the import hook disabled
|
233
|
+
class WidgetNoImportHook < ActiveRecord::Base
|
234
|
+
include Deimos::KafkaSource
|
235
|
+
self.table_name = 'widgets'
|
236
|
+
|
237
|
+
# :nodoc:
|
238
|
+
def self.kafka_config
|
239
|
+
{
|
240
|
+
update: true,
|
241
|
+
delete: true,
|
242
|
+
import: false,
|
243
|
+
create: true
|
244
|
+
}
|
245
|
+
end
|
246
|
+
|
247
|
+
# :nodoc:
|
248
|
+
def self.kafka_producers
|
249
|
+
[WidgetProducer]
|
250
|
+
end
|
251
|
+
end
|
252
|
+
WidgetNoImportHook.reset_column_information
|
253
|
+
end
|
254
|
+
|
255
|
+
it 'should not fail when bulk-importing with existing records' do
|
256
|
+
widget1 = WidgetNoImportHook.create(widget_id: 1, name: 'Widget 1')
|
257
|
+
widget2 = WidgetNoImportHook.create(widget_id: 2, name: 'Widget 2')
|
258
|
+
widget1.name = 'New Widget No Import Hook 1'
|
259
|
+
widget2.name = 'New Widget No Import Hook 2'
|
260
|
+
|
261
|
+
expect {
|
262
|
+
WidgetNoImportHook.import([widget1, widget2], on_duplicate_key_update: %i(widget_id name))
|
263
|
+
}.not_to raise_error
|
264
|
+
|
265
|
+
expect('my-topic').not_to have_sent({
|
266
|
+
widget_id: 1,
|
267
|
+
name: 'New Widget No Import Hook 1',
|
268
|
+
id: widget1.id,
|
269
|
+
created_at: anything,
|
270
|
+
updated_at: anything
|
271
|
+
}, widget1.id)
|
272
|
+
expect('my-topic').not_to have_sent({
|
273
|
+
widget_id: 2,
|
274
|
+
name: 'New Widget No Import Hook 2',
|
275
|
+
id: widget2.id,
|
276
|
+
created_at: anything,
|
277
|
+
updated_at: anything
|
278
|
+
}, widget2.id)
|
279
|
+
end
|
280
|
+
|
281
|
+
it 'should not fail when mixing existing and new records' do
|
282
|
+
widget1 = WidgetNoImportHook.create(widget_id: 1, name: 'Widget 1')
|
283
|
+
expect('my-topic').to have_sent({
|
284
|
+
widget_id: 1,
|
285
|
+
name: 'Widget 1',
|
286
|
+
id: widget1.id,
|
287
|
+
created_at: anything,
|
288
|
+
updated_at: anything
|
289
|
+
}, widget1.id)
|
290
|
+
|
291
|
+
widget2 = WidgetNoImportHook.new(widget_id: 2, name: 'Widget 2')
|
292
|
+
widget1.name = 'New Widget 1'
|
293
|
+
WidgetNoImportHook.import([widget1, widget2], on_duplicate_key_update: %i(widget_id))
|
294
|
+
widgets = WidgetNoImportHook.all
|
295
|
+
expect('my-topic').not_to have_sent({
|
296
|
+
widget_id: 1,
|
297
|
+
name: 'New Widget 1',
|
298
|
+
id: widgets[0].id,
|
299
|
+
created_at: anything,
|
300
|
+
updated_at: anything
|
301
|
+
}, widgets[0].id)
|
302
|
+
expect('my-topic').not_to have_sent({
|
303
|
+
widget_id: 2,
|
304
|
+
name: 'Widget 2',
|
305
|
+
id: widgets[1].id,
|
306
|
+
created_at: anything,
|
307
|
+
updated_at: anything
|
308
|
+
}, widgets[1].id)
|
309
|
+
end
|
310
|
+
end
|
228
311
|
end
|
229
312
|
end
|
data/spec/producer_spec.rb
CHANGED
@@ -64,6 +64,14 @@ module ProducerTest
|
|
64
64
|
end
|
65
65
|
stub_const('MyErrorProducer', producer_class)
|
66
66
|
|
67
|
+
producer_class = Class.new(Deimos::Producer) do
|
68
|
+
schema 'MySchema'
|
69
|
+
namespace 'com.my-namespace'
|
70
|
+
topic nil
|
71
|
+
key_config none: true
|
72
|
+
end
|
73
|
+
stub_const('MyNoTopicProducer', producer_class)
|
74
|
+
|
67
75
|
end
|
68
76
|
|
69
77
|
it 'should fail on invalid message with error handler' do
|
@@ -102,6 +110,33 @@ module ProducerTest
|
|
102
110
|
expect('my-topic').not_to have_sent('test_id' => 'foo2', 'some_int' => 123)
|
103
111
|
end
|
104
112
|
|
113
|
+
it 'should allow setting the topic from publish_list' do
|
114
|
+
expect(described_class).to receive(:produce_batch).once.with(
|
115
|
+
Deimos::Backends::Test,
|
116
|
+
[
|
117
|
+
Deimos::Message.new({ 'test_id' => 'foo', 'some_int' => 123 },
|
118
|
+
MyProducer,
|
119
|
+
topic: 'a-new-topic',
|
120
|
+
partition_key: 'foo',
|
121
|
+
key: 'foo'),
|
122
|
+
Deimos::Message.new({ 'test_id' => 'bar', 'some_int' => 124 },
|
123
|
+
MyProducer,
|
124
|
+
topic: 'a-new-topic',
|
125
|
+
partition_key: 'bar',
|
126
|
+
key: 'bar')
|
127
|
+
]
|
128
|
+
).and_call_original
|
129
|
+
|
130
|
+
MyProducer.publish_list(
|
131
|
+
[{ 'test_id' => 'foo', 'some_int' => 123 },
|
132
|
+
{ 'test_id' => 'bar', 'some_int' => 124 }],
|
133
|
+
topic: 'a-new-topic'
|
134
|
+
)
|
135
|
+
expect('a-new-topic').to have_sent('test_id' => 'foo', 'some_int' => 123)
|
136
|
+
expect('my-topic').not_to have_sent('test_id' => 'foo', 'some_int' => 123)
|
137
|
+
expect('my-topic').not_to have_sent('test_id' => 'foo2', 'some_int' => 123)
|
138
|
+
end
|
139
|
+
|
105
140
|
it 'should add a message ID' do
|
106
141
|
payload = { 'test_id' => 'foo',
|
107
142
|
'some_int' => 123,
|
@@ -201,6 +236,7 @@ module ProducerTest
|
|
201
236
|
end
|
202
237
|
|
203
238
|
it 'should encode the key' do
|
239
|
+
Deimos.configure { |c| c.producers.topic_prefix = nil }
|
204
240
|
expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'foo', topic: 'my-topic-key')
|
205
241
|
expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'bar', topic: 'my-topic-key')
|
206
242
|
expect(MyProducer.encoder).to receive(:encode).with({
|
@@ -218,6 +254,21 @@ module ProducerTest
|
|
218
254
|
)
|
219
255
|
end
|
220
256
|
|
257
|
+
it 'should encode the key with topic prefix' do
|
258
|
+
Deimos.configure { |c| c.producers.topic_prefix = 'prefix.' }
|
259
|
+
expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'foo', topic: 'prefix.my-topic-key')
|
260
|
+
expect(MyProducer.encoder).to receive(:encode_key).with('test_id', 'bar', topic: 'prefix.my-topic-key')
|
261
|
+
expect(MyProducer.encoder).to receive(:encode).with({ 'test_id' => 'foo',
|
262
|
+
'some_int' => 123 },
|
263
|
+
{ topic: 'prefix.my-topic-value' })
|
264
|
+
expect(MyProducer.encoder).to receive(:encode).with({ 'test_id' => 'bar',
|
265
|
+
'some_int' => 124 },
|
266
|
+
{ topic: 'prefix.my-topic-value' })
|
267
|
+
|
268
|
+
MyProducer.publish_list([{ 'test_id' => 'foo', 'some_int' => 123 },
|
269
|
+
{ 'test_id' => 'bar', 'some_int' => 124 }])
|
270
|
+
end
|
271
|
+
|
221
272
|
it 'should not encode with plaintext key' do
|
222
273
|
expect(MyNonEncodedProducer.key_encoder).not_to receive(:encode_key)
|
223
274
|
|
@@ -269,6 +320,31 @@ module ProducerTest
|
|
269
320
|
)
|
270
321
|
end
|
271
322
|
|
323
|
+
it 'should raise error if blank topic is passed in explicitly' do
|
324
|
+
expect {
|
325
|
+
MyProducer.publish_list(
|
326
|
+
[{ 'test_id' => 'foo',
|
327
|
+
'some_int' => 123 },
|
328
|
+
{ 'test_id' => 'bar',
|
329
|
+
'some_int' => 124 }],
|
330
|
+
topic: ''
|
331
|
+
)
|
332
|
+
}.to raise_error(RuntimeError,
|
333
|
+
'Topic not specified. Please specify the topic.')
|
334
|
+
end
|
335
|
+
|
336
|
+
it 'should raise error if the producer has not been initialized with a topic' do
|
337
|
+
expect {
|
338
|
+
MyNoTopicProducer.publish_list(
|
339
|
+
[{ 'test_id' => 'foo',
|
340
|
+
'some_int' => 123 },
|
341
|
+
{ 'test_id' => 'bar',
|
342
|
+
'some_int' => 124 }]
|
343
|
+
)
|
344
|
+
}.to raise_error(RuntimeError,
|
345
|
+
'Topic not specified. Please specify the topic.')
|
346
|
+
end
|
347
|
+
|
272
348
|
it 'should error with nothing set' do
|
273
349
|
expect {
|
274
350
|
MyErrorProducer.publish_list(
|
data/spec/spec_helper.rb
CHANGED
@@ -183,6 +183,7 @@ RSpec.configure do |config|
|
|
183
183
|
config.before(:each) do
|
184
184
|
Deimos.config.reset!
|
185
185
|
Deimos.configure do |deimos_config|
|
186
|
+
deimos_config.producers.backend = :test
|
186
187
|
deimos_config.phobos_config_file = File.join(File.dirname(__FILE__), 'phobos.yml')
|
187
188
|
deimos_config.schema.path = File.join(File.expand_path(__dir__), 'schemas')
|
188
189
|
deimos_config.consumers.reraise_errors = true
|
@@ -96,7 +96,32 @@ each_db_config(Deimos::Utils::DbProducer) do
|
|
96
96
|
expect(phobos_producer).to have_received(:publish_list).with(['A'] * 100).once
|
97
97
|
expect(phobos_producer).to have_received(:publish_list).with(['A'] * 10).once
|
98
98
|
expect(phobos_producer).to have_received(:publish_list).with(['A']).once
|
99
|
+
end
|
100
|
+
|
101
|
+
it 'should not resend batches of sent messages' do
|
102
|
+
allow(phobos_producer).to receive(:publish_list) do |group|
|
103
|
+
raise Kafka::BufferOverflow if group.any?('A') && group.size >= 1000
|
104
|
+
raise Kafka::BufferOverflow if group.any?('BIG') && group.size >= 10
|
105
|
+
end
|
106
|
+
allow(Deimos.config.metrics).to receive(:increment)
|
107
|
+
batch = ['A'] * 450 + ['BIG'] * 550
|
108
|
+
producer.produce_messages(batch)
|
109
|
+
|
110
|
+
expect(phobos_producer).to have_received(:publish_list).with(batch)
|
111
|
+
expect(phobos_producer).to have_received(:publish_list).with(['A'] * 100).exactly(4).times
|
112
|
+
expect(phobos_producer).to have_received(:publish_list).with(['A'] * 50 + ['BIG'] * 50)
|
113
|
+
expect(phobos_producer).to have_received(:publish_list).with(['A'] * 10).exactly(5).times
|
114
|
+
expect(phobos_producer).to have_received(:publish_list).with(['BIG'] * 1).exactly(550).times
|
99
115
|
|
116
|
+
expect(Deimos.config.metrics).to have_received(:increment).with('publish',
|
117
|
+
tags: %w(status:success topic:),
|
118
|
+
by: 100).exactly(4).times
|
119
|
+
expect(Deimos.config.metrics).to have_received(:increment).with('publish',
|
120
|
+
tags: %w(status:success topic:),
|
121
|
+
by: 10).exactly(5).times
|
122
|
+
expect(Deimos.config.metrics).to have_received(:increment).with('publish',
|
123
|
+
tags: %w(status:success topic:),
|
124
|
+
by: 1).exactly(550).times
|
100
125
|
end
|
101
126
|
|
102
127
|
describe '#compact_messages' do
|
@@ -289,6 +314,8 @@ each_db_config(Deimos::Utils::DbProducer) do
|
|
289
314
|
message: "mess#{i}",
|
290
315
|
partition_key: "key#{i}"
|
291
316
|
)
|
317
|
+
end
|
318
|
+
(5..8).each do |i|
|
292
319
|
Deimos::KafkaMessage.create!(
|
293
320
|
id: i,
|
294
321
|
topic: 'my-topic2',
|
@@ -17,6 +17,7 @@ RSpec.describe Deimos::Utils::SchemaControllerMixin, type: :controller do
|
|
17
17
|
request_namespace 'com.my-namespace.request'
|
18
18
|
response_namespace 'com.my-namespace.response'
|
19
19
|
schemas :index, :show
|
20
|
+
schemas create: 'CreateTopic'
|
20
21
|
schemas :update, request: 'UpdateRequest', response: 'UpdateResponse'
|
21
22
|
|
22
23
|
# :nodoc:
|
@@ -29,6 +30,11 @@ RSpec.describe Deimos::Utils::SchemaControllerMixin, type: :controller do
|
|
29
30
|
render_schema({ 'response_id' => payload[:request_id] + ' dad' })
|
30
31
|
end
|
31
32
|
|
33
|
+
# :nodoc:
|
34
|
+
def create
|
35
|
+
render_schema({ 'response_id' => payload[:request_id] + ' bro' })
|
36
|
+
end
|
37
|
+
|
32
38
|
# :nodoc:
|
33
39
|
def update
|
34
40
|
render_schema({ 'update_response_id' => payload[:update_request_id] + ' sis' })
|
@@ -65,4 +71,14 @@ RSpec.describe Deimos::Utils::SchemaControllerMixin, type: :controller do
|
|
65
71
|
expect(response_backend.decode(response.body)).to eq({ 'update_response_id' => 'hi sis' })
|
66
72
|
end
|
67
73
|
|
74
|
+
it 'should render the correct response for create' do
|
75
|
+
request_backend = Deimos.schema_backend(schema: 'CreateTopic',
|
76
|
+
namespace: 'com.my-namespace.request')
|
77
|
+
response_backend = Deimos.schema_backend(schema: 'CreateTopic',
|
78
|
+
namespace: 'com.my-namespace.response')
|
79
|
+
request.content_type = 'avro/binary'
|
80
|
+
post :create, params: { id: 1 }, body: request_backend.encode({ 'request_id' => 'hi' })
|
81
|
+
expect(response_backend.decode(response.body)).to eq({ 'response_id' => 'hi bro' })
|
82
|
+
end
|
83
|
+
|
68
84
|
end
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: deimos-ruby
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 1.8.
|
4
|
+
version: 1.8.3
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Daniel Orner
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2020-
|
11
|
+
date: 2020-11-18 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
name: avro_turf
|
@@ -348,7 +348,9 @@ files:
|
|
348
348
|
- docs/ARCHITECTURE.md
|
349
349
|
- docs/CONFIGURATION.md
|
350
350
|
- docs/DATABASE_BACKEND.md
|
351
|
+
- docs/INTEGRATION_TESTS.md
|
351
352
|
- docs/PULL_REQUEST_TEMPLATE.md
|
353
|
+
- docs/UPGRADING.md
|
352
354
|
- lib/deimos.rb
|
353
355
|
- lib/deimos/active_record_consume/batch_consumption.rb
|
354
356
|
- lib/deimos/active_record_consume/batch_slicer.rb
|
@@ -453,8 +455,10 @@ files:
|
|
453
455
|
- spec/schemas/com/my-namespace/Wibble.avsc
|
454
456
|
- spec/schemas/com/my-namespace/Widget.avsc
|
455
457
|
- spec/schemas/com/my-namespace/WidgetTheSecond.avsc
|
458
|
+
- spec/schemas/com/my-namespace/request/CreateTopic.avsc
|
456
459
|
- spec/schemas/com/my-namespace/request/Index.avsc
|
457
460
|
- spec/schemas/com/my-namespace/request/UpdateRequest.avsc
|
461
|
+
- spec/schemas/com/my-namespace/response/CreateTopic.avsc
|
458
462
|
- spec/schemas/com/my-namespace/response/Index.avsc
|
459
463
|
- spec/schemas/com/my-namespace/response/UpdateResponse.avsc
|
460
464
|
- spec/spec_helper.rb
|
@@ -483,9 +487,9 @@ required_ruby_version: !ruby/object:Gem::Requirement
|
|
483
487
|
version: '0'
|
484
488
|
required_rubygems_version: !ruby/object:Gem::Requirement
|
485
489
|
requirements:
|
486
|
-
- - "
|
490
|
+
- - ">="
|
487
491
|
- !ruby/object:Gem::Version
|
488
|
-
version:
|
492
|
+
version: '0'
|
489
493
|
requirements: []
|
490
494
|
rubygems_version: 3.1.3
|
491
495
|
signing_key:
|
@@ -534,8 +538,10 @@ test_files:
|
|
534
538
|
- spec/schemas/com/my-namespace/Wibble.avsc
|
535
539
|
- spec/schemas/com/my-namespace/Widget.avsc
|
536
540
|
- spec/schemas/com/my-namespace/WidgetTheSecond.avsc
|
541
|
+
- spec/schemas/com/my-namespace/request/CreateTopic.avsc
|
537
542
|
- spec/schemas/com/my-namespace/request/Index.avsc
|
538
543
|
- spec/schemas/com/my-namespace/request/UpdateRequest.avsc
|
544
|
+
- spec/schemas/com/my-namespace/response/CreateTopic.avsc
|
539
545
|
- spec/schemas/com/my-namespace/response/Index.avsc
|
540
546
|
- spec/schemas/com/my-namespace/response/UpdateResponse.avsc
|
541
547
|
- spec/spec_helper.rb
|