phobos 1.9.0.pre.beta3 → 2.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 37dfd76749d3cbea4d6e7611ad70581eb938487ff37088e6f80d2b5ba78eb844
4
- data.tar.gz: ba3104759ff0d2d2369b5077abc487545f75b32815e4fb4af042575e4176c12f
3
+ metadata.gz: ec66514784d6b38c90a4ca0aaf063e001536d2680732c4afa535b49c2a55cb9c
4
+ data.tar.gz: 1d988dd76e39305cfb30d3dd8e8fd926065631e391ed077d7c0459e6760587b4
5
5
  SHA512:
6
- metadata.gz: 2cfb3e98b26a94f2adfd3ab579bc47aae565ece64bbdec1446833e33cb70c0e27e4413d0d4d083c5dc2cdf04827624c64c3ad51781c686f83013dfe21b363926
7
- data.tar.gz: 829f7fd8355cc7717535e9e6f108a6f3dd18dd25f144ae659b4c81db9a19e661a8c916caf8ab9891d49e8502c5027d68376ceb7fc2ea8632b3142efe88c8b414
6
+ metadata.gz: 07a92fdf61d970a61f98cfb8615568fa48af0550fb473458b6dce67b479b684ce8e280dbd97b20b5c16f84bf14fa18eea47f8bae490c6ac5712b8d21d2c4d037
7
+ data.tar.gz: 59bb2c58cfd017b8f9462eb431d2bb62322bdc817a1260816648481abff4ff0a5d6f1df5430f3888a0bad3cb92eeb11ee2ace3ced62df1e8ee02756bb1e8a795
data/.gitignore CHANGED
@@ -12,3 +12,5 @@ spec/examples.txt
12
12
  config/*.yml
13
13
  log/*.log
14
14
  .byebug_history
15
+ .idea
16
+ *.gem
data/CHANGELOG.md CHANGED
@@ -6,10 +6,32 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
6
6
  ``
7
7
  ## UNRELEASED
8
8
 
9
+ ## [2.1.0] - 2021-05-27
10
+
11
+ - Modify config to allow specifying kafka connection information separately for consumers and producers
12
+
13
+ ## [2.0.2] - 2021-01-28
14
+ - Additional fixes for Ruby 3.0
15
+
16
+ ## [2.0.1] - 2021-01-14
17
+ - Fix bug with Ruby 3.0 around OpenStruct
18
+
19
+ ## [2.0.0-beta1] - 2020-05-04
20
+
21
+ - Remove deprecated patterns:
22
+ -- `before_consume` method
23
+ -- `around_consume` as a class method or without yielding values
24
+ -- `publish` and `async_publish` with positional arguments
25
+
26
+ ## [1.9.0] - 2020-03-05
27
+ - Bumped version to 1.9.0.
28
+
9
29
  ## [1.9.0-beta3] - 2020-02-05
30
+
10
31
  - Fix bug where deprecation errors would be shown when receiving nil payloads
11
32
  even if `around_consume` was updated to yield them.
12
33
 
34
+
13
35
  ## [1.9.0-beta2] - 2020-01-09
14
36
 
15
37
  - Allow `around_consume` to yield payload and metadata, and deprecate
data/README.md CHANGED
@@ -21,7 +21,6 @@ With Phobos by your side, all this becomes smooth sailing.
21
21
  ## Table of Contents
22
22
 
23
23
  1. [Installation](#installation)
24
- 1. [Upgrade Notes](#upgrade-notes)
25
24
  1. [Usage](#usage)
26
25
  1. [Standalone apps](#usage-standalone-apps)
27
26
  1. [Consuming messages from Kafka](#usage-consuming-messages-from-kafka)
@@ -32,6 +31,7 @@ With Phobos by your side, all this becomes smooth sailing.
32
31
  1. [Plugins](#plugins)
33
32
  1. [Development](#development)
34
33
  1. [Test](#test)
34
+ 1. [Upgrade Notes](#upgrade-notes)
35
35
 
36
36
  ## <a name="installation"></a> Installation
37
37
 
@@ -53,10 +53,6 @@ Or install it yourself as:
53
53
  $ gem install phobos
54
54
  ```
55
55
 
56
- ### <a name="upgrade-notes"></a> Upgrade Notes
57
-
58
- Version 1.8.2 introduced a new `persistent_connections` setting for regular producers. This reduces the number of connections used to produce messages and you should consider setting it to true. This does require a manual shutdown call - please see [Producers with persistent connections](#persistent-connection).
59
-
60
56
  ## <a name="usage"></a> Usage
61
57
 
62
58
  Phobos can be used in two ways: as a standalone application or to support Kafka features in your existing project - including Rails apps. It provides a CLI tool to run it.
@@ -197,8 +193,6 @@ class MyHandler
197
193
  end
198
194
  ```
199
195
 
200
- Note: Previous versions used a `before_consume` method to pre-process the payload. This is still supported, but deprecated. Going forward, `around_consume` should yield the payload and metadata back to the calling code, allowing it to act as a pre-processor, e.g. by decoding Avro messages into Ruby hashes.
201
-
202
196
  Take a look at the examples folder for some ideas.
203
197
 
204
198
  The hander life cycle can be illustrated as:
@@ -287,8 +281,6 @@ my = MyProducer.new
287
281
  my.producer.publish(topic: 'topic', payload: 'message-payload', key: 'partition and message key', headers: { header_1: 'value 1' })
288
282
  ```
289
283
 
290
- Older versions of Phobos provided a `publish` method that accepted positional arguments. That version is still supported but it's soon to be deprecated in favour of the keyword arguments version.
291
-
292
284
  It is also possible to publish several messages at once:
293
285
 
294
286
  ```ruby
@@ -412,9 +404,12 @@ All [options supported by `ruby-kafka`][ruby-kafka-client] can be provided.
412
404
  __producer__ provides configurations for all producers created over the application,
413
405
  the options are the same for regular and async producers.
414
406
  All [options supported by `ruby-kafka`][ruby-kafka-producer] can be provided.
407
+ If the __kafka__ key is present under __producer__, it is merged into the top-level __kafka__, allowing different connection configuration for producers.
415
408
 
416
409
  __consumer__ provides configurations for all consumer groups created over the application.
417
410
  All [options supported by `ruby-kafka`][ruby-kafka-consumer] can be provided.
411
+ If the __kafka__ key is present under __consumer__, it is merged into the top-level __kafka__, allowing different connection configuration for consumers.
412
+
418
413
 
419
414
  __backoff__ Phobos provides automatic retries for your handlers. If an exception
420
415
  is raised, the listener will retry following the back off configured here.
@@ -590,8 +585,11 @@ After checking out the repo:
590
585
  * make sure `docker` is installed and running (for windows and mac this also includes `docker-compose`).
591
586
  * Linux: make sure `docker-compose` is installed and running.
592
587
  * run `bin/setup` to install dependencies
593
- * run `docker-compose up` to start the required kafka containers in a window
594
- * run `rspec` to run the tests in another window
588
+ * run `docker-compose up -d --force-recreate kafka zookeeper` to start the required kafka containers
589
+ * run tests to confirm no environmental issues
590
+ * wait a few seconds for kafka broker to get set up - `sleep 30`
591
+ * run `docker-compose run --rm test`
592
+ * make sure it reports `X examples, 0 failures`
595
593
 
596
594
  You can also run `bin/console` for an interactive prompt that will allow you to experiment.
597
595
 
@@ -604,8 +602,16 @@ Phobos exports a spec helper that can help you test your consumer. The Phobos li
604
602
  * `process_message(handler:, payload:, metadata: {}, encoding: nil)` - Invokes your handler with payload and metadata, using a dummy listener (encoding and metadata are optional).
605
603
 
606
604
  ```ruby
607
- require 'spec_helper'
605
+ ### spec_helper.rb
606
+ require 'phobos/test/helper'
607
+ RSpec.configure do |config|
608
+ config.include Phobos::Test::Helper
609
+ config.before(:each) do
610
+ Phobos.configure(path_to_my_config_file)
611
+ end
612
+ end
608
613
 
614
+ ### Spec file
609
615
  describe MyConsumer do
610
616
  let(:payload) { 'foo' }
611
617
  let(:metadata) { Hash(foo: 'bar') }
@@ -619,6 +625,57 @@ describe MyConsumer do
619
625
  end
620
626
  ```
621
627
 
628
+ ## <a name="upgrade-notes"></a> Upgrade Notes
629
+
630
+ Version 2.0 removes deprecated ways of defining producers and consumers:
631
+ * The `before_consume` method has been removed. You can have this behavior in the first part of an `around_consume` method.
632
+ * `around_consume` is now only available as an instance method, and it must yield the values to pass to the `consume` method.
633
+ * `publish` and `async_publish` now only accept keyword arguments, not positional arguments.
634
+
635
+ Example pre-2.0:
636
+ ```ruby
637
+ class MyHandler
638
+ include Phobos::Handler
639
+
640
+ def before_consume(payload, metadata)
641
+ payload[:id] = 1
642
+ end
643
+
644
+ def self.around_consume(payload, metadata)
645
+ metadata[:key] = 5
646
+ yield
647
+ end
648
+ end
649
+ ```
650
+
651
+ In 2.0:
652
+ ```ruby
653
+ class MyHandler
654
+ include Phobos::Handler
655
+
656
+ def around_consume(payload, metadata)
657
+ new_payload = payload.dup
658
+ new_metadata = metadata.dup
659
+ new_payload[:id] = 1
660
+ new_metadata[:key] = 5
661
+ yield new_payload, new_metadata
662
+ end
663
+ end
664
+ ```
665
+
666
+ Producer, 1.9:
667
+ ```ruby
668
+ producer.publish('my-topic', { payload_value: 1}, 5, 3, {header_val: 5})
669
+ ```
670
+
671
+ Producer 2.0:
672
+ ```ruby
673
+ producer.publish(topic: 'my-topic', payload: { payload_value: 1}, key: 5,
674
+ partition_key: 3, headers: { header_val: 5})
675
+ ```
676
+
677
+ Version 1.8.2 introduced a new `persistent_connections` setting for regular producers. This reduces the number of connections used to produce messages and you should consider setting it to true. This does require a manual shutdown call - please see [Producers with persistent connections](#persistent-connection).
678
+
622
679
  ## Contributing
623
680
 
624
681
  Bug reports and pull requests are welcome on GitHub at https://github.com/klarna/phobos.
@@ -65,6 +65,10 @@ producer:
65
65
  # that you need to manually call sync_producer_shutdown before exiting,
66
66
  # similar to async_producer_shutdown.
67
67
  persistent_connections: false
68
+ # kafka here supports the same parameters as the top-level, allowing custom connection
69
+ # configuration details for producers
70
+ kafka:
71
+ connect_timeout: 120
68
72
 
69
73
  consumer:
70
74
  # number of seconds after which, if a client hasn't contacted the Kafka cluster,
@@ -79,6 +83,10 @@ consumer:
79
83
  offset_retention_time:
80
84
  # interval between heartbeats; must be less than the session window
81
85
  heartbeat_interval: 10
86
+ # kafka here supports the same parameters as the top-level, allowing custom connection
87
+ # configuration details for consumers
88
+ kafka:
89
+ connect_timeout: 130
82
90
 
83
91
  backoff:
84
92
  min_ms: 1000
data/lib/phobos.rb CHANGED
@@ -55,8 +55,12 @@ module Phobos
55
55
 
56
56
  def configure(configuration)
57
57
  @config = fetch_configuration(configuration)
58
- @config.class.send(:define_method, :producer_hash) { Phobos.config.producer&.to_hash }
59
- @config.class.send(:define_method, :consumer_hash) { Phobos.config.consumer&.to_hash }
58
+ @config.class.send(:define_method, :producer_hash) do
59
+ Phobos.config.producer&.to_hash&.except(:kafka)
60
+ end
61
+ @config.class.send(:define_method, :consumer_hash) do
62
+ Phobos.config.consumer&.to_hash&.except(:kafka)
63
+ end
60
64
  @config.listeners ||= []
61
65
  configure_logger
62
66
  end
@@ -66,8 +70,14 @@ module Phobos
66
70
  @config.listeners += listeners_config.listeners
67
71
  end
68
72
 
69
- def create_kafka_client
70
- Kafka.new(config.kafka.to_hash.merge(logger: @ruby_kafka_logger))
73
+ def create_kafka_client(config_key = nil)
74
+ kafka_config = config.kafka.to_hash.merge(logger: @ruby_kafka_logger)
75
+
76
+ if config_key
77
+ kafka_config = kafka_config.merge(**config.send(config_key)&.kafka&.to_hash || {})
78
+ end
79
+
80
+ Kafka.new(**kafka_config)
71
81
  end
72
82
 
73
83
  def create_exponential_backoff(backoff_config = nil)
@@ -47,32 +47,13 @@ module Phobos
47
47
  )
48
48
  end
49
49
 
50
- def preprocess(batch, handler)
51
- if handler.respond_to?(:before_consume_batch)
52
- Phobos.deprecate('before_consume_batch is deprecated and will be removed in 2.0. \
53
- Use around_consume_batch and yield payloads and metadata objects.')
54
- handler.before_consume_batch(batch, @metadata)
55
- else
56
- batch
57
- end
58
- end
59
-
60
50
  def process_batch(batch)
61
51
  instrument('listener.process_batch_inline', @metadata) do |_metadata|
62
52
  handler = @listener.handler_class.new
63
53
 
64
- preprocessed_batch = preprocess(batch, handler)
65
- consume_block = proc { |around_batch, around_metadata|
66
- if around_batch
67
- handler.consume_batch(around_batch, around_metadata)
68
- else
69
- Phobos.deprecate('Calling around_consume_batch without yielding payloads \
70
- and metadata is deprecated and will be removed in 2.0.')
71
- handler.consume_batch(preprocessed_batch, @metadata)
72
- end
73
- }
74
-
75
- handler.around_consume_batch(preprocessed_batch, @metadata, &consume_block)
54
+ handler.around_consume_batch(batch, @metadata) do |around_batch, around_metadata|
55
+ handler.consume_batch(around_batch, around_metadata)
56
+ end
76
57
  end
77
58
  end
78
59
  end
@@ -35,47 +35,12 @@ module Phobos
35
35
 
36
36
  private
37
37
 
38
- def preprocess(payload, handler)
39
- if handler.respond_to?(:before_consume)
40
- Phobos.deprecate('before_consume is deprecated and will be removed in 2.0. \
41
- Use around_consume and yield payload and metadata objects.')
42
- begin
43
- handler.before_consume(payload, @metadata)
44
- rescue ArgumentError
45
- handler.before_consume(payload)
46
- end
47
- else
48
- payload
49
- end
50
- end
51
-
52
- def consume_block(payload, handler)
53
- proc { |around_payload, around_metadata|
54
- if around_metadata
55
- handler.consume(around_payload, around_metadata)
56
- else
57
- Phobos.deprecate('Calling around_consume without yielding payload and metadata \
58
- is deprecated and will be removed in 2.0.')
59
- handler.consume(payload, @metadata)
60
- end
61
- }
62
- end
63
-
64
38
  def process_message(payload)
65
39
  instrument('listener.process_message', @metadata) do
66
40
  handler = @listener.handler_class.new
67
41
 
68
- preprocessed_payload = preprocess(payload, handler)
69
- block = consume_block(preprocessed_payload, handler)
70
-
71
- if @listener.handler_class.respond_to?(:around_consume)
72
- # around_consume class method implementation
73
- Phobos.deprecate('around_consume has been moved to instance method, please update '\
74
- 'your consumer. This will not be backwards compatible in the future.')
75
- @listener.handler_class.around_consume(preprocessed_payload, @metadata, &block)
76
- else
77
- # around_consume instance method implementation
78
- handler.around_consume(preprocessed_payload, @metadata, &block)
42
+ handler.around_consume(payload, @metadata) do |around_payload, around_metadata|
43
+ handler.consume(around_payload, around_metadata)
79
44
  end
80
45
  end
81
46
  end
@@ -8,7 +8,7 @@ module Phobos
8
8
  # Based on
9
9
  # https://docs.omniref.com/ruby/2.3.0/files/lib/ostruct.rb#line=88
10
10
  def initialize(hash = nil)
11
- @table = {}
11
+ super
12
12
  @hash_table = {}
13
13
 
14
14
  hash&.each_pair do |key, value|
@@ -13,7 +13,7 @@ module Phobos
13
13
  max_concurrency = listener_configs[:max_concurrency] || 1
14
14
  Array.new(max_concurrency).map do
15
15
  configs = listener_configs.select { |k| Constants::LISTENER_OPTS.include?(k) }
16
- Phobos::Listener.new(configs.merge(handler: handler_class))
16
+ Phobos::Listener.new(**configs.merge(handler: handler_class))
17
17
  end
18
18
  end
19
19
  end
@@ -35,7 +35,7 @@ module Phobos
35
35
  )
36
36
  @encoding = Encoding.const_get(force_encoding.to_sym) if force_encoding
37
37
  @message_processing_opts = compact(min_bytes: min_bytes, max_wait_time: max_wait_time)
38
- @kafka_client = Phobos.create_kafka_client
38
+ @kafka_client = Phobos.create_kafka_client(:consumer)
39
39
  @producer_enabled = @handler_class.ancestors.include?(Phobos::Producer)
40
40
  end
41
41
  # rubocop:enable Metrics/MethodLength
@@ -93,7 +93,7 @@ module Phobos
93
93
  def start_listener
94
94
  instrument('listener.start', listener_metadata) do
95
95
  @consumer = create_kafka_consumer
96
- @consumer.subscribe(topic, @subscribe_opts)
96
+ @consumer.subscribe(topic, **@subscribe_opts)
97
97
 
98
98
  # This is done here because the producer client is bound to the current thread and
99
99
  # since "start" blocks a thread might be used to call it
@@ -135,7 +135,7 @@ module Phobos
135
135
  end
136
136
 
137
137
  def consume_each_batch
138
- @consumer.each_batch(@message_processing_opts) do |batch|
138
+ @consumer.each_batch(**@message_processing_opts) do |batch|
139
139
  batch_processor = Phobos::Actions::ProcessBatch.new(
140
140
  listener: self,
141
141
  batch: batch,
@@ -149,7 +149,7 @@ module Phobos
149
149
  end
150
150
 
151
151
  def consume_each_batch_inline
152
- @consumer.each_batch(@message_processing_opts) do |batch|
152
+ @consumer.each_batch(**@message_processing_opts) do |batch|
153
153
  batch_processor = Phobos::Actions::ProcessBatchInline.new(
154
154
  listener: self,
155
155
  batch: batch,
@@ -163,7 +163,7 @@ module Phobos
163
163
  end
164
164
 
165
165
  def consume_each_message
166
- @consumer.each_message(@message_processing_opts) do |message|
166
+ @consumer.each_message(**@message_processing_opts) do |message|
167
167
  message_processor = Phobos::Actions::ProcessMessage.new(
168
168
  listener: self,
169
169
  message: message,
@@ -181,7 +181,7 @@ module Phobos
181
181
  Constants::KAFKA_CONSUMER_OPTS.include?(k)
182
182
  end
183
183
  configs.merge!(@kafka_consumer_opts)
184
- @kafka_client.consumer({ group_id: group_id }.merge(configs))
184
+ @kafka_client.consumer(**{ group_id: group_id }.merge(configs))
185
185
  end
186
186
 
187
187
  def compact(hash)
@@ -11,28 +11,24 @@ module Phobos
11
11
  end
12
12
 
13
13
  class PublicAPI
14
- MissingRequiredArgumentsError = Class.new(StandardError) do
15
- def initialize
16
- super('You need to provide a topic name and a payload')
17
- end
18
- end
19
-
20
14
  def initialize(host_obj)
21
15
  @host_obj = host_obj
22
16
  end
23
17
 
24
- def publish(*args, **kwargs)
25
- Phobos.deprecate(deprecate_positional_args_message('publish')) if kwargs.empty?
26
-
27
- args = normalize_arguments(*args, **kwargs)
28
- class_producer.publish(**args)
18
+ def publish(topic:, payload:, key: nil, partition_key: nil, headers: nil)
19
+ class_producer.publish(topic: topic,
20
+ payload: payload,
21
+ key: key,
22
+ partition_key: partition_key,
23
+ headers: headers)
29
24
  end
30
25
 
31
- def async_publish(*args, **kwargs)
32
- Phobos.deprecate(deprecate_positional_args_message('async_publish')) if kwargs.empty?
33
-
34
- args = normalize_arguments(*args, **kwargs)
35
- class_producer.async_publish(**args)
26
+ def async_publish(topic:, payload:, key: nil, partition_key: nil, headers: nil)
27
+ class_producer.async_publish(topic: topic,
28
+ payload: payload,
29
+ key: key,
30
+ partition_key: partition_key,
31
+ headers: headers)
36
32
  end
37
33
 
38
34
  # @param messages [Array(Hash(:topic, :payload, :key, :headers))]
@@ -54,36 +50,6 @@ module Phobos
54
50
  def class_producer
55
51
  @host_obj.class.producer
56
52
  end
57
-
58
- # rubocop:disable Metrics/ParameterLists
59
- def normalize_arguments(p_topic = nil, p_payload = nil, p_key = nil,
60
- p_partition_key = nil, p_headers = {},
61
- **kwargs)
62
- {}.tap do |args|
63
- {
64
- topic: p_topic,
65
- payload: p_payload,
66
- key: p_key,
67
- partition_key: p_partition_key,
68
- headers: p_headers
69
- }.each { |k, v| args[k] = kwargs[k] || v }
70
-
71
- raise MissingRequiredArgumentsError if [:topic, :payload].any? { |k| args[k].nil? }
72
-
73
- kwargs.each do |k, v|
74
- next if args.key?(k)
75
-
76
- args[:headers][k] = v
77
- end
78
- end
79
- end
80
- # rubocop:enable Metrics/ParameterLists
81
-
82
- def deprecate_positional_args_message(method_name)
83
- "The `#{method_name}` method should now receive keyword arguments " \
84
- 'rather than positional ones. Please update your publishers. This will ' \
85
- 'not be backwards compatible in the future.'
86
- end
87
53
  end
88
54
 
89
55
  module ClassMethods
@@ -111,8 +77,8 @@ module Phobos
111
77
  end
112
78
 
113
79
  def create_sync_producer
114
- client = kafka_client || configure_kafka_client(Phobos.create_kafka_client)
115
- sync_producer = client.producer(regular_configs)
80
+ client = kafka_client || configure_kafka_client(Phobos.create_kafka_client(:producer))
81
+ sync_producer = client.producer(**regular_configs)
116
82
  if Phobos.config.producer_hash[:persistent_connections]
117
83
  producer_store[:sync_producer] = sync_producer
118
84
  end
@@ -142,8 +108,8 @@ module Phobos
142
108
  end
143
109
 
144
110
  def create_async_producer
145
- client = kafka_client || configure_kafka_client(Phobos.create_kafka_client)
146
- async_producer = client.async_producer(async_configs)
111
+ client = kafka_client || configure_kafka_client(Phobos.create_kafka_client(:producer))
112
+ async_producer = client.async_producer(**async_configs)
147
113
  producer_store[:async_producer] = async_producer
148
114
  end
149
115
 
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Phobos
4
- VERSION = '1.9.0-beta3'
4
+ VERSION = '2.1.0'
5
5
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: phobos
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.9.0.pre.beta3
4
+ version: 2.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Túlio Ornelas
@@ -12,10 +12,10 @@ authors:
12
12
  - Francisco Juan
13
13
  - Tommy Gustafsson
14
14
  - Daniel Orner
15
- autorequire:
15
+ autorequire:
16
16
  bindir: bin
17
17
  cert_chain: []
18
- date: 2020-02-05 00:00:00.000000000 Z
18
+ date: 2021-05-27 00:00:00.000000000 Z
19
19
  dependencies:
20
20
  - !ruby/object:Gem::Dependency
21
21
  name: bundler
@@ -299,7 +299,7 @@ licenses:
299
299
  - Apache License Version 2.0
300
300
  metadata:
301
301
  allowed_push_host: https://rubygems.org
302
- post_install_message:
302
+ post_install_message:
303
303
  rdoc_options: []
304
304
  require_paths:
305
305
  - lib
@@ -310,12 +310,12 @@ required_ruby_version: !ruby/object:Gem::Requirement
310
310
  version: '2.3'
311
311
  required_rubygems_version: !ruby/object:Gem::Requirement
312
312
  requirements:
313
- - - ">"
313
+ - - ">="
314
314
  - !ruby/object:Gem::Version
315
- version: 1.3.1
315
+ version: '0'
316
316
  requirements: []
317
- rubygems_version: 3.1.2
318
- signing_key:
317
+ rubygems_version: 3.2.16
318
+ signing_key:
319
319
  specification_version: 4
320
320
  summary: Simplifying Kafka for ruby apps
321
321
  test_files: []