phobos 2.0.0.pre.beta1 → 2.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 646f966d9d307346090b88a3ef52f784da42f07a7f8fd3369131a4d1149fc293
4
- data.tar.gz: 3631aef66014166e318f24b8b282265e0d3663ed006b915c49762301e8f02b89
3
+ metadata.gz: 6ec70d6f78852088846db2f5cee6673e9889962379bd82934758aa936886f1df
4
+ data.tar.gz: f0d5a40fb1acbf02e0ac6cae4121571eb72689bc13076505153f33cef48f2362
5
5
  SHA512:
6
- metadata.gz: c19eedfd196604f55ea659d501323cecfca50c4c53a03ae1193457f344e4a1e580bbd7ca2b33559b20ab2c1418143b48b6d9e4942986f71b369f4a9e261ccfeb
7
- data.tar.gz: cdbd2146b341d91bb7131eb986e24055bf73054ed12eb542e09c9d2a56b4b462bd4d3015e5ee3a00934e4e125658e5438987b402aa2edb0ca2fac2172b098f58
6
+ metadata.gz: c7192c98b0e3f3b22ab3aa4ef4f28bf3227410120012d3b77a66f018960f0e72309f72d47a00e505355d906826dc89bdc327ceea50a99ea7e28d233602bd8ec6
7
+ data.tar.gz: b1dab967fb374559b98d976b0628140692f9c080d3de55d17edaa2a885a4ef17a1bbf7555a380ef00199e62a002f0bdd6d56486e77fbe4f2176eb39756c4f185
data/.gitignore CHANGED
@@ -12,3 +12,5 @@ spec/examples.txt
12
12
  config/*.yml
13
13
  log/*.log
14
14
  .byebug_history
15
+ .idea
16
+ *.gem
data/CHANGELOG.md CHANGED
@@ -6,7 +6,21 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
6
6
  ``
7
7
  ## UNRELEASED
8
8
 
9
- ## [2.0.0-beta1] - 2020-5-04
9
+ ## [2.1.1] - 2022-01-26
10
+
11
+ - Fix crash on Ruby 3.1 involving a change to the Psych gem.
12
+
13
+ ## [2.1.0] - 2021-05-27
14
+
15
+ - Modify config to allow specifying kafka connection information separately for consumers and producers
16
+
17
+ ## [2.0.2] - 2021-01-28
18
+ - Additional fixes for Ruby 3.0
19
+
20
+ ## [2.0.1] - 2021-01-14
21
+ - Fix bug with Ruby 3.0 around OpenStruct
22
+
23
+ ## [2.0.0-beta1] - 2020-05-04
10
24
 
11
25
  - Remove deprecated patterns:
12
26
  -- `before_consume` method
data/README.md CHANGED
@@ -404,9 +404,12 @@ All [options supported by `ruby-kafka`][ruby-kafka-client] can be provided.
404
404
  __producer__ provides configurations for all producers created over the application,
405
405
  the options are the same for regular and async producers.
406
406
  All [options supported by `ruby-kafka`][ruby-kafka-producer] can be provided.
407
+ If the __kafka__ key is present under __producer__, it is merged into the top-level __kafka__, allowing different connection configuration for producers.
407
408
 
408
409
  __consumer__ provides configurations for all consumer groups created over the application.
409
410
  All [options supported by `ruby-kafka`][ruby-kafka-consumer] can be provided.
411
+ If the __kafka__ key is present under __consumer__, it is merged into the top-level __kafka__, allowing different connection configuration for consumers.
412
+
410
413
 
411
414
  __backoff__ Phobos provides automatic retries for your handlers. If an exception
412
415
  is raised, the listener will retry following the back off configured here.
@@ -582,8 +585,11 @@ After checking out the repo:
582
585
  * make sure `docker` is installed and running (for windows and mac this also includes `docker-compose`).
583
586
  * Linux: make sure `docker-compose` is installed and running.
584
587
  * run `bin/setup` to install dependencies
585
- * run `docker-compose up` to start the required kafka containers in a window
586
- * run `rspec` to run the tests in another window
588
+ * run `docker-compose up -d --force-recreate kafka zookeeper` to start the required kafka containers
589
+ * run tests to confirm no environmental issues
590
+ * wait a few seconds for kafka broker to get set up - `sleep 30`
591
+ * run `docker-compose run --rm test`
592
+ * make sure it reports `X examples, 0 failures`
587
593
 
588
594
  You can also run `bin/console` for an interactive prompt that will allow you to experiment.
589
595
 
@@ -596,8 +602,16 @@ Phobos exports a spec helper that can help you test your consumer. The Phobos li
596
602
  * `process_message(handler:, payload:, metadata: {}, encoding: nil)` - Invokes your handler with payload and metadata, using a dummy listener (encoding and metadata are optional).
597
603
 
598
604
  ```ruby
599
- require 'spec_helper'
605
+ ### spec_helper.rb
606
+ require 'phobos/test/helper'
607
+ RSpec.configure do |config|
608
+ config.include Phobos::Test::Helper
609
+ config.before(:each) do
610
+ Phobos.configure(path_to_my_config_file)
611
+ end
612
+ end
600
613
 
614
+ ### Spec file
601
615
  describe MyConsumer do
602
616
  let(:payload) { 'foo' }
603
617
  let(:metadata) { Hash(foo: 'bar') }
@@ -65,6 +65,10 @@ producer:
65
65
  # that you need to manually call sync_producer_shutdown before exiting,
66
66
  # similar to async_producer_shutdown.
67
67
  persistent_connections: false
68
+ # kafka here supports the same parameters as the top-level, allowing custom connection
69
+ # configuration details for producers
70
+ kafka:
71
+ connect_timeout: 120
68
72
 
69
73
  consumer:
70
74
  # number of seconds after which, if a client hasn't contacted the Kafka cluster,
@@ -79,6 +83,10 @@ consumer:
79
83
  offset_retention_time:
80
84
  # interval between heartbeats; must be less than the session window
81
85
  heartbeat_interval: 10
86
+ # kafka here supports the same parameters as the top-level, allowing custom connection
87
+ # configuration details for consumers
88
+ kafka:
89
+ connect_timeout: 130
82
90
 
83
91
  backoff:
84
92
  min_ms: 1000
@@ -8,7 +8,7 @@ module Phobos
8
8
  # Based on
9
9
  # https://docs.omniref.com/ruby/2.3.0/files/lib/ostruct.rb#line=88
10
10
  def initialize(hash = nil)
11
- @table = {}
11
+ super
12
12
  @hash_table = {}
13
13
 
14
14
  hash&.each_pair do |key, value|
@@ -13,7 +13,7 @@ module Phobos
13
13
  max_concurrency = listener_configs[:max_concurrency] || 1
14
14
  Array.new(max_concurrency).map do
15
15
  configs = listener_configs.select { |k| Constants::LISTENER_OPTS.include?(k) }
16
- Phobos::Listener.new(configs.merge(handler: handler_class))
16
+ Phobos::Listener.new(**configs.merge(handler: handler_class))
17
17
  end
18
18
  end
19
19
  end
@@ -35,7 +35,7 @@ module Phobos
35
35
  )
36
36
  @encoding = Encoding.const_get(force_encoding.to_sym) if force_encoding
37
37
  @message_processing_opts = compact(min_bytes: min_bytes, max_wait_time: max_wait_time)
38
- @kafka_client = Phobos.create_kafka_client
38
+ @kafka_client = Phobos.create_kafka_client(:consumer)
39
39
  @producer_enabled = @handler_class.ancestors.include?(Phobos::Producer)
40
40
  end
41
41
  # rubocop:enable Metrics/MethodLength
@@ -93,7 +93,7 @@ module Phobos
93
93
  def start_listener
94
94
  instrument('listener.start', listener_metadata) do
95
95
  @consumer = create_kafka_consumer
96
- @consumer.subscribe(topic, @subscribe_opts)
96
+ @consumer.subscribe(topic, **@subscribe_opts)
97
97
 
98
98
  # This is done here because the producer client is bound to the current thread and
99
99
  # since "start" blocks a thread might be used to call it
@@ -135,7 +135,7 @@ module Phobos
135
135
  end
136
136
 
137
137
  def consume_each_batch
138
- @consumer.each_batch(@message_processing_opts) do |batch|
138
+ @consumer.each_batch(**@message_processing_opts) do |batch|
139
139
  batch_processor = Phobos::Actions::ProcessBatch.new(
140
140
  listener: self,
141
141
  batch: batch,
@@ -149,7 +149,7 @@ module Phobos
149
149
  end
150
150
 
151
151
  def consume_each_batch_inline
152
- @consumer.each_batch(@message_processing_opts) do |batch|
152
+ @consumer.each_batch(**@message_processing_opts) do |batch|
153
153
  batch_processor = Phobos::Actions::ProcessBatchInline.new(
154
154
  listener: self,
155
155
  batch: batch,
@@ -163,7 +163,7 @@ module Phobos
163
163
  end
164
164
 
165
165
  def consume_each_message
166
- @consumer.each_message(@message_processing_opts) do |message|
166
+ @consumer.each_message(**@message_processing_opts) do |message|
167
167
  message_processor = Phobos::Actions::ProcessMessage.new(
168
168
  listener: self,
169
169
  message: message,
@@ -181,7 +181,7 @@ module Phobos
181
181
  Constants::KAFKA_CONSUMER_OPTS.include?(k)
182
182
  end
183
183
  configs.merge!(@kafka_consumer_opts)
184
- @kafka_client.consumer({ group_id: group_id }.merge(configs))
184
+ @kafka_client.consumer(**{ group_id: group_id }.merge(configs))
185
185
  end
186
186
 
187
187
  def compact(hash)
@@ -77,8 +77,8 @@ module Phobos
77
77
  end
78
78
 
79
79
  def create_sync_producer
80
- client = kafka_client || configure_kafka_client(Phobos.create_kafka_client)
81
- sync_producer = client.producer(regular_configs)
80
+ client = kafka_client || configure_kafka_client(Phobos.create_kafka_client(:producer))
81
+ sync_producer = client.producer(**regular_configs)
82
82
  if Phobos.config.producer_hash[:persistent_connections]
83
83
  producer_store[:sync_producer] = sync_producer
84
84
  end
@@ -108,8 +108,8 @@ module Phobos
108
108
  end
109
109
 
110
110
  def create_async_producer
111
- client = kafka_client || configure_kafka_client(Phobos.create_kafka_client)
112
- async_producer = client.async_producer(async_configs)
111
+ client = kafka_client || configure_kafka_client(Phobos.create_kafka_client(:producer))
112
+ async_producer = client.async_producer(**async_configs)
113
113
  producer_store[:async_producer] = async_producer
114
114
  end
115
115
 
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Phobos
4
- VERSION = '2.0.0-beta1'
4
+ VERSION = '2.1.1'
5
5
  end
data/lib/phobos.rb CHANGED
@@ -13,7 +13,7 @@ require 'kafka'
13
13
  require 'logging'
14
14
  require 'erb'
15
15
 
16
- require 'phobos/deep_struct'
16
+ require 'phobos/configuration'
17
17
  require 'phobos/version'
18
18
  require 'phobos/constants'
19
19
  require 'phobos/log'
@@ -49,25 +49,24 @@ module Logging
49
49
  end
50
50
 
51
51
  module Phobos
52
+ extend Configuration
52
53
  class << self
53
54
  attr_reader :config, :logger
54
55
  attr_accessor :silence_log
55
56
 
56
- def configure(configuration)
57
- @config = fetch_configuration(configuration)
58
- @config.class.send(:define_method, :producer_hash) { Phobos.config.producer&.to_hash }
59
- @config.class.send(:define_method, :consumer_hash) { Phobos.config.consumer&.to_hash }
60
- @config.listeners ||= []
61
- configure_logger
62
- end
63
-
64
57
  def add_listeners(configuration)
65
58
  listeners_config = fetch_configuration(configuration)
66
59
  @config.listeners += listeners_config.listeners
67
60
  end
68
61
 
69
- def create_kafka_client
70
- Kafka.new(config.kafka.to_hash.merge(logger: @ruby_kafka_logger))
62
+ def create_kafka_client(config_key = nil)
63
+ kafka_config = config.kafka.to_hash.merge(logger: @ruby_kafka_logger)
64
+
65
+ if config_key
66
+ kafka_config = kafka_config.merge(**config.send(config_key)&.kafka&.to_hash || {})
67
+ end
68
+
69
+ Kafka.new(**kafka_config)
71
70
  end
72
71
 
73
72
  def create_exponential_backoff(backoff_config = nil)
@@ -82,84 +81,5 @@ module Phobos
82
81
  warn "DEPRECATION WARNING: #{message}: #{location}"
83
82
  end
84
83
 
85
- # :nodoc:
86
- def configure_logger
87
- Logging.backtrace(true)
88
- Logging.logger.root.level = silence_log ? :fatal : config.logger.level
89
-
90
- configure_ruby_kafka_logger
91
- configure_phobos_logger
92
-
93
- logger.info do
94
- Hash(message: 'Phobos configured', env: ENV['RAILS_ENV'] || ENV['RACK_ENV'] || 'N/A')
95
- end
96
- end
97
-
98
- private
99
-
100
- def fetch_configuration(configuration)
101
- DeepStruct.new(read_configuration(configuration))
102
- end
103
-
104
- def read_configuration(configuration)
105
- return configuration.to_h if configuration.respond_to?(:to_h)
106
-
107
- YAML.safe_load(
108
- ERB.new(
109
- File.read(File.expand_path(configuration))
110
- ).result,
111
- [Symbol],
112
- [],
113
- true
114
- )
115
- end
116
-
117
- def configure_phobos_logger
118
- if config.custom_logger
119
- @logger = config.custom_logger
120
- else
121
- @logger = Logging.logger[self]
122
- @logger.appenders = logger_appenders
123
- end
124
- end
125
-
126
- def configure_ruby_kafka_logger
127
- if config.custom_kafka_logger
128
- @ruby_kafka_logger = config.custom_kafka_logger
129
- elsif config.logger.ruby_kafka
130
- @ruby_kafka_logger = Logging.logger['RubyKafka']
131
- @ruby_kafka_logger.appenders = logger_appenders
132
- @ruby_kafka_logger.level = silence_log ? :fatal : config.logger.ruby_kafka.level
133
- else
134
- @ruby_kafka_logger = nil
135
- end
136
- end
137
-
138
- def logger_appenders
139
- appenders = [Logging.appenders.stdout(layout: stdout_layout)]
140
-
141
- if log_file
142
- FileUtils.mkdir_p(File.dirname(log_file))
143
- appenders << Logging.appenders.file(log_file, layout: json_layout)
144
- end
145
-
146
- appenders
147
- end
148
-
149
- def log_file
150
- config.logger.file
151
- end
152
-
153
- def json_layout
154
- Logging.layouts.json(date_pattern: Constants::LOG_DATE_PATTERN)
155
- end
156
-
157
- def stdout_layout
158
- if config.logger.stdout_json == true
159
- json_layout
160
- else
161
- Logging.layouts.pattern(date_pattern: Constants::LOG_DATE_PATTERN)
162
- end
163
- end
164
84
  end
165
85
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: phobos
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.0.0.pre.beta1
4
+ version: 2.1.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Túlio Ornelas
@@ -12,10 +12,10 @@ authors:
12
12
  - Francisco Juan
13
13
  - Tommy Gustafsson
14
14
  - Daniel Orner
15
- autorequire:
15
+ autorequire:
16
16
  bindir: bin
17
17
  cert_chain: []
18
- date: 2020-05-04 00:00:00.000000000 Z
18
+ date: 2022-01-26 00:00:00.000000000 Z
19
19
  dependencies:
20
20
  - !ruby/object:Gem::Dependency
21
21
  name: bundler
@@ -299,7 +299,7 @@ licenses:
299
299
  - Apache License Version 2.0
300
300
  metadata:
301
301
  allowed_push_host: https://rubygems.org
302
- post_install_message:
302
+ post_install_message:
303
303
  rdoc_options: []
304
304
  require_paths:
305
305
  - lib
@@ -310,12 +310,12 @@ required_ruby_version: !ruby/object:Gem::Requirement
310
310
  version: '2.3'
311
311
  required_rubygems_version: !ruby/object:Gem::Requirement
312
312
  requirements:
313
- - - ">"
313
+ - - ">="
314
314
  - !ruby/object:Gem::Version
315
- version: 1.3.1
315
+ version: '0'
316
316
  requirements: []
317
- rubygems_version: 3.1.2
318
- signing_key:
317
+ rubygems_version: 3.3.3
318
+ signing_key:
319
319
  specification_version: 4
320
320
  summary: Simplifying Kafka for ruby apps
321
321
  test_files: []