phobos 1.7.2 → 1.8.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: 9a1be0c7e093c341850150a4fc386bb858a4cb37
4
- data.tar.gz: 6873ff23a925070291916c4b227a4b139bc44463
2
+ SHA256:
3
+ metadata.gz: 0b37b13898c547e212449d8b598984b80ceb9796d377991b7dd73f6e6acfdeb8
4
+ data.tar.gz: 66d12f8461a5e3e796b47364644a948c2f1c030d227f61a6dd675baf7e368f62
5
5
  SHA512:
6
- metadata.gz: '08520a3dd1a8c53057561853ff1651a710a911128a2ef75ce65bb5f82c501c537317a92d473bdc82b4a2270acbd5c0cfd418397c68db286690b7387636f254bc'
7
- data.tar.gz: 88ffede29159f84ed85f9b1a9c2efb5bc5a4b39f4b04d37cb2d9c690e828d27236c00dce1ecb365ab58a5e8efd503c2e2854cbb45d0bb074e829b48d37b977ff
6
+ metadata.gz: 75e2ddf1d53ea28086f0442779c2b19f7b8e9d78d62c7b7f4257fdf59d37f70d145fc014da37678d3023a4395c7b84b1c6d81145187482344af493114673931d
7
+ data.tar.gz: da67af5cee56a7867f99f0353b847fd8d1faf23dd7f24e539fc6a90514b9e5c3c38bef04f29d03cbd4344242818127d1f69aafb4bd36d1ac11321179b75d7cb4
@@ -1 +1 @@
1
- 2.4.1
1
+ 2.5.1
@@ -1,8 +1,9 @@
1
1
  sudo: required
2
2
  language: ruby
3
3
  rvm:
4
- - 2.4.1
5
- - 2.3.6
4
+ - 2.5.1
5
+ - 2.4.4
6
+ - 2.3.7
6
7
 
7
8
  services:
8
9
  - docker
@@ -6,6 +6,14 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
6
6
 
7
7
  ## UNRELEASED
8
8
 
9
+ ## [1.8.0] - 2018-07-22
10
+ ### Added
11
+ - Possibility to configure a custom logger #81
12
+ ### Changed
13
+ - Reduce the volume of info-level log messages #78
14
+ - Phobos Handler `around_consume` is now an instance method #82
15
+ - Send consumer heartbeats between retry attempts #83
16
+
9
17
  ## [1.7.2] - 2018-05-03
10
18
  ### Added
11
19
  - Add ability to override session_timeout, heartbeat_interval, offset_retention_time, offset_commit_interval, and offset_commit_threshold per listener
data/Dockerfile CHANGED
@@ -1,8 +1,8 @@
1
- FROM ruby:2.4.1-alpine
1
+ FROM ruby:2.5.1-alpine
2
2
 
3
3
  RUN apk update && apk upgrade && \
4
4
  apk add --no-cache bash git openssh build-base
5
- RUN gem install bundler -v 1.16.0
5
+ RUN gem install bundler -v 1.16.1
6
6
 
7
7
  WORKDIR /opt/phobos
8
8
 
data/README.md CHANGED
@@ -3,6 +3,7 @@
3
3
  [![Build Status](https://travis-ci.org/klarna/phobos.svg?branch=master)](https://travis-ci.org/klarna/phobos)
4
4
  [![Maintainability](https://api.codeclimate.com/v1/badges/2d00845fc6e7e83df6e7/maintainability)](https://codeclimate.com/github/klarna/phobos/maintainability)
5
5
  [![Test Coverage](https://api.codeclimate.com/v1/badges/2d00845fc6e7e83df6e7/test_coverage)](https://codeclimate.com/github/klarna/phobos/test_coverage)
6
+ [![Depfu](https://badges.depfu.com/badges/57da3d5ff1da449cf8739cfe30b8d2f8/count.svg)](https://depfu.com/github/klarna/phobos?project=Bundler)
6
7
  [![Chat with us on Discord](https://discordapp.com/api/guilds/379938130326847488/widget.png)](https://discord.gg/rfMUBVD)
7
8
 
8
9
  # Phobos
@@ -155,7 +156,25 @@ class MyHandler
155
156
  end
156
157
  ```
157
158
 
158
- It is also possible to control the execution of `#consume` with the class method `.around_consume(payload, metadata)`. This method receives the payload and metadata, and then invokes `#consume` method by means of a block; example:
159
+ It is also possible to control the execution of `#consume` with the method `#around_consume(payload, metadata)`. This method receives the payload and metadata, and then invokes `#consume` method by means of a block; example:
160
+
161
+ ```ruby
162
+ class MyHandler
163
+ include Phobos::Handler
164
+
165
+ def around_consume(payload, metadata)
166
+ Phobos.logger.info "consuming..."
167
+ output = yield
168
+ Phobos.logger.info "done, output: #{output}"
169
+ end
170
+
171
+ def consume(payload, metadata)
172
+ # consume or skip message
173
+ end
174
+ end
175
+ ```
176
+
177
+ Note: `around_consume` was previously defined as a class method. The current code supports both implementations, giving precendence to the class method, but future versions will no longer support `.around_consume`.
159
178
 
160
179
  ```ruby
161
180
  class MyHandler
@@ -173,13 +192,13 @@ class MyHandler
173
192
  end
174
193
  ```
175
194
 
176
- Finally, it is also possible to preprocess the message payload before consuming it using the `before_consume` hook which is invoked before `.around_consume` and `#consume`. The result of this operation will be assigned to payload, so it is important to return the modified payload. This can be very useful, for example if you want a single point of decoding Avro messages and want the payload as a hash instead of a binary.
195
+ Finally, it is also possible to preprocess the message payload before consuming it using the `before_consume` hook which is invoked before `#around_consume` and `#consume`. The result of this operation will be assigned to payload, so it is important to return the modified payload. This can be very useful, for example if you want a single point of decoding Avro messages and want the payload as a hash instead of a binary.
177
196
 
178
197
  ```ruby
179
198
  class MyHandler
180
199
  include Phobos::Handler
181
200
 
182
- def before_consume(payload)
201
+ def before_consume(payload, metadata)
183
202
  # optionally preprocess payload
184
203
  payload
185
204
  end
@@ -194,7 +213,7 @@ The hander life cycle can be illustrated as:
194
213
 
195
214
  or optionally,
196
215
 
197
- `.start` -> `#before_consume` -> `.around_consume` [ `#consume` ] -> `.stop`
216
+ `.start` -> `#before_consume` -> `#around_consume` [ `#consume` ] -> `.stop`
198
217
 
199
218
  ### <a name="usage-producing-messages-to-kafka"></a> Producing messages to Kafka
200
219
 
@@ -319,17 +338,24 @@ The configuration file is organized in 6 sections. Take a look at the example fi
319
338
 
320
339
  The file will be parsed through ERB so ERB syntax/file extension is supported beside the YML format.
321
340
 
322
- __logger__ configures the logger for all Phobos components, it automatically outputs to `STDOUT` and it saves the log in the configured file
341
+ __logger__ configures the logger for all Phobos components. It automatically
342
+ outputs to `STDOUT` and it saves the log in the configured file.
323
343
 
324
- __kafka__ provides configurations for every `Kafka::Client` created over the application. All [options supported by `ruby-kafka`][ruby-kafka-client] can be provided.
344
+ __kafka__ provides configurations for every `Kafka::Client` created over the application.
345
+ All [options supported by `ruby-kafka`][ruby-kafka-client] can be provided.
325
346
 
326
- __producer__ provides configurations for all producers created over the application, the options are the same for regular and async producers. All [options supported by `ruby-kafka`][ruby-kafka-producer] can be provided.
347
+ __producer__ provides configurations for all producers created over the application,
348
+ the options are the same for regular and async producers.
349
+ All [options supported by `ruby-kafka`][ruby-kafka-producer] can be provided.
327
350
 
328
- __consumer__ provides configurations for all consumer groups created over the application. All [options supported by `ruby-kafka`][ruby-kafka-consumer] can be provided.
351
+ __consumer__ provides configurations for all consumer groups created over the application.
352
+ All [options supported by `ruby-kafka`][ruby-kafka-consumer] can be provided.
329
353
 
330
- __backoff__ Phobos provides automatic retries for your handlers, if an exception is raised the listener will retry following the back off configured here. Backoff can also be configured per listener.
354
+ __backoff__ Phobos provides automatic retries for your handlers. If an exception
355
+ is raised, the listener will retry following the back off configured here.
356
+ Backoff can also be configured per listener.
331
357
 
332
- __listeners__ is the list of listeners configured, each listener represents a consumers group
358
+ __listeners__ is the list of listeners configured. Each listener represents a consumer group.
333
359
 
334
360
  [ruby-kafka-client]: http://www.rubydoc.info/gems/ruby-kafka/Kafka%2FClient%3Ainitialize
335
361
  [ruby-kafka-consumer]: http://www.rubydoc.info/gems/ruby-kafka/Kafka%2FClient%3Aconsumer
@@ -349,6 +375,23 @@ $ phobos start -c /var/configs/my.yml -l /var/configs/additional_listeners.yml
349
375
  Note that the config file _must_ still specify a listeners section, though it
350
376
  can be empty.
351
377
 
378
+ #### Custom configuration/logging
379
+
380
+ Phobos can be configured using a hash rather than the config file directly. This
381
+ can be useful if you want to do some pre-processing before sending the file
382
+ to Phobos. One particularly useful aspect is the ability to provide Phobos
383
+ with a custom logger, e.g. by reusing the Rails logger:
384
+
385
+ ```ruby
386
+ Phobos.configure(
387
+ custom_logger: Rails.logger,
388
+ custom_kafka_logger: Rails.logger
389
+ )
390
+ ```
391
+
392
+ If these keys are given, they will override the `logger` keys in the Phobos
393
+ config file.
394
+
352
395
  ### <a name="usage-instrumentation"></a> Instrumentation
353
396
 
354
397
  Some operations are instrumented using [Active Support Notifications](http://api.rubyonrails.org/classes/ActiveSupport/Notifications.html).
@@ -481,8 +524,8 @@ describe MyConsumer do
481
524
  let(:metadata) { Hash(foo: 'bar') }
482
525
 
483
526
  it 'consumes my message' do
484
- expect(described_class).to receive(:around_consume).with(payload, metadata).once.and_call_original
485
- expect_any_instance_of(described_class).to receive(:before_consume).with(payload).once.and_call_original
527
+ expect_any_instance_of(described_class).to receive(:around_consume).with(payload, metadata).once.and_call_original
528
+ expect_any_instance_of(described_class).to receive(:before_consume).with(payload, metadata).once.and_call_original
486
529
  expect_any_instance_of(described_class).to receive(:consume).with(payload, metadata).once.and_call_original
487
530
 
488
531
  process_message(handler: described_class, payload: payload, metadata: metadata)
@@ -13,18 +13,16 @@ services:
13
13
  - ./coverage:/opt/phobos/coverage
14
14
 
15
15
  zookeeper:
16
- image: jplock/zookeeper:3.4.10
16
+ image: wurstmeister/zookeeper:latest
17
17
  ports:
18
18
  - 2181:2181
19
19
 
20
20
  kafka:
21
21
  depends_on:
22
22
  - zookeeper
23
- image: ches/kafka:0.10.2.1
23
+ image: wurstmeister/kafka:0.11.0.1
24
24
  ports:
25
25
  - 9092:9092
26
26
  environment:
27
- - KAFKA_BROKER_ID=0
28
- - KAFKA_ADVERTISED_HOST_NAME=localhost
29
- - KAFKA_ADVERTISED_PORT=9092
30
- - ZOOKEEPER_CONNECTION_STRING=zookeeper:2181
27
+ KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
28
+ KAFKA_BROKER_ID: 0
@@ -55,38 +55,53 @@ module Phobos
55
55
  ExponentialBackoff.new(min, max).tap { |backoff| backoff.randomize_factor = rand }
56
56
  end
57
57
 
58
+ # :nodoc:
58
59
  def configure_logger
59
- log_file = config.logger.file
60
60
  ruby_kafka = config.logger.ruby_kafka
61
- date_pattern = '%Y-%m-%dT%H:%M:%S:%L%zZ'
62
- json_layout = Logging.layouts.json(date_pattern: date_pattern)
63
-
64
- stdout_layout = if config.logger.stdout_json == true
65
- json_layout
66
- else
67
- Logging.layouts.pattern(date_pattern: date_pattern)
68
- end
69
-
70
- appenders = [Logging.appenders.stdout(layout: stdout_layout)]
71
61
 
72
62
  Logging.backtrace(true)
73
63
  Logging.logger.root.level = silence_log ? :fatal : config.logger.level
74
-
75
- if log_file
76
- FileUtils.mkdir_p(File.dirname(log_file))
77
- appenders << Logging.appenders.file(log_file, layout: json_layout)
78
- end
64
+ appenders = logger_appenders
79
65
 
80
66
  @ruby_kafka_logger = nil
81
67
 
82
- if ruby_kafka
68
+ if config.custom_kafka_logger
69
+ @ruby_kafka_logger = config.custom_kafka_logger
70
+ elsif ruby_kafka
83
71
  @ruby_kafka_logger = Logging.logger['RubyKafka']
84
72
  @ruby_kafka_logger.appenders = appenders
85
73
  @ruby_kafka_logger.level = silence_log ? :fatal : ruby_kafka.level
86
74
  end
87
75
 
88
- @logger = Logging.logger[self]
89
- @logger.appenders = appenders
76
+ if config.custom_logger
77
+ @logger = config.custom_logger
78
+ else
79
+ @logger = Logging.logger[self]
80
+ @logger.appenders = appenders
81
+ end
82
+ end
83
+
84
+ def logger_appenders
85
+ date_pattern = '%Y-%m-%dT%H:%M:%S:%L%zZ'
86
+ json_layout = Logging.layouts.json(date_pattern: date_pattern)
87
+ log_file = config.logger.file
88
+ stdout_layout = if config.logger.stdout_json == true
89
+ json_layout
90
+ else
91
+ Logging.layouts.pattern(date_pattern: date_pattern)
92
+ end
93
+
94
+ appenders = [Logging.appenders.stdout(layout: stdout_layout)]
95
+
96
+ if log_file
97
+ FileUtils.mkdir_p(File.dirname(log_file))
98
+ appenders << Logging.appenders.file(log_file, layout: json_layout)
99
+ end
100
+ appenders
101
+ end
102
+
103
+ def deprecate(message)
104
+ warn "DEPRECATION WARNING: #{message} #{Kernel.caller.first}"
90
105
  end
91
106
 
92
107
  private
@@ -1,6 +1,8 @@
1
1
  module Phobos
2
2
  module Actions
3
3
  class ProcessMessage
4
+ MAX_SLEEP_INTERVAL = 3
5
+
4
6
  include Phobos::Instrumentation
5
7
 
6
8
  attr_reader :metadata
@@ -38,16 +40,26 @@ module Phobos
38
40
  { message: "error processing message, waiting #{interval}s" }.merge(error).merge(@metadata)
39
41
  end
40
42
 
41
- sleep interval
43
+ snooze(interval)
42
44
  end
43
45
 
44
- raise Phobos::AbortError if @listener.should_stop?
45
-
46
46
  @metadata.merge!(retry_count: retry_count + 1)
47
47
  retry
48
48
  end
49
49
  end
50
50
 
51
+ def snooze(interval)
52
+ remaining_interval = interval
53
+
54
+ @listener.send_heartbeat_if_necessary
55
+
56
+ while remaining_interval.positive?
57
+ sleep [remaining_interval, MAX_SLEEP_INTERVAL].min
58
+ remaining_interval -= MAX_SLEEP_INTERVAL
59
+ @listener.send_heartbeat_if_necessary
60
+ end
61
+ end
62
+
51
63
  private
52
64
 
53
65
  def force_encoding(value)
@@ -57,10 +69,23 @@ module Phobos
57
69
  def process_message(payload)
58
70
  instrument('listener.process_message', @metadata) do
59
71
  handler = @listener.handler_class.new
60
- preprocessed_payload = handler.before_consume(payload)
72
+ preprocessed_payload = begin
73
+ handler.before_consume(payload, @metadata)
74
+ rescue ArgumentError => e
75
+ Phobos.deprecate("before_consume now expects metadata as second argument, please update your consumer."\
76
+ " This will not be backwards compatible in the future.")
77
+ handler.before_consume(payload)
78
+ end
79
+ consume_block = Proc.new { handler.consume(preprocessed_payload, @metadata) }
61
80
 
62
- @listener.handler_class.around_consume(preprocessed_payload, @metadata) do
63
- handler.consume(preprocessed_payload, @metadata)
81
+ if @listener.handler_class.respond_to?(:around_consume)
82
+ # around_consume class method implementation
83
+ Phobos.deprecate("around_consume has been moved to instance method, please update your consumer."\
84
+ " This will not be backwards compatible in the future.")
85
+ @listener.handler_class.around_consume(preprocessed_payload, @metadata, &consume_block)
86
+ else
87
+ # around_consume instance method implementation
88
+ handler.around_consume(preprocessed_payload, @metadata, &consume_block)
64
89
  end
65
90
  end
66
91
  end
@@ -4,7 +4,7 @@ module Phobos
4
4
  base.extend(ClassMethods)
5
5
  end
6
6
 
7
- def before_consume(payload)
7
+ def before_consume(payload, metadata)
8
8
  payload
9
9
  end
10
10
 
@@ -12,16 +12,16 @@ module Phobos
12
12
  raise NotImplementedError
13
13
  end
14
14
 
15
+ def around_consume(payload, metadata)
16
+ yield
17
+ end
18
+
15
19
  module ClassMethods
16
20
  def start(kafka_client)
17
21
  end
18
22
 
19
23
  def stop
20
24
  end
21
-
22
- def around_consume(payload, metadata)
23
- yield
24
- end
25
25
  end
26
26
  end
27
27
  end
@@ -103,7 +103,7 @@ module Phobos
103
103
  )
104
104
 
105
105
  batch_processor.execute
106
- Phobos.logger.info { Hash(message: 'Committed offset').merge(batch_processor.metadata) }
106
+ Phobos.logger.debug { Hash(message: 'Committed offset').merge(batch_processor.metadata) }
107
107
  return if should_stop?
108
108
  end
109
109
  end
@@ -117,7 +117,7 @@ module Phobos
117
117
  )
118
118
 
119
119
  message_processor.execute
120
- Phobos.logger.info { Hash(message: 'Committed offset').merge(message_processor.metadata) }
120
+ Phobos.logger.debug { Hash(message: 'Committed offset').merge(message_processor.metadata) }
121
121
  return if should_stop?
122
122
  end
123
123
  end
@@ -139,6 +139,11 @@ module Phobos
139
139
  @signal_to_stop == true
140
140
  end
141
141
 
142
+ def send_heartbeat_if_necessary
143
+ raise Phobos::AbortError if should_stop?
144
+ @consumer&.send_heartbeat_if_necessary
145
+ end
146
+
142
147
  private
143
148
 
144
149
  def listener_metadata
@@ -1,3 +1,3 @@
1
1
  module Phobos
2
- VERSION = '1.7.2'
2
+ VERSION = '1.8.0'
3
3
  end
@@ -45,7 +45,7 @@ Gem::Specification.new do |spec|
45
45
  spec.required_ruby_version = '>= 2.3'
46
46
 
47
47
  spec.add_development_dependency 'bundler'
48
- spec.add_development_dependency 'rake', '~> 10.0'
48
+ spec.add_development_dependency 'rake', '~> 12.3'
49
49
  spec.add_development_dependency 'rspec', '~> 3.0'
50
50
  spec.add_development_dependency 'pry-byebug'
51
51
  spec.add_development_dependency 'simplecov'
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: phobos
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.7.2
4
+ version: 1.8.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Túlio Ornelas
@@ -14,7 +14,7 @@ authors:
14
14
  autorequire:
15
15
  bindir: bin
16
16
  cert_chain: []
17
- date: 2018-05-03 00:00:00.000000000 Z
17
+ date: 2018-07-22 00:00:00.000000000 Z
18
18
  dependencies:
19
19
  - !ruby/object:Gem::Dependency
20
20
  name: bundler
@@ -36,14 +36,14 @@ dependencies:
36
36
  requirements:
37
37
  - - "~>"
38
38
  - !ruby/object:Gem::Version
39
- version: '10.0'
39
+ version: '12.3'
40
40
  type: :development
41
41
  prerelease: false
42
42
  version_requirements: !ruby/object:Gem::Requirement
43
43
  requirements:
44
44
  - - "~>"
45
45
  - !ruby/object:Gem::Version
46
- version: '10.0'
46
+ version: '12.3'
47
47
  - !ruby/object:Gem::Dependency
48
48
  name: rspec
49
49
  requirement: !ruby/object:Gem::Requirement
@@ -275,7 +275,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
275
275
  version: '0'
276
276
  requirements: []
277
277
  rubyforge_project:
278
- rubygems_version: 2.6.13
278
+ rubygems_version: 2.7.6
279
279
  signing_key:
280
280
  specification_version: 4
281
281
  summary: Simplifying Kafka for ruby apps