phobos 1.0.0 → 1.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 97e08e50c3b2cfdb020c71c181989cb90e865316
4
- data.tar.gz: 21bab9e83626f738b5903355e477b03ad5ff4d89
3
+ metadata.gz: 31d36810a7dbe760625a6f08d618d37780c1f584
4
+ data.tar.gz: ffbca77c17a57c8bc86d6497e4e4f6fccb3efdd4
5
5
  SHA512:
6
- metadata.gz: 17421090032e4212c626f708e80f5dca3d3c6421757982268980d954841473ab5176fdb907ac594bc079a5595b4f405b6402362767e01e12ec6b9a45d83e3d01
7
- data.tar.gz: aa1b7bbf6f07c99cde3f2246a0dc848b9e7a1cb395e42e1513e5a1e7265488cb46b81dc48ae9d049abf61ebab4854dece2b4d594ab52a2e3310fd97412a2c406
6
+ metadata.gz: a6f98e9ce96d3c0671db8c7c63cb833b3189f3541706258dc2e4757d2ca714dfc675973c23dd2edb344c5f888c987674a24341d98345de0dc0c6d76de4138db6
7
+ data.tar.gz: 7c98f56c03f8d479bb7067bde18e6ba9290184b08eadad0a680c42c6cf001db71e9c42a86357ffd9909bbf27b3364f95e2211fcd05c56eec5a88364101411aad
data/CHANGELOG.md ADDED
@@ -0,0 +1,14 @@
1
+ # Change Log
2
+ All notable changes to this project will be documented in this file.
3
+
4
+ The format is based on [Keep a Changelog](http://keepachangelog.com/)
5
+ and this project adheres to [Semantic Versioning](http://semver.org/).
6
+
7
+ ## 1.1.0 (2016-09-02)
8
+ - [enhancement] - Removed Hashie as a dependency #12
9
+ - [feature] Allow configuring consumers min_bytes & max_wait_time #15
10
+ - [feature] Allow configuring producers max_queue_size, delivery_threshold & delivery_interval #16
11
+ - [feature] Allow configuring force_encoding for message payload #18
12
+
13
+ ## 1.0.0 (2016-08-08)
14
+ - Published on Github with full fledged consumers and producers
data/README.md CHANGED
@@ -1,4 +1,7 @@
1
- ![Circle CI](https://circleci.com/gh/klarna/phobos.svg?style=shield&circle-token=2289e0fe5bd934074597b32e7f8f0bc98ea0e3c7)
1
+ ![Phobos](https://raw.githubusercontent.com/klarna/phobos/master/logo.png)
2
+
3
+ [![Circle CI](https://circleci.com/gh/klarna/phobos.svg?style=shield&circle-token=2289e0fe5bd934074597b32e7f8f0bc98ea0e3c7)](https://circleci.com/gh/klarna/phobos/tree/master)
4
+ [![Coverage Status](https://coveralls.io/repos/github/klarna/phobos/badge.svg?branch=master)](https://coveralls.io/github/klarna/phobos?branch=master)
2
5
 
3
6
  # Phobos
4
7
 
@@ -24,6 +27,7 @@ With Phobos by your side, all this becomes smooth sailing.
24
27
  1. [As library in another app](#usage-as-library)
25
28
  1. [Configuration file](#usage-configuration-file)
26
29
  1. [Instrumentation](#usage-instrumentation)
30
+ 1. [Plugins](#plugins)
27
31
  1. [Development](#development)
28
32
 
29
33
  ## <a name="installation"></a> Installation
@@ -73,7 +77,7 @@ $ phobos init
73
77
 
74
78
  ### Consumers (listeners and handlers)
75
79
 
76
- In Phobos apps __listeners__ are configured against Kafka - they are our consumers. A listener requires a __handler__ (a ruby class where you should process incoming messages), a __topic__, and a __group_id__. Consumer groups are used to coordinate the listeners across machines. We write the __handlers__ and Phobos makes sure to run them for us. An example of a handler is:
80
+ In Phobos apps __listeners__ are configured against Kafka - they are our consumers. A listener requires a __handler__ (a ruby class where you should process incoming messages), a Kafka __topic__, and a Kafka __group_id__. Consumer groups are used to coordinate the listeners across machines. We write the __handlers__ and Phobos makes sure to run them for us. An example of a handler is:
77
81
 
78
82
  ```ruby
79
83
  class MyHandler
@@ -201,9 +205,9 @@ MyProducer
201
205
  ])
202
206
  ```
203
207
 
204
- There are two flavors of producers: __normal__ producers and __async__ producers.
208
+ There are two flavors of producers: __regular__ producers and __async__ producers.
205
209
 
206
- Normal producers will deliver the messages synchronously and disconnect, it doesn't matter if you use `publish` or `publish_list` after the messages get delivered the producer will disconnect.
210
+ Regular producers will deliver the messages synchronously and disconnect, it doesn't matter if you use `publish` or `publish_list` after the messages get delivered the producer will disconnect.
207
211
 
208
212
  Async producers will accept your messages without blocking, use the methods `async_publish` and `async_publish_list` to use async producers.
209
213
 
@@ -289,7 +293,7 @@ __logger__ configures the logger for all Phobos components, it automatically out
289
293
 
290
294
  __kafka__ provides configurations for every `Kafka::Client` created over the application. All [options supported by `ruby-kafka`][ruby-kafka-client] can be provided.
291
295
 
292
- __producer__ provides configurations for all producers created over the application, the options are the same for normal and async producers. All [options supported by `ruby-kafka`][ruby-kafka-producer] can be provided.
296
+ __producer__ provides configurations for all producers created over the application, the options are the same for regular and async producers. All [options supported by `ruby-kafka`][ruby-kafka-producer] can be provided.
293
297
 
294
298
  __consumer__ provides configurations for all consumer groups created over the application. All [options supported by `ruby-kafka`][ruby-kafka-consumer] can be provided.
295
299
 
@@ -379,6 +383,15 @@ end
379
383
  * group_id
380
384
  * topic
381
385
 
386
+ ## <a name="plugins"></a> Plugins
387
+
388
+ List of gems that enhance Phobos:
389
+
390
+ * [Phobos DB Checkpoint](https://github.com/klarna/phobos_db_checkpoint) is drop in replacement to Phobos::Handler, extending it with the following features:
391
+ * Persists your Kafka events to an active record compatible database
392
+ * Ensures that your handler will consume messages only once
393
+ * Allows your system to quickly reprocess events in case of failures
394
+
382
395
  ## <a name="development"></a> Development
383
396
 
384
397
  After checking out the repo:
@@ -402,6 +415,10 @@ sh utils/stop-all.sh
402
415
 
403
416
  Bug reports and pull requests are welcome on GitHub at https://github.com/klarna/phobos.
404
417
 
418
+ ## Acknowledgements
419
+
420
+ Thanks to Sebastian Norde for the awesome logo!
421
+
405
422
  ## License
406
423
 
407
424
  Copyright 2016 Klarna
@@ -45,6 +45,14 @@ producer:
45
45
  # number of messages that needs to be in a message set before it should be compressed.
46
46
  # Note that message sets are per-partition rather than per-topic or per-producer
47
47
  compression_threshold: 1
48
+ # maximum number of messages allowed in the queue. Only used for async_producer
49
+ max_queue_size: 1000
50
+ # if greater than zero, the number of buffered messages that will automatically
51
+ # trigger a delivery. Only used for async_producer
52
+ delivery_threshold: 0
53
+ # if greater than zero, the number of seconds between automatic message
54
+ # deliveries. Only used for async_producer
55
+ delivery_interval: 0
48
56
 
49
57
  consumer:
50
58
  # number of seconds after which, if a client hasn't contacted the Kafka cluster,
@@ -67,12 +75,19 @@ listeners:
67
75
  topic: test
68
76
  # id of the group that the consumer should join
69
77
  group_id: test-1
78
+ # Number of threads created for this listener, each thread will behave as an independent consumer.
79
+ # They don't share any state
80
+ max_concurrency: 1
70
81
  # Once the consumer group has checkpointed its progress in the topic's partitions,
71
82
  # the consumers will always start from the checkpointed offsets, regardless of config
72
83
  # As such, this setting only applies when the consumer initially starts consuming from a topic
73
84
  start_from_beginning: true
74
85
  # maximum amount of data fetched from a single partition at a time
75
86
  max_bytes_per_partition: 524288 # 512 KB
76
- # Number of threads created for this listener, each thread will behave as an independent consumer.
77
- # They don't share any state
78
- max_concurrency: 1
87
+ # Minimum number of bytes to read before returning messages from the server; if `max_wait_time` is reached, this is ignored.
88
+ min_bytes: 1
89
+ # Maximum duration of time to wait before returning messages from the server, in seconds
90
+ max_wait_time: 5
91
+ # Apply this encoding to the message payload, if blank it uses the original encoding. This property accepts values
92
+ # defined by the ruby Encoding class (https://ruby-doc.org/core-2.3.0/Encoding.html). Ex: UTF_8, ASCII_8BIT, etc
93
+ force_encoding:
@@ -1,5 +1,10 @@
1
- require "bundler/setup"
2
- require "phobos"
1
+ #
2
+ # This example assumes you want to create a threaded kafka generator which
3
+ # publish a stream of kafka messages without consuming them. It also shows
4
+ # what happens when you produce more messages than the producer can handle.
5
+ #
6
+ require 'bundler/setup'
7
+ require 'phobos'
3
8
 
4
9
  TOPIC = 'test-partitions'
5
10
 
@@ -40,8 +45,8 @@ Thread.new do
40
45
 
41
46
  puts "produced #{key}, total: #{total}"
42
47
 
43
- # Since this is a simplistic code we are going to generate more messages than
44
- # the producer can write to Kafka, so eventually we'll get some buffer overflows
48
+ # Since this is very simplistic code, we are going to generate more messages than
49
+ # the producer can write to Kafka. Eventually we'll get some buffer overflows
45
50
  #
46
51
  rescue Kafka::BufferOverflow => e
47
52
  puts "| waiting"
@@ -0,0 +1,40 @@
1
+ # Please use this with at least the same consideration as you would when using OpenStruct.
2
+ # Right now we only use this to parse our internal configuration files. It is not meant to
3
+ # be used on incoming data.
4
+ module Phobos
5
+ class DeepStruct < OpenStruct
6
+ # Based on
7
+ # https://docs.omniref.com/ruby/2.3.0/files/lib/ostruct.rb#line=88
8
+ def initialize(hash=nil)
9
+ @table = {}
10
+ @hash_table = {}
11
+
12
+ if hash
13
+ hash.each_pair do |k, v|
14
+ k = k.to_sym
15
+ @table[k] = to_deep_struct(v)
16
+ @hash_table[k] = v
17
+ end
18
+ end
19
+ end
20
+
21
+ def to_h
22
+ @hash_table.dup
23
+ end
24
+ alias_method :to_hash, :to_h
25
+
26
+ private
27
+
28
+ def to_deep_struct(v)
29
+ case v
30
+ when Hash
31
+ self.class.new(v)
32
+ when Enumerable
33
+ v.map { |el| to_deep_struct(el) }
34
+ else
35
+ v
36
+ end
37
+ end
38
+ protected :to_deep_struct
39
+ end
40
+ end
@@ -1,16 +1,16 @@
1
1
  module Phobos
2
2
  class Executor
3
3
  include Phobos::Instrumentation
4
- LISTENER_OPTS = %i(handler group_id topic start_from_beginning max_bytes_per_partition).freeze
4
+ LISTENER_OPTS = %i(handler group_id topic min_bytes max_wait_time force_encoding start_from_beginning max_bytes_per_partition).freeze
5
5
 
6
6
  def initialize
7
7
  @threads = Concurrent::Array.new
8
8
  @listeners = Phobos.config.listeners.flat_map do |config|
9
9
  handler_class = config.handler.constantize
10
- listener_configs = config.to_hash.symbolize_keys
10
+ listener_configs = config.to_hash
11
11
  max_concurrency = listener_configs[:max_concurrency] || 1
12
12
  max_concurrency.times.map do
13
- configs = listener_configs.select {|k| LISTENER_OPTS.include?(k)}
13
+ configs = listener_configs.select { |k| LISTENER_OPTS.include?(k) }
14
14
  Phobos::Listener.new(configs.merge(handler: handler_class))
15
15
  end
16
16
  end
@@ -7,7 +7,7 @@ module Phobos
7
7
 
8
8
  attr_reader :group_id, :topic, :id
9
9
 
10
- def initialize(handler:, group_id:, topic:, start_from_beginning: true, max_bytes_per_partition: DEFAULT_MAX_BYTES_PER_PARTITION)
10
+ def initialize(handler:, group_id:, topic:, min_bytes: nil, max_wait_time: nil, force_encoding: nil, start_from_beginning: true, max_bytes_per_partition: DEFAULT_MAX_BYTES_PER_PARTITION)
11
11
  @id = SecureRandom.hex[0...6]
12
12
  @handler_class = handler
13
13
  @group_id = group_id
@@ -16,6 +16,8 @@ module Phobos
16
16
  start_from_beginning: start_from_beginning,
17
17
  max_bytes_per_partition: max_bytes_per_partition
18
18
  }
19
+ @encoding = Encoding.const_get(force_encoding.to_sym) if force_encoding
20
+ @consumer_opts = compact(min_bytes: min_bytes, max_wait_time: max_wait_time)
19
21
  @kafka_client = Phobos.create_kafka_client
20
22
  @producer_enabled = @handler_class.ancestors.include?(Phobos::Producer)
21
23
  end
@@ -35,7 +37,7 @@ module Phobos
35
37
  end
36
38
 
37
39
  begin
38
- @consumer.each_batch do |batch|
40
+ @consumer.each_batch(@consumer_opts) do |batch|
39
41
  batch_metadata = {
40
42
  batch_size: batch.messages.count,
41
43
  partition: batch.partition,
@@ -139,7 +141,7 @@ module Phobos
139
141
  end
140
142
 
141
143
  def process_message(message, metadata)
142
- payload = message.value
144
+ payload = force_encoding(message.value)
143
145
  @handler_class.around_consume(payload, metadata) do
144
146
  @handler_class.new.consume(payload, metadata)
145
147
  end
@@ -149,5 +151,13 @@ module Phobos
149
151
  configs = Phobos.config.consumer_hash.select { |k| KAFKA_CONSUMER_OPTS.include?(k) }
150
152
  @kafka_client.consumer({group_id: group_id}.merge(configs))
151
153
  end
154
+
155
+ def force_encoding(value)
156
+ @encoding ? value.force_encoding(@encoding) : value
157
+ end
158
+
159
+ def compact(hash)
160
+ hash.delete_if { |_, v| v.nil? }
161
+ end
152
162
  end
153
163
  end
@@ -49,6 +49,7 @@ module Phobos
49
49
 
50
50
  class PublicAPI
51
51
  NAMESPACE = :phobos_producer_store
52
+ ASYNC_PRODUCER_PARAMS = %i(max_queue_size delivery_threshold delivery_interval).freeze
52
53
 
53
54
  # This method configures the kafka client used with publish operations
54
55
  # performed by the host class
@@ -70,7 +71,7 @@ module Phobos
70
71
 
71
72
  def publish_list(messages)
72
73
  client = kafka_client || configure_kafka_client(Phobos.create_kafka_client)
73
- producer = client.producer(Phobos.config.producer_hash)
74
+ producer = client.producer(regular_configs)
74
75
  produce_messages(producer, messages)
75
76
  ensure
76
77
  producer&.shutdown
@@ -78,7 +79,7 @@ module Phobos
78
79
 
79
80
  def create_async_producer
80
81
  client = kafka_client || configure_kafka_client(Phobos.create_kafka_client)
81
- async_producer = client.async_producer(Phobos.config.producer_hash)
82
+ async_producer = client.async_producer(async_configs)
82
83
  producer_store[:async_producer] = async_producer
83
84
  end
84
85
 
@@ -101,6 +102,14 @@ module Phobos
101
102
  producer_store[:async_producer] = nil
102
103
  end
103
104
 
105
+ def regular_configs
106
+ Phobos.config.producer_hash.reject { |k, _| ASYNC_PRODUCER_PARAMS.include?(k)}
107
+ end
108
+
109
+ def async_configs
110
+ Phobos.config.producer_hash
111
+ end
112
+
104
113
  private
105
114
 
106
115
  def produce_messages(producer, messages)
@@ -1,3 +1,3 @@
1
1
  module Phobos
2
- VERSION = "1.0.0"
2
+ VERSION = "1.1.0"
3
3
  end
data/lib/phobos.rb CHANGED
@@ -1,15 +1,16 @@
1
- require 'yaml'
1
+ require 'ostruct'
2
2
  require 'securerandom'
3
+ require 'yaml'
3
4
 
5
+ require 'active_support/core_ext/hash/keys'
6
+ require 'active_support/core_ext/string/inflections'
7
+ require 'active_support/notifications'
4
8
  require 'concurrent'
9
+ require 'exponential_backoff'
5
10
  require 'kafka'
6
- require 'hashie'
7
11
  require 'logging'
8
- require 'exponential_backoff'
9
- require 'active_support/notifications'
10
- require 'active_support/core_ext/string/inflections'
11
- require 'active_support/core_ext/hash/keys'
12
12
 
13
+ require 'phobos/deep_struct'
13
14
  require 'phobos/version'
14
15
  require 'phobos/instrumentation'
15
16
  require 'phobos/errors'
@@ -28,15 +29,15 @@ module Phobos
28
29
 
29
30
  def configure(yml_path)
30
31
  ENV['RAILS_ENV'] = ENV['RACK_ENV'] ||= 'development'
31
- @config = Hashie::Mash.new(YAML.load_file(File.expand_path(yml_path)))
32
- @config.class.send(:define_method, :producer_hash) { Phobos.config.producer&.to_hash&.symbolize_keys }
33
- @config.class.send(:define_method, :consumer_hash) { Phobos.config.consumer&.to_hash&.symbolize_keys }
32
+ @config = DeepStruct.new(YAML.load_file(File.expand_path(yml_path)))
33
+ @config.class.send(:define_method, :producer_hash) { Phobos.config.producer&.to_hash }
34
+ @config.class.send(:define_method, :consumer_hash) { Phobos.config.consumer&.to_hash }
34
35
  configure_logger
35
36
  logger.info { Hash(message: 'Phobos configured', env: ENV['RACK_ENV']) }
36
37
  end
37
38
 
38
39
  def create_kafka_client
39
- Kafka.new(config.kafka.to_hash.symbolize_keys)
40
+ Kafka.new(config.kafka.to_hash)
40
41
  end
41
42
 
42
43
  def create_exponential_backoff
data/logo.png ADDED
Binary file
data/phobos.gemspec CHANGED
@@ -24,7 +24,7 @@ Gem::Specification.new do |spec|
24
24
  ]
25
25
 
26
26
  spec.summary = %q{Simplifying Kafka for ruby apps}
27
- spec.description = %q{Phobos is a microframework and library for kafka based applications, it wraps commons behaviors needed by consumers/producers in an easy an convenient API. It uses ruby-kafka as it's kafka client and core component.}
27
+ spec.description = %q{Phobos is a microframework and library for kafka based applications, it wraps common behaviors needed by consumers/producers in an easy an convenient API. It uses ruby-kafka as its kafka client and core component.}
28
28
  spec.homepage = 'https://github.com/klarna/phobos'
29
29
  spec.license = 'Apache License Version 2.0'
30
30
 
@@ -47,12 +47,13 @@ Gem::Specification.new do |spec|
47
47
  spec.add_development_dependency 'rspec', '~> 3.0'
48
48
  spec.add_development_dependency 'pry-byebug', '~> 3.4.0'
49
49
  spec.add_development_dependency 'rspec_junit_formatter', '0.2.2'
50
+ spec.add_development_dependency 'simplecov', '~> 0.12.0'
51
+ spec.add_development_dependency 'coveralls', '~> 0.8.15'
50
52
 
51
- spec.add_dependency 'ruby-kafka', '>= 0.3.13.beta4'
53
+ spec.add_dependency 'ruby-kafka', '>= 0.3.14'
52
54
  spec.add_dependency 'concurrent-ruby', '>= 1.0.2'
53
55
  spec.add_dependency 'concurrent-ruby-ext', '>= 1.0.2'
54
56
  spec.add_dependency 'activesupport', '>= 4.0.0'
55
- spec.add_dependency 'hashie'
56
57
  spec.add_dependency 'logging'
57
58
  spec.add_dependency 'exponential-backoff'
58
59
  spec.add_dependency 'thor'
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: phobos
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0
4
+ version: 1.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Túlio Ornelas
@@ -13,7 +13,7 @@ authors:
13
13
  autorequire:
14
14
  bindir: bin
15
15
  cert_chain: []
16
- date: 2016-08-18 00:00:00.000000000 Z
16
+ date: 2016-09-02 00:00:00.000000000 Z
17
17
  dependencies:
18
18
  - !ruby/object:Gem::Dependency
19
19
  name: bundler
@@ -86,35 +86,49 @@ dependencies:
86
86
  - !ruby/object:Gem::Version
87
87
  version: 0.2.2
88
88
  - !ruby/object:Gem::Dependency
89
- name: ruby-kafka
89
+ name: simplecov
90
90
  requirement: !ruby/object:Gem::Requirement
91
91
  requirements:
92
- - - ">="
92
+ - - "~>"
93
93
  - !ruby/object:Gem::Version
94
- version: 0.3.13.beta4
95
- type: :runtime
94
+ version: 0.12.0
95
+ type: :development
96
96
  prerelease: false
97
97
  version_requirements: !ruby/object:Gem::Requirement
98
98
  requirements:
99
- - - ">="
99
+ - - "~>"
100
100
  - !ruby/object:Gem::Version
101
- version: 0.3.13.beta4
101
+ version: 0.12.0
102
102
  - !ruby/object:Gem::Dependency
103
- name: concurrent-ruby
103
+ name: coveralls
104
+ requirement: !ruby/object:Gem::Requirement
105
+ requirements:
106
+ - - "~>"
107
+ - !ruby/object:Gem::Version
108
+ version: 0.8.15
109
+ type: :development
110
+ prerelease: false
111
+ version_requirements: !ruby/object:Gem::Requirement
112
+ requirements:
113
+ - - "~>"
114
+ - !ruby/object:Gem::Version
115
+ version: 0.8.15
116
+ - !ruby/object:Gem::Dependency
117
+ name: ruby-kafka
104
118
  requirement: !ruby/object:Gem::Requirement
105
119
  requirements:
106
120
  - - ">="
107
121
  - !ruby/object:Gem::Version
108
- version: 1.0.2
122
+ version: 0.3.14
109
123
  type: :runtime
110
124
  prerelease: false
111
125
  version_requirements: !ruby/object:Gem::Requirement
112
126
  requirements:
113
127
  - - ">="
114
128
  - !ruby/object:Gem::Version
115
- version: 1.0.2
129
+ version: 0.3.14
116
130
  - !ruby/object:Gem::Dependency
117
- name: concurrent-ruby-ext
131
+ name: concurrent-ruby
118
132
  requirement: !ruby/object:Gem::Requirement
119
133
  requirements:
120
134
  - - ">="
@@ -128,33 +142,33 @@ dependencies:
128
142
  - !ruby/object:Gem::Version
129
143
  version: 1.0.2
130
144
  - !ruby/object:Gem::Dependency
131
- name: activesupport
145
+ name: concurrent-ruby-ext
132
146
  requirement: !ruby/object:Gem::Requirement
133
147
  requirements:
134
148
  - - ">="
135
149
  - !ruby/object:Gem::Version
136
- version: 4.0.0
150
+ version: 1.0.2
137
151
  type: :runtime
138
152
  prerelease: false
139
153
  version_requirements: !ruby/object:Gem::Requirement
140
154
  requirements:
141
155
  - - ">="
142
156
  - !ruby/object:Gem::Version
143
- version: 4.0.0
157
+ version: 1.0.2
144
158
  - !ruby/object:Gem::Dependency
145
- name: hashie
159
+ name: activesupport
146
160
  requirement: !ruby/object:Gem::Requirement
147
161
  requirements:
148
162
  - - ">="
149
163
  - !ruby/object:Gem::Version
150
- version: '0'
164
+ version: 4.0.0
151
165
  type: :runtime
152
166
  prerelease: false
153
167
  version_requirements: !ruby/object:Gem::Requirement
154
168
  requirements:
155
169
  - - ">="
156
170
  - !ruby/object:Gem::Version
157
- version: '0'
171
+ version: 4.0.0
158
172
  - !ruby/object:Gem::Dependency
159
173
  name: logging
160
174
  requirement: !ruby/object:Gem::Requirement
@@ -198,8 +212,8 @@ dependencies:
198
212
  - !ruby/object:Gem::Version
199
213
  version: '0'
200
214
  description: Phobos is a microframework and library for kafka based applications,
201
- it wraps commons behaviors needed by consumers/producers in an easy an convenient
202
- API. It uses ruby-kafka as it's kafka client and core component.
215
+ it wraps common behaviors needed by consumers/producers in an easy an convenient
216
+ API. It uses ruby-kafka as its kafka client and core component.
203
217
  email:
204
218
  - ornelas.tulio@gmail.com
205
219
  - mathias.klippinge@gmail.com
@@ -215,6 +229,7 @@ files:
215
229
  - ".gitignore"
216
230
  - ".rspec"
217
231
  - ".ruby-version"
232
+ - CHANGELOG.md
218
233
  - Dockerfile
219
234
  - Gemfile
220
235
  - LICENSE.txt
@@ -227,11 +242,12 @@ files:
227
242
  - config/phobos.yml.example
228
243
  - examples/handler_saving_events_database.rb
229
244
  - examples/handler_using_async_producer.rb
230
- - examples/publishing_messages_without_consumer.rb.rb
245
+ - examples/publishing_messages_without_consumer.rb
231
246
  - lib/phobos.rb
232
247
  - lib/phobos/cli.rb
233
248
  - lib/phobos/cli/runner.rb
234
249
  - lib/phobos/cli/start.rb
250
+ - lib/phobos/deep_struct.rb
235
251
  - lib/phobos/echo_handler.rb
236
252
  - lib/phobos/errors.rb
237
253
  - lib/phobos/executor.rb
@@ -240,6 +256,7 @@ files:
240
256
  - lib/phobos/listener.rb
241
257
  - lib/phobos/producer.rb
242
258
  - lib/phobos/version.rb
259
+ - logo.png
243
260
  - phobos.gemspec
244
261
  - utils/create-topic.sh
245
262
  - utils/env.sh