karafka 0.6.0.rc1 → 0.6.0.rc2

Sign up to get free protection for your applications and to get access to all the features.
Files changed (43) hide show
  1. checksums.yaml +4 -4
  2. data/.travis.yml +8 -2
  3. data/CHANGELOG.md +13 -2
  4. data/CONTRIBUTING.md +0 -25
  5. data/Gemfile +2 -0
  6. data/Gemfile.lock +8 -9
  7. data/README.md +22 -11
  8. data/Rakefile +1 -2
  9. data/bin/karafka +1 -1
  10. data/karafka.gemspec +2 -6
  11. data/lib/karafka.rb +9 -8
  12. data/lib/karafka/attributes_map.rb +3 -2
  13. data/lib/karafka/base_controller.rb +8 -1
  14. data/lib/karafka/cli.rb +1 -1
  15. data/lib/karafka/cli/console.rb +2 -2
  16. data/lib/karafka/cli/flow.rb +1 -1
  17. data/lib/karafka/cli/info.rb +2 -2
  18. data/lib/karafka/cli/install.rb +2 -2
  19. data/lib/karafka/cli/server.rb +14 -6
  20. data/lib/karafka/cli/worker.rb +1 -1
  21. data/lib/karafka/connection/config_adapter.rb +23 -22
  22. data/lib/karafka/connection/messages_consumer.rb +23 -3
  23. data/lib/karafka/connection/messages_processor.rb +6 -25
  24. data/lib/karafka/errors.rb +4 -0
  25. data/lib/karafka/fetcher.rb +1 -1
  26. data/lib/karafka/loader.rb +5 -55
  27. data/lib/karafka/monitor.rb +1 -1
  28. data/lib/karafka/params/params.rb +5 -1
  29. data/lib/karafka/parsers/json.rb +2 -2
  30. data/lib/karafka/routing/builder.rb +7 -0
  31. data/lib/karafka/routing/consumer_group.rb +10 -3
  32. data/lib/karafka/routing/topic.rb +1 -1
  33. data/lib/karafka/schemas/config.rb +4 -4
  34. data/lib/karafka/schemas/consumer_group.rb +2 -2
  35. data/lib/karafka/schemas/server_cli_options.rb +43 -0
  36. data/lib/karafka/server.rb +10 -0
  37. data/lib/karafka/setup/config.rb +9 -4
  38. data/lib/karafka/templates/application_controller.rb.example +2 -0
  39. data/lib/karafka/templates/application_responder.rb.example +2 -0
  40. data/lib/karafka/templates/application_worker.rb.example +2 -0
  41. data/lib/karafka/templates/{app.rb.example → karafka.rb.example} +13 -6
  42. data/lib/karafka/version.rb +1 -1
  43. metadata +11 -52
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 2b23b384c41e678138f2dec19e303e3cd62dc537
4
- data.tar.gz: 794671bee86cd69916ae16f48111179be4e45228
3
+ metadata.gz: f1ac31c4f943edba6e25886bc34b2bfad35e69fb
4
+ data.tar.gz: 488973afae1eb933b27dec7dd2fdcbb21fe6edae
5
5
  SHA512:
6
- metadata.gz: 29f049589ec7f572b9de747c005522a2b40076b5fc8425f451f83f12d8eef549b6ce403cf2201068ca462dedae83df799735f4655df167714116a6990238a7df
7
- data.tar.gz: 436c792d7b15fafd68754c75bc558559ed2d0828591aba5a0644a44a720dbf4bffeefa428d146517552ba6d66dfa266d76ab9b3a3fa1c6e3a19c685a4496eaf6
6
+ metadata.gz: 7e0ce78c37d1309d7b6adc51b4c59545df6eed3ee1dca0441c751b309090bd4d62e9da100fbe8fb7086af1b4f1d600ead0f9f7b873bf3d45e8f95092c30cdd76
7
+ data.tar.gz: d2555a9f9ca030df2d31a49d2a272e4d13b681ad53590708c6ed960004690761a0e3cf9e88ee000a0870d48d60e8987a7c0bf619c10cbfc5dad057fb374115b3
@@ -1,10 +1,16 @@
1
1
  language: ruby
2
+ sudo: false
2
3
  rvm:
3
4
  - 2.3.0
4
5
  - 2.3.1
5
6
  - 2.3.2
6
7
  - 2.3.3
8
+ - 2.3.4
7
9
  - 2.4.0
8
10
  - 2.4.1
9
- script:
10
- - bundle exec rake
11
+ - jruby-9.1.12.0
12
+ script: bundle exec rspec spec/
13
+ env:
14
+ global:
15
+ - JRUBY_OPTS='--debug'
16
+ install: bundle install --jobs=3 --retry=3
@@ -1,6 +1,6 @@
1
1
  # Karafka framework changelog
2
2
 
3
- ## 0.6.0.rc1
3
+ ## 0.6.0.rc2 - released
4
4
 
5
5
  ### Closed issues:
6
6
 
@@ -20,11 +20,20 @@
20
20
  - #175 - Allow single consumer to subscribe to multiple topics
21
21
  - #178 - Remove parsing failover when cannot unparse data
22
22
  - #174 - Extended config validation
23
- - #180 - Switch from JSON parser to yajl-ruby
23
+ - ~~#180 - Switch from JSON parser to yajl-ruby~~
24
24
  - #181 - When responder is defined and not used due to ```respond_with``` not being triggered in the perform, it won't raise an exception.
25
25
  - #188 - Rename name in config to client id
26
26
  - #186 - Support ruby-kafka ```ssl_ca_cert_file_path``` config
27
27
  - #189 - karafka console does not preserve history on exit
28
+ - #191 - Karafka 0.6.0rc1 does not work with jruby / now it does :-)
29
+ - Switch to multi json so everyone can use their favourite JSON parser
30
+ - Added jruby support in general and in Travis
31
+ - #196 - Topic mapper does not map topics when subscribing thanks to @webandtech
32
+ - #96 - Karafka server - possiblity to run it only for a certain topics
33
+ - ~~karafka worker cli option is removed (please use sidekiq directly)~~ - restored, bad idea
34
+ - (optional) pausing upon processing failures ```pause_timeout```
35
+ - Karafka console main process no longer intercepts irb errors
36
+ - Wiki updates
28
37
 
29
38
  ### New features and improvements
30
39
 
@@ -38,6 +47,7 @@
38
47
 
39
48
  ### Incompatibilities
40
49
 
50
+ - Default boot file is renamed from app.rb to karafka.rb
41
51
  - Removed worker glass as dependency (now and independent gem)
42
52
  - ```kafka.hosts``` option renamed to ```kafka.seed_brokers``` - you don't need to provide all the hosts to work with Kafka
43
53
  - ```start_from_beginning``` moved into kafka scope (```kafka.start_from_beginning```)
@@ -47,6 +57,7 @@
47
57
  - Renamed content to value to better resemble ruby-kafka internal messages naming convention
48
58
  - When having a responder with ```required``` topics and not using ```#respond_with``` at all, it will raise an exception
49
59
  - Renamed ```inline_mode``` to ```inline_processing``` to resemble other settings conventions
60
+ - Renamed ```inline_processing``` to ```processing_backend``` to reach 1.0 future compatibility
50
61
 
51
62
  ### Other changes
52
63
  - PolishGeeksDevTools removed (in favour of Coditsu)
@@ -40,28 +40,3 @@ You can also reach us at hello@karafka.opencollective.com.
40
40
 
41
41
  Thank you to all the people who have already contributed to karafka!
42
42
  <a href="graphs/contributors"><img src="https://opencollective.com/karafka/contributors.svg?width=890" /></a>
43
-
44
-
45
- ### Backers
46
-
47
- Thank you to all our backers! [[Become a backer](https://opencollective.com/karafka#backer)]
48
-
49
- <a href="https://opencollective.com/karafka#backers" target="_blank"><img src="https://opencollective.com/karafka/backers.svg?width=890"></a>
50
-
51
-
52
- ### Sponsors
53
-
54
- Thank you to all our sponsors! (please ask your company to also support this open source project by [becoming a sponsor](https://opencollective.com/karafka#sponsor))
55
-
56
- <a href="https://opencollective.com/karafka/sponsor/0/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/0/avatar.svg"></a>
57
- <a href="https://opencollective.com/karafka/sponsor/1/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/1/avatar.svg"></a>
58
- <a href="https://opencollective.com/karafka/sponsor/2/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/2/avatar.svg"></a>
59
- <a href="https://opencollective.com/karafka/sponsor/3/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/3/avatar.svg"></a>
60
- <a href="https://opencollective.com/karafka/sponsor/4/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/4/avatar.svg"></a>
61
- <a href="https://opencollective.com/karafka/sponsor/5/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/5/avatar.svg"></a>
62
- <a href="https://opencollective.com/karafka/sponsor/6/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/6/avatar.svg"></a>
63
- <a href="https://opencollective.com/karafka/sponsor/7/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/7/avatar.svg"></a>
64
- <a href="https://opencollective.com/karafka/sponsor/8/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/8/avatar.svg"></a>
65
- <a href="https://opencollective.com/karafka/sponsor/9/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/9/avatar.svg"></a>
66
-
67
- <!-- This `CONTRIBUTING.md` is based on @nayafia's template https://github.com/nayafia/contributing-template -->
data/Gemfile CHANGED
@@ -6,4 +6,6 @@ gemspec
6
6
 
7
7
  group :development, :test do
8
8
  gem 'timecop'
9
+ gem 'rspec'
10
+ gem 'simplecov'
9
11
  end
@@ -1,18 +1,19 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (0.6.0.rc1)
4
+ karafka (0.6.0.rc2)
5
5
  activesupport (>= 5.0)
6
6
  celluloid
7
7
  dry-configurable (~> 0.7)
8
8
  dry-validation (~> 0.11)
9
9
  envlogic (~> 1.0)
10
+ multi_json (>= 1.12)
10
11
  rake (>= 11.3)
12
+ require_all (>= 1.4)
11
13
  ruby-kafka (>= 0.4)
12
14
  sidekiq (>= 4.2)
13
15
  thor (~> 0.19)
14
16
  waterdrop (>= 0.4)
15
- yajl-ruby (>= 1.3.0)
16
17
 
17
18
  GEM
18
19
  remote: https://rubygems.org/
@@ -22,7 +23,6 @@ GEM
22
23
  i18n (~> 0.7)
23
24
  minitest (~> 5.1)
24
25
  tzinfo (~> 1.1)
25
- byebug (9.0.6)
26
26
  celluloid (0.17.3)
27
27
  celluloid-essentials
28
28
  celluloid-extras
@@ -81,12 +81,14 @@ GEM
81
81
  inflecto (0.0.2)
82
82
  json (2.1.0)
83
83
  minitest (5.10.3)
84
+ multi_json (1.12.1)
84
85
  null-logger (0.1.4)
85
86
  rack (2.0.3)
86
87
  rack-protection (2.0.0)
87
88
  rack
88
89
  rake (12.0.0)
89
90
  redis (3.3.3)
91
+ require_all (1.4.0)
90
92
  rspec (3.6.0)
91
93
  rspec-core (~> 3.6.0)
92
94
  rspec-expectations (~> 3.6.0)
@@ -112,7 +114,7 @@ GEM
112
114
  json (>= 1.8, < 3)
113
115
  simplecov-html (~> 0.10.0)
114
116
  simplecov-html (0.10.2)
115
- thor (0.19.4)
117
+ thor (0.20.0)
116
118
  thread_safe (0.3.6)
117
119
  timecop (0.9.1)
118
120
  timers (4.1.2)
@@ -126,17 +128,14 @@ GEM
126
128
  null-logger
127
129
  rake
128
130
  ruby-kafka (~> 0.4)
129
- yajl-ruby (1.3.0)
130
131
 
131
132
  PLATFORMS
132
133
  ruby
133
134
 
134
135
  DEPENDENCIES
135
- bundler (~> 1.2)
136
- byebug
137
136
  karafka!
138
- rspec (>= 3.6)
139
- simplecov (>= 0.14)
137
+ rspec
138
+ simplecov
140
139
  timecop
141
140
 
142
141
  BUNDLED WITH
data/README.md CHANGED
@@ -1,4 +1,4 @@
1
- ![karafka logo](http://mensfeld.github.io/karafka-framework-introduction/img/karafka-04.png)
1
+ ![karafka logo](https://raw.githubusercontent.com/karafka/misc/master/logo/karafka_logotype_transparent2.png)
2
2
 
3
3
  [![Build Status](https://travis-ci.org/karafka/karafka.png)](https://travis-ci.org/karafka/karafka)
4
4
  [![Backers on Open Collective](https://opencollective.com/karafka/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/karafka/sponsors/badge.svg)](#sponsors) [![Join the chat at https://gitter.im/karafka/karafka](https://badges.gitter.im/karafka/karafka.svg)](https://gitter.im/karafka/karafka?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
@@ -19,9 +19,28 @@ Karafka based applications can be easily deployed to any type of infrastructure,
19
19
  * Capistrano
20
20
  * Docker
21
21
 
22
- ## Support
22
+ ## Getting started
23
+
24
+ If you want to get started with Kafka and Karafka as fast as possible, then the best idea is to just clone our example repository:
25
+
26
+ ```bash
27
+ git clone https://github.com/karafka/karafka-example-app ./example_app
28
+ ```
29
+
30
+ then, just bundle install all the dependencies:
31
+
32
+ ```bash
33
+ cd ./example_app
34
+ bundle install
35
+ ```
36
+
37
+ and follow the instructions from the [example app Wiki](https://github.com/karafka/karafka-example-app/blob/master/README.md).
38
+
39
+ **Note**: you need to ensure, that you have Kafka up and running and you need to configure Kafka seed_brokers in the ```karafka.rb``` file.
40
+
41
+ If you need more details and know how on how to start Karafka with a clean installation, read the [Getting started page](https://github.com/karafka/karafka/wiki/Getting-started) section of our Wiki.
23
42
 
24
- **Warning**: We're currently in the middle of upgrading our [Wiki pages](https://github.com/karafka/karafka/wiki), to match our newest 0.6 release and it's API. If you use the 0.5 version, you might encounter some incompatibilities. We're really sorry for the inconvenience.
43
+ ## Support
25
44
 
26
45
  Karafka has a [Wiki pages](https://github.com/karafka/karafka/wiki) for almost everything. It covers the whole installation, setup and deployment along with other useful details on how to run Karafka.
27
46
 
@@ -39,14 +58,6 @@ If you are interested in our commercial services, please contact [Maciej Mensfel
39
58
 
40
59
  Karafka framework and Karafka team are __not__ related to Kafka streaming service called CloudKarafka in any matter. We don't recommend nor discourage usage of their platform.
41
60
 
42
- ## Requirements
43
-
44
- In order to use Karafka framework, you need to have:
45
-
46
- - Zookeeper (required by Kafka)
47
- - Kafka (at least 0.9.0)
48
- - Ruby (at least 2.3.0)
49
-
50
61
  ## Note on Patches/Pull Requests
51
62
 
52
63
  Fork the project.
data/Rakefile CHANGED
@@ -1,7 +1,6 @@
1
1
  # frozen_string_literal: true
2
2
 
3
- require 'bundler'
4
- require 'rake'
3
+ require 'rspec'
5
4
  require 'rspec/core/rake_task'
6
5
 
7
6
  RSpec::Core::RakeTask.new(:spec)
@@ -1,7 +1,7 @@
1
1
  #!/usr/bin/env ruby
2
2
 
3
3
  require 'karafka'
4
- require Karafka.boot_file.to_s if File.exist?(Karafka.boot_file.to_s)
4
+ require Karafka.boot_file.to_s
5
5
 
6
6
  Karafka::Cli.prepare
7
7
  Karafka::Cli.start
@@ -26,12 +26,8 @@ Gem::Specification.new do |spec|
26
26
  spec.add_dependency 'activesupport', '>= 5.0'
27
27
  spec.add_dependency 'dry-validation', '~> 0.11'
28
28
  spec.add_dependency 'dry-configurable', '~> 0.7'
29
- spec.add_dependency 'yajl-ruby', '>= 1.3.0'
30
-
31
- spec.add_development_dependency 'bundler', '~> 1.2'
32
- spec.add_development_dependency 'rspec', '>= 3.6'
33
- spec.add_development_dependency 'simplecov', '>= 0.14'
34
- spec.add_development_dependency 'byebug'
29
+ spec.add_dependency 'multi_json', '>= 1.12'
30
+ spec.add_dependency 'require_all', '>= 1.4'
35
31
 
36
32
  spec.required_ruby_version = '>= 2.3.0'
37
33
 
@@ -16,7 +16,8 @@
16
16
  envlogic
17
17
  thor
18
18
  fileutils
19
- yajl
19
+ multi_json
20
+ require_all
20
21
  dry-configurable
21
22
  dry-validation
22
23
  active_support/callbacks
@@ -61,18 +62,18 @@ module Karafka
61
62
  end
62
63
 
63
64
  # @return [String] path to a default file that contains booting procedure etc
64
- # @note By default it is a file called 'app.rb' but it can be specified as you wish if you
65
- # have Karafka that is merged into a Sinatra/Rails app and app.rb is taken.
65
+ # @note By default it is a file called 'karafka.rb' but it can be specified as you wish if you
66
+ # have Karafka that is merged into a Sinatra/Rails app and karafka.rb is taken.
66
67
  # It will be used for console/workers/etc
67
68
  # @example Standard only-Karafka case
68
- # Karafka.boot_file #=> '/home/app_path/app.rb'
69
- # @example Non standard case
70
- # KARAFKA_BOOT_FILE='/home/app_path/karafka.rb'
71
69
  # Karafka.boot_file #=> '/home/app_path/karafka.rb'
70
+ # @example Non standard case
71
+ # KARAFKA_BOOT_FILE='/home/app_path/app.rb'
72
+ # Karafka.boot_file #=> '/home/app_path/app.rb'
72
73
  def boot_file
73
- Pathname.new(ENV['KARAFKA_BOOT_FILE'] || File.join(Karafka.root, 'app.rb'))
74
+ Pathname.new(ENV['KARAFKA_BOOT_FILE'] || File.join(Karafka.root, 'karafka.rb'))
74
75
  end
75
76
  end
76
77
  end
77
78
 
78
- Karafka::Loader.new.load!(Karafka.core_root)
79
+ Karafka::Loader.load!(Karafka.core_root)
@@ -22,6 +22,7 @@ module Karafka
22
22
  ],
23
23
  subscription: %i[start_from_beginning max_bytes_per_partition],
24
24
  consuming: %i[min_bytes max_wait_time],
25
+ pausing: %i[pause_timeout],
25
26
  # All the options that are under kafka config namespace, but are not used
26
27
  # directly with kafka api, but from the Karafka user perspective, they are
27
28
  # still related to kafka. They should not be proxied anywhere
@@ -32,7 +33,7 @@ module Karafka
32
33
  # @return [Array<Symbol>] properties that can be set on a per topic level
33
34
  def topic
34
35
  (config_adapter[:subscription] + %i[
35
- inline_processing
36
+ processing_backend
36
37
  name
37
38
  worker
38
39
  parser
@@ -52,7 +53,7 @@ module Karafka
52
53
  # only when proxying details go ruby-kafka. We use ignored fields internally in karafka
53
54
  ignored_settings = config_adapter[:subscription]
54
55
  defined_settings = config_adapter.values.flatten
55
- karafka_settings = %i[batch_consuming topic_mapper]
56
+ karafka_settings = %i[batch_consuming]
56
57
  # This is a drity and bad hack of dry-configurable to get keys before setting values
57
58
  dynamically_proxied = Karafka::Setup::Config
58
59
  ._settings
@@ -107,7 +107,14 @@ module Karafka
107
107
  # will schedule a call task in sidekiq
108
108
  def schedule
109
109
  run_callbacks :schedule do
110
- topic.inline_processing ? call_inline : call_async
110
+ case topic.processing_backend
111
+ when :inline then
112
+ call_inline
113
+ when :sidekiq then
114
+ call_async
115
+ else
116
+ raise Errors::InvalidProcessingBackend, topic.processing_backend
117
+ end
111
118
  end
112
119
  end
113
120
 
@@ -7,7 +7,7 @@ module Karafka
7
7
  #
8
8
  # @note Whole Cli is built using Thor
9
9
  # @see https://github.com/erikhuda/thor
10
- class Cli
10
+ class Cli < Thor
11
11
  package_name 'Karafka'
12
12
 
13
13
  class << self
@@ -2,7 +2,7 @@
2
2
 
3
3
  module Karafka
4
4
  # Karafka framework Cli
5
- class Cli
5
+ class Cli < Thor
6
6
  # Console Karafka Cli action
7
7
  class Console < Base
8
8
  desc 'Start the Karafka console (short-cut alias: "c")'
@@ -22,7 +22,7 @@ module Karafka
22
22
  # Start the Karafka console
23
23
  def call
24
24
  cli.info
25
- system self.class.command
25
+ exec self.class.command
26
26
  end
27
27
  end
28
28
  end
@@ -2,7 +2,7 @@
2
2
 
3
3
  module Karafka
4
4
  # Karafka framework Cli
5
- class Cli
5
+ class Cli < Thor
6
6
  # Description of topics flow (incoming/outgoing)
7
7
  class Flow < Base
8
8
  desc 'Print application data flow (incoming => outgoing)'
@@ -2,7 +2,7 @@
2
2
 
3
3
  module Karafka
4
4
  # Karafka framework Cli
5
- class Cli
5
+ class Cli < Thor
6
6
  # Info Karafka Cli action
7
7
  class Info < Base
8
8
  desc 'Print configuration details and other options of your application'
@@ -14,7 +14,7 @@ module Karafka
14
14
  info = [
15
15
  "Karafka framework version: #{Karafka::VERSION}",
16
16
  "Application client id: #{config.client_id}",
17
- "Inline processing: #{config.inline_processing}",
17
+ "Processing backend: #{config.processing_backend}",
18
18
  "Batch consuming: #{config.batch_consuming}",
19
19
  "Batch processing: #{config.batch_processing}",
20
20
  "Number of threads: #{config.concurrency}",
@@ -2,7 +2,7 @@
2
2
 
3
3
  module Karafka
4
4
  # Karafka framework Cli
5
- class Cli
5
+ class Cli < Thor
6
6
  # Install Karafka Cli action
7
7
  class Install < Base
8
8
  desc 'Install all required things for Karafka application in current directory'
@@ -20,7 +20,7 @@ module Karafka
20
20
 
21
21
  # Where should we map proper files from templates
22
22
  INSTALL_FILES_MAP = {
23
- 'app.rb.example' => Karafka.boot_file.basename,
23
+ 'karafka.rb.example' => Karafka.boot_file.basename,
24
24
  'sidekiq.yml.example' => 'config/sidekiq.yml.example',
25
25
  'application_worker.rb.example' => 'app/workers/application_worker.rb',
26
26
  'application_controller.rb.example' => 'app/controllers/application_controller.rb',
@@ -2,27 +2,30 @@
2
2
 
3
3
  module Karafka
4
4
  # Karafka framework Cli
5
- class Cli
5
+ class Cli < Thor
6
6
  # Server Karafka Cli action
7
7
  class Server < Base
8
8
  desc 'Start the Karafka server (short-cut alias: "s")'
9
9
  option aliases: 's'
10
10
  option :daemon, default: false, type: :boolean, aliases: :d
11
11
  option :pid, default: 'tmp/pids/karafka', type: :string, aliases: :p
12
+ option :consumer_groups, type: :array, default: nil, aliases: :g
12
13
 
13
14
  # Start the Karafka server
14
15
  def call
16
+ validate!
17
+
15
18
  puts 'Starting Karafka server'
16
19
  cli.info
17
20
 
18
21
  if cli.options[:daemon]
22
+ FileUtils.mkdir_p File.dirname(cli.options[:pid])
19
23
  # For some reason Celluloid spins threads that break forking
20
24
  # Threads are not shutdown immediately so deamonization will stale until
21
25
  # those threads are killed by Celluloid manager (via timeout)
22
26
  # There's nothing initialized here yet, so instead we shutdown celluloid
23
27
  # and run it again when we need (after fork)
24
28
  Celluloid.shutdown
25
- validate!
26
29
  daemonize
27
30
  Celluloid.boot
28
31
  end
@@ -30,17 +33,22 @@ module Karafka
30
33
  # Remove pidfile on shutdown
31
34
  ObjectSpace.define_finalizer(String.new, proc { send(:clean) })
32
35
 
36
+ # We assign active topics on a server level, as only server is expected to listen on
37
+ # part of the topics
38
+ Karafka::Server.consumer_groups = cli.options[:consumer_groups]
39
+
33
40
  # After we fork, we can boot celluloid again
34
41
  Karafka::Server.run
35
42
  end
36
43
 
37
44
  private
38
45
 
39
- # Prepare (if not exists) directory for a pidfile and check if there is no running karafka
40
- # instance already (and raise error if so)
46
+ # Checks the server cli configuration
47
+ # options validations in terms of app setup (topics, pid existence, etc)
41
48
  def validate!
42
- FileUtils.mkdir_p File.dirname(cli.options[:pid])
43
- raise "#{cli.options[:pid]} already exists" if File.exist?(cli.options[:pid])
49
+ result = Schemas::ServerCliOptions.call(cli.options)
50
+ return if result.success?
51
+ raise Errors::InvalidConfiguration, result.errors
44
52
  end
45
53
 
46
54
  # Detaches current process into background and writes its pidfile
@@ -2,7 +2,7 @@
2
2
 
3
3
  module Karafka
4
4
  # Karafka framework Cli
5
- class Cli
5
+ class Cli < Thor
6
6
  # Worker Karafka Cli action
7
7
  class Worker < Base
8
8
  desc 'Start the Karafka Sidekiq worker (short-cut alias: "w")'
@@ -41,13 +41,7 @@ module Karafka
41
41
  # @return [Hash] hash with all the settings required by Kafka#consumer method
42
42
  def consumer(consumer_group)
43
43
  settings = { group_id: consumer_group.id }
44
-
45
- kafka_configs.each do |setting_name, setting_value|
46
- next unless AttributesMap.config_adapter[:consumer].include?(setting_name)
47
- next if settings.keys.include?(setting_name)
48
- settings[setting_name] = setting_value
49
- end
50
-
44
+ settings = fetch_for(:consumer, settings)
51
45
  sanitize(settings)
52
46
  end
53
47
 
@@ -57,15 +51,7 @@ module Karafka
57
51
  # @return [Hash] hash with all the settings required by
58
52
  # Kafka::Consumer#consume_each_message and Kafka::Consumer#consume_each_batch method
59
53
  def consuming(_consumer_group)
60
- settings = {}
61
-
62
- kafka_configs.each do |setting_name, setting_value|
63
- next unless AttributesMap.config_adapter[:consuming].include?(setting_name)
64
- next if settings.keys.include?(setting_name)
65
- settings[setting_name] = setting_value
66
- end
67
-
68
- sanitize(settings)
54
+ sanitize(fetch_for(:consuming))
69
55
  end
70
56
 
71
57
  # Builds all the configuration settings for kafka consumer#subscribe method
@@ -73,18 +59,33 @@ module Karafka
73
59
  # @return [Hash] hash with all the settings required by kafka consumer#subscribe method
74
60
  def subscription(topic)
75
61
  settings = { start_from_beginning: topic.start_from_beginning }
62
+ settings = fetch_for(:subscription, settings)
63
+ [Karafka::App.config.topic_mapper.outgoing(topic.name), sanitize(settings)]
64
+ end
65
+
66
+ # Builds all the configuration settings required by kafka consumer#pause method
67
+ # @param consumer_group [Karafka::Routing::ConsumerGroup] consumer group details
68
+ # @return [Hash] hash with all the settings required to pause kafka consumer
69
+ def pausing(consumer_group)
70
+ { timeout: consumer_group.pause_timeout }
71
+ end
76
72
 
73
+ private
74
+
75
+ # Fetches proper settings for a given map namespace
76
+ # @param namespace_key [Symbol] namespace from attributes map config adapter hash
77
+ # @param preexisting_settings [Hash] hash with some preexisting settings that might have
78
+ # been loaded in a different way
79
+ def fetch_for(namespace_key, preexisting_settings = {})
77
80
  kafka_configs.each do |setting_name, setting_value|
78
- next unless AttributesMap.config_adapter[:subscription].include?(setting_name)
79
- next if settings.keys.include?(setting_name)
80
- settings[setting_name] = setting_value
81
+ next unless AttributesMap.config_adapter[namespace_key].include?(setting_name)
82
+ next if preexisting_settings.keys.include?(setting_name)
83
+ preexisting_settings[setting_name] = setting_value
81
84
  end
82
85
 
83
- [topic.name, sanitize(settings)]
86
+ preexisting_settings
84
87
  end
85
88
 
86
- private
87
-
88
89
  # Removes nil containing keys from the final settings so it can use Kafkas driver
89
90
  # defaults for those
90
91
  # @param settings [Hash] settings that may contain nil values
@@ -20,9 +20,19 @@ module Karafka
20
20
  def fetch_loop
21
21
  send(
22
22
  consumer_group.batch_consuming ? :consume_each_batch : :consume_each_message
23
- ) do |messages|
24
- yield(messages)
25
- end
23
+ ) { |messages| yield(messages) }
24
+ rescue Kafka::ProcessingError => e
25
+ # If there was an error during processing, we have to log it, pause current partition
26
+ # and process other things
27
+ Karafka.monitor.notice_error(self.class, e.cause)
28
+ pause(e.topic, e.partition)
29
+ retry
30
+ # This is on purpose - see the notes for this method
31
+ # rubocop:disable RescueException
32
+ rescue Exception => e
33
+ # rubocop:enable RescueException
34
+ Karafka.monitor.notice_error(self.class, e)
35
+ retry
26
36
  end
27
37
 
28
38
  # Gracefuly stops topic consumption
@@ -35,6 +45,16 @@ module Karafka
35
45
 
36
46
  attr_reader :consumer_group
37
47
 
48
+ # Pauses processing of a given topic partition
49
+ # @param topic [String] topic that we want to pause
50
+ # @param partition [Integer] number partition that we want to pause
51
+ def pause(topic, partition)
52
+ settings = ConfigAdapter.pausing(consumer_group)
53
+ return false unless settings[:timeout].positive?
54
+ kafka_consumer.pause(topic, partition, settings)
55
+ true
56
+ end
57
+
38
58
  # Consumes messages from Kafka in batches
39
59
  # @yieldparam [Array<Kafka::FetchedMessage>] kafka fetched messages
40
60
  def consume_each_batch
@@ -24,48 +24,29 @@ module Karafka
24
24
  controller = Karafka::Routing::Router.build("#{group_id}_#{mapped_topic}")
25
25
  handler = controller.topic.batch_processing ? :process_batch : :process_each
26
26
 
27
- send(handler, controller, mapped_topic, kafka_messages)
28
- # This is on purpose - see the notes for this method
29
- # rubocop:disable RescueException
30
- rescue Exception => e
31
- # rubocop:enable RescueException
32
- Karafka.monitor.notice_error(self, e)
27
+ send(handler, controller, kafka_messages)
33
28
  end
34
29
 
35
30
  private
36
31
 
37
32
  # Processes whole batch in one request (all at once)
38
33
  # @param controller [Karafka::BaseController] base controller descendant
39
- # @param mapped_topic [String] mapped topic name
40
34
  # @param kafka_messages [Array<Kafka::FetchedMessage>] raw messages from kafka
41
- def process_batch(controller, mapped_topic, kafka_messages)
42
- messages_batch = kafka_messages.map do |kafka_message|
43
- # Since we support topic mapping (for Kafka providers that require namespaces)
44
- # we have to overwrite topic with our mapped topic version
45
- # @note For the default mapper, it will be the same as topic
46
- # @note We have to use instance_variable_set, as the Kafka::FetchedMessage does not
47
- # provide attribute writers
48
- kafka_message.instance_variable_set(:'@topic', mapped_topic)
49
- kafka_message
50
- end
51
-
52
- controller.params_batch = messages_batch
53
-
54
- Karafka.monitor.notice(self, messages_batch)
55
-
35
+ def process_batch(controller, kafka_messages)
36
+ controller.params_batch = kafka_messages
37
+ Karafka.monitor.notice(self, kafka_messages)
56
38
  controller.schedule
57
39
  end
58
40
 
59
41
  # Processes messages one by one (like with std http requests)
60
42
  # @param controller [Karafka::BaseController] base controller descendant
61
- # @param mapped_topic [String] mapped topic name
62
43
  # @param kafka_messages [Array<Kafka::FetchedMessage>] raw messages from kafka
63
- def process_each(controller, mapped_topic, kafka_messages)
44
+ def process_each(controller, kafka_messages)
64
45
  kafka_messages.each do |kafka_message|
65
46
  # @note This is a simple trick - we just process one after another, but in order
66
47
  # not to handle everywhere both cases (single vs batch), we just "fake" batching with
67
48
  # a single message for each
68
- process_batch(controller, mapped_topic, [kafka_message])
49
+ process_batch(controller, [kafka_message])
69
50
  end
70
51
  end
71
52
  end
@@ -41,5 +41,9 @@ module Karafka
41
41
  # Raised when processing messages in batches but still want to use #params instead of
42
42
  # #params_batch
43
43
  ParamsMethodUnavailable = Class.new(BaseError)
44
+
45
+ # Raised when for some reason we try to use invalid processing backend and
46
+ # we bypass validations
47
+ InvalidProcessingBackend = Class.new(BaseError)
44
48
  end
45
49
  end
@@ -26,7 +26,7 @@ module Karafka
26
26
 
27
27
  # @return [Array<Karafka::Connection::Listener>] listeners that will consume messages
28
28
  def listeners
29
- @listeners ||= App.consumer_groups.map do |consumer_group|
29
+ @listeners ||= App.consumer_groups.active.map do |consumer_group|
30
30
  Karafka::Connection::Listener.new(consumer_group)
31
31
  end
32
32
  end
@@ -2,78 +2,28 @@
2
2
 
3
3
  module Karafka
4
4
  # Loader for requiring all the files in a proper order
5
- # Some files needs to be required before other, so it will
6
- # try to figure it out. It will load 'base*' files first and then
7
- # any other.
8
- class Loader
5
+ module Loader
9
6
  # Order in which we want to load app files
10
7
  DIRS = %w[
11
8
  config/initializers
12
9
  lib
13
- app/helpers
14
- app/inputs
15
- app/decorators
16
- app/models/concerns
17
- app/models
18
- app/responders
19
- app/services
20
- app/presenters
21
- app/workers
22
- app/controllers
23
- app/aspects
24
10
  app
25
11
  ].freeze
26
12
 
27
13
  # Will load files in a proper order (based on DIRS)
28
14
  # @param [String] root path from which we want to start
29
- def load(root)
15
+ def self.load(root)
30
16
  DIRS.each do |dir|
31
17
  path = File.join(root, dir)
18
+ next unless File.exist?(path)
32
19
  load!(path)
33
20
  end
34
21
  end
35
22
 
36
23
  # Requires all the ruby files from one path in a proper order
37
24
  # @param path [String] path (dir) from which we want to load ruby files in a proper order
38
- # @note First we load all the base files that might be used in inheritance
39
- def load!(path)
40
- base_load!(path)
41
- files_load!(path)
42
- end
43
-
44
- # Requires all the ruby files from one relative path inside application directory
45
- # @param relative_path [String] relative path (dir) to a file inside application directory
46
- # from which we want to load ruby files in a proper order
47
- def relative_load!(relative_path)
48
- path = File.join(::Karafka.root, relative_path)
49
- load!(path)
50
- end
51
-
52
- private
53
-
54
- # Loads all the base files
55
- # @param path [String] path (dir) from which we want to load ruby base files in a proper order
56
- def base_load!(path)
57
- bases = File.join(path, '**/base*.rb')
58
- Dir[bases].sort(&method(:base_sorter)).each(&method(:require))
59
- end
60
-
61
- # Loads all other files (not base)
62
- # @param path [String] path (dir) from which we want to load ruby files in a proper order
63
- # @note Technically it will load the base files again but they are already loaded so nothing
64
- # will happen
65
- def files_load!(path)
66
- files = File.join(path, '**/*.rb')
67
- Dir[files].sort.each(&method(:require))
68
- end
69
-
70
- # @return [Integer] order for sorting
71
- # @note We need sort all base files based on their position in a file tree
72
- # so all the files that are "higher" should be loaded first
73
- # @param str1 [String] first string for comparison
74
- # @param str2 [String] second string for comparison
75
- def base_sorter(str1, str2)
76
- str1.count('/') <=> str2.count('/')
25
+ def self.load!(path)
26
+ require_all(path)
77
27
  end
78
28
  end
79
29
  end
@@ -55,7 +55,7 @@ module Karafka
55
55
  def caller_exceptions_map
56
56
  @caller_exceptions_map ||= {
57
57
  error: [
58
- Karafka::Connection::MessagesProcessor,
58
+ Karafka::Connection::MessagesConsumer,
59
59
  Karafka::Connection::Listener,
60
60
  Karafka::Params::Params
61
61
  ],
@@ -10,7 +10,6 @@ module Karafka
10
10
  class Params < HashWithIndifferentAccess
11
11
  # Kafka::FetchedMessage attributes that we want to use inside of params
12
12
  KAFKA_MESSAGE_ATTRIBUTES = %i[
13
- topic
14
13
  value
15
14
  partition
16
15
  offset
@@ -44,6 +43,11 @@ module Karafka
44
43
  KAFKA_MESSAGE_ATTRIBUTES.each do |attribute|
45
44
  instance[attribute] = message.send(attribute)
46
45
  end
46
+
47
+ # When we get raw messages, they might have a topic, that was modified by a
48
+ # topic mapper. We need to "reverse" this change and map back to the non-modified
49
+ # format, so our internal flow is not corrupted with the mapping
50
+ instance[:topic] = Karafka::App.config.topic_mapper.incoming(message.topic)
47
51
  end
48
52
  end
49
53
  end
@@ -10,8 +10,8 @@ module Karafka
10
10
  # @example
11
11
  # Json.parse("{\"a\":1}") #=> { 'a' => 1 }
12
12
  def self.parse(content)
13
- ::Yajl::Parser.parse(content)
14
- rescue ::Yajl::ParseError => e
13
+ ::MultiJson.load(content)
14
+ rescue ::MultiJson::ParseError => e
15
15
  raise ::Karafka::Errors::ParserError, e
16
16
  end
17
17
 
@@ -32,6 +32,13 @@ module Karafka
32
32
  end
33
33
  end
34
34
 
35
+ # @return [Array<Karafka::Routing::ConsumerGroup>] only active consumer groups that
36
+ # we want to use. Since Karafka supports multi-process setup, we need to be able
37
+ # to pick only those consumer groups that should be active in our given process context
38
+ def active
39
+ select(&:active?)
40
+ end
41
+
35
42
  private
36
43
 
37
44
  # Builds and saves given consumer group
@@ -10,16 +10,23 @@ module Karafka
10
10
 
11
11
  attr_reader :topics
12
12
  attr_reader :id
13
+ attr_reader :name
13
14
 
14
- # @param id [String, Symbol] raw id of this consumer group. Raw means, that it does not
15
+ # @param name [String, Symbol] raw name of this consumer group. Raw means, that it does not
15
16
  # yet have an application client_id namespace, this will be added here by default.
16
17
  # We add it to make a multi-system development easier for people that don't use
17
18
  # kafka and don't understand the concept of consumer groups.
18
- def initialize(id)
19
- @id = "#{Karafka::App.config.client_id.to_s.underscore}_#{id}"
19
+ def initialize(name)
20
+ @name = name
21
+ @id = "#{Karafka::App.config.client_id.to_s.underscore}_#{@name}"
20
22
  @topics = []
21
23
  end
22
24
 
25
+ # @return [Boolean] true if this consumer group should be active in our current process
26
+ def active?
27
+ Karafka::Server.consumer_groups.include?(name)
28
+ end
29
+
23
30
  # Builds a topic representation inside of a current consumer group route
24
31
  # @param name [String, Symbol] name of topic to which we want to subscribe
25
32
  # @yield Evaluates a given block in a topic context
@@ -36,7 +36,7 @@ module Karafka
36
36
  # background job
37
37
  # @note If not provided - will be built based on the provided controller
38
38
  def worker
39
- @worker ||= inline_processing ? nil : Karafka::Workers::Builder.new(controller).build
39
+ @worker ||= processing_backend == :sidekiq ? Workers::Builder.new(controller).build : nil
40
40
  end
41
41
 
42
42
  # @return [Class, nil] Class (not an instance) of a responder that should respond from
@@ -20,11 +20,11 @@ module Karafka
20
20
  end
21
21
  end
22
22
 
23
- optional(:inline_processing).filled(:bool?)
23
+ optional(:processing_backend).filled(included_in?: %i[inline sidekiq])
24
24
 
25
- # If inline_processing is true, redis should be filled
26
- rule(redis_presence: %i[redis inline_processing]) do |redis, inline_processing|
27
- inline_processing.false?.then(redis.filled?)
25
+ # If we want to use sidekiq, then redis needs to be configured
26
+ rule(redis_presence: %i[redis processing_backend]) do |redis, processing_backend|
27
+ processing_backend.eql?(:sidekiq).then(redis.filled?)
28
28
  end
29
29
 
30
30
  optional(:connection_pool).schema do
@@ -6,7 +6,7 @@ module Karafka
6
6
  ConsumerGroupTopic = Dry::Validation.Schema do
7
7
  required(:id).filled(:str?, format?: Karafka::Schemas::TOPIC_REGEXP)
8
8
  required(:name).filled(:str?, format?: Karafka::Schemas::TOPIC_REGEXP)
9
- required(:inline_processing).filled(:bool?)
9
+ required(:processing_backend).filled(included_in?: %i[inline sidekiq])
10
10
  required(:controller).filled
11
11
  required(:parser).filled
12
12
  required(:interchanger).filled
@@ -20,11 +20,11 @@ module Karafka
20
20
  required(:id).filled(:str?, format?: Karafka::Schemas::TOPIC_REGEXP)
21
21
  required(:seed_brokers).filled(:array?)
22
22
  required(:session_timeout).filled(:int?)
23
+ required(:pause_timeout).filled(:int?, gteq?: 0)
23
24
  required(:offset_commit_interval).filled(:int?)
24
25
  required(:offset_commit_threshold).filled(:int?)
25
26
  required(:offset_retention_time) { none?.not > int? }
26
27
  required(:heartbeat_interval).filled(:int?, gteq?: 0)
27
- required(:topic_mapper).filled
28
28
  required(:connect_timeout).filled(:int?, gt?: 0)
29
29
  required(:socket_timeout).filled(:int?, gt?: 0)
30
30
  required(:max_wait_time).filled(:int?, gteq?: 0)
@@ -0,0 +1,43 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ module Schemas
5
+ # Schema for validating correctness of the server cli command options
6
+ # We validate some basics + the list of consumer_groups on which we want to use, to make
7
+ # sure that all of them are defined, plus that a pidfile does not exist
8
+ ServerCliOptions = Dry::Validation.Schema do
9
+ configure do
10
+ option :consumer_groups
11
+
12
+ def self.messages
13
+ super.merge(
14
+ en: {
15
+ errors: {
16
+ consumer_groups_inclusion: 'Unknown consumer group.',
17
+ pid_existence: 'Pidfile already exists.'
18
+ }
19
+ }
20
+ )
21
+ end
22
+ end
23
+
24
+ optional(:pid).filled(:str?)
25
+ optional(:daemon).filled(:bool?)
26
+ optional(:consumer_groups).filled(:array?)
27
+
28
+ validate(consumer_groups_inclusion: :consumer_groups) do |consumer_groups|
29
+ # If there were no consumer_groups declared in the server cli, it means that we will
30
+ # run all of them and no need to validate them here at all
31
+ if consumer_groups.nil?
32
+ true
33
+ else
34
+ (consumer_groups - Karafka::Routing::Builder.instance.map(&:name)).empty?
35
+ end
36
+ end
37
+
38
+ validate(pid_existence: :pid) do |pid|
39
+ pid ? !File.exist?(pid) : true
40
+ end
41
+ end
42
+ end
43
+ end
@@ -8,6 +8,9 @@ module Karafka
8
8
  # So we can have access to them later on and be able to stop them on exit
9
9
  attr_reader :consumers
10
10
 
11
+ # Writer for list of consumer groups that we want to consume in our current process context
12
+ attr_writer :consumer_groups
13
+
11
14
  # Method which runs app
12
15
  def run
13
16
  @consumers = Concurrent::Array.new
@@ -17,6 +20,13 @@ module Karafka
17
20
  start_supervised
18
21
  end
19
22
 
23
+ # @return [Array<String>] array with names of consumer groups that should be consumed in a
24
+ # current server context
25
+ def consumer_groups
26
+ # If not specified, a server will listed on all the topics
27
+ @consumer_groups ||= Karafka::App.consumer_groups.map(&:name).freeze
28
+ end
29
+
20
30
  private
21
31
 
22
32
  # @return [Karafka::Process] process wrapper instance used to catch system signal calls
@@ -18,9 +18,9 @@ module Karafka
18
18
  # option client_id [String] kafka client_id - used to provide
19
19
  # default Kafka groups namespaces and identify that app in kafka
20
20
  setting :client_id
21
- # If inline_processing is set to true, we won't enqueue jobs, instead we will run them
22
- # immediately
23
- setting :inline_processing, false
21
+ # How should we process messages. For now we support inline mode (asap in the process) or
22
+ # sidekiq mode (schedule to sidekiq)
23
+ setting :processing_backend, :inline
24
24
  # option logger [Instance] logger that we want to use
25
25
  setting :logger, -> { ::Karafka::Logger.instance }
26
26
  # option monitor [Instance] monitor that we will to use (defaults to Karafka::Monitor)
@@ -51,7 +51,7 @@ module Karafka
51
51
  # or the opposite for bigger systems
52
52
  setting :size, lambda {
53
53
  [
54
- ::Karafka::App.consumer_groups.count,
54
+ ::Karafka::App.consumer_groups.active.count,
55
55
  Sidekiq.options[:concurrency]
56
56
  ].max
57
57
  }
@@ -67,6 +67,11 @@ module Karafka
67
67
  # option session_timeout [Integer] the number of seconds after which, if a client
68
68
  # hasn't contacted the Kafka cluster, it will be kicked out of the group.
69
69
  setting :session_timeout, 30
70
+ # Time that a given partition will be paused from processing messages, when message
71
+ # processing fails. It allows us to process other partitions, while the error is being
72
+ # resolved and also "slows" things down, so it prevents from "eating" up all messages and
73
+ # processing them with failed code
74
+ setting :pause_timeout, 10
70
75
  # option offset_commit_interval [Integer] the interval between offset commits,
71
76
  # in seconds.
72
77
  setting :offset_commit_interval, 10
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  # Application controller from which all Karafka controllers should inherit
2
4
  # You can rename it if it would conflict with your current code base (in case you're integrating
3
5
  # Karafka with other frameworks)
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  # Application responder from which all Karafka responders should inherit
2
4
  # You can rename it if it would conflict with your current code base (in case you're integrating
3
5
  # Karafka with other frameworks)
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  # Application worker from which all workers should inherit
2
4
  # You can rename it if it would conflict with your current code base (in case you're integrating
3
5
  # Karafka with other frameworks). Karafka will use first direct descendant of Karafka::BaseWorker
@@ -1,19 +1,26 @@
1
+ # frozen_string_literal: true
2
+
3
+ # Non Ruby on Rails setup
1
4
  ENV['RACK_ENV'] ||= 'development'
2
5
  ENV['KARAFKA_ENV'] ||= ENV['RACK_ENV']
3
-
4
6
  Bundler.require(:default, ENV['KARAFKA_ENV'])
7
+ Karafka::Loader.load(Karafka::App.root)
5
8
 
6
- Karafka::Loader.new.load(Karafka::App.root)
9
+ # Ruby on Rails setup
10
+ # Remove whole non-Rails setup that is above and uncomment the 4 lines below
11
+ # ENV['RAILS_ENV'] ||= 'development'
12
+ # ENV['KARAFKA_ENV'] = ENV['RAILS_ENV']
13
+ # require ::File.expand_path('../config/environment', __FILE__)
14
+ # Rails.application.eager_load!
7
15
 
8
- # App class
9
- class App < Karafka::App
16
+ class KarafkaApp < Karafka::App
10
17
  setup do |config|
11
18
  config.kafka.seed_brokers = %w( 127.0.0.1:9092 )
12
19
  config.client_id = 'example_app'
13
20
  config.redis = {
14
21
  url: 'redis://localhost:6379'
15
22
  }
16
- config.inline_processing = Karafka.env.development?
23
+ config.processing_backend = Karafka.env.development? ? :inline : :sidekiq
17
24
  config.batch_consuming = true
18
25
  end
19
26
 
@@ -35,4 +42,4 @@ class App < Karafka::App
35
42
  end
36
43
  end
37
44
 
38
- App.boot!
45
+ KarafkaApp.boot!
@@ -3,5 +3,5 @@
3
3
  # Main module namespace
4
4
  module Karafka
5
5
  # Current Karafka version
6
- VERSION = '0.6.0.rc1'
6
+ VERSION = '0.6.0.rc2'
7
7
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: karafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.6.0.rc1
4
+ version: 0.6.0.rc2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Maciej Mensfeld
@@ -10,7 +10,7 @@ authors:
10
10
  autorequire:
11
11
  bindir: bin
12
12
  cert_chain: []
13
- date: 2017-08-15 00:00:00.000000000 Z
13
+ date: 2017-08-23 00:00:00.000000000 Z
14
14
  dependencies:
15
15
  - !ruby/object:Gem::Dependency
16
16
  name: ruby-kafka
@@ -153,75 +153,33 @@ dependencies:
153
153
  - !ruby/object:Gem::Version
154
154
  version: '0.7'
155
155
  - !ruby/object:Gem::Dependency
156
- name: yajl-ruby
156
+ name: multi_json
157
157
  requirement: !ruby/object:Gem::Requirement
158
158
  requirements:
159
159
  - - ">="
160
160
  - !ruby/object:Gem::Version
161
- version: 1.3.0
161
+ version: '1.12'
162
162
  type: :runtime
163
163
  prerelease: false
164
164
  version_requirements: !ruby/object:Gem::Requirement
165
165
  requirements:
166
166
  - - ">="
167
167
  - !ruby/object:Gem::Version
168
- version: 1.3.0
168
+ version: '1.12'
169
169
  - !ruby/object:Gem::Dependency
170
- name: bundler
170
+ name: require_all
171
171
  requirement: !ruby/object:Gem::Requirement
172
- requirements:
173
- - - "~>"
174
- - !ruby/object:Gem::Version
175
- version: '1.2'
176
- type: :development
177
- prerelease: false
178
- version_requirements: !ruby/object:Gem::Requirement
179
- requirements:
180
- - - "~>"
181
- - !ruby/object:Gem::Version
182
- version: '1.2'
183
- - !ruby/object:Gem::Dependency
184
- name: rspec
185
- requirement: !ruby/object:Gem::Requirement
186
- requirements:
187
- - - ">="
188
- - !ruby/object:Gem::Version
189
- version: '3.6'
190
- type: :development
191
- prerelease: false
192
- version_requirements: !ruby/object:Gem::Requirement
193
172
  requirements:
194
173
  - - ">="
195
174
  - !ruby/object:Gem::Version
196
- version: '3.6'
197
- - !ruby/object:Gem::Dependency
198
- name: simplecov
199
- requirement: !ruby/object:Gem::Requirement
200
- requirements:
201
- - - ">="
202
- - !ruby/object:Gem::Version
203
- version: '0.14'
204
- type: :development
205
- prerelease: false
206
- version_requirements: !ruby/object:Gem::Requirement
207
- requirements:
208
- - - ">="
209
- - !ruby/object:Gem::Version
210
- version: '0.14'
211
- - !ruby/object:Gem::Dependency
212
- name: byebug
213
- requirement: !ruby/object:Gem::Requirement
214
- requirements:
215
- - - ">="
216
- - !ruby/object:Gem::Version
217
- version: '0'
218
- type: :development
175
+ version: '1.4'
176
+ type: :runtime
219
177
  prerelease: false
220
178
  version_requirements: !ruby/object:Gem::Requirement
221
179
  requirements:
222
180
  - - ">="
223
181
  - !ruby/object:Gem::Version
224
- version: '0'
182
+ version: '1.4'
225
183
  description: Framework used to simplify Apache Kafka based Ruby applications development
226
184
  email:
227
185
  - maciej@coditsu.io
@@ -292,6 +250,7 @@ files:
292
250
  - lib/karafka/schemas/config.rb
293
251
  - lib/karafka/schemas/consumer_group.rb
294
252
  - lib/karafka/schemas/responder_usage.rb
253
+ - lib/karafka/schemas/server_cli_options.rb
295
254
  - lib/karafka/server.rb
296
255
  - lib/karafka/setup/config.rb
297
256
  - lib/karafka/setup/configurators/base.rb
@@ -299,10 +258,10 @@ files:
299
258
  - lib/karafka/setup/configurators/sidekiq.rb
300
259
  - lib/karafka/setup/configurators/water_drop.rb
301
260
  - lib/karafka/status.rb
302
- - lib/karafka/templates/app.rb.example
303
261
  - lib/karafka/templates/application_controller.rb.example
304
262
  - lib/karafka/templates/application_responder.rb.example
305
263
  - lib/karafka/templates/application_worker.rb.example
264
+ - lib/karafka/templates/karafka.rb.example
306
265
  - lib/karafka/templates/sidekiq.yml.example
307
266
  - lib/karafka/version.rb
308
267
  - lib/karafka/workers/builder.rb