karafka 2.0.20 → 2.0.22

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c3b857c4c930396d4cac4682e4350c4c5fdc2888128dfade3f06e21a02e90c29
4
- data.tar.gz: 4f8fdb66df24164ec65886d5b4c4ba766dda9c7d4d23a7a301920a8adb21e16e
3
+ metadata.gz: 6a096a44a1a2988ff394215b8d63f6f2e33b2a3e5100f68f75fcf33eeda4b490
4
+ data.tar.gz: 2af0875b550f37ef9ea47b2a3011a3ed3e535adc3f6632ab10510d91bc69590c
5
5
  SHA512:
6
- metadata.gz: d4ed9d036f7dae85b3ce2f9c45de64eed8a8da110f0a826baf100ec9c90dc7e7dbaf5c74f89b472eb89ee7d3bdc9e265fee259621d93e21c972776fa3f682cc0
7
- data.tar.gz: aed5fcd7447be757cd2ca641d1d4f6029db11296ac6b4012b4a02047126f27be4aef8d42da700f736c4f70eba2b0aff252a4ccf332fb6c8b88e6a911e3432171
6
+ metadata.gz: a81cf3305482e0e925f3a5aa48d978585766826e0990081d74f5f56bc535ae0a8a6be3a4ec986e6a7702d70d0d207ca9acc71b7aeb3909557b44518413ff9bc6
7
+ data.tar.gz: 6c66140b258135367a120a5dac4eb8a08f4dc94e24fb3a714b32ec6ad909bc9f85bf5e160aefe5f5d307c9d554c6ecab61cbdebf315ce6953476303e9783122c
checksums.yaml.gz.sig CHANGED
Binary file
@@ -101,6 +101,9 @@ jobs:
101
101
  - name: Install package dependencies
102
102
  run: "[ -e $APT_DEPS ] || sudo apt-get install -y --no-install-recommends $APT_DEPS"
103
103
 
104
+ - name: Remove libzstd-dev to check no supported compressions
105
+ run: sudo apt-get -y remove libzstd-dev
106
+
104
107
  - name: Start Kafka with docker-compose
105
108
  run: |
106
109
  docker-compose up -d
@@ -124,5 +127,8 @@ jobs:
124
127
  - name: Run integration tests
125
128
  env:
126
129
  KARAFKA_PRO_LICENSE_TOKEN: ${{ secrets.KARAFKA_PRO_LICENSE_TOKEN }}
130
+ KARAFKA_PRO_USERNAME: ${{ secrets.KARAFKA_PRO_USERNAME }}
131
+ KARAFKA_PRO_PASSWORD: ${{ secrets.KARAFKA_PRO_PASSWORD }}
132
+ KARAFKA_PRO_VERSION: ${{ secrets.KARAFKA_PRO_VERSION }}
127
133
  GITHUB_COVERAGE: ${{matrix.coverage}}
128
134
  run: bin/integrations
data/CHANGELOG.md CHANGED
@@ -1,5 +1,22 @@
1
1
  # Karafka framework changelog
2
2
 
3
+ ## 2.0.22 (2022-12-02)
4
+ - [Improvement] Load Pro components upon Karafka require so they can be altered prior to setup.
5
+ - [Improvement] Do not run LRJ jobs that were added to the jobs queue but were revoked meanwhile.
6
+ - [Improvement] Allow running particular named subscription groups similar to consumer groups.
7
+ - [Improvement] Allow running particular topics similar to consumer groups.
8
+ - [Improvement] Raise configuration error when trying to run Karafka with options leading to no subscriptions.
9
+ - [Fix] Fix `karafka info` subscription groups count reporting as it was misleading.
10
+ - [Fix] Allow for defining subscription groups with symbols similar to consumer groups and topics to align the API.
11
+ - [Fix] Do not allow for an explicit `nil` as a `subscription_group` block argument.
12
+ - [Fix] Fix instability in subscription groups static members ids when using `--consumer_groups` CLI flag.
13
+ - [Fix] Fix a case in routing, where anonymous subscription group could not be used inside of a consumer group.
14
+ - [Fix] Fix a case where shutdown prior to listeners build would crash the server initialization.
15
+ - [Fix] Duplicated logs in development environment for Rails when logger set to `$stdout`.
16
+
17
+ ## 20.0.21 (2022-11-25)
18
+ - [Improvement] Make revocation jobs for LRJ topics non-blocking to prevent blocking polling when someone uses non-revocation aware LRJ jobs and revocation happens.
19
+
3
20
  ## 2.0.20 (2022-11-24)
4
21
  - [Improvement] Support `group.instance.id` assignment (static group membership) for a case where a single consumer group has multiple subscription groups (#1173).
5
22
 
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (2.0.20)
4
+ karafka (2.0.22)
5
5
  karafka-core (>= 2.0.4, < 3.0.0)
6
6
  rdkafka (>= 0.12)
7
7
  thor (>= 0.20)
@@ -79,4 +79,4 @@ DEPENDENCIES
79
79
  simplecov
80
80
 
81
81
  BUNDLED WITH
82
- 2.3.24
82
+ 2.3.26
data/README.md CHANGED
@@ -13,7 +13,7 @@ Karafka is a Ruby and Rails multi-threaded efficient Kafka processing framework
13
13
  - Supports parallel processing in [multiple threads](https://karafka.io/docs/Concurrency-and-multithreading) (also for a [single topic partition](https://karafka.io/docs/Pro-Virtual-Partitions) work)
14
14
  - [Automatically integrates](https://karafka.io/docs/Integrating-with-Ruby-on-Rails-and-other-frameworks#integrating-with-ruby-on-rails) with Ruby on Rails
15
15
  - Has [ActiveJob backend](https://karafka.io/docs/Active-Job) support (including [ordered jobs](https://karafka.io/docs/Pro-Enhanced-Active-Job#ordered-jobs))
16
- - Has a seamless [Dead Letter Queue](karafka.io/docs/Dead-Letter-Queue/) functionality built-in
16
+ - Has a seamless [Dead Letter Queue](https://karafka.io/docs/Dead-Letter-Queue/) functionality built-in
17
17
  - Supports in-development [code reloading](https://karafka.io/docs/Auto-reload-of-code-changes-in-development)
18
18
  - Is powered by [librdkafka](https://github.com/edenhill/librdkafka) (the Apache Kafka C/C++ client library)
19
19
  - Has an out-of the box [StatsD/DataDog monitoring](https://karafka.io/docs/Monitoring-and-logging) with a dashboard template.
data/bin/integrations CHANGED
@@ -74,8 +74,8 @@ class Scenario
74
74
  def type
75
75
  scenario_dir = File.dirname(@path)
76
76
 
77
- return :poro if scenario_dir.end_with?('_poro')
78
- return :pristine if scenario_dir.end_with?('_pristine')
77
+ return :poro if scenario_dir.include?('_poro')
78
+ return :pristine if scenario_dir.include?('_pristine')
79
79
 
80
80
  :regular
81
81
  end
data/bin/rspecs CHANGED
@@ -1,4 +1,6 @@
1
1
  #!/usr/bin/env bash
2
2
 
3
+ set -e
4
+
3
5
  SPECS_TYPE=regular bundle exec rspec --tag ~type:pro
4
6
  SPECS_TYPE=pro bundle exec rspec --tag type:pro
data/config/errors.yml CHANGED
@@ -25,7 +25,10 @@ en:
25
25
 
26
26
  server_cli_options:
27
27
  missing: needs to be present
28
- consumer_groups_inclusion: Unknown consumer group
28
+ consumer_groups_inclusion: Unknown consumer group name
29
+ subscription_groups_inclusion: Unknown subscription group name
30
+ topics_inclusion: Unknown topic name
31
+ topics_missing: No topics to subscribe to
29
32
 
30
33
  topic:
31
34
  missing: needs to be present
@@ -34,7 +37,7 @@ en:
34
37
  consumer_format: needs to be present
35
38
  id_format: 'needs to be a string with a Kafka accepted format'
36
39
  initial_offset_format: needs to be either earliest or latest
37
- subscription_group_format: must be nil or a non-empty string
40
+ subscription_group_format: must be a non-empty string
38
41
  manual_offset_management.active_format: needs to be either true or false
39
42
  consumer_active_job_missing: ActiveJob needs to be available
40
43
  manual_offset_management_must_be_enabled: cannot be disabled for ActiveJob topics
data/karafka.gemspec CHANGED
@@ -12,9 +12,14 @@ Gem::Specification.new do |spec|
12
12
  spec.authors = ['Maciej Mensfeld']
13
13
  spec.email = %w[contact@karafka.io]
14
14
  spec.homepage = 'https://karafka.io'
15
- spec.summary = 'Efficient Kafka processing framework for Ruby and Rails'
16
- spec.description = 'Framework used to simplify Apache Kafka based Ruby applications development'
17
15
  spec.licenses = ['LGPL-3.0', 'Commercial']
16
+ spec.summary = 'Karafka is Ruby and Rails efficient Kafka processing framework.'
17
+ spec.description = <<-DESC
18
+ Karafka is Ruby and Rails efficient Kafka processing framework.
19
+
20
+ Karafka allows you to capture everything that happens in your systems in large scale,
21
+ without having to focus on things that are not your business domain.
22
+ DESC
18
23
 
19
24
  spec.add_dependency 'karafka-core', '>= 2.0.4', '< 3.0.0'
20
25
  spec.add_dependency 'rdkafka', '>= 0.12'
data/lib/karafka/app.rb CHANGED
@@ -16,9 +16,19 @@ module Karafka
16
16
 
17
17
  # @return [Hash] active subscription groups grouped based on consumer group in a hash
18
18
  def subscription_groups
19
+ # We first build all the subscription groups, so they all get the same position, despite
20
+ # later narrowing that. It allows us to maintain same position number for static members
21
+ # even when we want to run subset of consumer groups or subscription groups
22
+ #
23
+ # We then narrow this to active consumer groups from which we select active subscription
24
+ # groups.
19
25
  consumer_groups
20
- .active
21
- .map { |consumer_group| [consumer_group, consumer_group.subscription_groups] }
26
+ .map { |cg| [cg, cg.subscription_groups] }
27
+ .select { |cg, _| cg.active? }
28
+ .select { |_, sgs| sgs.delete_if { |sg| !sg.active? } }
29
+ .delete_if { |_, sgs| sgs.empty? }
30
+ .each { |_, sgs| sgs.each { |sg| sg.topics.delete_if { |top| !top.active? } } }
31
+ .each { |_, sgs| sgs.delete_if { |sg| sg.topics.empty? } }
22
32
  .to_h
23
33
  end
24
34
 
@@ -62,14 +62,8 @@ module Karafka
62
62
  # that may not yet kick in when error occurs. That way we pause always on the last processed
63
63
  # message.
64
64
  def on_consume
65
- Karafka.monitor.instrument('consumer.consumed', caller: self) do
66
- consume
67
- end
68
-
69
- coordinator.consumption(self).success!
65
+ handle_consume
70
66
  rescue StandardError => e
71
- coordinator.consumption(self).failure!(e)
72
-
73
67
  Karafka.monitor.instrument(
74
68
  'error.occurred',
75
69
  error: e,
@@ -77,9 +71,6 @@ module Karafka
77
71
  seek_offset: coordinator.seek_offset,
78
72
  type: 'consumer.consume.error'
79
73
  )
80
- ensure
81
- # We need to decrease number of jobs that this coordinator coordinates as it has finished
82
- coordinator.decrement
83
74
  end
84
75
 
85
76
  # @private
@@ -37,7 +37,8 @@ module Karafka
37
37
  "Karafka version: #{Karafka::VERSION}#{postfix}",
38
38
  "Ruby version: #{RUBY_DESCRIPTION}",
39
39
  "Rdkafka version: #{::Rdkafka::VERSION}",
40
- "Subscription groups count: #{Karafka::App.subscription_groups.size}",
40
+ "Consumer groups count: #{Karafka::App.consumer_groups.size}",
41
+ "Subscription groups count: #{Karafka::App.subscription_groups.values.flatten.size}",
41
42
  "Workers count: #{Karafka::App.config.concurrency}",
42
43
  "Application client id: #{config.client_id}",
43
44
  "Boot file: #{Karafka.boot_file}",
@@ -9,18 +9,19 @@ module Karafka
9
9
 
10
10
  desc 'Start the Karafka server (short-cut alias: "s")'
11
11
  option aliases: 's'
12
- option :consumer_groups, type: :array, default: nil, aliases: :g
12
+ option :consumer_groups, type: :array, default: [], aliases: :g
13
+ option :subscription_groups, type: :array, default: []
14
+ option :topics, type: :array, default: []
13
15
 
14
16
  # Start the Karafka server
15
17
  def call
16
18
  # Print our banner and info in the dev mode
17
19
  print_marketing_info if Karafka::App.env.development?
18
20
 
19
- Contracts::ServerCliOptions.new.validate!(cli.options)
20
-
21
- # We assign active topics on a server level, as only server is expected to listen on
22
- # part of the topics
23
- Karafka::Server.consumer_groups = cli.options[:consumer_groups]
21
+ active_routing_config = Karafka::App.config.internal.routing.active
22
+ active_routing_config.consumer_groups = cli.options[:consumer_groups]
23
+ active_routing_config.subscription_groups = cli.options[:subscription_groups]
24
+ active_routing_config.topics = cli.options[:topics]
24
25
 
25
26
  Karafka::Server.run
26
27
  end
@@ -12,7 +12,9 @@ module Karafka
12
12
  ).fetch('en').fetch('validations').fetch('server_cli_options')
13
13
  end
14
14
 
15
- optional(:consumer_groups) { |cg| cg.is_a?(Array) && !cg.empty? }
15
+ optional(:consumer_groups) { |cg| cg.is_a?(Array) }
16
+ optional(:subscription_groups) { |sg| sg.is_a?(Array) }
17
+ optional(:topics) { |topics| topics.is_a?(Array) }
16
18
 
17
19
  virtual do |data, errors|
18
20
  next unless errors.empty?
@@ -22,11 +24,66 @@ module Karafka
22
24
 
23
25
  # If there were no consumer_groups declared in the server cli, it means that we will
24
26
  # run all of them and no need to validate them here at all
25
- next if value.nil?
26
- next if (value - Karafka::App.config.internal.routing.builder.map(&:name)).empty?
27
+ next if value.empty?
28
+ next if (value - Karafka::App.consumer_groups.map(&:name)).empty?
27
29
 
30
+ # Found unknown consumer groups
28
31
  [[%i[consumer_groups], :consumer_groups_inclusion]]
29
32
  end
33
+
34
+ virtual do |data, errors|
35
+ next unless errors.empty?
36
+ next unless data.key?(:subscription_groups)
37
+
38
+ value = data.fetch(:subscription_groups)
39
+
40
+ # If there were no subscription_groups declared in the server cli, it means that we will
41
+ # run all of them and no need to validate them here at all
42
+ next if value.empty?
43
+
44
+ subscription_groups = Karafka::App
45
+ .consumer_groups
46
+ .map(&:subscription_groups)
47
+ .flatten
48
+ .map(&:name)
49
+
50
+ next if (value - subscription_groups).empty?
51
+
52
+ # Found unknown subscription groups
53
+ [[%i[subscription_groups], :subscription_groups_inclusion]]
54
+ end
55
+
56
+ virtual do |data, errors|
57
+ next unless errors.empty?
58
+ next unless data.key?(:topics)
59
+
60
+ value = data.fetch(:topics)
61
+
62
+ # If there were no topics declared in the server cli, it means that we will
63
+ # run all of them and no need to validate them here at all
64
+ next if value.empty?
65
+
66
+ topics = Karafka::App
67
+ .consumer_groups
68
+ .map(&:subscription_groups)
69
+ .flatten
70
+ .map(&:topics)
71
+ .map { |gtopics| gtopics.map(&:name) }
72
+ .flatten
73
+
74
+ next if (value - topics).empty?
75
+
76
+ # Found unknown topics
77
+ [[%i[topics], :topics_inclusion]]
78
+ end
79
+
80
+ # Makes sure we have anything to subscribe to when we start the server
81
+ virtual do |_, errors|
82
+ next unless errors.empty?
83
+ next unless Karafka::App.subscription_groups.empty?
84
+
85
+ [[%i[topics], :topics_missing]]
86
+ end
30
87
  end
31
88
  end
32
89
  end
@@ -20,7 +20,7 @@ module Karafka
20
20
  required(:initial_offset) { |val| %w[earliest latest].include?(val) }
21
21
  required(:max_wait_time) { |val| val.is_a?(Integer) && val >= 10 }
22
22
  required(:name) { |val| val.is_a?(String) && Contracts::TOPIC_REGEXP.match?(val) }
23
- required(:subscription_group) { |val| val.nil? || (val.is_a?(String) && !val.empty?) }
23
+ required(:subscription_group) { |val| val.is_a?(String) && !val.empty? }
24
24
 
25
25
  virtual do |data, errors|
26
26
  next unless errors.empty?
@@ -8,68 +8,71 @@ module Karafka
8
8
 
9
9
  private_constant :PUBLIC_KEY_LOCATION
10
10
 
11
- # Tries to prepare license and verifies it
12
- #
13
- # @param license_config [Karafka::Core::Configurable::Node] config related to the licensing
14
- def prepare_and_verify(license_config)
15
- prepare(license_config)
16
- verify(license_config)
17
- end
18
-
19
- private
11
+ class << self
12
+ # Tries to load the license and yields if successful
13
+ def detect
14
+ # If required, do not require again
15
+ require('karafka-license') unless const_defined?('::Karafka::License')
20
16
 
21
- # @param license_config [Karafka::Core::Configurable::Node] config related to the licensing
22
- def prepare(license_config)
23
- # If there is token, no action needed
24
- # We support a case where someone would put the token in instead of using one from the
25
- # license. That's in case there are limitations to using external package sources, etc
26
- return if license_config.token
17
+ yield
27
18
 
28
- begin
29
- license_config.token || require('karafka-license')
19
+ true
30
20
  rescue LoadError
31
- return
21
+ false
32
22
  end
33
23
 
34
- license_config.token = Karafka::License.token
35
- end
36
-
37
- # Check license and setup license details (if needed)
38
- # @param license_config [Karafka::Core::Configurable::Node] config related to the licensing
39
- def verify(license_config)
40
- # If no license, it will just run LGPL components without anything extra
41
- return unless license_config.token
24
+ # Tries to prepare license and verifies it
25
+ #
26
+ # @param license_config [Karafka::Core::Configurable::Node] config related to the licensing
27
+ def prepare_and_verify(license_config)
28
+ # If license is not loaded, nothing to do
29
+ return unless const_defined?('::Karafka::License')
42
30
 
43
- public_key = OpenSSL::PKey::RSA.new(File.read(PUBLIC_KEY_LOCATION))
31
+ prepare(license_config)
32
+ verify(license_config)
33
+ end
44
34
 
45
- # We gsub and strip in case someone copy-pasted it as a multi line string
46
- formatted_token = license_config.token.strip.delete("\n").delete(' ')
47
- decoded_token = Base64.decode64(formatted_token)
35
+ private
48
36
 
49
- begin
50
- data = public_key.public_decrypt(decoded_token)
51
- rescue OpenSSL::OpenSSLError
52
- data = nil
37
+ # @param license_config [Karafka::Core::Configurable::Node] config related to the licensing
38
+ def prepare(license_config)
39
+ license_config.token = Karafka::License.token
53
40
  end
54
41
 
55
- details = data ? JSON.parse(data) : raise_invalid_license_token(license_config)
42
+ # Check license and setup license details (if needed)
43
+ # @param license_config [Karafka::Core::Configurable::Node] config related to the licensing
44
+ def verify(license_config)
45
+ public_key = OpenSSL::PKey::RSA.new(File.read(PUBLIC_KEY_LOCATION))
56
46
 
57
- license_config.entity = details.fetch('entity')
58
- end
47
+ # We gsub and strip in case someone copy-pasted it as a multi line string
48
+ formatted_token = license_config.token.strip.delete("\n").delete(' ')
49
+ decoded_token = Base64.decode64(formatted_token)
59
50
 
60
- # Raises an error with info, that used token is invalid
61
- # @param license_config [Karafka::Core::Configurable::Node]
62
- def raise_invalid_license_token(license_config)
63
- # We set it to false so `Karafka.pro?` method behaves as expected
64
- license_config.token = false
65
-
66
- raise(
67
- Errors::InvalidLicenseTokenError,
68
- <<~MSG.tr("\n", ' ')
69
- License key you provided is invalid.
70
- Please reach us at contact@karafka.io or visit https://karafka.io to obtain a valid one.
71
- MSG
72
- )
51
+ begin
52
+ data = public_key.public_decrypt(decoded_token)
53
+ rescue OpenSSL::OpenSSLError
54
+ data = nil
55
+ end
56
+
57
+ details = data ? JSON.parse(data) : raise_invalid_license_token(license_config)
58
+
59
+ license_config.entity = details.fetch('entity')
60
+ end
61
+
62
+ # Raises an error with info, that used token is invalid
63
+ # @param license_config [Karafka::Core::Configurable::Node]
64
+ def raise_invalid_license_token(license_config)
65
+ # We set it to false so `Karafka.pro?` method behaves as expected
66
+ license_config.token = false
67
+
68
+ raise(
69
+ Errors::InvalidLicenseTokenError,
70
+ <<~MSG.tr("\n", ' ')
71
+ License key you provided is invalid.
72
+ Please reach us at contact@karafka.io or visit https://karafka.io to obtain a valid one.
73
+ MSG
74
+ )
75
+ end
73
76
  end
74
77
  end
75
78
  end
@@ -45,8 +45,6 @@ module Karafka
45
45
  # @param config [Karafka::Core::Configurable::Node] app config that we can alter with pro
46
46
  # components
47
47
  def setup(config)
48
- require_all
49
-
50
48
  reconfigure(config)
51
49
 
52
50
  load_topic_features
@@ -18,8 +18,7 @@ module Karafka
18
18
  # Pro jobs
19
19
  module Jobs
20
20
  # The main job type in a non-blocking variant.
21
- # This variant works "like" the regular consumption but pauses the partition for as long
22
- # as it is needed until a job is done.
21
+ # This variant works "like" the regular consumption but does not block the queue.
23
22
  #
24
23
  # It can be useful when having long lasting jobs that would exceed `max.poll.interval`
25
24
  # if would block.
@@ -0,0 +1,37 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component under a commercial license.
4
+ # This Karafka component is NOT licensed under LGPL.
5
+ #
6
+ # All of the commercial components are present in the lib/karafka/pro directory of this
7
+ # repository and their usage requires commercial license agreement.
8
+ #
9
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
10
+ #
11
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
12
+ # your code to Maciej Mensfeld.
13
+
14
+ module Karafka
15
+ module Pro
16
+ # Pro components related to processing part of Karafka
17
+ module Processing
18
+ # Pro jobs
19
+ module Jobs
20
+ # The revoked job type in a non-blocking variant.
21
+ # This variant works "like" the regular revoked but does not block the queue.
22
+ #
23
+ # It can be useful when having long lasting jobs that would exceed `max.poll.interval`
24
+ # in scenarios where there are more jobs than threads, without this being async we
25
+ # would potentially stop polling
26
+ class RevokedNonBlocking < ::Karafka::Processing::Jobs::Revoked
27
+ # Makes this job non-blocking from the start
28
+ # @param args [Array] any arguments accepted by `::Karafka::Processing::Jobs::Revoked`
29
+ def initialize(*args)
30
+ super
31
+ @non_blocking = true
32
+ end
33
+ end
34
+ end
35
+ end
36
+ end
37
+ end
@@ -28,6 +28,18 @@ module Karafka
28
28
  super
29
29
  end
30
30
  end
31
+
32
+ # @param executor [Karafka::Processing::Executor]
33
+ # @return [Karafka::Processing::Jobs::Revoked] revocation job for non LRJ
34
+ # @return [Karafka::Processing::Jobs::RevokedNonBlocking] revocation job that is
35
+ # non-blocking, so when revocation job is scheduled for LRJ it also will not block
36
+ def revoked(executor)
37
+ if executor.topic.long_running_job?
38
+ Jobs::RevokedNonBlocking.new(executor)
39
+ else
40
+ super
41
+ end
42
+ end
31
43
  end
32
44
  end
33
45
  end
@@ -22,6 +22,7 @@ module Karafka
22
22
  # Nothing. Just standard, automatic flow
23
23
  module Default
24
24
  include Base
25
+ include ::Karafka::Processing::Strategies::Default
25
26
 
26
27
  # Apply strategy for a non-feature based flow
27
28
  FEATURES = %i[].freeze
@@ -39,6 +40,31 @@ module Karafka
39
40
  end
40
41
  end
41
42
 
43
+ # Run the user consumption code
44
+ def handle_consume
45
+ # We should not run the work at all on a partition that was revoked
46
+ # This can happen primarily when an LRJ job gets to the internal worker queue and
47
+ # this partition is revoked prior processing.
48
+ unless revoked?
49
+ Karafka.monitor.instrument('consumer.consumed', caller: self) do
50
+ consume
51
+ end
52
+ end
53
+
54
+ # Mark job as successful
55
+ coordinator.consumption(self).success!
56
+ rescue StandardError => e
57
+ # If failed, mark as failed
58
+ coordinator.consumption(self).failure!(e)
59
+
60
+ # Re-raise so reported in the consumer
61
+ raise e
62
+ ensure
63
+ # We need to decrease number of jobs that this coordinator coordinates as it has
64
+ # finished
65
+ coordinator.decrement
66
+ end
67
+
42
68
  # Standard flow without any features
43
69
  def handle_after_consume
44
70
  coordinator.on_finished do |last_group_message|
@@ -100,7 +100,7 @@ module Karafka
100
100
  return if @queue.closed?
101
101
 
102
102
  @queue.close
103
- @semaphores.values.each(&:close)
103
+ @semaphores.each_value(&:close)
104
104
  end
105
105
  end
106
106
 
@@ -22,6 +22,11 @@ module Karafka
22
22
  raise NotImplementedError, 'Implement in a subclass'
23
23
  end
24
24
 
25
+ # What should happen in the processing
26
+ def handle_consume
27
+ raise NotImplementedError, 'Implement in a subclass'
28
+ end
29
+
25
30
  # Post-consumption handling
26
31
  def handle_after_consume
27
32
  raise NotImplementedError, 'Implement in a subclass'
@@ -23,6 +23,25 @@ module Karafka
23
23
  coordinator.pause_tracker.increment
24
24
  end
25
25
 
26
+ # Run the user consumption code
27
+ def handle_consume
28
+ Karafka.monitor.instrument('consumer.consumed', caller: self) do
29
+ consume
30
+ end
31
+
32
+ # Mark job as successful
33
+ coordinator.consumption(self).success!
34
+ rescue StandardError => e
35
+ # If failed, mark as failed
36
+ coordinator.consumption(self).failure!(e)
37
+
38
+ # Re-raise so reported in the consumer
39
+ raise e
40
+ ensure
41
+ # We need to decrease number of jobs that this coordinator coordinates as it has finished
42
+ coordinator.decrement
43
+ end
44
+
26
45
  # Standard flow marks work as consumed and moves on if everything went ok.
27
46
  # If there was a processing error, we will pause and continue from the next message
28
47
  # (next that is +1 from the last one that was successfully marked as consumed)
@@ -45,6 +45,9 @@ if rails
45
45
 
46
46
  next unless Rails.env.development?
47
47
  next unless ENV.key?('KARAFKA_CLI')
48
+ # If we are already publishing to STDOUT, no need to add it again.
49
+ # If added again, would print stuff twice
50
+ next if ActiveSupport::Logger.logger_outputs_to?(Rails.logger, $stdout)
48
51
 
49
52
  logger = ActiveSupport::Logger.new($stdout)
50
53
  # Inherit the logger level from Rails, otherwise would always run with the debug level
@@ -82,7 +82,7 @@ module Karafka
82
82
  # @param block [Proc] further topics definitions
83
83
  def subscription_group(subscription_group_name = SecureRandom.uuid, &block)
84
84
  consumer_group('app') do
85
- target.public_send(:subscription_group=, subscription_group_name, &block)
85
+ target.public_send(:subscription_group=, subscription_group_name.to_s, &block)
86
86
  end
87
87
  end
88
88
 
@@ -31,7 +31,10 @@ module Karafka
31
31
 
32
32
  # @return [Boolean] true if this consumer group should be active in our current process
33
33
  def active?
34
- Karafka::Server.consumer_groups.include?(name)
34
+ cgs = Karafka::App.config.internal.routing.active.consumer_groups
35
+
36
+ # When empty it means no groups were specified, hence all should be used
37
+ cgs.empty? || cgs.include?(name)
35
38
  end
36
39
 
37
40
  # Builds a topic representation inside of a current consumer group route
@@ -50,9 +53,9 @@ module Karafka
50
53
 
51
54
  # Assigns the current subscription group id based on the defined one and allows for further
52
55
  # topic definition
53
- # @param name [String, Symbol]
56
+ # @param name [String, Symbol] name of the current subscription group
54
57
  # @param block [Proc] block that may include topics definitions
55
- def subscription_group=(name, &block)
58
+ def subscription_group=(name = SecureRandom.uuid, &block)
56
59
  # We cast it here, so the routing supports symbol based but that's anyhow later on
57
60
  # validated as a string
58
61
  @current_subscription_group_id = name
@@ -8,7 +8,7 @@ module Karafka
8
8
  # @note One subscription group will always belong to one consumer group, but one consumer
9
9
  # group can have multiple subscription groups.
10
10
  class SubscriptionGroup
11
- attr_reader :id, :topics, :kafka
11
+ attr_reader :id, :name, :topics, :kafka
12
12
 
13
13
  # @param position [Integer] position of this subscription group in all the subscriptions
14
14
  # groups array. We need to have this value for sake of static group memberships, where
@@ -16,7 +16,8 @@ module Karafka
16
16
  # @param topics [Karafka::Routing::Topics] all the topics that share the same key settings
17
17
  # @return [SubscriptionGroup] built subscription group
18
18
  def initialize(position, topics)
19
- @id = "#{topics.first.subscription_group}_#{position}"
19
+ @name = topics.first.subscription_group
20
+ @id = "#{@name}_#{position}"
20
21
  @position = position
21
22
  @topics = topics
22
23
  @kafka = build_kafka
@@ -38,6 +39,14 @@ module Karafka
38
39
  @topics.first.max_wait_time
39
40
  end
40
41
 
42
+ # @return [Boolean] is this subscription group one of active once
43
+ def active?
44
+ sgs = Karafka::App.config.internal.routing.active.subscription_groups
45
+
46
+ # When empty it means no groups were specified, hence all should be used
47
+ sgs.empty? || sgs.include?(name)
48
+ end
49
+
41
50
  private
42
51
 
43
52
  # @return [Hash] kafka settings are a bit special. They are exactly the same for all of the
@@ -75,6 +75,14 @@ module Karafka
75
75
  consumer
76
76
  end
77
77
 
78
+ # @return [Boolean] should this topic be in use
79
+ def active?
80
+ topics = Karafka::App.config.internal.routing.active.topics
81
+
82
+ # When empty it means no topics were specified, hence all should be used
83
+ topics.empty? || topics.include?(name)
84
+ end
85
+
78
86
  # @return [Hash] hash with all the topic attributes
79
87
  # @note This is being used when we validate the consumer_group and its topics
80
88
  def to_h
@@ -23,6 +23,14 @@ module Karafka
23
23
  @accumulator.each(&block)
24
24
  end
25
25
 
26
+ # Allows us to remove elements from the topics
27
+ #
28
+ # Block to decide what to delete
29
+ # @param block [Proc]
30
+ def delete_if(&block)
31
+ @accumulator.delete_if(&block)
32
+ end
33
+
26
34
  # Finds topic by its name
27
35
  #
28
36
  # @param topic_name [String] topic name
@@ -20,11 +20,19 @@ module Karafka
20
20
  # Set of workers
21
21
  attr_accessor :workers
22
22
 
23
- # Writer for list of consumer groups that we want to consume in our current process context
24
- attr_writer :consumer_groups
25
-
26
23
  # Method which runs app
27
24
  def run
25
+ self.listeners = []
26
+ self.workers = []
27
+
28
+ # We need to validate this prior to running because it may be executed also from the
29
+ # embedded
30
+ # We cannot validate this during the start because config needs to be populated and routes
31
+ # need to be defined.
32
+ Contracts::ServerCliOptions.new.validate!(
33
+ Karafka::App.config.internal.routing.active.to_h
34
+ )
35
+
28
36
  process.on_sigint { stop }
29
37
  process.on_sigquit { stop }
30
38
  process.on_sigterm { stop }
@@ -49,13 +57,6 @@ module Karafka
49
57
  raise e
50
58
  end
51
59
 
52
- # @return [Array<String>] array with names of consumer groups that should be consumed in a
53
- # current server context
54
- def consumer_groups
55
- # If not specified, a server will listen on all the topics
56
- @consumer_groups ||= Karafka::App.consumer_groups.map(&:name).freeze
57
- end
58
-
59
60
  # Starts Karafka with a supervision
60
61
  # @note We don't need to sleep because Karafka::Fetcher is locking and waiting to
61
62
  # finish loop (and it won't happen until we explicitly want to stop)
@@ -107,6 +107,14 @@ module Karafka
107
107
  # option subscription_groups_builder [Routing::SubscriptionGroupsBuilder] subscription
108
108
  # group builder
109
109
  setting :subscription_groups_builder, default: Routing::SubscriptionGroupsBuilder.new
110
+
111
+ # Internally assigned list of limits on routings active for the current process
112
+ # This should be overwritten by the CLI command
113
+ setting :active do
114
+ setting :consumer_groups, default: [].freeze
115
+ setting :subscription_groups, default: [].freeze
116
+ setting :topics, default: [].freeze
117
+ end
110
118
  end
111
119
 
112
120
  setting :processing do
@@ -142,16 +150,18 @@ module Karafka
142
150
  # Configuring method
143
151
  # @param block [Proc] block we want to execute with the config instance
144
152
  def setup(&block)
153
+ # Will prepare and verify license if present
154
+ Licenser.prepare_and_verify(config.license)
155
+ # Will configure all the pro components
156
+ # This needs to happen before end user configuration as the end user may overwrite some
157
+ # of the pro defaults with custom components
158
+ Pro::Loader.setup(config) if Karafka.pro?
159
+
145
160
  configure(&block)
146
161
  merge_kafka_defaults!(config)
147
162
 
148
163
  Contracts::Config.new.validate!(config.to_h)
149
164
 
150
- licenser = Licenser.new
151
-
152
- # Tries to load our license gem and if present will try to load the correct license
153
- licenser.prepare_and_verify(config.license)
154
-
155
165
  configure_components
156
166
 
157
167
  Karafka::App.initialized!
@@ -188,12 +198,6 @@ module Karafka
188
198
  producer_config.kafka = AttributesMap.producer(config.kafka.dup)
189
199
  producer_config.logger = config.logger
190
200
  end
191
-
192
- return unless Karafka.pro?
193
-
194
- # Runs the pro loader that includes all the pro components
195
- require 'karafka/pro/loader'
196
- Pro::Loader.setup(config)
197
201
  end
198
202
  end
199
203
  end
@@ -3,5 +3,5 @@
3
3
  # Main module namespace
4
4
  module Karafka
5
5
  # Current Karafka version
6
- VERSION = '2.0.20'
6
+ VERSION = '2.0.22'
7
7
  end
data/lib/karafka.rb CHANGED
@@ -100,5 +100,14 @@ loader.eager_load
100
100
  # nor included here
101
101
  ::Karafka::Routing::Features::Base.load_all
102
102
 
103
+ # We need to detect and require (not setup) Pro components during the gem load, because we need
104
+ # to make pro components available in case anyone wants to use them as a base to their own
105
+ # custom components. Otherwise inheritance would not work.
106
+ Karafka::Licenser.detect do
107
+ require 'karafka/pro/loader'
108
+
109
+ Karafka::Pro::Loader.require_all
110
+ end
111
+
103
112
  # Load railtie after everything else is ready so we know we can rely on it.
104
113
  require 'karafka/railtie'
data.tar.gz.sig CHANGED
Binary file
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: karafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.0.20
4
+ version: 2.0.22
5
5
  platform: ruby
6
6
  authors:
7
7
  - Maciej Mensfeld
@@ -35,7 +35,7 @@ cert_chain:
35
35
  Qf04B9ceLUaC4fPVEz10FyobjaFoY4i32xRto3XnrzeAgfEe4swLq8bQsR3w/EF3
36
36
  MGU0FeSV2Yj7Xc2x/7BzLK8xQn5l7Yy75iPF+KP3vVmDHnNl
37
37
  -----END CERTIFICATE-----
38
- date: 2022-11-24 00:00:00.000000000 Z
38
+ date: 2022-12-02 00:00:00.000000000 Z
39
39
  dependencies:
40
40
  - !ruby/object:Gem::Dependency
41
41
  name: karafka-core
@@ -119,7 +119,11 @@ dependencies:
119
119
  - - "~>"
120
120
  - !ruby/object:Gem::Version
121
121
  version: '2.3'
122
- description: Framework used to simplify Apache Kafka based Ruby applications development
122
+ description: |2
123
+ Karafka is Ruby and Rails efficient Kafka processing framework.
124
+
125
+ Karafka allows you to capture everything that happens in your systems in large scale,
126
+ without having to focus on things that are not your business domain.
123
127
  email:
124
128
  - contact@karafka.io
125
129
  executables:
@@ -227,6 +231,7 @@ files:
227
231
  - lib/karafka/pro/performance_tracker.rb
228
232
  - lib/karafka/pro/processing/coordinator.rb
229
233
  - lib/karafka/pro/processing/jobs/consume_non_blocking.rb
234
+ - lib/karafka/pro/processing/jobs/revoked_non_blocking.rb
230
235
  - lib/karafka/pro/processing/jobs_builder.rb
231
236
  - lib/karafka/pro/processing/partitioner.rb
232
237
  - lib/karafka/pro/processing/scheduler.rb
@@ -353,5 +358,5 @@ requirements: []
353
358
  rubygems_version: 3.3.7
354
359
  signing_key:
355
360
  specification_version: 4
356
- summary: Efficient Kafka processing framework for Ruby and Rails
361
+ summary: Karafka is Ruby and Rails efficient Kafka processing framework.
357
362
  test_files: []
metadata.gz.sig CHANGED
Binary file