karafka 2.0.0.alpha1 → 2.0.0.alpha2

Sign up to get free protection for your applications and to get access to all the features.
Files changed (48) hide show
  1. checksums.yaml +4 -4
  2. checksums.yaml.gz.sig +0 -0
  3. data/.github/workflows/ci.yml +1 -0
  4. data/CHANGELOG.md +4 -1
  5. data/CONTRIBUTING.md +6 -6
  6. data/Gemfile.lock +23 -23
  7. data/LICENSE +3 -0
  8. data/bin/integrations +1 -1
  9. data/bin/karafka +4 -0
  10. data/config/errors.yml +1 -0
  11. data/docker-compose.yml +1 -0
  12. data/karafka.gemspec +1 -1
  13. data/lib/active_job/karafka.rb +6 -4
  14. data/lib/active_job/queue_adapters/karafka_adapter.rb +3 -6
  15. data/lib/karafka/active_job/consumer.rb +24 -0
  16. data/lib/karafka/active_job/dispatcher.rb +38 -0
  17. data/lib/karafka/active_job/job_extensions.rb +34 -0
  18. data/lib/karafka/active_job/job_options_contract.rb +15 -0
  19. data/lib/karafka/active_job/routing_extensions.rb +18 -0
  20. data/lib/karafka/app.rb +1 -0
  21. data/lib/karafka/cli/info.rb +3 -3
  22. data/lib/karafka/cli/server.rb +2 -16
  23. data/lib/karafka/contracts/base.rb +23 -0
  24. data/lib/karafka/contracts/config.rb +21 -3
  25. data/lib/karafka/contracts/consumer_group.rb +1 -3
  26. data/lib/karafka/contracts/consumer_group_topic.rb +1 -3
  27. data/lib/karafka/contracts/server_cli_options.rb +1 -3
  28. data/lib/karafka/errors.rb +4 -0
  29. data/lib/karafka/instrumentation/stdout_listener.rb +3 -0
  30. data/lib/karafka/licenser.rb +20 -9
  31. data/lib/karafka/messages/batch_metadata.rb +2 -0
  32. data/lib/karafka/messages/builders/batch_metadata.rb +23 -1
  33. data/lib/karafka/pro/active_job/dispatcher.rb +58 -0
  34. data/lib/karafka/pro/active_job/job_options_contract.rb +27 -0
  35. data/lib/karafka/pro/loader.rb +29 -0
  36. data/lib/karafka/pro.rb +13 -0
  37. data/lib/karafka/processing/worker.rb +1 -1
  38. data/lib/karafka/railtie.rb +12 -2
  39. data/lib/karafka/routing/builder.rb +1 -11
  40. data/lib/karafka/routing/subscription_group.rb +5 -5
  41. data/lib/karafka/setup/config.rb +20 -20
  42. data/lib/karafka/version.rb +1 -1
  43. data/lib/karafka.rb +7 -2
  44. data.tar.gz.sig +0 -0
  45. metadata +15 -7
  46. metadata.gz.sig +0 -0
  47. data/lib/active_job/consumer.rb +0 -22
  48. data/lib/active_job/routing_extensions.rb +0 -15
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a441a124346099aa31fb36a2241c2abfbc242ab6f3f9ad197b82d8786e78232e
4
- data.tar.gz: 81a706d58e469ecc48759dfe28c30e3779baab848122b72c194c30e06a7cdd7d
3
+ metadata.gz: 112496d14f74a8095df2b0b557e294140f422bbc8d6ea93f0709ac3b4ec017ec
4
+ data.tar.gz: a049427bd67d7887c0d5a352c724ecc56b156842859e9688068d36e25aeeb05d
5
5
  SHA512:
6
- metadata.gz: bf9ebff340150913cf273bc3f09c9f3dd3c02e6ffe427fd00ead842995e5f146d554609cd1ff4d1039ee96ef349b8e7712e2c557abc322a9e0300886bc765b6a
7
- data.tar.gz: 054cf8240a33a133c2c9dce5f1c3c2ac26877006c8a823c1e58c903fccd8cac7c31b5d0c0d2ec22257d8959d40bb69d49bf5676f378c33004807a78413278b3e
6
+ metadata.gz: d549e3c4bd31a2d011d453495ee6c3f3825ece067285834ef1bc5872cbc257f9695ff831eed7370c0668b33f4c297b1c3512fb068fecafdbd1a5d109d305e243
7
+ data.tar.gz: 2f7a48f6490e7ccef765c1697bdb54eb8e596246c853f8e30a85a134e50394c78c983b2ad4020d140da0a532aa1633cd425ba3af12a2d1e2f59fc9264288c3fb
checksums.yaml.gz.sig CHANGED
Binary file
@@ -122,5 +122,6 @@ jobs:
122
122
 
123
123
  - name: Run integration tests
124
124
  env:
125
+ KARAFKA_PRO_LICENSE_TOKEN: ${{ secrets.KARAFKA_PRO_LICENSE_TOKEN }}
125
126
  GITHUB_COVERAGE: ${{matrix.coverage}}
126
127
  run: bundle exec bin/integrations
data/CHANGELOG.md CHANGED
@@ -1,6 +1,9 @@
1
1
  # Karafka framework changelog
2
2
 
3
- ## 2.0.0-alpha1 (Unreleased)
3
+ ## 2.0.0-alpha2 (2022-02-19)
4
+ - Require `kafka` keys to be symbols
5
+
6
+ ## 2.0.0-alpha1 (2022-01-30)
4
7
  - Change license to `LGPL-3.0`
5
8
  - Introduce a Pro subscription
6
9
  - Switch from `ruby-kafka` to `librdkafka` as an underlying driver
data/CONTRIBUTING.md CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  ## Introduction
4
4
 
5
- First, thank you for considering contributing to karafka! It's people like you that make the open source community such a great community! 😊
5
+ First, thank you for considering contributing to the Karafka ecosystem! It's people like you that make the open source community such a great community! 😊
6
6
 
7
7
  We welcome any type of contribution, not only code. You can help with:
8
8
  - **QA**: file bug reports, the more details you can give the better (e.g. screenshots with the console open)
@@ -18,15 +18,17 @@ Working on your first Pull Request? You can learn how from this *free* series, [
18
18
 
19
19
  Any code change should be submitted as a pull request. The description should explain what the code does and give steps to execute it. The pull request should also contain tests.
20
20
 
21
- ## Code review process
21
+ ### Code review process
22
22
 
23
23
  Each pull request must pass all the rspec specs and meet our quality requirements.
24
24
 
25
25
  To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
26
26
 
27
- Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
27
+ ### Contributing to Pro components
28
28
 
29
- [![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/commit_builds)
29
+ All of Karafka components are open-source. However, the `Pro` components are licenses under `LICENSE-COMM`.
30
+
31
+ By sending a pull request to the pro components, you are agreeing to transfer the copyright of your code to Maciej Mensfeld.
30
32
 
31
33
  ## Questions
32
34
 
@@ -35,7 +37,5 @@ You can also reach us at hello@karafka.opencollective.com.
35
37
 
36
38
  ## Credits
37
39
 
38
- ### Contributors
39
-
40
40
  Thank you to all the people who have already contributed to karafka!
41
41
  <a href="graphs/contributors"><img src="https://opencollective.com/karafka/contributors.svg?width=890" /></a>
data/Gemfile.lock CHANGED
@@ -1,22 +1,22 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (2.0.0.pre.alpha1)
4
+ karafka (2.0.0.alpha2)
5
5
  dry-configurable (~> 0.13)
6
6
  dry-monitor (~> 0.5)
7
7
  dry-validation (~> 1.7)
8
8
  rdkafka (>= 0.10)
9
9
  thor (>= 0.20)
10
- waterdrop (>= 2.1.0, < 3.0.0)
10
+ waterdrop (>= 2.2.0, < 3.0.0)
11
11
  zeitwerk (~> 2.3)
12
12
 
13
13
  GEM
14
14
  remote: https://rubygems.org/
15
15
  specs:
16
- activejob (7.0.1)
17
- activesupport (= 7.0.1)
16
+ activejob (7.0.2.2)
17
+ activesupport (= 7.0.2.2)
18
18
  globalid (>= 0.3.6)
19
- activesupport (7.0.1)
19
+ activesupport (7.0.2.2)
20
20
  concurrent-ruby (~> 1.0, >= 1.0.2)
21
21
  i18n (>= 1.6, < 2)
22
22
  minitest (>= 5.1)
@@ -45,7 +45,7 @@ GEM
45
45
  dry-configurable (~> 0.13, >= 0.13.0)
46
46
  dry-core (~> 0.5, >= 0.5)
47
47
  dry-events (~> 0.2)
48
- dry-schema (1.8.0)
48
+ dry-schema (1.9.1)
49
49
  concurrent-ruby (~> 1.0)
50
50
  dry-configurable (~> 0.13, >= 0.13.0)
51
51
  dry-core (~> 0.5, >= 0.5)
@@ -58,18 +58,18 @@ GEM
58
58
  dry-core (~> 0.5, >= 0.5)
59
59
  dry-inflector (~> 0.1, >= 0.1.2)
60
60
  dry-logic (~> 1.0, >= 1.0.2)
61
- dry-validation (1.7.0)
61
+ dry-validation (1.8.0)
62
62
  concurrent-ruby (~> 1.0)
63
63
  dry-container (~> 0.7, >= 0.7.1)
64
64
  dry-core (~> 0.5, >= 0.5)
65
65
  dry-initializer (~> 3.0)
66
- dry-schema (~> 1.8, >= 1.8.0)
66
+ dry-schema (~> 1.9, >= 1.9.1)
67
67
  factory_bot (6.2.0)
68
68
  activesupport (>= 5.0.0)
69
69
  ffi (1.15.5)
70
70
  globalid (1.0.0)
71
71
  activesupport (>= 5.0)
72
- i18n (1.9.1)
72
+ i18n (1.10.0)
73
73
  concurrent-ruby (~> 1.0)
74
74
  mini_portile2 (2.7.1)
75
75
  minitest (5.15.0)
@@ -78,29 +78,29 @@ GEM
78
78
  ffi (~> 1.15)
79
79
  mini_portile2 (~> 2.6)
80
80
  rake (> 12)
81
- rspec (3.10.0)
82
- rspec-core (~> 3.10.0)
83
- rspec-expectations (~> 3.10.0)
84
- rspec-mocks (~> 3.10.0)
85
- rspec-core (3.10.2)
86
- rspec-support (~> 3.10.0)
87
- rspec-expectations (3.10.2)
81
+ rspec (3.11.0)
82
+ rspec-core (~> 3.11.0)
83
+ rspec-expectations (~> 3.11.0)
84
+ rspec-mocks (~> 3.11.0)
85
+ rspec-core (3.11.0)
86
+ rspec-support (~> 3.11.0)
87
+ rspec-expectations (3.11.0)
88
88
  diff-lcs (>= 1.2.0, < 2.0)
89
- rspec-support (~> 3.10.0)
90
- rspec-mocks (3.10.3)
89
+ rspec-support (~> 3.11.0)
90
+ rspec-mocks (3.11.0)
91
91
  diff-lcs (>= 1.2.0, < 2.0)
92
- rspec-support (~> 3.10.0)
93
- rspec-support (3.10.3)
92
+ rspec-support (~> 3.11.0)
93
+ rspec-support (3.11.0)
94
94
  simplecov (0.21.2)
95
95
  docile (~> 1.1)
96
96
  simplecov-html (~> 0.11)
97
97
  simplecov_json_formatter (~> 0.1)
98
98
  simplecov-html (0.12.3)
99
- simplecov_json_formatter (0.1.3)
99
+ simplecov_json_formatter (0.1.4)
100
100
  thor (1.2.1)
101
101
  tzinfo (2.0.4)
102
102
  concurrent-ruby (~> 1.0)
103
- waterdrop (2.1.0)
103
+ waterdrop (2.2.0)
104
104
  concurrent-ruby (>= 1.1)
105
105
  dry-configurable (~> 0.13)
106
106
  dry-monitor (~> 0.5)
@@ -121,4 +121,4 @@ DEPENDENCIES
121
121
  simplecov
122
122
 
123
123
  BUNDLED WITH
124
- 2.3.5
124
+ 2.3.7
data/LICENSE CHANGED
@@ -9,6 +9,9 @@ Karafka has also commercial-friendly license, commercial support and commercial
9
9
  All of the commercial components are present in the lib/karafka/pro directory of this repository
10
10
  and their usage requires commercial license agreement.
11
11
 
12
+ By sending a pull request to the pro components, you are agreeing to transfer the copyright of your
13
+ code to Maciej Mensfeld.
14
+
12
15
  You can find the commercial license in LICENSE-COM.
13
16
 
14
17
  Please see https://karafka.io for purchasing options.
data/bin/integrations CHANGED
@@ -13,7 +13,7 @@ IntegrationTestError = Class.new(StandardError)
13
13
 
14
14
  # How many child processes with integration specs do we want to run in parallel
15
15
  # When the value is high, there's a problem with thread allocation on Github
16
- CONCURRENCY = 5
16
+ CONCURRENCY = 4
17
17
 
18
18
  # Abstraction around a single test scenario execution process
19
19
  class Scenario
data/bin/karafka CHANGED
@@ -2,6 +2,10 @@
2
2
 
3
3
  require 'karafka'
4
4
 
5
+ # We set this to indicate, that the process in which we are (whatever it does) was started using
6
+ # our bin/karafka cli
7
+ ENV['KARAFKA_CLI'] = 'true'
8
+
5
9
  # If there is a boot file, we need to require it as we expect it to contain
6
10
  # Karafka app setup, routes, etc
7
11
  if File.exist?(Karafka.boot_file)
data/config/errors.yml CHANGED
@@ -5,3 +5,4 @@ en:
5
5
  topics_names_not_unique: all topic names within a single consumer group must be unique
6
6
  required_usage_count: Given topic must be used at least once
7
7
  consumer_groups_inclusion: Unknown consumer group
8
+ kafka_key_must_be_a_symbol: All keys under the kafka settings scope need to be symbols
data/docker-compose.yml CHANGED
@@ -16,6 +16,7 @@ services:
16
16
  KAFKA_CREATE_TOPICS:
17
17
  "integrations_0_03:3:1,\
18
18
  integrations_1_03:3:1,\
19
+ integrations_2_03:3:1,\
19
20
  integrations_0_10:10:1,\
20
21
  integrations_1_10:10:1,\
21
22
  benchmarks_0_01:1:1,\
data/karafka.gemspec CHANGED
@@ -21,7 +21,7 @@ Gem::Specification.new do |spec|
21
21
  spec.add_dependency 'dry-validation', '~> 1.7'
22
22
  spec.add_dependency 'rdkafka', '>= 0.10'
23
23
  spec.add_dependency 'thor', '>= 0.20'
24
- spec.add_dependency 'waterdrop', '>= 2.1.0', '< 3.0.0'
24
+ spec.add_dependency 'waterdrop', '>= 2.2.0', '< 3.0.0'
25
25
  spec.add_dependency 'zeitwerk', '~> 2.3'
26
26
 
27
27
  spec.required_ruby_version = '>= 2.6.0'
@@ -2,8 +2,6 @@
2
2
 
3
3
  require 'active_job'
4
4
  require 'active_job/queue_adapters'
5
- require 'active_job/consumer'
6
- require 'active_job/routing_extensions'
7
5
  require 'active_job/queue_adapters/karafka_adapter'
8
6
 
9
7
  module ActiveJob
@@ -14,5 +12,9 @@ module ActiveJob
14
12
  end
15
13
 
16
14
  # We extend routing builder by adding a simple wrapper for easier jobs topics defining
17
- ::Karafka::Routing::Builder.include ActiveJob::RoutingExtensions
18
- ::Karafka::Routing::Proxy.include ActiveJob::RoutingExtensions
15
+ # This needs to be extended here as it is going to be used in karafka routes, hence doing that in
16
+ # the railtie initializer would be too late
17
+ ::Karafka::Routing::Builder.include ::Karafka::ActiveJob::RoutingExtensions
18
+ ::Karafka::Routing::Proxy.include ::Karafka::ActiveJob::RoutingExtensions
19
+
20
+ # We extend ActiveJob stuff in the railtie
@@ -5,16 +5,13 @@ module ActiveJob
5
5
  # ActiveJob queue adapters
6
6
  module QueueAdapters
7
7
  # Karafka adapter for enqueuing jobs
8
+ # This is here for ease of integration with ActiveJob.
8
9
  class KarafkaAdapter
9
- # Enqueues the job by sending all the payload to a dedicated topic in Kafka that will be
10
- # later on consumed by a special ActiveJob consumer
10
+ # Enqueues the job using the configured dispatcher
11
11
  #
12
12
  # @param job [Object] job that should be enqueued
13
13
  def enqueue(job)
14
- ::Karafka.producer.produce_async(
15
- topic: job.queue_name,
16
- payload: ActiveSupport::JSON.encode(job.serialize)
17
- )
14
+ ::Karafka::App.config.internal.active_job.dispatcher.call(job)
18
15
  end
19
16
 
20
17
  # Raises info, that Karafka backend does not support scheduling jobs
@@ -0,0 +1,24 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ module ActiveJob
5
+ # This is the consumer for ActiveJob that eats the messages enqueued with it one after another.
6
+ # It marks the offset after each message, so we make sure, none of the jobs is executed twice
7
+ class Consumer < ::Karafka::BaseConsumer
8
+ # Executes the ActiveJob logic
9
+ # @note ActiveJob does not support batches, so we just run one message after another
10
+ def consume
11
+ messages.each do |message|
12
+ ::ActiveJob::Base.execute(
13
+ # We technically speaking could set this as deserializer and reference it from the
14
+ # message instead of using the `#raw_payload`. This is not done on purpose to simplify
15
+ # the ActiveJob setup here
16
+ ::ActiveSupport::JSON.decode(message.raw_payload)
17
+ )
18
+
19
+ mark_as_consumed(message)
20
+ end
21
+ end
22
+ end
23
+ end
24
+ end
@@ -0,0 +1,38 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ module ActiveJob
5
+ # Dispatcher that sends the ActiveJob job to a proper topic based on the queue name
6
+ class Dispatcher
7
+ # Defaults for dispatching
8
+ # The can be updated by using `#karafka_options` on the job
9
+ DEFAULTS = {
10
+ dispatch_method: :produce_async
11
+ }.freeze
12
+
13
+ private_constant :DEFAULTS
14
+
15
+ # @param job [ActiveJob::Base] job
16
+ def call(job)
17
+ ::Karafka.producer.public_send(
18
+ fetch_option(job, :dispatch_method, DEFAULTS),
19
+ topic: job.queue_name,
20
+ payload: ::ActiveSupport::JSON.encode(job.serialize)
21
+ )
22
+ end
23
+
24
+ private
25
+
26
+ # @param job [ActiveJob::Base] job
27
+ # @param key [Symbol] key we want to fetch
28
+ # @param defaults [Hash]
29
+ # @return [Object] options we are interested in
30
+ def fetch_option(job, key, defaults)
31
+ job
32
+ .class
33
+ .karafka_options
34
+ .fetch(key, defaults.fetch(key))
35
+ end
36
+ end
37
+ end
38
+ end
@@ -0,0 +1,34 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ module ActiveJob
5
+ # Allows for setting karafka specific options in ActiveJob jobs
6
+ module JobExtensions
7
+ class << self
8
+ # Defines all the needed accessors and sets defaults
9
+ # @param klass [ActiveJob::Base] active job base
10
+ def extended(klass)
11
+ klass.class_attribute :_karafka_options
12
+ klass._karafka_options = {}
13
+ end
14
+ end
15
+
16
+ # @param new_options [Hash] additional options that allow for jobs Karafka related options
17
+ # customization
18
+ # @return [Hash] karafka options
19
+ def karafka_options(new_options = {})
20
+ return _karafka_options if new_options.empty?
21
+
22
+ # Make sure, that karafka options that someone wants to use are valid before assigning
23
+ # them
24
+ App.config.internal.active_job.job_options_contract.validate!(new_options)
25
+
26
+ new_options.each do |name, value|
27
+ _karafka_options[name] = value
28
+ end
29
+
30
+ _karafka_options
31
+ end
32
+ end
33
+ end
34
+ end
@@ -0,0 +1,15 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ module ActiveJob
5
+ # Contract for validating the options that can be altered with `#karafka_options` per job class
6
+ # @note We keep this in the `Karafka::ActiveJob` namespace instead of `Karafka::Contracts` as
7
+ # we want to keep ActiveJob related Karafka components outside of the core Karafka code and
8
+ # all in the same place
9
+ class JobOptionsContract < Contracts::Base
10
+ params do
11
+ optional(:dispatch_method).value(included_in?: %i[produce_async produce_sync])
12
+ end
13
+ end
14
+ end
15
+ end
@@ -0,0 +1,18 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ # ActiveJob related Karafka stuff
5
+ module ActiveJob
6
+ # Routing extensions for ActiveJob
7
+ module RoutingExtensions
8
+ # This method simplifies routes definition for ActiveJob topics / queues by auto-injecting
9
+ # the consumer class
10
+ # @param name [String, Symbol] name of the topic where ActiveJobs jobs should go
11
+ def active_job_topic(name)
12
+ topic(name) do
13
+ consumer App.config.internal.active_job.consumer
14
+ end
15
+ end
16
+ end
17
+ end
18
+ end
data/lib/karafka/app.rb CHANGED
@@ -36,6 +36,7 @@ module Karafka
36
36
  logger
37
37
  producer
38
38
  monitor
39
+ pro?
39
40
  ].each do |delegated|
40
41
  define_method(delegated) do
41
42
  Karafka.send(delegated)
@@ -31,9 +31,9 @@ module Karafka
31
31
  def core_info
32
32
  config = Karafka::App.config
33
33
 
34
- postfix = config.license.token ? ' + Pro' : ''
34
+ postfix = Karafka.pro? ? ' + Pro' : ''
35
35
 
36
- info = [
36
+ [
37
37
  "Karafka version: #{Karafka::VERSION}#{postfix}",
38
38
  "Ruby version: #{RUBY_VERSION}",
39
39
  "Rdkafka version: #{::Rdkafka::VERSION}",
@@ -49,7 +49,7 @@ module Karafka
49
49
  def license_info
50
50
  config = Karafka::App.config
51
51
 
52
- if config.license.token
52
+ if Karafka.pro?
53
53
  [
54
54
  'License: Commercial',
55
55
  "License entity: #{config.license.entity}",
@@ -5,11 +5,6 @@ module Karafka
5
5
  class Cli < Thor
6
6
  # Server Karafka Cli action
7
7
  class Server < Base
8
- # Server config settings contract
9
- CONTRACT = Contracts::ServerCliOptions.new.freeze
10
-
11
- private_constant :CONTRACT
12
-
13
8
  desc 'Start the Karafka server (short-cut alias: "s")'
14
9
  option aliases: 's'
15
10
  option :consumer_groups, type: :array, default: nil, aliases: :g
@@ -19,7 +14,7 @@ module Karafka
19
14
  # Print our banner and info in the dev mode
20
15
  print_marketing_info if Karafka::App.env.development?
21
16
 
22
- validate!
17
+ Contracts::ServerCliOptions.new.validate!(cli.options)
23
18
 
24
19
  # We assign active topics on a server level, as only server is expected to listen on
25
20
  # part of the topics
@@ -34,7 +29,7 @@ module Karafka
34
29
  def print_marketing_info
35
30
  Karafka.logger.info Info::BANNER
36
31
 
37
- if Karafka::App.config.license.token
32
+ if Karafka.pro?
38
33
  Karafka.logger.info(
39
34
  "\033[0;32mThank you for investing in the Karafka Pro subscription!\033[0m\n"
40
35
  )
@@ -44,15 +39,6 @@ module Karafka
44
39
  )
45
40
  end
46
41
  end
47
-
48
- # Checks the server cli configuration
49
- # options validations in terms of app setup (topics, pid existence, etc)
50
- def validate!
51
- result = CONTRACT.call(cli.options)
52
- return if result.success?
53
-
54
- raise Errors::InvalidConfigurationError, result.errors.to_h
55
- end
56
42
  end
57
43
  end
58
44
  end
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ module Contracts
5
+ # Base contract for all Karafka contracts
6
+ class Base < Dry::Validation::Contract
7
+ config.messages.load_paths << File.join(Karafka.gem_root, 'config', 'errors.yml')
8
+
9
+ # @param data [Hash] data for validation
10
+ # @return [Boolean] true if all good
11
+ # @raise [Errors::InvalidConfigurationError] invalid configuration error
12
+ # @note We use contracts only in the config validation context, so no need to add support
13
+ # for multiple error classes. It will be added when it will be needed.
14
+ def validate!(data)
15
+ result = call(data)
16
+
17
+ return true if result.success?
18
+
19
+ raise Errors::InvalidConfigurationError, result.errors.to_h
20
+ end
21
+ end
22
+ end
23
+ end
@@ -8,9 +8,7 @@ module Karafka
8
8
  # `Karafka::Setup::Config` model, but we don't validate them here as they are
9
9
  # validated per each route (topic + consumer_group) because they can be overwritten,
10
10
  # so we validate all of that once all the routes are defined and ready.
11
- class Config < Dry::Validation::Contract
12
- config.messages.load_paths << File.join(Karafka.gem_root, 'config', 'errors.yml')
13
-
11
+ class Config < Base
14
12
  params do
15
13
  # License validity happens in the licenser. Here we do only the simple consistency checks
16
14
  required(:license).schema do
@@ -27,6 +25,26 @@ module Karafka
27
25
  required(:pause_max_timeout) { int? & gt?(0) }
28
26
  required(:pause_with_exponential_backoff).filled(:bool?)
29
27
  required(:shutdown_timeout) { int? & gt?(0) }
28
+ required(:kafka).filled(:hash)
29
+
30
+ # We validate internals just to be sure, that they are present and working
31
+ required(:internal).schema do
32
+ required(:routing_builder)
33
+ required(:status)
34
+ required(:process)
35
+ required(:subscription_groups_builder)
36
+ end
37
+ end
38
+
39
+ # rdkafka requires all the keys to be strings, so we ensure that
40
+ rule(:kafka) do
41
+ next unless value.is_a?(Hash)
42
+
43
+ value.each_key do |key|
44
+ next if key.is_a?(Symbol)
45
+
46
+ key(:"kafka.#{key}").failure(:kafka_key_must_be_a_symbol)
47
+ end
30
48
  end
31
49
 
32
50
  rule(:pause_timeout, :pause_max_timeout) do
@@ -3,9 +3,7 @@
3
3
  module Karafka
4
4
  module Contracts
5
5
  # Contract for single full route (consumer group + topics) validation.
6
- class ConsumerGroup < Dry::Validation::Contract
7
- config.messages.load_paths << File.join(Karafka.gem_root, 'config', 'errors.yml')
8
-
6
+ class ConsumerGroup < Base
9
7
  # Internal contract for sub-validating topics schema
10
8
  TOPIC_CONTRACT = ConsumerGroupTopic.new.freeze
11
9
 
@@ -3,9 +3,7 @@
3
3
  module Karafka
4
4
  module Contracts
5
5
  # Consumer group topic validation rules.
6
- class ConsumerGroupTopic < Dry::Validation::Contract
7
- config.messages.load_paths << File.join(Karafka.gem_root, 'config', 'errors.yml')
8
-
6
+ class ConsumerGroupTopic < Base
9
7
  params do
10
8
  required(:consumer).filled
11
9
  required(:deserializer).filled
@@ -3,9 +3,7 @@
3
3
  module Karafka
4
4
  module Contracts
5
5
  # Contract for validating correctness of the server cli command options.
6
- class ServerCliOptions < Dry::Validation::Contract
7
- config.messages.load_paths << File.join(Karafka.gem_root, 'config', 'errors.yml')
8
-
6
+ class ServerCliOptions < Base
9
7
  params do
10
8
  optional(:consumer_groups).value(:array, :filled?)
11
9
  end
@@ -43,5 +43,9 @@ module Karafka
43
43
 
44
44
  # Raised when the license token is not valid
45
45
  InvalidLicenseTokenError = Class.new(BaseError)
46
+
47
+ # Used to instrument this error into the error notifications
48
+ # We do not raise it so we won't crash deployed systems
49
+ ExpiredLicenseTokenError = Class.new(BaseError)
46
50
  end
47
51
  end
@@ -88,6 +88,9 @@ module Karafka
88
88
  when 'connection.listener.fetch_loop.error'
89
89
  error "Listener fetch loop error: #{error}"
90
90
  error details
91
+ when 'licenser.expired'
92
+ error error
93
+ error details
91
94
  when 'runner.call.error'
92
95
  fatal "Runner crashed due to an error: #{error}"
93
96
  fatal details
@@ -26,7 +26,7 @@ module Karafka
26
26
  data = nil
27
27
  end
28
28
 
29
- details = data ? JSON.parse(data) : raise_invalid_license_token
29
+ details = data ? JSON.parse(data) : raise_invalid_license_token(license_config)
30
30
 
31
31
  license_config.entity = details.fetch('entity')
32
32
  license_config.expires_on = Date.parse(details.fetch('expires_on'))
@@ -39,7 +39,11 @@ module Karafka
39
39
  private
40
40
 
41
41
  # Raises an error with info, that used token is invalid
42
- def raise_invalid_license_token
42
+ # @param license_config [Dry::Configurable::Config]
43
+ def raise_invalid_license_token(license_config)
44
+ # We set it to false so `Karafka.pro?` method behaves as expected
45
+ license_config.token = false
46
+
43
47
  raise(
44
48
  Errors::InvalidLicenseTokenError,
45
49
  <<~MSG.tr("\n", ' ')
@@ -50,15 +54,22 @@ module Karafka
50
54
  end
51
55
 
52
56
  # We do not raise an error here as we don't want to cause any problems to someone that runs
53
- # Karafka on production. Error is enough.
57
+ # Karafka on production. Error message is enough.
58
+ #
54
59
  # @param expires_on [Date] when the license expires
55
60
  def notify_if_license_expired(expires_on)
56
- Karafka.logger.error(
57
- <<~MSG.tr("\n", ' ')
58
- Your license expired on #{expires_on}.
59
- Karafka no longer uses any Pro capabilities.
60
- Please reach us at contact@karafka.io or visit https://karafka.io to obtain a valid one.
61
- MSG
61
+ message = <<~MSG.tr("\n", ' ')
62
+ Your license expired on #{expires_on}.
63
+ Please reach us at contact@karafka.io or visit https://karafka.io to obtain a valid one.
64
+ MSG
65
+
66
+ Karafka.logger.error(message)
67
+
68
+ Karafka.monitor.instrument(
69
+ 'error.occurred',
70
+ caller: self,
71
+ error: Errors::ExpiredLicenseTokenError.new(message),
72
+ type: 'licenser.expired'
62
73
  )
63
74
  end
64
75
  end
@@ -14,6 +14,8 @@ module Karafka
14
14
  :partition,
15
15
  :topic,
16
16
  :scheduled_at,
17
+ :consumption_lag,
18
+ :processing_lag,
17
19
  keyword_init: true
18
20
  )
19
21
  end
@@ -12,7 +12,12 @@ module Karafka
12
12
  # @param topic [Karafka::Routing::Topic] topic for which we've fetched the batch
13
13
  # @param scheduled_at [Time] moment when the batch was scheduled for processing
14
14
  # @return [Karafka::Messages::BatchMetadata] batch metadata object
15
+ # @note Regarding the time lags: we can use the current time here, as batch metadata is
16
+ # created in the worker. So whenever this is being built, it means that the processing
17
+ # of this batch has already started.
15
18
  def call(kafka_batch, topic, scheduled_at)
19
+ now = Time.now
20
+
16
21
  Karafka::Messages::BatchMetadata.new(
17
22
  size: kafka_batch.count,
18
23
  first_offset: kafka_batch.first.offset,
@@ -20,9 +25,26 @@ module Karafka
20
25
  deserializer: topic.deserializer,
21
26
  partition: kafka_batch[0].partition,
22
27
  topic: topic.name,
23
- scheduled_at: scheduled_at
28
+ scheduled_at: scheduled_at,
29
+ # This lag describes how long did it take for a message to be consumed from the
30
+ # moment it was created
31
+ consumption_lag: time_distance_in_ms(now, kafka_batch.last.timestamp),
32
+ # This lag describes how long did a batch have to wait before it was picked up by
33
+ # one of the workers
34
+ processing_lag: time_distance_in_ms(now, scheduled_at)
24
35
  ).freeze
25
36
  end
37
+
38
+ private
39
+
40
+ # Computes time distance in between two times in ms
41
+ #
42
+ # @param time1 [Time]
43
+ # @param time2 [Time]
44
+ # @return [Integer] distance in between two times in ms
45
+ def time_distance_in_ms(time1, time2)
46
+ ((time1 - time2) * 1_000).round
47
+ end
26
48
  end
27
49
  end
28
50
  end
@@ -0,0 +1,58 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component.
4
+ # All of the commercial components are present in the lib/karafka/pro directory of this repository
5
+ # and their usage requires commercial license agreement.
6
+ #
7
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
8
+ #
9
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
10
+ # your code to Maciej Mensfeld.
11
+
12
+ module Karafka
13
+ module Pro
14
+ # Karafka Pro ActiveJob components
15
+ module ActiveJob
16
+ # Pro dispatcher that sends the ActiveJob job to a proper topic based on the queue name
17
+ # and that allows to inject additional options into the producer, effectively allowing for a
18
+ # much better and more granular control over the dispatch and consumption process.
19
+ class Dispatcher < ::Karafka::ActiveJob::Dispatcher
20
+ # Defaults for dispatching
21
+ # The can be updated by using `#karafka_options` on the job
22
+ DEFAULTS = {
23
+ dispatch_method: :produce_async,
24
+ # We don't create a dummy proc based partitioner as we would have to evaluate it with
25
+ # each job.
26
+ partitioner: nil
27
+ }.freeze
28
+
29
+ private_constant :DEFAULTS
30
+
31
+ # @param job [ActiveJob::Base] job
32
+ def call(job)
33
+ ::Karafka.producer.public_send(
34
+ fetch_option(job, :dispatch_method, DEFAULTS),
35
+ dispatch_details(job).merge!(
36
+ topic: job.queue_name,
37
+ payload: ::ActiveSupport::JSON.encode(job.serialize)
38
+ )
39
+ )
40
+ end
41
+
42
+ private
43
+
44
+ # @param job [ActiveJob::Base] job instance
45
+ # @return [Hash] hash with dispatch details to which we merge topic and payload
46
+ def dispatch_details(job)
47
+ partitioner = fetch_option(job, :partitioner, DEFAULTS)
48
+
49
+ return {} unless partitioner
50
+
51
+ {
52
+ partition_key: partitioner.call(job)
53
+ }
54
+ end
55
+ end
56
+ end
57
+ end
58
+ end
@@ -0,0 +1,27 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component.
4
+ # All of the commercial components are present in the lib/karafka/pro directory of this repository
5
+ # and their usage requires commercial license agreement.
6
+ #
7
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
8
+ #
9
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
10
+ # your code to Maciej Mensfeld.
11
+
12
+ module Karafka
13
+ module Pro
14
+ module ActiveJob
15
+ # Contract for validating the options that can be altered with `#karafka_options` per job
16
+ # class that works with Pro features.
17
+ class JobOptionsContract < ::Karafka::ActiveJob::JobOptionsContract
18
+ # Dry types
19
+ Types = include Dry.Types()
20
+
21
+ params do
22
+ optional(:partitioner).value(Types.Interface(:call))
23
+ end
24
+ end
25
+ end
26
+ end
27
+ end
@@ -0,0 +1,29 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component.
4
+ # All of the commercial components are present in the lib/karafka/pro directory of this repository
5
+ # and their usage requires commercial license agreement.
6
+ #
7
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
8
+ #
9
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
10
+ # your code to Maciej Mensfeld.
11
+ module Karafka
12
+ module Pro
13
+ # Loader requires and loads all the pro components only when they are needed
14
+ class Loader
15
+ class << self
16
+ # Loads all the pro components and configures them wherever it is expected
17
+ # @param config [Dry::Configurable::Config] whole app config that we can alter with pro
18
+ # components
19
+ def setup(config)
20
+ require_relative 'active_job/dispatcher'
21
+ require_relative 'active_job/job_options_contract'
22
+
23
+ config.internal.active_job.dispatcher = ActiveJob::Dispatcher.new
24
+ config.internal.active_job.job_options_contract = ActiveJob::JobOptionsContract.new
25
+ end
26
+ end
27
+ end
28
+ end
29
+ end
@@ -0,0 +1,13 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component.
4
+ # All of the commercial components are present in the lib/karafka/pro directory of this repository
5
+ # and their usage requires commercial license agreement.
6
+ #
7
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
8
+ #
9
+ module Karafka
10
+ # Namespace for pro components, licensed under the commercial license agreement.
11
+ module Pro
12
+ end
13
+ end
@@ -41,7 +41,7 @@ module Karafka
41
41
  # We signal critical exceptions, notify and do not allow worker to fail
42
42
  # rubocop:disable Lint/RescueException
43
43
  rescue Exception => e
44
- # rubocop:enable Lint/RescueException
44
+ # rubocop:enable Lint/RescueException
45
45
  Karafka.monitor.instrument(
46
46
  'error.occurred',
47
47
  caller: self,
@@ -19,6 +19,7 @@ end
19
19
  if rails
20
20
  # Load Karafka
21
21
  require 'karafka'
22
+
22
23
  # Load ActiveJob adapter
23
24
  require 'active_job/karafka'
24
25
 
@@ -30,6 +31,13 @@ if rails
30
31
  class Railtie < Rails::Railtie
31
32
  railtie_name :karafka
32
33
 
34
+ initializer 'karafka.active_job_integration' do
35
+ ActiveSupport.on_load(:active_job) do
36
+ # Extend ActiveJob with some Karafka specific ActiveJob magic
37
+ extend ::Karafka::ActiveJob::JobExtensions
38
+ end
39
+ end
40
+
33
41
  initializer 'karafka.configure_rails_initialization' do |app|
34
42
  # Consumers should autoload by default in the Rails app so they are visible
35
43
  app.config.autoload_paths += %w[app/consumers]
@@ -37,8 +45,10 @@ if rails
37
45
  # Make Karafka use Rails logger
38
46
  ::Karafka::App.config.logger = Rails.logger
39
47
 
40
- # This lines will make Karafka print to stdout like puma or unicorn
41
- if Rails.env.development?
48
+ # This lines will make Karafka print to stdout like puma or unicorn when we run karafka
49
+ # server + will support code reloading with each fetched loop. We do it only for karafka
50
+ # based commands as Rails processes and console will have it enabled already
51
+ if Rails.env.development? && ENV.key?('KARAFKA_CLI')
42
52
  Rails.logger.extend(
43
53
  ActiveSupport::Logger.broadcast(
44
54
  ActiveSupport::Logger.new($stdout)
@@ -10,11 +10,6 @@ module Karafka
10
10
  # end
11
11
  # end
12
12
  class Builder < Concurrent::Array
13
- # Consumer group consistency checking contract
14
- CONTRACT = Karafka::Contracts::ConsumerGroup.new.freeze
15
-
16
- private_constant :CONTRACT
17
-
18
13
  def initialize
19
14
  @draws = Concurrent::Array.new
20
15
  super
@@ -38,12 +33,7 @@ module Karafka
38
33
  instance_eval(&block)
39
34
 
40
35
  each do |consumer_group|
41
- hashed_group = consumer_group.to_h
42
- validation_result = CONTRACT.call(hashed_group)
43
-
44
- next if validation_result.success?
45
-
46
- raise Errors::InvalidConfigurationError, validation_result.errors.to_h
36
+ Contracts::ConsumerGroup.new.validate!(consumer_group.to_h)
47
37
  end
48
38
  end
49
39
 
@@ -20,7 +20,7 @@ module Karafka
20
20
 
21
21
  # @return [String] consumer group id
22
22
  def consumer_group_id
23
- kafka['group.id']
23
+ kafka[:'group.id']
24
24
  end
25
25
 
26
26
  # @return [Integer] max messages fetched in a single go
@@ -39,12 +39,12 @@ module Karafka
39
39
  def kafka
40
40
  kafka = @topics.first.kafka.dup
41
41
 
42
- kafka['client.id'] ||= Karafka::App.config.client_id
43
- kafka['group.id'] ||= @topics.first.consumer_group.id
44
- kafka['auto.offset.reset'] ||= 'earliest'
42
+ kafka[:'client.id'] ||= Karafka::App.config.client_id
43
+ kafka[:'group.id'] ||= @topics.first.consumer_group.id
44
+ kafka[:'auto.offset.reset'] ||= 'earliest'
45
45
  # Karafka manages the offsets based on the processing state, thus we do not rely on the
46
46
  # rdkafka offset auto-storing
47
- kafka['enable.auto.offset.store'] = 'false'
47
+ kafka[:'enable.auto.offset.store'] = 'false'
48
48
  kafka.freeze
49
49
  kafka
50
50
  end
@@ -14,15 +14,12 @@ module Karafka
14
14
  class Config
15
15
  extend Dry::Configurable
16
16
 
17
- # Contract for checking the config provided by the user
18
- CONTRACT = Karafka::Contracts::Config.new.freeze
19
-
20
17
  # Defaults for kafka settings, that will be overwritten only if not present already
21
18
  KAFKA_DEFAULTS = {
22
- 'client.id' => 'karafka'
19
+ 'client.id': 'karafka'
23
20
  }.freeze
24
21
 
25
- private_constant :CONTRACT, :KAFKA_DEFAULTS
22
+ private_constant :KAFKA_DEFAULTS
26
23
 
27
24
  # Available settings
28
25
 
@@ -84,8 +81,6 @@ module Karafka
84
81
  setting :kafka, default: {}
85
82
 
86
83
  # Namespace for internal settings that should not be modified
87
- # It's a temporary step to "declassify" several things internally before we move to a
88
- # non global state
89
84
  setting :internal do
90
85
  # option routing_builder [Karafka::Routing::Builder] builder instance
91
86
  setting :routing_builder, default: Routing::Builder.new
@@ -98,6 +93,17 @@ module Karafka
98
93
  # option subscription_groups_builder [Routing::SubscriptionGroupsBuilder] subscription
99
94
  # group builder
100
95
  setting :subscription_groups_builder, default: Routing::SubscriptionGroupsBuilder.new
96
+
97
+ # Karafka components for ActiveJob
98
+ setting :active_job do
99
+ # option dispatcher [Karafka::ActiveJob::Dispatcher] default dispatcher for ActiveJob
100
+ setting :dispatcher, default: ActiveJob::Dispatcher.new
101
+ # option job_options_contract [Karafka::Contracts::JobOptionsContract] contract for
102
+ # ensuring, that extra job options defined are valid
103
+ setting :job_options_contract, default: ActiveJob::JobOptionsContract.new
104
+ # option consumer [Class] consumer class that should be used to consume ActiveJob data
105
+ setting :consumer, default: ActiveJob::Consumer
106
+ end
101
107
  end
102
108
 
103
109
  class << self
@@ -106,7 +112,7 @@ module Karafka
106
112
  def setup(&block)
107
113
  configure(&block)
108
114
  merge_kafka_defaults!(config)
109
- validate!
115
+ Contracts::Config.new.validate!(config.to_h)
110
116
 
111
117
  # Check the license presence (if needed) and
112
118
  Licenser.new.verify(config.license)
@@ -128,18 +134,6 @@ module Karafka
128
134
  end
129
135
  end
130
136
 
131
- # Validate config based on the config contract
132
- # @return [Boolean] true if configuration is valid
133
- # @raise [Karafka::Errors::InvalidConfigurationError] raised when configuration
134
- # doesn't match with the config contract
135
- def validate!
136
- validation_result = CONTRACT.call(config.to_h)
137
-
138
- return true if validation_result.success?
139
-
140
- raise Errors::InvalidConfigurationError, validation_result.errors.to_h
141
- end
142
-
143
137
  # Sets up all the components that are based on the user configuration
144
138
  # @note At the moment it is only WaterDrop
145
139
  def configure_components
@@ -149,6 +143,12 @@ module Karafka
149
143
  producer_config.kafka = config.kafka.dup
150
144
  producer_config.logger = config.logger
151
145
  end
146
+
147
+ return unless Karafka.pro?
148
+
149
+ # Runs the pro loader that includes all the pro components
150
+ require 'karafka/pro/loader'
151
+ Pro::Loader.setup(config)
152
152
  end
153
153
  end
154
154
  end
@@ -3,5 +3,5 @@
3
3
  # Main module namespace
4
4
  module Karafka
5
5
  # Current Karafka version
6
- VERSION = '2.0.0.alpha1'
6
+ VERSION = '2.0.0.alpha2'
7
7
  end
data/lib/karafka.rb CHANGED
@@ -65,6 +65,11 @@ module Karafka
65
65
  Pathname.new(File.expand_path('karafka', __dir__))
66
66
  end
67
67
 
68
+ # @return [Boolean] true if there is a valid pro token present
69
+ def pro?
70
+ App.config.license.token != false
71
+ end
72
+
68
73
  # @return [String] path to a default file that contains booting procedure etc
69
74
  # @note By default it is a file called 'karafka.rb' but it can be specified as you wish if you
70
75
  # have Karafka that is merged into a Sinatra/Rails app and karafka.rb is taken.
@@ -82,8 +87,8 @@ end
82
87
 
83
88
  loader = Zeitwerk::Loader.for_gem
84
89
  # Do not load Rails extensions by default, this will be handled by Railtie if they are needed
85
- loader.do_not_eager_load(Karafka.gem_root.join('lib/active_job'))
90
+ loader.ignore(Karafka.gem_root.join('lib/active_job'))
86
91
  # Do not load pro components, this will be handled by license manager
87
- loader.do_not_eager_load(Karafka.gem_root.join('lib/karafka/pro'))
92
+ loader.ignore(Karafka.gem_root.join('lib/karafka/pro'))
88
93
  loader.setup
89
94
  loader.eager_load
data.tar.gz.sig CHANGED
Binary file
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: karafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.0.0.alpha1
4
+ version: 2.0.0.alpha2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Maciej Mensfeld
@@ -34,7 +34,7 @@ cert_chain:
34
34
  R2P11bWoCtr70BsccVrN8jEhzwXngMyI2gVt750Y+dbTu1KgRqZKp/ECe7ZzPzXj
35
35
  pIy9vHxTANKYVyI4qj8OrFdEM5BQNu8oQpL0iQ==
36
36
  -----END CERTIFICATE-----
37
- date: 2022-01-30 00:00:00.000000000 Z
37
+ date: 2022-02-19 00:00:00.000000000 Z
38
38
  dependencies:
39
39
  - !ruby/object:Gem::Dependency
40
40
  name: dry-configurable
@@ -112,7 +112,7 @@ dependencies:
112
112
  requirements:
113
113
  - - ">="
114
114
  - !ruby/object:Gem::Version
115
- version: 2.1.0
115
+ version: 2.2.0
116
116
  - - "<"
117
117
  - !ruby/object:Gem::Version
118
118
  version: 3.0.0
@@ -122,7 +122,7 @@ dependencies:
122
122
  requirements:
123
123
  - - ">="
124
124
  - !ruby/object:Gem::Version
125
- version: 2.1.0
125
+ version: 2.2.0
126
126
  - - "<"
127
127
  - !ruby/object:Gem::Version
128
128
  version: 3.0.0
@@ -178,11 +178,14 @@ files:
178
178
  - config/errors.yml
179
179
  - docker-compose.yml
180
180
  - karafka.gemspec
181
- - lib/active_job/consumer.rb
182
181
  - lib/active_job/karafka.rb
183
182
  - lib/active_job/queue_adapters/karafka_adapter.rb
184
- - lib/active_job/routing_extensions.rb
185
183
  - lib/karafka.rb
184
+ - lib/karafka/active_job/consumer.rb
185
+ - lib/karafka/active_job/dispatcher.rb
186
+ - lib/karafka/active_job/job_extensions.rb
187
+ - lib/karafka/active_job/job_options_contract.rb
188
+ - lib/karafka/active_job/routing_extensions.rb
186
189
  - lib/karafka/app.rb
187
190
  - lib/karafka/base_consumer.rb
188
191
  - lib/karafka/cli.rb
@@ -197,6 +200,7 @@ files:
197
200
  - lib/karafka/connection/pauses_manager.rb
198
201
  - lib/karafka/connection/rebalance_manager.rb
199
202
  - lib/karafka/contracts.rb
203
+ - lib/karafka/contracts/base.rb
200
204
  - lib/karafka/contracts/config.rb
201
205
  - lib/karafka/contracts/consumer_group.rb
202
206
  - lib/karafka/contracts/consumer_group_topic.rb
@@ -221,6 +225,10 @@ files:
221
225
  - lib/karafka/messages/metadata.rb
222
226
  - lib/karafka/messages/seek.rb
223
227
  - lib/karafka/patches/rdkafka/consumer.rb
228
+ - lib/karafka/pro.rb
229
+ - lib/karafka/pro/active_job/dispatcher.rb
230
+ - lib/karafka/pro/active_job/job_options_contract.rb
231
+ - lib/karafka/pro/loader.rb
224
232
  - lib/karafka/process.rb
225
233
  - lib/karafka/processing/executor.rb
226
234
  - lib/karafka/processing/executors_buffer.rb
@@ -275,7 +283,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
275
283
  - !ruby/object:Gem::Version
276
284
  version: 1.3.1
277
285
  requirements: []
278
- rubygems_version: 3.3.4
286
+ rubygems_version: 3.3.3
279
287
  signing_key:
280
288
  specification_version: 4
281
289
  summary: Ruby based framework for working with Apache Kafka
metadata.gz.sig CHANGED
Binary file
@@ -1,22 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module ActiveJob
4
- # This is the consumer for ActiveJob that eats the messages enqueued with it one after another.
5
- # It marks the offset after each message, so we make sure, none of the jobs is executed twice
6
- class Consumer < Karafka::BaseConsumer
7
- # Executes the ActiveJob logic
8
- # @note ActiveJob does not support batches, so we just run one message after another
9
- def consume
10
- messages.each do |message|
11
- ActiveJob::Base.execute(
12
- # We technically speaking could set this as deserializer and reference it from the
13
- # message instead of using the `#raw_payload`. This is not done on purpose to simplify
14
- # the ActiveJob setup here
15
- ActiveSupport::JSON.decode(message.raw_payload)
16
- )
17
-
18
- mark_as_consumed(message)
19
- end
20
- end
21
- end
22
- end
@@ -1,15 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module ActiveJob
4
- # Routing extensions for ActiveJob
5
- module RoutingExtensions
6
- # This method simplifies routes definition for ActiveJob topics / queues by auto-injecting the
7
- # consumer class
8
- # @param name [String, Symbol] name of the topic where ActiveJobs jobs should go
9
- def active_job_topic(name)
10
- topic(name) do
11
- consumer ActiveJob::Consumer
12
- end
13
- end
14
- end
15
- end