karafka 2.5.0.beta1 → 2.5.0.beta2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4bcac2dd9a093cd85ab76342d9c227d552e9927e681e79f55963a9d13e64f79d
4
- data.tar.gz: 8b5e32c1c1099b8e654599c6df25c33816c3a4a315ffe7e0de06c49a2b4e720e
3
+ metadata.gz: bbcaf396d7f2eff35ec3b59c96ffe7dd880f1f07294aa28bb623e95ce328e3e9
4
+ data.tar.gz: 410b037a79abbbc82fbd4b540dcb26378dca396e408f68280325e354ae2e275b
5
5
  SHA512:
6
- metadata.gz: d2ffa6709d42103eea487e44b6756d9b449bd762b162e8ce35283b2513b26c2fde8ef559b900c7932d1fe4e3e8c771d9874d1ed54a572e3e94f6df067eb354d8
7
- data.tar.gz: 9bfafb2b2c8ec8975e55fe1bddeaaedfb4d7fd1cedae36abdfed45819e8136eb614922d4f52c0b011d97602ad728dc7a860a3025e9ab1694720b0f35af00429d
6
+ metadata.gz: dcd79f3bda653b74d95938d440ccfe93d8d2cdc214e02de31d9ad56f738b554699bd1991d83dd1c9c7090e1c3a63669f8837677fef77efc22df4804bc16f25a0
7
+ data.tar.gz: dbca53f433a0e13ec7582f9ee1c6da15bf38678a1019a3a36e361e7bbbae2c5e5777bf44e61330692b5f38c4fa011dd9e3950027925767908f617df02e1e8219
@@ -31,7 +31,7 @@ jobs:
31
31
  fetch-depth: 0
32
32
 
33
33
  - name: Set up Ruby
34
- uses: ruby/setup-ruby@e34163cd15f4bb403dcd72d98e295997e6a55798 # v1.238.0
34
+ uses: ruby/setup-ruby@bb0f760b6c925183520ee0bcc9c4a432a7c8c3c6 # v1.241.0
35
35
  with:
36
36
  ruby-version: 3.4
37
37
  bundler-cache: true
@@ -118,7 +118,7 @@ jobs:
118
118
  run: rm -f Gemfile.lock
119
119
 
120
120
  - name: Set up Ruby
121
- uses: ruby/setup-ruby@e34163cd15f4bb403dcd72d98e295997e6a55798 # v1.238.0
121
+ uses: ruby/setup-ruby@bb0f760b6c925183520ee0bcc9c4a432a7c8c3c6 # v1.241.0
122
122
  with:
123
123
  ruby-version: ${{matrix.ruby}}
124
124
  bundler-cache: true
@@ -164,7 +164,7 @@ jobs:
164
164
  docker compose up -d || (sleep 5 && docker compose up -d)
165
165
 
166
166
  - name: Set up Ruby
167
- uses: ruby/setup-ruby@e34163cd15f4bb403dcd72d98e295997e6a55798 # v1.238.0
167
+ uses: ruby/setup-ruby@bb0f760b6c925183520ee0bcc9c4a432a7c8c3c6 # v1.241.0
168
168
  with:
169
169
  # Do not use cache here as we run bundle install also later in some of the integration
170
170
  # tests and we need to be able to run it without cache
@@ -228,7 +228,7 @@ jobs:
228
228
  docker compose up -d || (sleep 5 && docker compose up -d)
229
229
 
230
230
  - name: Set up Ruby
231
- uses: ruby/setup-ruby@e34163cd15f4bb403dcd72d98e295997e6a55798 # v1.238.0
231
+ uses: ruby/setup-ruby@bb0f760b6c925183520ee0bcc9c4a432a7c8c3c6 # v1.241.0
232
232
  with:
233
233
  ruby-version: ${{matrix.ruby}}
234
234
  bundler: 'latest'
@@ -0,0 +1,36 @@
1
+ name: Push Gem
2
+
3
+ on:
4
+ push:
5
+ tags:
6
+ - v*
7
+
8
+ permissions:
9
+ contents: read
10
+
11
+ jobs:
12
+ push:
13
+ if: github.repository_owner == 'karafka'
14
+ runs-on: ubuntu-latest
15
+ environment: deployment
16
+
17
+ permissions:
18
+ contents: write
19
+ id-token: write
20
+
21
+ steps:
22
+ - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
23
+ with:
24
+ fetch-depth: 0
25
+
26
+ - name: Set up Ruby
27
+ uses: ruby/setup-ruby@bb0f760b6c925183520ee0bcc9c4a432a7c8c3c6 # v1.241.0
28
+ with:
29
+ bundler-cache: false
30
+
31
+ - name: Bundle install
32
+ run: |
33
+ bundle install --jobs 4 --retry 3
34
+
35
+ # Release
36
+ - uses: rubygems/release-gem@9e85cb11501bebc2ae661c1500176316d3987059 # v1
data/CHANGELOG.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Karafka Framework Changelog
2
2
 
3
- ## 2.4.19 (Unreleased)
3
+ ## 2.5.0 (Unreleased)
4
4
  - **[Breaking]** Use DLQ and Piping prefix `source_` instead of `original_` to align with naming convention of Kafka Streams and Apache Flink for future usage.
5
5
  - **[Breaking]** Rename scheduled jobs topics names in their config (Pro).
6
6
  - **[Feature]** Parallel Segments for concurrent processing of the same partition with more than partition count of processes (Pro).
@@ -30,6 +30,11 @@
30
30
  - [Enhancement] Set `topic.metadata.refresh.interval.ms` for default producer in dev to 5s to align with consumer setup.
31
31
  - [Enhancement] Alias `-2` and `-1` with `latest` and `earliest` for seeking.
32
32
  - [Enhancement] Allow for usage of `latest` and `earliest` in the `Karafka::Pro::Iterator`.
33
+ - [Enhancement] Failures during `topics migrate` (and other subcommands) don't show what topic failed, and why it's invalid.
34
+ - [Enhancement] Apply changes to topics configuration in atomic independent requests when using Declarative Topics.
35
+ - [Enhancement] Execute the help CLI command when no command provided (similar to Rails) to improve DX.
36
+ - [Enhancement] Remove backtrace from the CLI error for incorrect commands (similar to Rails) to improve DX.
37
+ - [Enhancement] Provide `karafka topics help` sub-help due to nesting of Declarative Topics actions.
33
38
  - [Refactor] Introduce a `bin/verify_kafka_warnings` script to clean Kafka from temporary test-suite topics.
34
39
  - [Refactor] Introduce a `bin/verify_topics_naming` script to ensure proper test topics naming convention.
35
40
  - [Refactor] Make sure all temporary topics have a `it-` prefix in their name.
@@ -52,6 +57,8 @@
52
57
  - [Fix] optparse double parse loses ARGV.
53
58
  - [Fix] `karafka` cannot be required without Bundler.
54
59
  - [Fix] Scheduled Messages re-seek moves to `latest` on inheritance of initial offset when `0` offset is compacted.
60
+ - [Fix] Seek to `:latest` without `topic_partition_position` (-1) will not seek at all.
61
+ - [Change] Move to trusted-publishers and remove signing since no longer needed.
55
62
 
56
63
  ## 2.4.18 (2025-04-09)
57
64
  - [Fix] Make sure `Bundler.with_unbundled_env` is not called multiple times.
data/Gemfile CHANGED
@@ -16,9 +16,9 @@ group :integrations, :test do
16
16
  end
17
17
 
18
18
  group :integrations do
19
- # gem 'activejob', require: false
20
- # gem 'karafka-testing', '>= 2.4.6', require: false
21
- # gem 'karafka-web', '>= 0.10.4', require: false
19
+ gem 'activejob', require: false
20
+ gem 'karafka-testing', '>= 2.5.0', require: false
21
+ gem 'karafka-web', '>= 0.11.0.beta1', require: false
22
22
  end
23
23
 
24
24
  group :test do
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (2.5.0.beta1)
4
+ karafka (2.5.0.beta2)
5
5
  base64 (~> 0.2)
6
6
  karafka-core (>= 2.5.0, < 2.6.0)
7
7
  karafka-rdkafka (>= 0.19.2)
@@ -11,6 +11,9 @@ PATH
11
11
  GEM
12
12
  remote: https://rubygems.org/
13
13
  specs:
14
+ activejob (8.0.2)
15
+ activesupport (= 8.0.2)
16
+ globalid (>= 0.3.6)
14
17
  activesupport (8.0.2)
15
18
  base64
16
19
  benchmark (>= 0.3)
@@ -33,6 +36,7 @@ GEM
33
36
  diff-lcs (1.6.2)
34
37
  docile (1.4.1)
35
38
  drb (2.2.3)
39
+ erubi (1.13.1)
36
40
  et-orbi (1.2.11)
37
41
  tzinfo
38
42
  factory_bot (6.5.1)
@@ -51,21 +55,35 @@ GEM
51
55
  fugit (1.11.1)
52
56
  et-orbi (~> 1, >= 1.2.11)
53
57
  raabro (~> 1.4)
58
+ globalid (1.2.1)
59
+ activesupport (>= 6.1)
54
60
  i18n (1.14.7)
55
61
  concurrent-ruby (~> 1.0)
56
- karafka-core (2.5.0)
62
+ karafka-core (2.5.1)
57
63
  karafka-rdkafka (>= 0.19.2, < 0.21.0)
58
64
  logger (>= 1.6.0)
59
- karafka-rdkafka (0.19.2)
65
+ karafka-rdkafka (0.19.4)
60
66
  ffi (~> 1.15)
61
67
  mini_portile2 (~> 2.6)
62
68
  rake (> 12)
69
+ karafka-testing (2.5.0)
70
+ karafka (>= 2.5.0.beta1, < 2.6.0)
71
+ waterdrop (>= 2.8.0)
72
+ karafka-web (0.11.0.beta3)
73
+ erubi (~> 1.4)
74
+ karafka (>= 2.5.0.beta1, < 2.6.0)
75
+ karafka-core (>= 2.5.0, < 2.6.0)
76
+ roda (~> 3.68, >= 3.69)
77
+ tilt (~> 2.0)
63
78
  logger (1.7.0)
64
79
  mini_portile2 (2.8.9)
65
80
  minitest (5.25.5)
66
81
  ostruct (0.6.1)
67
82
  raabro (1.4.0)
83
+ rack (3.1.15)
68
84
  rake (13.2.1)
85
+ roda (3.92.0)
86
+ rack
69
87
  rspec (3.13.0)
70
88
  rspec-core (~> 3.13.0)
71
89
  rspec-expectations (~> 3.13.0)
@@ -87,14 +105,15 @@ GEM
87
105
  simplecov-html (0.13.1)
88
106
  simplecov_json_formatter (0.1.4)
89
107
  stringio (3.1.7)
108
+ tilt (2.6.0)
90
109
  tzinfo (2.0.6)
91
110
  concurrent-ruby (~> 1.0)
92
111
  uri (1.0.3)
93
- waterdrop (2.8.3)
112
+ waterdrop (2.8.4)
94
113
  karafka-core (>= 2.4.9, < 3.0.0)
95
- karafka-rdkafka (>= 0.19.1)
114
+ karafka-rdkafka (>= 0.19.2)
96
115
  zeitwerk (~> 2.3)
97
- zeitwerk (2.7.3)
116
+ zeitwerk (2.6.18)
98
117
 
99
118
  PLATFORMS
100
119
  aarch64-linux-gnu
@@ -110,10 +129,13 @@ PLATFORMS
110
129
  x86_64-linux-musl
111
130
 
112
131
  DEPENDENCIES
132
+ activejob
113
133
  byebug
114
134
  factory_bot
115
135
  fugit
116
136
  karafka!
137
+ karafka-testing (>= 2.5.0)
138
+ karafka-web (>= 0.11.0.beta1)
117
139
  ostruct
118
140
  rspec
119
141
  simplecov
data/README.md CHANGED
@@ -84,7 +84,7 @@ bundle exec karafka server
84
84
 
85
85
  I also sell Karafka Pro subscriptions. It includes a commercial-friendly license, priority support, architecture consultations, enhanced Web UI and high throughput data processing-related features (virtual partitions, long-running jobs, and more).
86
86
 
87
- **10%** of the income will be distributed back to other OSS projects that Karafka uses under the hood.
87
+ Part of the income is [distributed back](https://github.com/orgs/karafka/sponsoring) to other OSS projects that Karafka uses under the hood.
88
88
 
89
89
  Help me provide high-quality open-source software. Please see the Karafka [homepage](https://karafka.io/#become-pro) for more details.
90
90
 
data/Rakefile ADDED
@@ -0,0 +1,4 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'bundler/setup'
4
+ require 'bundler/gem_tasks'
data/bin/integrations CHANGED
@@ -48,7 +48,8 @@ class Scenario
48
48
  'instrumentation/post_errors_instrumentation_error_spec.rb' => [1].freeze,
49
49
  'cli/declaratives/delete/existing_with_exit_code_spec.rb' => [2].freeze,
50
50
  'cli/declaratives/create/new_with_exit_code_spec.rb' => [2].freeze,
51
- 'cli/declaratives/plan/when_changes_with_detailed_exit_code_spec.rb' => [2].freeze
51
+ 'cli/declaratives/plan/when_changes_with_detailed_exit_code_spec.rb' => [2].freeze,
52
+ 'cli/declaratives/align/incorrectly_spec.rb' => [1].freeze
52
53
  }.freeze
53
54
 
54
55
  private_constant :MAX_RUN_TIME, :EXIT_CODES
data/karafka.gemspec CHANGED
@@ -29,11 +29,6 @@ Gem::Specification.new do |spec|
29
29
 
30
30
  spec.required_ruby_version = '>= 3.0.0'
31
31
 
32
- if $PROGRAM_NAME.end_with?('gem')
33
- spec.signing_key = File.expand_path('~/.ssh/gem-private_key.pem')
34
- end
35
-
36
- spec.cert_chain = %w[certs/cert.pem]
37
32
  spec.files = `git ls-files -z`.split("\x0").reject { |f| f.match(%r{^(spec)/}) }
38
33
  spec.executables = %w[karafka]
39
34
  spec.require_paths = %w[lib]
@@ -10,6 +10,10 @@ module Karafka
10
10
  #
11
11
  # Altering is done in the incremental way.
12
12
  module Configs
13
+ extend Helpers::ConfigImporter.new(
14
+ max_wait_time: %i[admin max_wait_time]
15
+ )
16
+
13
17
  class << self
14
18
  # Fetches given resources configurations from Kafka
15
19
  #
@@ -94,7 +98,7 @@ module Karafka
94
98
  # Makes sure that admin is closed afterwards.
95
99
  def with_admin_wait
96
100
  Admin.with_admin do |admin|
97
- yield(admin).wait(max_wait_timeout: Karafka::App.config.admin.max_wait_time)
101
+ yield(admin).wait(max_wait_timeout: max_wait_time)
98
102
  end
99
103
  end
100
104
  end
data/lib/karafka/admin.rb CHANGED
@@ -10,6 +10,15 @@ module Karafka
10
10
  # Cluster on which operations are performed can be changed via `admin.kafka` config, however
11
11
  # there is no multi-cluster runtime support.
12
12
  module Admin
13
+ extend Helpers::ConfigImporter.new(
14
+ max_wait_time: %i[admin max_wait_time],
15
+ poll_timeout: %i[admin poll_timeout],
16
+ max_attempts: %i[admin max_attempts],
17
+ group_id: %i[admin group_id],
18
+ app_kafka: %i[kafka],
19
+ admin_kafka: %i[admin kafka]
20
+ )
21
+
13
22
  # 2010-01-01 00:00:00 - way before Kafka was released so no messages should exist prior to
14
23
  # this date
15
24
  # We do not use the explicit -2 librdkafka value here because we resolve this offset without
@@ -113,7 +122,7 @@ module Karafka
113
122
  handler = admin.create_topic(name, partitions, replication_factor, topic_config)
114
123
 
115
124
  with_re_wait(
116
- -> { handler.wait(max_wait_timeout: app_config.admin.max_wait_time) },
125
+ -> { handler.wait(max_wait_timeout: max_wait_time) },
117
126
  -> { topics_names.include?(name) }
118
127
  )
119
128
  end
@@ -127,7 +136,7 @@ module Karafka
127
136
  handler = admin.delete_topic(name)
128
137
 
129
138
  with_re_wait(
130
- -> { handler.wait(max_wait_timeout: app_config.admin.max_wait_time) },
139
+ -> { handler.wait(max_wait_timeout: max_wait_time) },
131
140
  -> { !topics_names.include?(name) }
132
141
  )
133
142
  end
@@ -142,7 +151,7 @@ module Karafka
142
151
  handler = admin.create_partitions(name, partitions)
143
152
 
144
153
  with_re_wait(
145
- -> { handler.wait(max_wait_timeout: app_config.admin.max_wait_time) },
154
+ -> { handler.wait(max_wait_timeout: max_wait_time) },
146
155
  -> { topic_info(name).fetch(:partition_count) >= partitions }
147
156
  )
148
157
  end
@@ -353,7 +362,7 @@ module Karafka
353
362
  def delete_consumer_group(consumer_group_id)
354
363
  with_admin do |admin|
355
364
  handler = admin.delete_group(consumer_group_id)
356
- handler.wait(max_wait_timeout: app_config.admin.max_wait_time)
365
+ handler.wait(max_wait_timeout: max_wait_time)
357
366
  end
358
367
  end
359
368
 
@@ -539,7 +548,7 @@ module Karafka
539
548
 
540
549
  admin = config(:producer, {}).admin(
541
550
  native_kafka_auto_start: false,
542
- native_kafka_poll_timeout_ms: app_config.admin.poll_timeout
551
+ native_kafka_poll_timeout_ms: poll_timeout
543
552
  )
544
553
 
545
554
  bind_oauth(bind_id, admin)
@@ -604,7 +613,7 @@ module Karafka
604
613
  rescue Rdkafka::AbstractHandle::WaitTimeoutError, Errors::ResultNotVisibleError
605
614
  return if breaker.call
606
615
 
607
- retry if attempt <= app_config.admin.max_attempts
616
+ retry if attempt <= max_attempts
608
617
 
609
618
  raise
610
619
  end
@@ -613,11 +622,10 @@ module Karafka
613
622
  # @param settings [Hash] extra settings for config (if needed)
614
623
  # @return [::Rdkafka::Config] rdkafka config
615
624
  def config(type, settings)
616
- app_config
617
- .kafka
625
+ app_kafka
618
626
  .then(&:dup)
619
- .merge(app_config.admin.kafka)
620
- .tap { |config| config[:'group.id'] = app_config.admin.group_id }
627
+ .merge(admin_kafka)
628
+ .tap { |config| config[:'group.id'] = group_id }
621
629
  # We merge after setting the group id so it can be altered if needed
622
630
  # In general in admin we only should alter it when we need to impersonate a given
623
631
  # consumer group or do something similar
@@ -651,11 +659,6 @@ module Karafka
651
659
  offset
652
660
  end
653
661
  end
654
-
655
- # @return [Karafka::Core::Configurable::Node] root node config
656
- def app_config
657
- ::Karafka::App.config
658
- end
659
662
  end
660
663
  end
661
664
  end
@@ -30,10 +30,13 @@ module Karafka
30
30
  return false
31
31
  end
32
32
 
33
- names = resources_to_migrate.map(&:name).join(', ')
34
- puts "Updating configuration of the following topics: #{names}"
35
- Karafka::Admin::Configs.alter(resources_to_migrate)
36
- puts "#{green('Updated')} all requested topics configuration."
33
+ resources_to_migrate.each do |resource|
34
+ supervised("Updating topic: #{resource.name} configuration") do
35
+ Karafka::Admin::Configs.alter(resource)
36
+ end
37
+
38
+ puts "#{green('Updated')} topic #{resource.name} configuration."
39
+ end
37
40
 
38
41
  true
39
42
  end
@@ -12,6 +12,23 @@ module Karafka
12
12
 
13
13
  private
14
14
 
15
+ # Used to run Karafka Admin commands that talk with Kafka and that can fail due to broker
16
+ # errors and other issues. We catch errors and provide nicer printed output prior to
17
+ # re-raising the mapped error for proper exit code status handling
18
+ #
19
+ # @param operation_message [String] message that we use to print that it is going to run
20
+ # and if case if failed with a failure indication.
21
+ def supervised(operation_message)
22
+ puts "#{operation_message}..."
23
+
24
+ yield
25
+ rescue Rdkafka::RdkafkaError => e
26
+ puts "#{operation_message} #{red('failed')}:"
27
+ puts e
28
+
29
+ raise Errors::CommandValidationError, cause: e
30
+ end
31
+
15
32
  # @return [Array<Karafka::Routing::Topic>] all available topics that can be managed
16
33
  # @note If topic is defined in multiple consumer groups, first config will be used. This
17
34
  # means, that this CLI will not work for simultaneous management of multiple clusters
@@ -15,13 +15,15 @@ module Karafka
15
15
  if existing_topics_names.include?(name)
16
16
  puts "#{yellow('Skipping')} because topic #{name} already exists."
17
17
  else
18
- puts "Creating topic #{name}..."
19
- Admin.create_topic(
20
- name,
21
- topic.declaratives.partitions,
22
- topic.declaratives.replication_factor,
23
- topic.declaratives.details
24
- )
18
+ supervised("Creating topic #{name}") do
19
+ Admin.create_topic(
20
+ name,
21
+ topic.declaratives.partitions,
22
+ topic.declaratives.replication_factor,
23
+ topic.declaratives.details
24
+ )
25
+ end
26
+
25
27
  puts "#{green('Created')} topic #{name}."
26
28
  any_created = true
27
29
  end
@@ -13,8 +13,10 @@ module Karafka
13
13
  name = topic.name
14
14
 
15
15
  if existing_topics_names.include?(name)
16
- puts "Deleting topic #{name}..."
17
- Admin.delete_topic(name)
16
+ supervised("Deleting topic #{name}") do
17
+ Admin.delete_topic(name)
18
+ end
19
+
18
20
  puts "#{green('Deleted')} topic #{name}."
19
21
  any_deleted = true
20
22
  else
@@ -0,0 +1,39 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Karafka
4
+ class Cli
5
+ class Topics < Cli::Base
6
+ # Declarative topics CLI sub-help
7
+ class Help < Base
8
+ # Displays help information for all available topics management commands
9
+ def call
10
+ puts <<~HELP
11
+ Karafka topics commands:
12
+ align # Aligns configuration of all declarative topics based on definitions
13
+ create # Creates topics with appropriate settings
14
+ delete # Deletes all topics defined in the routes
15
+ help # Describes available topics management commands
16
+ migrate # Creates missing topics, repartitions existing and aligns configuration
17
+ plan # Plans migration process and prints changes to be applied
18
+ repartition # Adds additional partitions to topics with fewer partitions than expected
19
+ reset # Deletes and re-creates all topics
20
+
21
+ Options:
22
+ --detailed-exitcode # Provides detailed exit codes (0=no changes, 1=error, 2=changes applied)
23
+
24
+ Examples:
25
+ karafka topics create
26
+ karafka topics plan --detailed-exitcode
27
+ karafka topics migrate
28
+ karafka topics align
29
+
30
+ Note: All admin operations run on the default cluster only.
31
+ HELP
32
+
33
+ # We return false to indicate with exit code 0 that no changes were applied
34
+ false
35
+ end
36
+ end
37
+ end
38
+ end
39
+ end
@@ -21,8 +21,10 @@ module Karafka
21
21
  existing_count = existing_partitions.fetch(name, false)
22
22
 
23
23
  if existing_count && existing_count < desired_count
24
- puts "Increasing number of partitions to #{desired_count} on topic #{name}..."
25
- Admin.create_partitions(name, desired_count)
24
+ supervised("Increasing number of partitions to #{desired_count} on topic #{name}") do
25
+ Admin.create_partitions(name, desired_count)
26
+ end
27
+
26
28
  change = desired_count - existing_count
27
29
  puts "#{green('Created')} #{change} additional partitions on topic #{name}."
28
30
  any_repartitioned = true
@@ -27,10 +27,13 @@ module Karafka
27
27
  # crashes
28
28
  CHANGES_EXIT_CODE = 2
29
29
 
30
- private_constant :NO_CHANGES_EXIT_CODE, :CHANGES_EXIT_CODE
30
+ # Used when there was an error during execution.
31
+ ERROR_EXIT_CODE = 1
32
+
33
+ private_constant :NO_CHANGES_EXIT_CODE, :CHANGES_EXIT_CODE, :ERROR_EXIT_CODE
31
34
 
32
35
  # @param action [String] action we want to take
33
- def call(action = 'missing')
36
+ def call(action = 'help')
34
37
  detailed_exit_code = options.fetch(:detailed_exitcode, false)
35
38
 
36
39
  command = case action
@@ -48,8 +51,10 @@ module Karafka
48
51
  Topics::Align
49
52
  when 'plan'
50
53
  Topics::Plan
54
+ when 'help'
55
+ Topics::Help
51
56
  else
52
- raise ::ArgumentError, "Invalid topics action: #{action}"
57
+ raise Errors::UnrecognizedCommandError, "Unrecognized topics action: #{action}"
53
58
  end
54
59
 
55
60
  changes = command.new.call
@@ -57,6 +62,8 @@ module Karafka
57
62
  return unless detailed_exit_code
58
63
 
59
64
  changes ? exit(CHANGES_EXIT_CODE) : exit(NO_CHANGES_EXIT_CODE)
65
+ rescue Errors::CommandValidationError
66
+ exit(ERROR_EXIT_CODE)
60
67
  end
61
68
  end
62
69
  end
data/lib/karafka/cli.rb CHANGED
@@ -21,6 +21,8 @@ module Karafka
21
21
  args = action ? [action] : []
22
22
 
23
23
  command.new.call(*args)
24
+ elsif command_name.nil?
25
+ Help.new.call
24
26
  else
25
27
  raise(
26
28
  Karafka::Errors::UnrecognizedCommandError,
@@ -491,9 +491,17 @@ module Karafka
491
491
  #
492
492
  # This code adds around 0.01 ms per seek but saves from many user unexpected behaviours in
493
493
  # seeking and pausing
494
- return if message.offset == topic_partition_position(message.topic, message.partition)
494
+ position = topic_partition_position(message.topic, message.partition)
495
495
 
496
- kafka.seek(message)
496
+ # Always seek if current position cannot be fetched or is negative. Offset seek can also
497
+ # be negative (-1 or -2) and we should not compare it with the position because they are
498
+ # special (earliest or latest)
499
+ return kafka.seek(message) if position.negative?
500
+ # If offset is the same as the next position, we don't have to seek to get there, hence
501
+ # only in such case we can do nothing.
502
+ return kafka.seek(message) if message.offset != position
503
+
504
+ nil
497
505
  end
498
506
 
499
507
  # Commits the stored offsets in a sync way and closes the consumer.
@@ -15,13 +15,13 @@ module Karafka
15
15
  # Skip verification if web is not used at all
16
16
  return unless require_version('karafka/web')
17
17
 
18
- # All good if version higher than 0.9.0.rc3 because we expect 0.9.0.rc3 or higher
19
- return if version(Karafka::Web::VERSION) >= version('0.9.0.rc3')
18
+ # All good if version higher than 0.10.0 because we expect 0.10.0 or higher
19
+ return if version(Karafka::Web::VERSION) >= version('0.10.0')
20
20
 
21
21
  # If older web-ui used, we cannot allow it
22
22
  raise(
23
23
  Errors::DependencyConstraintsError,
24
- 'karafka-web < 0.9.0 is not compatible with this karafka version'
24
+ 'karafka-web < 0.10.0 is not compatible with this karafka version'
25
25
  )
26
26
  end
27
27
 
@@ -22,7 +22,34 @@ module Karafka
22
22
  InvalidConfigurationError = Class.new(BaseError)
23
23
 
24
24
  # Raised when we try to use Karafka CLI commands (except install) without a boot file
25
- MissingBootFileError = Class.new(BaseError)
25
+ MissingBootFileError = Class.new(BaseError) do
26
+ # @param boot_file_path [Pathname] path where the boot file should be
27
+ def initialize(boot_file_path)
28
+ message = <<~MSG
29
+
30
+ \e[31mKarafka Boot File Missing:\e[0m #{boot_file_path}
31
+
32
+ Cannot find Karafka boot file - this file configures your Karafka application.
33
+
34
+ \e[33mQuick fixes:\e[0m
35
+ \e[32m1.\e[0m Navigate to your Karafka app directory
36
+ \e[32m2.\e[0m Check if following file exists: \e[36m#{boot_file_path}\e[0m
37
+ \e[32m3.\e[0m Install Karafka if needed: \e[36mkarafka install\e[0m
38
+
39
+ \e[33mCommon causes:\e[0m
40
+ \e[31m•\e[0m Wrong directory (not in Karafka app root)
41
+ \e[31m•\e[0m File was accidentally moved or deleted
42
+ \e[31m•\e[0m New project needing initialization
43
+
44
+ For setup help: \e[34mhttps://karafka.io/docs/Getting-Started\e[0m
45
+ MSG
46
+
47
+ super(message)
48
+ # In case of this error backtrace is irrelevant and we want to print comprehensive error
49
+ # message without backtrace, this is why nullified.
50
+ set_backtrace([])
51
+ end
52
+ end
26
53
 
27
54
  # Raised when we've waited enough for shutting down a non-responsive process
28
55
  ForcefulShutdownError = Class.new(BaseError)
@@ -65,7 +92,13 @@ module Karafka
65
92
  ResultNotVisibleError = Class.new(BaseError)
66
93
 
67
94
  # Raised when there is an attempt to run an unrecognized CLI command
68
- UnrecognizedCommandError = Class.new(BaseError)
95
+ UnrecognizedCommandError = Class.new(BaseError) do
96
+ # Overwritten not to print backtrace for unknown CLI command
97
+ def initialize(*args)
98
+ super
99
+ set_backtrace([])
100
+ end
101
+ end
69
102
 
70
103
  # Raised when you were executing a command and it could not finish successfully because of
71
104
  # a setup state or parameters configuration
@@ -8,6 +8,10 @@ module Karafka
8
8
  module ScheduledMessages
9
9
  # Consumer that coordinates scheduling of messages when the time comes
10
10
  class Consumer < ::Karafka::BaseConsumer
11
+ include Helpers::ConfigImporter.new(
12
+ dispatcher_class: %i[scheduled_messages dispatcher_class]
13
+ )
14
+
11
15
  # Prepares the initial state of all stateful components
12
16
  def initialized
13
17
  clear!
@@ -155,7 +159,7 @@ module Karafka
155
159
  @today = Day.new
156
160
  @tracker = Tracker.new
157
161
  @state = State.new(false)
158
- @dispatcher = config.dispatcher_class.new(topic.name, partition)
162
+ @dispatcher = dispatcher_class.new(topic.name, partition)
159
163
  @states_reporter = Helpers::IntervalRunner.new do
160
164
  @tracker.today = @daily_buffer.size
161
165
  @tracker.state = @state.to_s
@@ -165,11 +169,6 @@ module Karafka
165
169
 
166
170
  tags.add(:state, @state.to_s)
167
171
  end
168
-
169
- # @return [Karafka::Core::Configurable::Node] Schedules config node
170
- def config
171
- @config ||= Karafka::App.config.scheduled_messages
172
- end
173
172
  end
174
173
  end
175
174
  end
@@ -9,6 +9,15 @@ module Karafka
9
9
 
10
10
  private_constant :FORCEFUL_SHUTDOWN_WAIT
11
11
 
12
+ extend Helpers::ConfigImporter.new(
13
+ cli_contract: %i[internal cli contract],
14
+ activity_manager: %i[internal routing activity_manager],
15
+ supervision_sleep: %i[internal supervision_sleep],
16
+ shutdown_timeout: %i[shutdown_timeout],
17
+ forceful_exit_code: %i[internal forceful_exit_code],
18
+ process: %i[internal process]
19
+ )
20
+
12
21
  class << self
13
22
  # Set of consuming threads. Each consumer thread contains a single consumer
14
23
  attr_accessor :listeners
@@ -42,9 +51,7 @@ module Karafka
42
51
  # embedded
43
52
  # We cannot validate this during the start because config needs to be populated and routes
44
53
  # need to be defined.
45
- config.internal.cli.contract.validate!(
46
- config.internal.routing.activity_manager.to_h
47
- )
54
+ cli_contract.validate!(activity_manager.to_h)
48
55
 
49
56
  # We clear as we do not want parent handlers in case of working from fork
50
57
  process.clear
@@ -99,18 +106,18 @@ module Karafka
99
106
 
100
107
  Karafka::App.stop!
101
108
 
102
- timeout = config.shutdown_timeout
109
+ timeout = shutdown_timeout
103
110
 
104
111
  # We check from time to time (for the timeout period) if all the threads finished
105
112
  # their work and if so, we can just return and normal shutdown process will take place
106
113
  # We divide it by 1000 because we use time in ms.
107
- ((timeout / 1_000) * (1 / config.internal.supervision_sleep)).to_i.times do
114
+ ((timeout / 1_000) * (1 / supervision_sleep)).to_i.times do
108
115
  all_listeners_stopped = listeners.all?(&:stopped?)
109
116
  all_workers_stopped = workers.none?(&:alive?)
110
117
 
111
118
  return if all_listeners_stopped && all_workers_stopped
112
119
 
113
- sleep(config.internal.supervision_sleep)
120
+ sleep(supervision_sleep)
114
121
  end
115
122
 
116
123
  raise Errors::ForcefulShutdownError
@@ -148,7 +155,7 @@ module Karafka
148
155
  return unless process.supervised?
149
156
 
150
157
  # exit! is not within the instrumentation as it would not trigger due to exit
151
- Kernel.exit!(config.internal.forceful_exit_code)
158
+ Kernel.exit!(forceful_exit_code)
152
159
  ensure
153
160
  # We need to check if it wasn't an early exit to make sure that only on stop invocation
154
161
  # can change the status after everything is closed
@@ -172,18 +179,6 @@ module Karafka
172
179
  # in one direction
173
180
  Karafka::App.quiet!
174
181
  end
175
-
176
- private
177
-
178
- # @return [Karafka::Core::Configurable::Node] root config node
179
- def config
180
- Karafka::App.config
181
- end
182
-
183
- # @return [Karafka::Process] process wrapper instance used to catch system signal calls
184
- def process
185
- config.internal.process
186
- end
187
182
  end
188
183
 
189
184
  # Always start with standalone so there always is a value for the execution mode.
@@ -3,5 +3,5 @@
3
3
  # Main module namespace
4
4
  module Karafka
5
5
  # Current Karafka version
6
- VERSION = '2.5.0.beta1'
6
+ VERSION = '2.5.0.beta2'
7
7
  end
metadata CHANGED
@@ -1,39 +1,12 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: karafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.5.0.beta1
4
+ version: 2.5.0.beta2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Maciej Mensfeld
8
8
  bindir: bin
9
- cert_chain:
10
- - |
11
- -----BEGIN CERTIFICATE-----
12
- MIIEcDCCAtigAwIBAgIBATANBgkqhkiG9w0BAQsFADA/MRAwDgYDVQQDDAdjb250
13
- YWN0MRcwFQYKCZImiZPyLGQBGRYHa2FyYWZrYTESMBAGCgmSJomT8ixkARkWAmlv
14
- MB4XDTI0MDgyMzEwMTkyMFoXDTQ5MDgxNzEwMTkyMFowPzEQMA4GA1UEAwwHY29u
15
- dGFjdDEXMBUGCgmSJomT8ixkARkWB2thcmFma2ExEjAQBgoJkiaJk/IsZAEZFgJp
16
- bzCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKjLhLjQqUlNayxkXnO+
17
- PsmCDs/KFIzhrsYMfLZRZNaWmzV3ujljMOdDjd4snM2X06C41iVdQPWjpe3j8vVe
18
- ZXEWR/twSbOP6Eeg8WVH2wCOo0x5i7yhVn4UBLH4JpfEMCbemVcWQ9ry9OMg4WpH
19
- Uu4dRwxFV7hzCz3p0QfNLRI4miAxnGWcnlD98IJRjBAksTuR1Llj0vbOrDGsL9ZT
20
- JeXP2gdRLd8SqzAFJEWrbeTBCBU7gfSh3oMg5SVDLjaqf7Kz5wC/8bDZydzanOxB
21
- T6CDXPsCnllmvTNx2ei2T5rGYJOzJeNTmJLLK6hJWUlAvaQSvCwZRvFJ0tVGLEoS
22
- flqSr6uGyyl1eMUsNmsH4BqPEYcAV6P2PKTv2vUR8AP0raDvZ3xL1TKvfRb8xRpo
23
- vPopCGlY5XBWEc6QERHfVLTIVsjnls2/Ujj4h8/TSfqqYnaHKefIMLbuD/tquMjD
24
- iWQsW2qStBV0T+U7FijKxVfrfqZP7GxQmDAc9o1iiyAa3QIDAQABo3cwdTAJBgNV
25
- HRMEAjAAMAsGA1UdDwQEAwIEsDAdBgNVHQ4EFgQU3O4dTXmvE7YpAkszGzR9DdL9
26
- sbEwHQYDVR0RBBYwFIESY29udGFjdEBrYXJhZmthLmlvMB0GA1UdEgQWMBSBEmNv
27
- bnRhY3RAa2FyYWZrYS5pbzANBgkqhkiG9w0BAQsFAAOCAYEAVKTfoLXn7mqdSxIR
28
- eqxcR6Huudg1jes81s1+X0uiRTR3hxxKZ3Y82cPsee9zYWyBrN8TA4KA0WILTru7
29
- Ygxvzha0SRPsSiaKLmgOJ+61ebI4+bOORzIJLpD6GxCxu1r7MI4+0r1u1xe0EWi8
30
- agkVo1k4Vi8cKMLm6Gl9b3wG9zQBw6fcgKwmpjKiNnOLP+OytzUANrIUJjoq6oal
31
- TC+f/Uc0TLaRqUaW/bejxzDWWHoM3SU6aoLPuerglzp9zZVzihXwx3jPLUVKDFpF
32
- Rl2lcBDxlpYGueGo0/oNzGJAAy6js8jhtHC9+19PD53vk7wHtFTZ/0ugDQYnwQ+x
33
- oml2fAAuVWpTBCgOVFe6XCQpMKopzoxQ1PjKztW2KYxgJdIBX87SnL3aWuBQmhRd
34
- i9zWxov0mr44TWegTVeypcWGd/0nxu1+QHVNHJrpqlPBRvwQsUm7fwmRInGpcaB8
35
- ap8wNYvryYzrzvzUxIVFBVM5PacgkFqRmolCa8I7tdKQN+R1
36
- -----END CERTIFICATE-----
9
+ cert_chain: []
37
10
  date: 1980-01-02 00:00:00.000000000 Z
38
11
  dependencies:
39
12
  - !ruby/object:Gem::Dependency
@@ -138,6 +111,7 @@ files:
138
111
  - ".github/ISSUE_TEMPLATE/bug_report.md"
139
112
  - ".github/ISSUE_TEMPLATE/feature_request.md"
140
113
  - ".github/workflows/ci.yml"
114
+ - ".github/workflows/push.yml"
141
115
  - ".github/workflows/verify-action-pins.yml"
142
116
  - ".gitignore"
143
117
  - ".rspec"
@@ -152,6 +126,7 @@ files:
152
126
  - LICENSE-COMM
153
127
  - LICENSE-LGPL
154
128
  - README.md
129
+ - Rakefile
155
130
  - SECURITY.md
156
131
  - bin/benchmarks
157
132
  - bin/clean_kafka
@@ -167,7 +142,6 @@ files:
167
142
  - bin/verify_license_integrity
168
143
  - bin/verify_topics_naming
169
144
  - bin/wait_for_kafka
170
- - certs/cert.pem
171
145
  - certs/karafka-pro.pem
172
146
  - config/locales/errors.yml
173
147
  - config/locales/pro_errors.yml
@@ -207,6 +181,7 @@ files:
207
181
  - lib/karafka/cli/topics/base.rb
208
182
  - lib/karafka/cli/topics/create.rb
209
183
  - lib/karafka/cli/topics/delete.rb
184
+ - lib/karafka/cli/topics/help.rb
210
185
  - lib/karafka/cli/topics/migrate.rb
211
186
  - lib/karafka/cli/topics/plan.rb
212
187
  - lib/karafka/cli/topics/repartition.rb
@@ -644,7 +619,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
644
619
  - !ruby/object:Gem::Version
645
620
  version: '0'
646
621
  requirements: []
647
- rubygems_version: 3.6.9
622
+ rubygems_version: 3.6.7
648
623
  specification_version: 4
649
624
  summary: Karafka is Ruby and Rails efficient Kafka processing framework.
650
625
  test_files: []
checksums.yaml.gz.sig DELETED
Binary file
data/certs/cert.pem DELETED
@@ -1,26 +0,0 @@
1
- -----BEGIN CERTIFICATE-----
2
- MIIEcDCCAtigAwIBAgIBATANBgkqhkiG9w0BAQsFADA/MRAwDgYDVQQDDAdjb250
3
- YWN0MRcwFQYKCZImiZPyLGQBGRYHa2FyYWZrYTESMBAGCgmSJomT8ixkARkWAmlv
4
- MB4XDTI0MDgyMzEwMTkyMFoXDTQ5MDgxNzEwMTkyMFowPzEQMA4GA1UEAwwHY29u
5
- dGFjdDEXMBUGCgmSJomT8ixkARkWB2thcmFma2ExEjAQBgoJkiaJk/IsZAEZFgJp
6
- bzCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKjLhLjQqUlNayxkXnO+
7
- PsmCDs/KFIzhrsYMfLZRZNaWmzV3ujljMOdDjd4snM2X06C41iVdQPWjpe3j8vVe
8
- ZXEWR/twSbOP6Eeg8WVH2wCOo0x5i7yhVn4UBLH4JpfEMCbemVcWQ9ry9OMg4WpH
9
- Uu4dRwxFV7hzCz3p0QfNLRI4miAxnGWcnlD98IJRjBAksTuR1Llj0vbOrDGsL9ZT
10
- JeXP2gdRLd8SqzAFJEWrbeTBCBU7gfSh3oMg5SVDLjaqf7Kz5wC/8bDZydzanOxB
11
- T6CDXPsCnllmvTNx2ei2T5rGYJOzJeNTmJLLK6hJWUlAvaQSvCwZRvFJ0tVGLEoS
12
- flqSr6uGyyl1eMUsNmsH4BqPEYcAV6P2PKTv2vUR8AP0raDvZ3xL1TKvfRb8xRpo
13
- vPopCGlY5XBWEc6QERHfVLTIVsjnls2/Ujj4h8/TSfqqYnaHKefIMLbuD/tquMjD
14
- iWQsW2qStBV0T+U7FijKxVfrfqZP7GxQmDAc9o1iiyAa3QIDAQABo3cwdTAJBgNV
15
- HRMEAjAAMAsGA1UdDwQEAwIEsDAdBgNVHQ4EFgQU3O4dTXmvE7YpAkszGzR9DdL9
16
- sbEwHQYDVR0RBBYwFIESY29udGFjdEBrYXJhZmthLmlvMB0GA1UdEgQWMBSBEmNv
17
- bnRhY3RAa2FyYWZrYS5pbzANBgkqhkiG9w0BAQsFAAOCAYEAVKTfoLXn7mqdSxIR
18
- eqxcR6Huudg1jes81s1+X0uiRTR3hxxKZ3Y82cPsee9zYWyBrN8TA4KA0WILTru7
19
- Ygxvzha0SRPsSiaKLmgOJ+61ebI4+bOORzIJLpD6GxCxu1r7MI4+0r1u1xe0EWi8
20
- agkVo1k4Vi8cKMLm6Gl9b3wG9zQBw6fcgKwmpjKiNnOLP+OytzUANrIUJjoq6oal
21
- TC+f/Uc0TLaRqUaW/bejxzDWWHoM3SU6aoLPuerglzp9zZVzihXwx3jPLUVKDFpF
22
- Rl2lcBDxlpYGueGo0/oNzGJAAy6js8jhtHC9+19PD53vk7wHtFTZ/0ugDQYnwQ+x
23
- oml2fAAuVWpTBCgOVFe6XCQpMKopzoxQ1PjKztW2KYxgJdIBX87SnL3aWuBQmhRd
24
- i9zWxov0mr44TWegTVeypcWGd/0nxu1+QHVNHJrpqlPBRvwQsUm7fwmRInGpcaB8
25
- ap8wNYvryYzrzvzUxIVFBVM5PacgkFqRmolCa8I7tdKQN+R1
26
- -----END CERTIFICATE-----
data.tar.gz.sig DELETED
@@ -1 +0,0 @@
1
- `E�����S �i7�L�(�D��9�A���`�"a)��
metadata.gz.sig DELETED
Binary file