karafka 1.3.0 → 1.3.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ffe10d1ca48b0b218191231ec6969f3a767f5a1a4010dca0ef00fa44c2eaab67
4
- data.tar.gz: 27d1b52ba3782b562176b7a6de73563bdd48c250bc0576a1821c55bfcec07bff
3
+ metadata.gz: b4de7fea7ee892f9c83db91dea9e3d16d48d423985477d3f40440bea77907906
4
+ data.tar.gz: ccfe4744b3bb2c19b31032e8f904716116faa50f96a9be19a4e3cdc79364ebe6
5
5
  SHA512:
6
- metadata.gz: 85bdedfe0791c7d17abbc54ed49df83e2c9cbfb048124107d403559e72b2884909bcc5974c98a738744f80184315b2ee31cc8bd1048cc5a10230e6b6f0184e3d
7
- data.tar.gz: 382074bee041ed27776571816a982fbd7f0af6b6f9cfa8fa393113acf1c2d4cbfb0a7573d2bcd16f2662667d1b8175546e281deb5a7b66ce30b5d9c4c19d005d
6
+ metadata.gz: 74dbf97aec22f4e0f5f9a07ac9b0457c09846332c65f548d30e2b0342a2adec4b8542004aa1937012892f3374b39f83f557ef5adecacc40ae538290842165150
7
+ data.tar.gz: fa2072670fe7c17720e593373a4d8222163f4e789c68ec72e4e1f58f4a08688839a0c01e98115db4dfbc5c57d633b0251fe481fc9838362203961c1e4e236159
Binary file
data.tar.gz.sig CHANGED
Binary file
@@ -1 +1 @@
1
- 2.6.3
1
+ 2.7.1
@@ -12,25 +12,25 @@ test: &test
12
12
  stage: Test
13
13
  language: ruby
14
14
  before_install:
15
- - gem install bundler
16
- - gem update --system
15
+ - yes | gem update --system
17
16
  script: bundle exec rspec
18
17
 
19
18
  jobs:
20
19
  include:
21
20
  - <<: *test
22
- rvm: 2.6.3
21
+ rvm: 2.7.1
23
22
  - <<: *test
24
- rvm: 2.5.5
23
+ rvm: 2.6.6
24
+ - <<: *test
25
+ rvm: 2.5.8
25
26
 
26
27
  - stage: coditsu
27
28
  language: ruby
28
- rvm: 2.6.3
29
+ rvm: 2.7.1
29
30
  before_install:
30
- - gem update --system
31
- - gem install bundler
31
+ - yes | gem update --system
32
32
  script: \curl -sSL https://api.coditsu.io/run/ci | bash
33
33
 
34
34
  stages:
35
- - coditsu
36
35
  - test
36
+ - coditsu
@@ -1,5 +1,29 @@
1
1
  # Karafka framework changelog
2
2
 
3
+ ## 1.3.6 (2020-04-24)
4
+ - #583 - Use Karafka.logger for CLI messages (prikha)
5
+ - #582 - Cannot only define seed brokers in consumer groups
6
+
7
+ ## 1.3.5 (2020-04-02)
8
+ - #578 - ThreadError: can't be called from trap context patch
9
+
10
+ ## 1.3.4 (2020-02-17)
11
+ - `dry-configurable` upgrade (solnic)
12
+ - Remove temporary `thor` patches that are no longer needed
13
+
14
+ ## 1.3.3 (2019-12-23)
15
+ - Require `delegate` to fix missing dependency in `ruby-kafka`
16
+
17
+ ## 1.3.2 (2019-12-23)
18
+ - #561 - Allow `thor` 1.0.x usage in Karafka
19
+ - #567 - Ruby 2.7.0 support + unfreeze of a frozen string fix
20
+
21
+ ## 1.3.1 (2019-11-11)
22
+ - #545 - Makes sure the log directory exists when is possible (robertomiranda)
23
+ - Ruby 2.6.5 support
24
+ - #551 - add support for DSA keys
25
+ - #549 - Missing directories after `karafka install` (nijikon)
26
+
3
27
  ## 1.3.0 (2019-09-09)
4
28
  - Drop support for Ruby 2.4
5
29
  - YARD docs tags cleanup
@@ -92,7 +116,7 @@
92
116
  ## 1.2.3
93
117
  - #313 - support PLAINTEXT and SSL for scheme
94
118
  - #288 - drop activesupport callbacks in favor of notifications
95
- - #320 - Pausing indefinetely with nil pause timeout doesn't work
119
+ - #320 - Pausing indefinitely with nil pause timeout doesn't work
96
120
  - #318 - Partition pausing doesn't work with custom topic mappers
97
121
  - Rename ConfigAdapter to ApiAdapter to better reflect what it does
98
122
  - #317 - Manual offset committing doesn't work with custom topic mappers
@@ -221,7 +245,7 @@
221
245
  - Switch to multi json so everyone can use their favourite JSON parser
222
246
  - Added jruby support in general and in Travis
223
247
  - #196 - Topic mapper does not map topics when subscribing thanks to @webandtech
224
- - #96 - Karafka server - possiblity to run it only for a certain topics
248
+ - #96 - Karafka server - possibility to run it only for a certain topics
225
249
  - ~~karafka worker cli option is removed (please use sidekiq directly)~~ - restored, bad idea
226
250
  - (optional) pausing upon processing failures ```pause_timeout```
227
251
  - Karafka console main process no longer intercepts irb errors
@@ -229,7 +253,7 @@
229
253
  - #204 - Long running controllers
230
254
  - Better internal API to handle multiple usage cases using ```Karafka::Controllers::Includer```
231
255
  - #207 - Rename before_enqueued to after_received
232
- - #147 - Deattach Karafka from Sidekiq by extracting Sidekiq backend
256
+ - #147 - De-attach Karafka from Sidekiq by extracting Sidekiq backend
233
257
 
234
258
  ### New features and improvements
235
259
 
@@ -314,7 +338,7 @@
314
338
  - Waterdrop 0.3.2.1 with kafka.hosts instead of kafka_hosts
315
339
  - #105 - Karafka::Monitor#caller_label not working with inherited monitors
316
340
  - #99 - Standalone mode (without Sidekiq)
317
- - #97 - Buffer responders single topics before send (prevalidation)
341
+ - #97 - Buffer responders single topics before send (pre-validation)
318
342
  - Better control over consumer thanks to additional config options
319
343
  - #111 - Dynamic worker assignment based on the income params
320
344
  - Long shutdown time fix
@@ -322,7 +346,7 @@
322
346
  ## 0.5.0
323
347
  - Removed Zookeeper totally as dependency
324
348
  - Better group and partition rebalancing
325
- - Automatic thread management (no need for tunning) - each topic is a separate actor/thread
349
+ - Automatic thread management (no need for tuning) - each topic is a separate actor/thread
326
350
  - Moved from Poseidon into Ruby-Kafka
327
351
  - No more max_concurrency setting
328
352
  - After you define your App class and routes (and everything else) you need to add execute App.boot!
@@ -338,14 +362,14 @@
338
362
  - Ruby 2.2.* support dropped
339
363
  - Using App name as a Kafka client_id
340
364
  - Automatic Capistrano integration
341
- - Responders support for handling better responses pipelining and better responses flow description and design (see README for more details)
365
+ - Responders support for handling better responses pipe-lining and better responses flow description and design (see README for more details)
342
366
  - Gem bump
343
367
  - Readme updates
344
368
  - karafka flow CLI command for printing the application flow
345
- - Some internal refactorings
369
+ - Some internal refactoring
346
370
 
347
371
  ## 0.4.2
348
- - #87 - Reconsume mode with crone for better Rails/Rack integration
372
+ - #87 - Re-consume mode with crone for better Rails/Rack integration
349
373
  - Moved Karafka server related stuff into separate Karafka::Server class
350
374
  - Renamed Karafka::Runner into Karafka::Fetcher
351
375
  - Gem bump
@@ -357,7 +381,7 @@
357
381
 
358
382
  ## 0.4.1
359
383
  - Explicit throw(:abort) required to halt before_enqueue (like in Rails 5)
360
- - #61 - Autodiscover Kafka brokers based on Zookeeper data
384
+ - #61 - autodiscovery of Kafka brokers based on Zookeeper data
361
385
  - #63 - Graceful shutdown with current offset state during data processing
362
386
  - #65 - Example of NewRelic monitor is outdated
363
387
  - #71 - Setup should be executed after user code is loaded
@@ -413,7 +437,7 @@
413
437
  - Added Karafka::Monitoring that allows to add custom logging and monitoring with external libraries and systems
414
438
  - Moved logging functionality into Karafka::Monitoring default monitoring
415
439
  - Added possibility to provide own monitoring as long as in responds to #notice and #notice_error
416
- - Standarized logging format for all logs
440
+ - Standardized logging format for all logs
417
441
 
418
442
  ## 0.3.0
419
443
  - Switched from custom ParserError for each parser to general catching of Karafka::Errors::ParseError and its descendants
@@ -430,7 +454,7 @@
430
454
 
431
455
  ## 0.1.19
432
456
  - Internal call - schedule naming change
433
- - Enqueue to perform_async naming in controller to follow Sidekiqs naming convention
457
+ - Enqueue to perform_async naming in controller to follow Sidekiq naming convention
434
458
  - Gem bump
435
459
 
436
460
  ## 0.1.18
@@ -441,7 +465,7 @@
441
465
  - Changed Karafka::Connection::Cluster tp Karafka::Connection::ActorCluster to distinguish between a single thread actor cluster for multiple topic connection and a future feature that will allow process clusterization.
442
466
  - Add an ability to use user-defined parsers for a messages
443
467
  - Lazy load params for before callbacks
444
- - Automatic loading/initializng all workers classes during startup (so Sidekiq won't fail with unknown workers exception)
468
+ - Automatic loading/initializing all workers classes during startup (so Sidekiq won't fail with unknown workers exception)
445
469
  - Params are now private to controller
446
470
  - Added bootstrap method to app.rb
447
471
 
@@ -482,7 +506,7 @@
482
506
  - Added worker logger
483
507
 
484
508
  ## 0.1.8
485
- - Droped local env suppot in favour of [Envlogic](https://github.com/karafka/envlogic) - no changes in API
509
+ - Dropped local env support in favour of [Envlogic](https://github.com/karafka/envlogic) - no changes in API
486
510
 
487
511
  ## 0.1.7
488
512
  - Karafka option for Redis hosts (not localhost only)
@@ -512,7 +536,7 @@
512
536
 
513
537
  ## 0.1.1
514
538
  - README updates
515
- - Raketasks updates
539
+ - Rake tasks updates
516
540
  - Rake installation task
517
541
  - Changelog file added
518
542
 
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (1.3.0)
4
+ karafka (1.3.6)
5
5
  dry-configurable (~> 0.8)
6
6
  dry-inflector (~> 0.1)
7
7
  dry-monitor (~> 0.3)
@@ -11,117 +11,120 @@ PATH
11
11
  multi_json (>= 1.12)
12
12
  rake (>= 11.3)
13
13
  ruby-kafka (>= 0.7.8)
14
- thor (~> 0.20)
14
+ thor (>= 0.20)
15
15
  waterdrop (~> 1.3.0)
16
16
  zeitwerk (~> 2.1)
17
17
 
18
18
  GEM
19
19
  remote: https://rubygems.org/
20
20
  specs:
21
- activesupport (6.0.0)
21
+ activesupport (6.0.2.2)
22
22
  concurrent-ruby (~> 1.0, >= 1.0.2)
23
23
  i18n (>= 0.7, < 2)
24
24
  minitest (~> 5.1)
25
25
  tzinfo (~> 1.1)
26
- zeitwerk (~> 2.1, >= 2.1.8)
27
- byebug (11.0.1)
28
- concurrent-ruby (1.1.5)
29
- delivery_boy (0.2.8)
26
+ zeitwerk (~> 2.2)
27
+ byebug (11.1.3)
28
+ concurrent-ruby (1.1.6)
29
+ delivery_boy (1.0.1)
30
30
  king_konf (~> 0.3)
31
- ruby-kafka (~> 0.7.8)
31
+ ruby-kafka (~> 1.0)
32
32
  diff-lcs (1.3)
33
- digest-crc (0.4.1)
33
+ digest-crc (0.5.1)
34
34
  docile (1.3.2)
35
- dry-configurable (0.8.3)
35
+ dry-configurable (0.11.5)
36
36
  concurrent-ruby (~> 1.0)
37
37
  dry-core (~> 0.4, >= 0.4.7)
38
+ dry-equalizer (~> 0.2)
38
39
  dry-container (0.7.2)
39
40
  concurrent-ruby (~> 1.0)
40
41
  dry-configurable (~> 0.1, >= 0.1.3)
41
42
  dry-core (0.4.9)
42
43
  concurrent-ruby (~> 1.0)
43
- dry-equalizer (0.2.2)
44
+ dry-equalizer (0.3.0)
44
45
  dry-events (0.2.0)
45
46
  concurrent-ruby (~> 1.0)
46
47
  dry-core (~> 0.4)
47
48
  dry-equalizer (~> 0.2)
48
- dry-inflector (0.1.2)
49
- dry-initializer (3.0.1)
50
- dry-logic (1.0.3)
49
+ dry-inflector (0.2.0)
50
+ dry-initializer (3.0.3)
51
+ dry-logic (1.0.6)
51
52
  concurrent-ruby (~> 1.0)
52
53
  dry-core (~> 0.2)
53
54
  dry-equalizer (~> 0.2)
54
- dry-monitor (0.3.1)
55
+ dry-monitor (0.3.2)
55
56
  dry-configurable (~> 0.5)
56
57
  dry-core (~> 0.4)
57
58
  dry-equalizer (~> 0.2)
58
- dry-events (~> 0.1)
59
- dry-schema (1.3.3)
59
+ dry-events (~> 0.2)
60
+ dry-schema (1.5.0)
60
61
  concurrent-ruby (~> 1.0)
61
62
  dry-configurable (~> 0.8, >= 0.8.3)
62
63
  dry-core (~> 0.4)
63
64
  dry-equalizer (~> 0.2)
64
65
  dry-initializer (~> 3.0)
65
66
  dry-logic (~> 1.0)
66
- dry-types (~> 1.0)
67
- dry-types (1.1.1)
67
+ dry-types (~> 1.4)
68
+ dry-types (1.4.0)
68
69
  concurrent-ruby (~> 1.0)
69
70
  dry-container (~> 0.3)
70
71
  dry-core (~> 0.4, >= 0.4.4)
71
- dry-equalizer (~> 0.2, >= 0.2.2)
72
+ dry-equalizer (~> 0.3)
72
73
  dry-inflector (~> 0.1, >= 0.1.2)
73
74
  dry-logic (~> 1.0, >= 1.0.2)
74
- dry-validation (1.3.1)
75
+ dry-validation (1.5.0)
75
76
  concurrent-ruby (~> 1.0)
76
77
  dry-container (~> 0.7, >= 0.7.1)
77
78
  dry-core (~> 0.4)
78
79
  dry-equalizer (~> 0.2)
79
80
  dry-initializer (~> 3.0)
80
- dry-schema (~> 1.0, >= 1.3.1)
81
- envlogic (1.1.0)
81
+ dry-schema (~> 1.5)
82
+ envlogic (1.1.2)
82
83
  dry-inflector (~> 0.1)
83
- factory_bot (5.0.2)
84
+ factory_bot (5.1.2)
84
85
  activesupport (>= 4.2.0)
85
- i18n (1.6.0)
86
+ i18n (1.8.2)
86
87
  concurrent-ruby (~> 1.0)
87
- irb (1.0.0)
88
- json (2.2.0)
88
+ io-console (0.5.6)
89
+ irb (1.2.3)
90
+ reline (>= 0.0.1)
89
91
  king_konf (0.3.7)
90
- minitest (5.11.3)
91
- multi_json (1.13.1)
92
- rake (12.3.3)
93
- rspec (3.8.0)
94
- rspec-core (~> 3.8.0)
95
- rspec-expectations (~> 3.8.0)
96
- rspec-mocks (~> 3.8.0)
97
- rspec-core (3.8.2)
98
- rspec-support (~> 3.8.0)
99
- rspec-expectations (3.8.4)
92
+ minitest (5.14.0)
93
+ multi_json (1.14.1)
94
+ rake (13.0.1)
95
+ reline (0.1.3)
96
+ io-console (~> 0.5)
97
+ rspec (3.9.0)
98
+ rspec-core (~> 3.9.0)
99
+ rspec-expectations (~> 3.9.0)
100
+ rspec-mocks (~> 3.9.0)
101
+ rspec-core (3.9.1)
102
+ rspec-support (~> 3.9.1)
103
+ rspec-expectations (3.9.1)
100
104
  diff-lcs (>= 1.2.0, < 2.0)
101
- rspec-support (~> 3.8.0)
102
- rspec-mocks (3.8.1)
105
+ rspec-support (~> 3.9.0)
106
+ rspec-mocks (3.9.1)
103
107
  diff-lcs (>= 1.2.0, < 2.0)
104
- rspec-support (~> 3.8.0)
105
- rspec-support (3.8.2)
106
- ruby-kafka (0.7.10)
108
+ rspec-support (~> 3.9.0)
109
+ rspec-support (3.9.2)
110
+ ruby-kafka (1.0.0)
107
111
  digest-crc
108
- simplecov (0.17.0)
112
+ simplecov (0.18.5)
109
113
  docile (~> 1.1)
110
- json (>= 1.8, < 3)
111
- simplecov-html (~> 0.10.0)
112
- simplecov-html (0.10.2)
113
- thor (0.20.3)
114
+ simplecov-html (~> 0.11)
115
+ simplecov-html (0.12.2)
116
+ thor (1.0.1)
114
117
  thread_safe (0.3.6)
115
- tzinfo (1.2.5)
118
+ tzinfo (1.2.7)
116
119
  thread_safe (~> 0.1)
117
- waterdrop (1.3.0)
118
- delivery_boy (~> 0.2)
120
+ waterdrop (1.3.4)
121
+ delivery_boy (>= 0.2, < 2.x)
119
122
  dry-configurable (~> 0.8)
120
123
  dry-monitor (~> 0.3)
121
124
  dry-validation (~> 1.2)
122
125
  ruby-kafka (>= 0.7.8)
123
126
  zeitwerk (~> 2.1)
124
- zeitwerk (2.1.10)
127
+ zeitwerk (2.3.0)
125
128
 
126
129
  PLATFORMS
127
130
  ruby
@@ -134,4 +137,4 @@ DEPENDENCIES
134
137
  simplecov
135
138
 
136
139
  BUNDLED WITH
137
- 2.0.2
140
+ 2.1.4
data/README.md CHANGED
@@ -2,11 +2,11 @@
2
2
 
3
3
  [![Build Status](https://travis-ci.org/karafka/karafka.svg?branch=master)](https://travis-ci.org/karafka/karafka)
4
4
 
5
- ## New release in progress!
5
+ **Note**: Documentation presented here refers to Karafka `1.3.x`.
6
6
 
7
- **Note**: Documentation presented here refers to Karafka `1.3.0`.
7
+ If you're upgrading from `1.2.0`, please refer to our [Upgrade Notes article](https://mensfeld.pl/2019/09/karafka-framework-1-3-0-release-notes-ruby-kafka/).
8
8
 
9
- If you are looking for the documentation for Karafka `1.2.*`, it can be found [here](https://github.com/karafka/wiki/tree/1.2).
9
+ If you are looking for the documentation for Karafka `1.2.x`, it can be found [here](https://github.com/karafka/wiki/tree/1.2).
10
10
 
11
11
  ## About Karafka
12
12
 
@@ -26,11 +26,11 @@ Gem::Specification.new do |spec|
26
26
  spec.add_dependency 'multi_json', '>= 1.12'
27
27
  spec.add_dependency 'rake', '>= 11.3'
28
28
  spec.add_dependency 'ruby-kafka', '>= 0.7.8'
29
- spec.add_dependency 'thor', '~> 0.20'
29
+ spec.add_dependency 'thor', '>= 0.20'
30
30
  spec.add_dependency 'waterdrop', '~> 1.3.0'
31
31
  spec.add_dependency 'zeitwerk', '~> 2.1'
32
32
 
33
- spec.required_ruby_version = '>= 2.4.0'
33
+ spec.required_ruby_version = '>= 2.5.0'
34
34
 
35
35
  if $PROGRAM_NAME.end_with?('gem')
36
36
  spec.signing_key = File.expand_path('~/.ssh/gem-private_key.pem')
@@ -1,6 +1,7 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  %w[
4
+ delegate
4
5
  English
5
6
  waterdrop
6
7
  kafka
@@ -52,14 +52,8 @@ module Karafka
52
52
  ignored_settings = api_adapter[:subscribe]
53
53
  defined_settings = api_adapter.values.flatten
54
54
  karafka_settings = %i[batch_fetching]
55
- # This is a dirty and bad hack of dry-configurable to get keys before setting values
56
- dynamically_proxied = Karafka::Setup::Config
57
- ._settings
58
- .settings
59
- .find { |s| s.name == :kafka }
60
- .value
61
- .names
62
- .to_a
55
+
56
+ dynamically_proxied = Karafka::Setup::Config.config.kafka.to_h.keys
63
57
 
64
58
  (defined_settings + dynamically_proxied).uniq + karafka_settings - ignored_settings
65
59
  end
@@ -47,7 +47,7 @@ end
47
47
  if ENV['KARAFKA_CONSOLE']
48
48
  # Reloads Karafka irb console session
49
49
  def reload!
50
- puts "Reloading...\n"
50
+ Karafka.logger.info "Reloading...\n"
51
51
  Kernel.exec Karafka::Cli::Console.command
52
52
  end
53
53
  end
@@ -11,19 +11,22 @@ module Karafka
11
11
  def call
12
12
  topics.each do |topic|
13
13
  any_topics = !topic.responder&.topics.nil?
14
+ log_messages = []
14
15
 
15
16
  if any_topics
16
- puts "#{topic.name} =>"
17
+ log_messages << "#{topic.name} =>"
17
18
 
18
19
  topic.responder.topics.each_value do |responder_topic|
19
20
  features = []
20
21
  features << (responder_topic.required? ? 'always' : 'conditionally')
21
22
 
22
- print responder_topic.name, "(#{features.join(', ')})"
23
+ log_messages << format(responder_topic.name, "(#{features.join(', ')})")
23
24
  end
24
25
  else
25
- puts "#{topic.name} => (nothing)"
26
+ log_messages << "#{topic.name} => (nothing)"
26
27
  end
28
+
29
+ Karafka.logger.info(log_messages.join("\n"))
27
30
  end
28
31
  end
29
32
 
@@ -34,11 +37,11 @@ module Karafka
34
37
  Karafka::App.consumer_groups.map(&:topics).flatten.sort_by(&:name)
35
38
  end
36
39
 
37
- # Prints a given value with label in a nice way
40
+ # Formats a given value with label in a nice way
38
41
  # @param label [String] label describing value
39
42
  # @param value [String] value that should be printed
40
- def print(label, value)
41
- printf "%-25s %s\n", " - #{label}:", value
43
+ def format(label, value)
44
+ " - #{label}: #{value}"
42
45
  end
43
46
  end
44
47
  end
@@ -24,7 +24,7 @@ module Karafka
24
24
  "Kafka seed brokers: #{config.kafka.seed_brokers}"
25
25
  ]
26
26
 
27
- puts(info.join("\n"))
27
+ Karafka.logger.info(info.join("\n"))
28
28
  end
29
29
  end
30
30
  end
@@ -13,7 +13,9 @@ module Karafka
13
13
  INSTALL_DIRS = %w[
14
14
  app/consumers
15
15
  app/responders
16
+ app/workers
16
17
  config
18
+ lib
17
19
  log
18
20
  tmp/pids
19
21
  ].freeze
@@ -14,11 +14,12 @@ module Karafka
14
14
  module ApiAdapter
15
15
  class << self
16
16
  # Builds all the configuration settings for Kafka.new method
17
+ # @param consumer_group [Karafka::Routing::ConsumerGroup] consumer group details
17
18
  # @return [Array<Hash>] Array with all the client arguments including hash with all
18
19
  # the settings required by Kafka.new method
19
20
  # @note We return array, so we can inject any arguments we want, in case of changes in the
20
21
  # raw driver
21
- def client
22
+ def client(consumer_group)
22
23
  # This one is a default that takes all the settings except special
23
24
  # cases defined in the map
24
25
  settings = {
@@ -26,14 +27,17 @@ module Karafka
26
27
  client_id: ::Karafka::App.config.client_id
27
28
  }
28
29
 
29
- kafka_configs.each do |setting_name, setting_value|
30
+ kafka_configs.each_key do |setting_name|
30
31
  # All options for config adapter should be ignored as we're just interested
31
32
  # in what is left, as we want to pass all the options that are "typical"
32
33
  # and not listed in the api_adapter special cases mapping. All the values
33
34
  # from the api_adapter mapping go somewhere else, not to the client directly
34
35
  next if AttributesMap.api_adapter.values.flatten.include?(setting_name)
35
36
 
36
- settings[setting_name] = setting_value
37
+ # Settings for each consumer group are either defined per consumer group or are
38
+ # inherited from the global/general settings level, thus we don't have to fetch them
39
+ # from the kafka settings as they are already on a consumer group level
40
+ settings[setting_name] = consumer_group.public_send(setting_name)
37
41
  end
38
42
 
39
43
  settings_hash = sanitize(settings)
@@ -6,9 +6,11 @@ module Karafka
6
6
  module Builder
7
7
  class << self
8
8
  # Builds a Kafka::Client instance that we use to work with Kafka cluster
9
+ # @param consumer_group [Karafka::Routing::ConsumerGroup] consumer group for which we want
10
+ # to have a new Kafka client
9
11
  # @return [::Kafka::Client] returns a Kafka client
10
- def call
11
- Kafka.new(*ApiAdapter.client)
12
+ def call(consumer_group)
13
+ Kafka.new(*ApiAdapter.client(consumer_group))
12
14
  end
13
15
  end
14
16
  end
@@ -97,7 +97,7 @@ module Karafka
97
97
  def kafka_consumer
98
98
  # @note We don't cache the connection internally because we cache kafka_consumer that uses
99
99
  # kafka client object instance
100
- @kafka_consumer ||= Builder.call.consumer(
100
+ @kafka_consumer ||= Builder.call(consumer_group).consumer(
101
101
  *ApiAdapter.consumer(consumer_group)
102
102
  ).tap do |consumer|
103
103
  consumer_group.topics.each do |topic|
@@ -47,10 +47,10 @@ module Karafka
47
47
  end
48
48
  end
49
49
  # This is on purpose - see the notes for this method
50
- # rubocop:disable RescueException
50
+ # rubocop:disable Lint/RescueException
51
51
  rescue Exception => e
52
52
  Karafka.monitor.instrument('connection.listener.fetch_loop.error', caller: self, error: e)
53
- # rubocop:enable RescueException
53
+ # rubocop:enable Lint/RescueException
54
54
  # We can stop client without a problem, as it will reinitialize itself when running the
55
55
  # `fetch_loop` again
56
56
  @client.stop
@@ -178,9 +178,9 @@ module Karafka
178
178
  # @param value [String] potential RSA key value
179
179
  # @return [Boolean] is the given string a valid RSA key
180
180
  def valid_private_key?(value)
181
- OpenSSL::PKey::RSA.new(value)
181
+ OpenSSL::PKey.read(value)
182
182
  true
183
- rescue OpenSSL::PKey::RSAError
183
+ rescue OpenSSL::PKey::PKeyError
184
184
  false
185
185
  end
186
186
 
@@ -6,6 +6,8 @@ module Karafka
6
6
  # We validate some basics + the list of consumer_groups on which we want to use, to make
7
7
  # sure that all of them are defined, plus that a pidfile does not exist
8
8
  class ServerCliOptions < Dry::Validation::Contract
9
+ config.messages.load_paths << File.join(Karafka.gem_root, 'config', 'errors.yml')
10
+
9
11
  params do
10
12
  optional(:pid).filled(:str?)
11
13
  optional(:daemon).filled(:bool?)
@@ -44,7 +44,7 @@ module Karafka
44
44
  # @example From Namespaced::Super2Consumer matching responder
45
45
  # matcher.name #=> Super2Responder
46
46
  def name
47
- inflected = @klass.to_s.split('::').last.to_s
47
+ inflected = +@klass.to_s.split('::').last.to_s
48
48
  # We inject the from into the name just in case it is missing as in a situation like
49
49
  # that it would just sanitize the name without adding the "to" postfix.
50
50
  # It could create cases when we want to build for example a responder to a consumer
@@ -36,10 +36,11 @@ module Karafka
36
36
  .to(STDOUT, file)
37
37
  end
38
38
 
39
- # Makes sure the log directory exists
39
+ # Makes sure the log directory exists as long as we can write to it
40
40
  def ensure_dir_exists
41
- dir = File.dirname(log_path)
42
- FileUtils.mkdir_p(dir) unless Dir.exist?(dir)
41
+ FileUtils.mkdir_p(File.dirname(log_path))
42
+ rescue Errno::EACCES
43
+ nil
43
44
  end
44
45
 
45
46
  # @return [Pathname] Path to a file to which we should log
@@ -58,9 +58,12 @@ module Karafka
58
58
  def stop_supervised
59
59
  Karafka::App.stop!
60
60
 
61
+ # Temporary patch until https://github.com/dry-rb/dry-configurable/issues/93 is fixed
62
+ timeout = Thread.new { Karafka::App.config.shutdown_timeout }.join.value
63
+
61
64
  # We check from time to time (for the timeout period) if all the threads finished
62
65
  # their work and if so, we can just return and normal shutdown process will take place
63
- (Karafka::App.config.shutdown_timeout * SUPERVISION_CHECK_FACTOR).to_i.times do
66
+ (timeout * SUPERVISION_CHECK_FACTOR).to_i.times do
64
67
  if consumer_threads.count(&:alive?).zero?
65
68
  Thread.new { Karafka.monitor.instrument('app.stopped') }.join
66
69
  return
@@ -3,5 +3,5 @@
3
3
  # Main module namespace
4
4
  module Karafka
5
5
  # Current Karafka version
6
- VERSION = '1.3.0'
6
+ VERSION = '1.3.6'
7
7
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: karafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.3.0
4
+ version: 1.3.6
5
5
  platform: ruby
6
6
  authors:
7
7
  - Maciej Mensfeld
@@ -36,7 +36,7 @@ cert_chain:
36
36
  KJG/fhg1JV5vVDdVy6x+tv5SQ5ctU0feCsVfESi3rE3zRd+nvzE9HcZ5aXeL1UtJ
37
37
  nT5Xrioegu2w1jPyVEgyZgTZC5rvD0nNS5sFNQ==
38
38
  -----END CERTIFICATE-----
39
- date: 2019-09-09 00:00:00.000000000 Z
39
+ date: 2020-04-24 00:00:00.000000000 Z
40
40
  dependencies:
41
41
  - !ruby/object:Gem::Dependency
42
42
  name: dry-configurable
@@ -168,14 +168,14 @@ dependencies:
168
168
  name: thor
169
169
  requirement: !ruby/object:Gem::Requirement
170
170
  requirements:
171
- - - "~>"
171
+ - - ">="
172
172
  - !ruby/object:Gem::Version
173
173
  version: '0.20'
174
174
  type: :runtime
175
175
  prerelease: false
176
176
  version_requirements: !ruby/object:Gem::Requirement
177
177
  requirements:
178
- - - "~>"
178
+ - - ">="
179
179
  - !ruby/object:Gem::Version
180
180
  version: '0.20'
181
181
  - !ruby/object:Gem::Dependency
@@ -322,14 +322,14 @@ required_ruby_version: !ruby/object:Gem::Requirement
322
322
  requirements:
323
323
  - - ">="
324
324
  - !ruby/object:Gem::Version
325
- version: 2.4.0
325
+ version: 2.5.0
326
326
  required_rubygems_version: !ruby/object:Gem::Requirement
327
327
  requirements:
328
328
  - - ">="
329
329
  - !ruby/object:Gem::Version
330
330
  version: '0'
331
331
  requirements: []
332
- rubygems_version: 3.0.3
332
+ rubygems_version: 3.1.2
333
333
  signing_key:
334
334
  specification_version: 4
335
335
  summary: Ruby based framework for working with Apache Kafka
metadata.gz.sig CHANGED
Binary file