karafka 2.0.0.rc1 → 2.0.0.rc4
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- checksums.yaml.gz.sig +0 -0
- data/CHANGELOG.md +24 -0
- data/CONTRIBUTING.md +4 -8
- data/Gemfile.lock +14 -56
- data/LICENSE-COMM +1 -1
- data/README.md +46 -10
- data/config/errors.yml +52 -5
- data/docker-compose.yml +3 -0
- data/karafka.gemspec +4 -6
- data/lib/karafka/active_job/consumer.rb +2 -0
- data/lib/karafka/active_job/job_options_contract.rb +8 -2
- data/lib/karafka/cli/install.rb +15 -2
- data/lib/karafka/cli/server.rb +4 -2
- data/lib/karafka/connection/client.rb +4 -4
- data/lib/karafka/contracts/base.rb +2 -8
- data/lib/karafka/contracts/config.rb +71 -52
- data/lib/karafka/contracts/consumer_group.rb +25 -18
- data/lib/karafka/contracts/consumer_group_topic.rb +30 -16
- data/lib/karafka/contracts/server_cli_options.rb +18 -7
- data/lib/karafka/errors.rb +0 -3
- data/lib/karafka/helpers/colorize.rb +20 -0
- data/lib/karafka/instrumentation/logger_listener.rb +8 -2
- data/lib/karafka/instrumentation/monitor.rb +14 -59
- data/lib/karafka/instrumentation/notifications.rb +52 -0
- data/lib/karafka/instrumentation/vendors/datadog/dashboard.json +1 -0
- data/lib/karafka/instrumentation/vendors/datadog/listener.rb +232 -0
- data/lib/karafka/pro/active_job/dispatcher.rb +5 -2
- data/lib/karafka/pro/active_job/job_options_contract.rb +11 -6
- data/lib/karafka/pro/contracts/base.rb +21 -0
- data/lib/karafka/pro/contracts/consumer_group.rb +34 -0
- data/lib/karafka/pro/contracts/consumer_group_topic.rb +33 -0
- data/lib/karafka/pro/loader.rb +21 -3
- data/lib/karafka/pro/processing/partitioner.rb +22 -3
- data/lib/karafka/pro/routing/builder_extensions.rb +30 -0
- data/lib/karafka/pro/routing/{extensions.rb → topic_extensions.rb} +1 -1
- data/lib/karafka/processing/jobs_queue.rb +11 -0
- data/lib/karafka/processing/worker.rb +4 -2
- data/lib/karafka/setup/config.rb +7 -3
- data/lib/karafka/templates/example_consumer.rb.erb +2 -2
- data/lib/karafka/version.rb +1 -1
- data/lib/karafka.rb +3 -4
- data.tar.gz.sig +0 -0
- metadata +24 -38
- metadata.gz.sig +0 -0
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 526402b906d00f844c5b25925a854bc45f5127dc8c06646f1cd1432f265dfb81
|
4
|
+
data.tar.gz: c58f6be491c4c4e82237307d37874100588c97241439788325600bea77d4254f
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: '08922eef9890af84f1b444329061283f7987258200a9e338b892944a6ca061a60abed273e02acf44f91728f0ed219ec494e96c05a1155db2321e941e24c8c328'
|
7
|
+
data.tar.gz: 5b3de319f1887af51eeda3ffc5b42a86601a382a55ab30e8cc7abce0526d4471ff274b3627250d47ec9a70de763fd337a7c8eff79add2b9409c935133bbe3610
|
checksums.yaml.gz.sig
CHANGED
Binary file
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,29 @@
|
|
1
1
|
# Karafka framework changelog
|
2
2
|
|
3
|
+
## 2.0.0.rc4 (2022-07-28)
|
4
|
+
- Remove `dry-monitor`
|
5
|
+
- Use `karafka-core`
|
6
|
+
|
7
|
+
## 2.0.0.rc3 (2022-07-26)
|
8
|
+
- Fix Pro partitioner hash function may not utilize all the threads (#907).
|
9
|
+
- Improve virtual partitions messages distribution.
|
10
|
+
- Add StatsD/DataDog optional monitoring listener + dashboard template.
|
11
|
+
- Validate that Pro consumer is always used for Pro subscription.
|
12
|
+
- Improve ActiveJob consumer shutdown behaviour.
|
13
|
+
- Change default `max_wait_time` to 1 second.
|
14
|
+
- Change default `max_messages` to 100 (#915).
|
15
|
+
- Move logger listener polling reporting level to debug when no messages (#916).
|
16
|
+
- Improve stability on aggressive rebalancing (multiple rebalances in a short period).
|
17
|
+
- Improve specs stability.
|
18
|
+
- Allow using `:key` and `:partition_key` for Enhanced Active Job partitioning.
|
19
|
+
|
20
|
+
## 2.0.0.rc2 (2022-07-19)
|
21
|
+
- Fix `example_consumer.rb.erb` `#shutdown` and `#revoked` signatures to correct once.
|
22
|
+
- Improve the install user experience (print status and created files).
|
23
|
+
- Change default `max_wait_time` from 10s to 5s.
|
24
|
+
- Remove direct dependency on `dry-configurable` in favour of a home-brew.
|
25
|
+
- Remove direct dependency on `dry-validation` in favour of a home-brew.
|
26
|
+
|
3
27
|
## 2.0.0-rc1 (2022-07-08)
|
4
28
|
- Extract consumption partitioner out of listener inline code.
|
5
29
|
- Introduce virtual partitioner concept for parallel processing of data from a single topic partition.
|
data/CONTRIBUTING.md
CHANGED
@@ -1,4 +1,4 @@
|
|
1
|
-
#
|
1
|
+
# Contributing
|
2
2
|
|
3
3
|
## Introduction
|
4
4
|
|
@@ -8,11 +8,7 @@ We welcome any type of contribution, not only code. You can help with:
|
|
8
8
|
- **QA**: file bug reports, the more details you can give the better (e.g. screenshots with the console open)
|
9
9
|
- **Marketing**: writing blog posts, howto's, printing stickers, ...
|
10
10
|
- **Community**: presenting the project at meetups, organizing a dedicated meetup for the local community, ...
|
11
|
-
- **Code**: take a look at the [open issues](issues). Even if you can't write code, commenting on them, showing that you care about a given issue matters. It helps us triage them.
|
12
|
-
|
13
|
-
## Your First Contribution
|
14
|
-
|
15
|
-
Working on your first Pull Request? You can learn how from this *free* series, [How to Contribute to an Open Source Project on GitHub](https://egghead.io/series/how-to-contribute-to-an-open-source-project-on-github).
|
11
|
+
- **Code**: take a look at the [open issues](https://github.com/karafka/karafka/issues). Even if you can't write code, commenting on them, showing that you care about a given issue matters. It helps us triage them.
|
16
12
|
|
17
13
|
## Submitting code
|
18
14
|
|
@@ -32,5 +28,5 @@ By sending a pull request to the pro components, you are agreeing to transfer th
|
|
32
28
|
|
33
29
|
## Questions
|
34
30
|
|
35
|
-
If you have any questions, create an [issue](
|
36
|
-
You can also reach us at
|
31
|
+
If you have any questions, create an [issue](https://github.com/karafka/karafka/issues) (protip: do a quick search first to see if someone else didn't ask the same question before!).
|
32
|
+
You can also reach us at contact@karafka.io.
|
data/Gemfile.lock
CHANGED
@@ -1,22 +1,20 @@
|
|
1
1
|
PATH
|
2
2
|
remote: .
|
3
3
|
specs:
|
4
|
-
karafka (2.0.0.
|
5
|
-
|
6
|
-
dry-monitor (~> 0.5)
|
7
|
-
dry-validation (~> 1.7)
|
4
|
+
karafka (2.0.0.rc4)
|
5
|
+
karafka-core (>= 2.0.0, < 3.0.0)
|
8
6
|
rdkafka (>= 0.10)
|
9
7
|
thor (>= 0.20)
|
10
|
-
waterdrop (>= 2.
|
8
|
+
waterdrop (>= 2.4.0, < 3.0.0)
|
11
9
|
zeitwerk (~> 2.3)
|
12
10
|
|
13
11
|
GEM
|
14
12
|
remote: https://rubygems.org/
|
15
13
|
specs:
|
16
|
-
activejob (7.0.3)
|
17
|
-
activesupport (= 7.0.3)
|
14
|
+
activejob (7.0.3.1)
|
15
|
+
activesupport (= 7.0.3.1)
|
18
16
|
globalid (>= 0.3.6)
|
19
|
-
activesupport (7.0.3)
|
17
|
+
activesupport (7.0.3.1)
|
20
18
|
concurrent-ruby (~> 1.0, >= 1.0.2)
|
21
19
|
i18n (>= 1.6, < 2)
|
22
20
|
minitest (>= 5.1)
|
@@ -25,54 +23,17 @@ GEM
|
|
25
23
|
concurrent-ruby (1.1.10)
|
26
24
|
diff-lcs (1.5.0)
|
27
25
|
docile (1.4.0)
|
28
|
-
dry-configurable (0.15.0)
|
29
|
-
concurrent-ruby (~> 1.0)
|
30
|
-
dry-core (~> 0.6)
|
31
|
-
dry-container (0.9.0)
|
32
|
-
concurrent-ruby (~> 1.0)
|
33
|
-
dry-configurable (~> 0.13, >= 0.13.0)
|
34
|
-
dry-core (0.7.1)
|
35
|
-
concurrent-ruby (~> 1.0)
|
36
|
-
dry-events (0.3.0)
|
37
|
-
concurrent-ruby (~> 1.0)
|
38
|
-
dry-core (~> 0.5, >= 0.5)
|
39
|
-
dry-inflector (0.2.1)
|
40
|
-
dry-initializer (3.1.1)
|
41
|
-
dry-logic (1.2.0)
|
42
|
-
concurrent-ruby (~> 1.0)
|
43
|
-
dry-core (~> 0.5, >= 0.5)
|
44
|
-
dry-monitor (0.5.0)
|
45
|
-
dry-configurable (~> 0.13, >= 0.13.0)
|
46
|
-
dry-core (~> 0.5, >= 0.5)
|
47
|
-
dry-events (~> 0.2)
|
48
|
-
dry-schema (1.9.3)
|
49
|
-
concurrent-ruby (~> 1.0)
|
50
|
-
dry-configurable (~> 0.13, >= 0.13.0)
|
51
|
-
dry-core (~> 0.5, >= 0.5)
|
52
|
-
dry-initializer (~> 3.0)
|
53
|
-
dry-logic (~> 1.0)
|
54
|
-
dry-types (~> 1.5)
|
55
|
-
dry-types (1.5.1)
|
56
|
-
concurrent-ruby (~> 1.0)
|
57
|
-
dry-container (~> 0.3)
|
58
|
-
dry-core (~> 0.5, >= 0.5)
|
59
|
-
dry-inflector (~> 0.1, >= 0.1.2)
|
60
|
-
dry-logic (~> 1.0, >= 1.0.2)
|
61
|
-
dry-validation (1.8.1)
|
62
|
-
concurrent-ruby (~> 1.0)
|
63
|
-
dry-container (~> 0.7, >= 0.7.1)
|
64
|
-
dry-core (~> 0.5, >= 0.5)
|
65
|
-
dry-initializer (~> 3.0)
|
66
|
-
dry-schema (~> 1.8, >= 1.8.0)
|
67
26
|
factory_bot (6.2.1)
|
68
27
|
activesupport (>= 5.0.0)
|
69
28
|
ffi (1.15.5)
|
70
29
|
globalid (1.0.0)
|
71
30
|
activesupport (>= 5.0)
|
72
|
-
i18n (1.
|
31
|
+
i18n (1.12.0)
|
73
32
|
concurrent-ruby (~> 1.0)
|
33
|
+
karafka-core (2.0.0)
|
34
|
+
concurrent-ruby (>= 1.1)
|
74
35
|
mini_portile2 (2.8.0)
|
75
|
-
minitest (5.
|
36
|
+
minitest (5.16.2)
|
76
37
|
rake (13.0.6)
|
77
38
|
rdkafka (0.12.0)
|
78
39
|
ffi (~> 1.15)
|
@@ -98,13 +59,10 @@ GEM
|
|
98
59
|
simplecov-html (0.12.3)
|
99
60
|
simplecov_json_formatter (0.1.4)
|
100
61
|
thor (1.2.1)
|
101
|
-
tzinfo (2.0.
|
62
|
+
tzinfo (2.0.5)
|
102
63
|
concurrent-ruby (~> 1.0)
|
103
|
-
waterdrop (2.
|
104
|
-
|
105
|
-
dry-configurable (~> 0.13)
|
106
|
-
dry-monitor (~> 0.5)
|
107
|
-
dry-validation (~> 1.7)
|
64
|
+
waterdrop (2.4.0)
|
65
|
+
karafka-core (~> 2.0)
|
108
66
|
rdkafka (>= 0.10)
|
109
67
|
zeitwerk (~> 2.3)
|
110
68
|
zeitwerk (2.6.0)
|
@@ -121,4 +79,4 @@ DEPENDENCIES
|
|
121
79
|
simplecov
|
122
80
|
|
123
81
|
BUNDLED WITH
|
124
|
-
2.3.
|
82
|
+
2.3.15
|
data/LICENSE-COMM
CHANGED
@@ -30,7 +30,7 @@ The Open Source version of the Software (“LGPL Version”) is licensed under t
|
|
30
30
|
|
31
31
|
4. Ownership. Notwithstanding anything to the contrary contained herein, except for the limited license rights expressly provided herein, Maciej Mensfeld and its suppliers have and will retain all rights, title and interest (including, without limitation, all patent, copyright, trademark, trade secret and other intellectual property rights) in and to the Software and all copies, modifications and derivative works thereof (including any changes which incorporate any of your ideas, feedback or suggestions). You acknowledge that you are obtaining only a limited license right to the Software, and that irrespective of any use of the words “purchase”, “sale” or like terms hereunder no ownership rights are being conveyed to you under this Agreement or otherwise.
|
32
32
|
|
33
|
-
5. Fees and Payment. The Software license fees will be due and payable in full as set forth in the applicable invoice or at the time of purchase.
|
33
|
+
5. Fees and Payment. The Software license fees will be due and payable in full as set forth in the applicable invoice or at the time of purchase. There are no refunds beyond the remedy refund.
|
34
34
|
|
35
35
|
6. Support, Maintenance and Services. Subject to the terms and conditions of this Agreement, as set forth in your invoice, and as set forth on the Karafka Pro support page (https://github.com/karafka/karafka/wiki/Commercial-Support), support and maintenance services may be included with the purchase of your license subscription.
|
36
36
|
|
data/README.md
CHANGED
@@ -4,11 +4,18 @@
|
|
4
4
|
[![Gem Version](https://badge.fury.io/rb/karafka.svg)](http://badge.fury.io/rb/karafka)
|
5
5
|
[![Join the chat at https://slack.karafka.io](https://raw.githubusercontent.com/karafka/misc/master/slack.svg)](https://slack.karafka.io)
|
6
6
|
|
7
|
-
**Note**: All of the documentation here refers to Karafka `2.0
|
7
|
+
**Note**: All of the documentation here refers to Karafka `2.0.0.rc3` or higher. If you are looking for the documentation for Karafka `1.4`, please click [here](https://github.com/karafka/wiki/tree/1.4).
|
8
8
|
|
9
9
|
## About Karafka
|
10
10
|
|
11
|
-
Karafka is a multi-threaded
|
11
|
+
Karafka is a Ruby and Rails multi-threaded efficient Kafka processing framework that:
|
12
|
+
|
13
|
+
- Supports parallel processing in [multiple threads](https://github.com/karafka/karafka/wiki/Concurrency-and-multithreading) (also for a [single topic partition](https://github.com/karafka/karafka/wiki/Pro-Virtual-Partitions) work)
|
14
|
+
- Has [ActiveJob backend](https://github.com/karafka/karafka/wiki/Active-Job) support (including ordered jobs)
|
15
|
+
- [Automatically integrates](https://github.com/karafka/karafka/wiki/Integrating-with-Ruby-on-Rails-and-other-frameworks#integrating-with-ruby-on-rails=) with Ruby on Rails
|
16
|
+
- Supports in-development [code reloading](https://github.com/karafka/karafka/wiki/Auto-reload-of-code-changes-in-development)
|
17
|
+
- Is powered by [librdkafka](https://github.com/edenhill/librdkafka) (the Apache Kafka C/C++ client library)
|
18
|
+
- Has an out-of the box [StatsD/DataDog monitoring](https://github.com/karafka/karafka/wiki/Monitoring-and-logging) with a dashboard template.
|
12
19
|
|
13
20
|
```ruby
|
14
21
|
# Define what topics you want to consume with which consumers in karafka.rb
|
@@ -28,24 +35,51 @@ class EventsConsumer < ApplicationConsumer
|
|
28
35
|
end
|
29
36
|
```
|
30
37
|
|
31
|
-
Karafka
|
32
|
-
|
33
|
-
Karafka **uses** threads to handle many messages at the same time in the same process. It does not require Rails but will integrate tightly with any Ruby on Rails applications to make event processing dead simple.
|
38
|
+
Karafka **uses** threads to handle many messages simultaneously in the same process. It does not require Rails but will integrate tightly with any Ruby on Rails applications to make event processing dead simple.
|
34
39
|
|
35
40
|
## Getting started
|
36
41
|
|
37
|
-
If you're
|
42
|
+
If you're entirely new to the subject, you can start with our "Kafka on Rails" articles series, which will get you up and running with the terminology and basic ideas behind using Kafka:
|
38
43
|
|
39
44
|
- [Kafka on Rails: Using Kafka with Ruby on Rails – Part 1 – Kafka basics and its advantages](https://mensfeld.pl/2017/11/kafka-on-rails-using-kafka-with-ruby-on-rails-part-1-kafka-basics-and-its-advantages/)
|
40
|
-
- [Kafka on Rails: Using Kafka with Ruby on Rails – Part 2 – Getting started with
|
45
|
+
- [Kafka on Rails: Using Kafka with Ruby on Rails – Part 2 – Getting started with Rails and Kafka](https://mensfeld.pl/2018/01/kafka-on-rails-using-kafka-with-ruby-on-rails-part-2-getting-started-with-ruby-and-kafka/)
|
41
46
|
|
42
47
|
If you want to get started with Kafka and Karafka as fast as possible, then the best idea is to visit our [Getting started](https://github.com/karafka/karafka/wiki/Getting-started) guides and the [example apps repository](https://github.com/karafka/example-apps).
|
43
48
|
|
44
49
|
We also maintain many [integration specs](https://github.com/karafka/karafka/tree/master/spec/integrations) illustrating various use-cases and features of the framework.
|
45
50
|
|
51
|
+
### TL;DR (1 minute from setup to publishing and consuming messages)
|
52
|
+
|
53
|
+
**Prerequisites**: Kafka running. You can start it by following instructions from [here](https://github.com/karafka/karafka/wiki/Setting-up-Kafka).
|
54
|
+
|
55
|
+
1. Add and install Karafka:
|
56
|
+
|
57
|
+
```bash
|
58
|
+
bundle add karafka -v 2.0.0.rc3
|
59
|
+
|
60
|
+
bundle exec karafka install
|
61
|
+
```
|
62
|
+
|
63
|
+
2. Dispatch a message to the example topic using the Rails or Ruby console:
|
64
|
+
|
65
|
+
```ruby
|
66
|
+
Karafka.producer.produce_sync(topic: 'example', payload: { 'ping' => 'pong' }.to_json)
|
67
|
+
```
|
68
|
+
|
69
|
+
3. Run Karafka server and see the consumption magic happen:
|
70
|
+
|
71
|
+
```bash
|
72
|
+
bundle exec karafka server
|
73
|
+
|
74
|
+
[7616dc24-505a-417f-b87b-6bf8fc2d98c5] Polled 1 message in 1000ms
|
75
|
+
[dcf3a8d8-0bd9-433a-8f63-b70a0cdb0732] Consume job for ExampleConsumer on example started
|
76
|
+
{"ping"=>"pong"}
|
77
|
+
[dcf3a8d8-0bd9-433a-8f63-b70a0cdb0732] Consume job for ExampleConsumer on example finished in 0ms
|
78
|
+
```
|
79
|
+
|
46
80
|
## Want to Upgrade? LGPL is not for you? Want to help?
|
47
81
|
|
48
|
-
I also sell Karafka Pro
|
82
|
+
I also sell Karafka Pro subscriptions. It includes a commercial-friendly license, priority support, architecture consultations, and high throughput data processing-related features (virtual partitions, long-running jobs, and more).
|
49
83
|
|
50
84
|
**20%** of the income will be distributed back to other OSS projects that Karafka uses under the hood.
|
51
85
|
|
@@ -53,6 +87,8 @@ Help me provide high-quality open-source software. Please see the Karafka [homep
|
|
53
87
|
|
54
88
|
## Support
|
55
89
|
|
56
|
-
Karafka has [Wiki pages](https://github.com/karafka/karafka/wiki) for almost everything and a pretty decent [FAQ](https://github.com/karafka/karafka/wiki/FAQ). It covers the
|
90
|
+
Karafka has [Wiki pages](https://github.com/karafka/karafka/wiki) for almost everything and a pretty decent [FAQ](https://github.com/karafka/karafka/wiki/FAQ). It covers the installation, setup, and deployment, along with other useful details on how to run Karafka.
|
91
|
+
|
92
|
+
If you have questions about using Karafka, feel free to join our [Slack](https://slack.karafka.io) channel.
|
57
93
|
|
58
|
-
|
94
|
+
Karafka has priority support for technical and architectural questions that is part of the Karafka Pro subscription.
|
data/config/errors.yml
CHANGED
@@ -1,9 +1,56 @@
|
|
1
1
|
en:
|
2
|
-
|
3
|
-
|
2
|
+
validations:
|
3
|
+
config:
|
4
|
+
missing: needs to be present
|
5
|
+
client_id_format: 'needs to be a string with a Kafka accepted format'
|
6
|
+
license.entity_format: needs to be a string
|
7
|
+
license.token_format: needs to be either false or a string
|
8
|
+
license.expires_on_format: needs to be a valid date
|
9
|
+
concurrency_format: needs to be an integer bigger than 0
|
10
|
+
consumer_mapper_format: needs to be present
|
11
|
+
consumer_persistence_format: needs to be either true or false
|
12
|
+
pause_timeout_format: needs to be an integer bigger than 0
|
13
|
+
pause_max_timeout_format: needs to be an integer bigger than 0
|
14
|
+
pause_with_exponential_backoff_format: needs to be either true or false
|
15
|
+
shutdown_timeout_format: needs to be an integer bigger than 0
|
16
|
+
max_wait_time_format: needs to be an integer bigger than 0
|
17
|
+
kafka_format: needs to be a filled hash
|
18
|
+
internal.status_format: needs to be present
|
19
|
+
internal.process_format: needs to be present
|
20
|
+
internal.routing.builder_format: needs to be present
|
21
|
+
internal.routing.subscription_groups_builder_format: needs to be present
|
22
|
+
key_must_be_a_symbol: All keys under the kafka settings scope need to be symbols
|
4
23
|
max_timeout_vs_pause_max_timeout: pause_timeout must be less or equal to pause_max_timeout
|
5
24
|
shutdown_timeout_vs_max_wait_time: shutdown_timeout must be more than max_wait_time
|
6
|
-
|
7
|
-
|
25
|
+
|
26
|
+
server_cli_options:
|
27
|
+
missing: needs to be present
|
8
28
|
consumer_groups_inclusion: Unknown consumer group
|
9
|
-
|
29
|
+
|
30
|
+
consumer_group_topic:
|
31
|
+
missing: needs to be present
|
32
|
+
name_format: 'needs to be a string with a Kafka accepted format'
|
33
|
+
deserializer_format: needs to be present
|
34
|
+
manual_offset_management_format: needs to be either true or false
|
35
|
+
consumer_format: needs to be present
|
36
|
+
id_format: 'needs to be a string with a Kafka accepted format'
|
37
|
+
initial_offset_format: needs to be either earliest or latest
|
38
|
+
|
39
|
+
consumer_group:
|
40
|
+
missing: needs to be present
|
41
|
+
topics_names_not_unique: all topic names within a single consumer group must be unique
|
42
|
+
id_format: 'needs to be a string with a Kafka accepted format'
|
43
|
+
topics_format: needs to be a non-empty array
|
44
|
+
|
45
|
+
job_options:
|
46
|
+
missing: needs to be present
|
47
|
+
dispatch_method_format: needs to be either :produce_async or :produce_sync
|
48
|
+
partitioner_format: 'needs to respond to #call'
|
49
|
+
partition_key_type_format: 'needs to be either :key or :partition_key'
|
50
|
+
|
51
|
+
test:
|
52
|
+
missing: needs to be present
|
53
|
+
id_format: needs to be a String
|
54
|
+
|
55
|
+
pro_consumer_group_topic:
|
56
|
+
consumer_format: needs to inherit from Karafka::Pro::BaseConsumer and not Karafka::Consumer
|
data/docker-compose.yml
CHANGED
@@ -36,10 +36,13 @@ services:
|
|
36
36
|
integrations_17_02:2:1,\
|
37
37
|
integrations_18_02:2:1,\
|
38
38
|
integrations_19_02:2:1,\
|
39
|
+
integrations_20_02:2:1,\
|
40
|
+
integrations_21_02:2:1,\
|
39
41
|
integrations_00_03:3:1,\
|
40
42
|
integrations_01_03:3:1,\
|
41
43
|
integrations_02_03:3:1,\
|
42
44
|
integrations_03_03:3:1,\
|
45
|
+
integrations_04_03:3:1,\
|
43
46
|
integrations_00_10:10:1,\
|
44
47
|
integrations_01_10:10:1,\
|
45
48
|
benchmarks_00_01:1:1,\
|
data/karafka.gemspec
CHANGED
@@ -12,19 +12,17 @@ Gem::Specification.new do |spec|
|
|
12
12
|
spec.authors = ['Maciej Mensfeld']
|
13
13
|
spec.email = %w[maciej@mensfeld.pl]
|
14
14
|
spec.homepage = 'https://karafka.io'
|
15
|
-
spec.summary = '
|
15
|
+
spec.summary = 'Efficient Kafka processing framework for Ruby and Rails'
|
16
16
|
spec.description = 'Framework used to simplify Apache Kafka based Ruby applications development'
|
17
17
|
spec.licenses = ['LGPL-3.0', 'Commercial']
|
18
18
|
|
19
|
-
spec.add_dependency '
|
20
|
-
spec.add_dependency 'dry-monitor', '~> 0.5'
|
21
|
-
spec.add_dependency 'dry-validation', '~> 1.7'
|
19
|
+
spec.add_dependency 'karafka-core', '>= 2.0.0', '< 3.0.0'
|
22
20
|
spec.add_dependency 'rdkafka', '>= 0.10'
|
23
21
|
spec.add_dependency 'thor', '>= 0.20'
|
24
|
-
spec.add_dependency 'waterdrop', '>= 2.
|
22
|
+
spec.add_dependency 'waterdrop', '>= 2.4.0', '< 3.0.0'
|
25
23
|
spec.add_dependency 'zeitwerk', '~> 2.3'
|
26
24
|
|
27
|
-
spec.required_ruby_version = '>= 2.
|
25
|
+
spec.required_ruby_version = '>= 2.7.0'
|
28
26
|
|
29
27
|
if $PROGRAM_NAME.end_with?('gem')
|
30
28
|
spec.signing_key = File.expand_path('~/.ssh/gem-private_key.pem')
|
@@ -9,6 +9,8 @@ module Karafka
|
|
9
9
|
# @note ActiveJob does not support batches, so we just run one message after another
|
10
10
|
def consume
|
11
11
|
messages.each do |message|
|
12
|
+
break if Karafka::App.stopping?
|
13
|
+
|
12
14
|
::ActiveJob::Base.execute(
|
13
15
|
# We technically speaking could set this as deserializer and reference it from the
|
14
16
|
# message instead of using the `#raw_payload`. This is not done on purpose to simplify
|
@@ -7,9 +7,15 @@ module Karafka
|
|
7
7
|
# we want to keep ActiveJob related Karafka components outside of the core Karafka code and
|
8
8
|
# all in the same place
|
9
9
|
class JobOptionsContract < Contracts::Base
|
10
|
-
|
11
|
-
|
10
|
+
configure do |config|
|
11
|
+
config.error_messages = YAML.safe_load(
|
12
|
+
File.read(
|
13
|
+
File.join(Karafka.gem_root, 'config', 'errors.yml')
|
14
|
+
)
|
15
|
+
).fetch('en').fetch('validations').fetch('job_options')
|
12
16
|
end
|
17
|
+
|
18
|
+
optional(:dispatch_method) { |val| %i[produce_async produce_sync].include?(val) }
|
13
19
|
end
|
14
20
|
end
|
15
21
|
end
|
data/lib/karafka/cli/install.rb
CHANGED
@@ -7,6 +7,8 @@ module Karafka
|
|
7
7
|
class Cli < Thor
|
8
8
|
# Install Karafka Cli action
|
9
9
|
class Install < Base
|
10
|
+
include Helpers::Colorize
|
11
|
+
|
10
12
|
desc 'Install all required things for Karafka application in current directory'
|
11
13
|
|
12
14
|
# Directories created by default
|
@@ -42,14 +44,25 @@ module Karafka
|
|
42
44
|
FileUtils.mkdir_p Karafka.root.join(dir)
|
43
45
|
end
|
44
46
|
|
47
|
+
puts
|
48
|
+
puts 'Installing Karafka framework...'
|
49
|
+
puts 'Ruby on Rails detected...' if rails?
|
50
|
+
puts
|
51
|
+
|
45
52
|
INSTALL_FILES_MAP.each do |source, target|
|
46
|
-
|
53
|
+
pathed_target = Karafka.root.join(target)
|
47
54
|
|
48
55
|
template = File.read(Karafka.core_root.join("templates/#{source}"))
|
49
56
|
render = ::ERB.new(template, trim_mode: '-').result(binding)
|
50
57
|
|
51
|
-
File.open(
|
58
|
+
File.open(pathed_target, 'w') { |file| file.write(render) }
|
59
|
+
|
60
|
+
puts "#{green('Created')} #{target}"
|
52
61
|
end
|
62
|
+
|
63
|
+
puts
|
64
|
+
puts("Installation #{green('completed')}. Have fun!")
|
65
|
+
puts
|
53
66
|
end
|
54
67
|
|
55
68
|
# @return [Boolean] true if we have Rails loaded
|
data/lib/karafka/cli/server.rb
CHANGED
@@ -5,6 +5,8 @@ module Karafka
|
|
5
5
|
class Cli < Thor
|
6
6
|
# Server Karafka Cli action
|
7
7
|
class Server < Base
|
8
|
+
include Helpers::Colorize
|
9
|
+
|
8
10
|
desc 'Start the Karafka server (short-cut alias: "s")'
|
9
11
|
option aliases: 's'
|
10
12
|
option :consumer_groups, type: :array, default: nil, aliases: :g
|
@@ -31,11 +33,11 @@ module Karafka
|
|
31
33
|
|
32
34
|
if Karafka.pro?
|
33
35
|
Karafka.logger.info(
|
34
|
-
|
36
|
+
green('Thank you for investing in the Karafka Pro subscription!')
|
35
37
|
)
|
36
38
|
else
|
37
39
|
Karafka.logger.info(
|
38
|
-
|
40
|
+
red('You like Karafka? Please consider getting a Pro version!')
|
39
41
|
)
|
40
42
|
end
|
41
43
|
end
|
@@ -69,9 +69,6 @@ module Karafka
|
|
69
69
|
# Put a message to the buffer if there is one
|
70
70
|
@buffer << message if message
|
71
71
|
|
72
|
-
# Track time spent on all of the processing and polling
|
73
|
-
time_poll.checkpoint
|
74
|
-
|
75
72
|
# Upon polling rebalance manager might have been updated.
|
76
73
|
# If partition revocation happens, we need to remove messages from revoked partitions
|
77
74
|
# as well as ensure we do not have duplicated due to the offset reset for partitions
|
@@ -82,6 +79,9 @@ module Karafka
|
|
82
79
|
break
|
83
80
|
end
|
84
81
|
|
82
|
+
# Track time spent on all of the processing and polling
|
83
|
+
time_poll.checkpoint
|
84
|
+
|
85
85
|
# Finally once we've (potentially) removed revoked, etc, if no messages were returned
|
86
86
|
# we can break.
|
87
87
|
# Worth keeping in mind, that the rebalance manager might have been updated despite no
|
@@ -268,7 +268,7 @@ module Karafka
|
|
268
268
|
true
|
269
269
|
rescue Rdkafka::RdkafkaError => e
|
270
270
|
return false if e.code == :assignment_lost
|
271
|
-
return
|
271
|
+
return true if e.code == :no_offset
|
272
272
|
|
273
273
|
raise e
|
274
274
|
end
|
@@ -3,20 +3,14 @@
|
|
3
3
|
module Karafka
|
4
4
|
module Contracts
|
5
5
|
# Base contract for all Karafka contracts
|
6
|
-
class Base <
|
7
|
-
config.messages.load_paths << File.join(Karafka.gem_root, 'config', 'errors.yml')
|
8
|
-
|
6
|
+
class Base < ::Karafka::Core::Contractable::Contract
|
9
7
|
# @param data [Hash] data for validation
|
10
8
|
# @return [Boolean] true if all good
|
11
9
|
# @raise [Errors::InvalidConfigurationError] invalid configuration error
|
12
10
|
# @note We use contracts only in the config validation context, so no need to add support
|
13
11
|
# for multiple error classes. It will be added when it will be needed.
|
14
12
|
def validate!(data)
|
15
|
-
|
16
|
-
|
17
|
-
return true if result.success?
|
18
|
-
|
19
|
-
raise Errors::InvalidConfigurationError, result.errors.to_h
|
13
|
+
super(data, Errors::InvalidConfigurationError)
|
20
14
|
end
|
21
15
|
end
|
22
16
|
end
|