karafka 1.0.1 → 1.4.14
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +5 -5
- checksums.yaml.gz.sig +0 -0
- data/.coditsu/ci.yml +3 -0
- data/.console_irbrc +1 -3
- data/.diffend.yml +3 -0
- data/.github/ISSUE_TEMPLATE/bug_report.md +50 -0
- data/.github/ISSUE_TEMPLATE/feature_request.md +20 -0
- data/.github/workflows/ci.yml +76 -0
- data/.gitignore +1 -0
- data/.ruby-version +1 -1
- data/CHANGELOG.md +286 -16
- data/CODE_OF_CONDUCT.md +1 -1
- data/CONTRIBUTING.md +6 -7
- data/Gemfile +5 -2
- data/Gemfile.lock +100 -103
- data/README.md +54 -74
- data/bin/karafka +1 -1
- data/certs/mensfeld.pem +26 -0
- data/config/errors.yml +40 -5
- data/docker-compose.yml +17 -0
- data/karafka.gemspec +31 -15
- data/lib/karafka/app.rb +19 -18
- data/lib/karafka/assignment_strategies/round_robin.rb +13 -0
- data/lib/karafka/attributes_map.rb +17 -21
- data/lib/karafka/backends/inline.rb +2 -3
- data/lib/karafka/base_consumer.rb +57 -0
- data/lib/karafka/base_responder.rb +77 -31
- data/lib/karafka/cli/base.rb +4 -4
- data/lib/karafka/cli/console.rb +11 -9
- data/lib/karafka/cli/flow.rb +9 -7
- data/lib/karafka/cli/info.rb +5 -4
- data/lib/karafka/cli/install.rb +32 -8
- data/lib/karafka/cli/missingno.rb +19 -0
- data/lib/karafka/cli/server.rb +18 -16
- data/lib/karafka/cli.rb +10 -2
- data/lib/karafka/code_reloader.rb +67 -0
- data/lib/karafka/connection/{config_adapter.rb → api_adapter.rb} +71 -22
- data/lib/karafka/connection/batch_delegator.rb +55 -0
- data/lib/karafka/connection/builder.rb +23 -0
- data/lib/karafka/connection/client.rb +120 -0
- data/lib/karafka/connection/listener.rb +39 -26
- data/lib/karafka/connection/message_delegator.rb +36 -0
- data/lib/karafka/consumers/batch_metadata.rb +10 -0
- data/lib/karafka/consumers/callbacks.rb +71 -0
- data/lib/karafka/consumers/includer.rb +64 -0
- data/lib/karafka/consumers/responders.rb +24 -0
- data/lib/karafka/{controllers → consumers}/single_params.rb +3 -3
- data/lib/karafka/contracts/config.rb +21 -0
- data/lib/karafka/contracts/consumer_group.rb +211 -0
- data/lib/karafka/contracts/consumer_group_topic.rb +19 -0
- data/lib/karafka/contracts/responder_usage.rb +54 -0
- data/lib/karafka/contracts/server_cli_options.rb +31 -0
- data/lib/karafka/contracts.rb +10 -0
- data/lib/karafka/errors.rb +27 -12
- data/lib/karafka/fetcher.rb +15 -15
- data/lib/karafka/helpers/class_matcher.rb +20 -10
- data/lib/karafka/helpers/config_retriever.rb +3 -3
- data/lib/karafka/helpers/inflector.rb +26 -0
- data/lib/karafka/helpers/multi_delegator.rb +0 -1
- data/lib/karafka/instrumentation/logger.rb +54 -0
- data/lib/karafka/instrumentation/monitor.rb +70 -0
- data/lib/karafka/instrumentation/proctitle_listener.rb +36 -0
- data/lib/karafka/instrumentation/stdout_listener.rb +140 -0
- data/lib/karafka/params/batch_metadata.rb +26 -0
- data/lib/karafka/params/builders/batch_metadata.rb +30 -0
- data/lib/karafka/params/builders/params.rb +38 -0
- data/lib/karafka/params/builders/params_batch.rb +25 -0
- data/lib/karafka/params/metadata.rb +20 -0
- data/lib/karafka/params/params.rb +35 -107
- data/lib/karafka/params/params_batch.rb +38 -19
- data/lib/karafka/patches/ruby_kafka.rb +47 -0
- data/lib/karafka/persistence/client.rb +29 -0
- data/lib/karafka/persistence/consumers.rb +45 -0
- data/lib/karafka/persistence/topics.rb +48 -0
- data/lib/karafka/process.rb +6 -9
- data/lib/karafka/responders/builder.rb +15 -14
- data/lib/karafka/responders/topic.rb +14 -9
- data/lib/karafka/routing/builder.rb +38 -9
- data/lib/karafka/routing/consumer_group.rb +6 -4
- data/lib/karafka/routing/consumer_mapper.rb +10 -9
- data/lib/karafka/routing/proxy.rb +10 -1
- data/lib/karafka/routing/router.rb +1 -1
- data/lib/karafka/routing/topic.rb +8 -12
- data/lib/karafka/routing/topic_mapper.rb +16 -18
- data/lib/karafka/serialization/json/deserializer.rb +27 -0
- data/lib/karafka/serialization/json/serializer.rb +31 -0
- data/lib/karafka/server.rb +50 -39
- data/lib/karafka/setup/config.rb +138 -91
- data/lib/karafka/setup/configurators/water_drop.rb +21 -16
- data/lib/karafka/setup/dsl.rb +21 -0
- data/lib/karafka/status.rb +7 -3
- data/lib/karafka/templates/{application_controller.rb.example → application_consumer.rb.erb} +2 -2
- data/lib/karafka/templates/karafka.rb.erb +92 -0
- data/lib/karafka/version.rb +1 -1
- data/lib/karafka.rb +19 -15
- data.tar.gz.sig +0 -0
- metadata +119 -81
- metadata.gz.sig +5 -0
- data/.github/ISSUE_TEMPLATE.md +0 -2
- data/.travis.yml +0 -17
- data/Rakefile +0 -7
- data/lib/karafka/base_controller.rb +0 -117
- data/lib/karafka/connection/messages_consumer.rb +0 -106
- data/lib/karafka/connection/messages_processor.rb +0 -61
- data/lib/karafka/controllers/includer.rb +0 -51
- data/lib/karafka/controllers/responders.rb +0 -19
- data/lib/karafka/loader.rb +0 -29
- data/lib/karafka/logger.rb +0 -53
- data/lib/karafka/monitor.rb +0 -98
- data/lib/karafka/parsers/json.rb +0 -38
- data/lib/karafka/patches/dry_configurable.rb +0 -33
- data/lib/karafka/persistence/controller.rb +0 -23
- data/lib/karafka/schemas/config.rb +0 -31
- data/lib/karafka/schemas/consumer_group.rb +0 -64
- data/lib/karafka/schemas/consumer_group_topic.rb +0 -18
- data/lib/karafka/schemas/responder_usage.rb +0 -38
- data/lib/karafka/schemas/server_cli_options.rb +0 -43
- data/lib/karafka/setup/configurators/base.rb +0 -35
- data/lib/karafka/setup/configurators/celluloid.rb +0 -19
- data/lib/karafka/templates/karafka.rb.example +0 -41
- /data/lib/karafka/templates/{application_responder.rb.example → application_responder.rb.erb} +0 -0
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
|
-
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 7dd06a7ace623ae63695899e2cff1293482390ccbaeabcf7b1cc4b4aa6ec6a9e
|
4
|
+
data.tar.gz: 60e7c986a94c9552c1adc754b6bdb02f5e5cb5012881a15333fa8eff854485a8
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: cc34ba15cd7f8f138202fd0a9b53c3f63fcde13dd353e8da810a2b1f5e153c87335b0f4d26e6dccca4c484ddc12ef59d8331315130772bd5f05ef39a34f1a7c7
|
7
|
+
data.tar.gz: 9ef7cfce8c382091072e1c1df4cfece7bd5220cc3b8c74f82f4b1166bda756c354b4632df66ef95c912d3f13c2264914d96359fb72d4663592f98cd4b313f269
|
checksums.yaml.gz.sig
ADDED
Binary file
|
data/.coditsu/ci.yml
ADDED
data/.console_irbrc
CHANGED
@@ -1,11 +1,9 @@
|
|
1
1
|
# irbrc for Karafka console
|
2
|
-
require 'karafka'
|
3
|
-
require Karafka.boot_file
|
4
2
|
|
5
3
|
IRB.conf[:AUTO_INDENT] = true
|
6
4
|
IRB.conf[:SAVE_HISTORY] = 1000
|
7
5
|
IRB.conf[:USE_READLINE] = true
|
8
|
-
IRB.conf[:HISTORY_FILE] = "
|
6
|
+
IRB.conf[:HISTORY_FILE] = ".irb-history"
|
9
7
|
IRB.conf[:LOAD_MODULES] = [] unless IRB.conf.key?(:LOAD_MODULES)
|
10
8
|
|
11
9
|
unless IRB.conf[:LOAD_MODULES].include?('irb/completion')
|
data/.diffend.yml
ADDED
@@ -0,0 +1,50 @@
|
|
1
|
+
---
|
2
|
+
name: Bug Report
|
3
|
+
about: Report an issue with Karafka you've discovered.
|
4
|
+
---
|
5
|
+
|
6
|
+
*Be clear, concise and precise in your description of the problem.
|
7
|
+
Open an issue with a descriptive title and a summary in grammatically correct,
|
8
|
+
complete sentences.*
|
9
|
+
|
10
|
+
*Use the template below when reporting bugs. Please, make sure that
|
11
|
+
you're running the latest stable Karafka and that the problem you're reporting
|
12
|
+
hasn't been reported (and potentially fixed) already.*
|
13
|
+
|
14
|
+
*Before filing the ticket you should replace all text above the horizontal
|
15
|
+
rule with your own words.*
|
16
|
+
|
17
|
+
--------
|
18
|
+
|
19
|
+
## Expected behavior
|
20
|
+
|
21
|
+
Describe here how you expected Karafka to behave in this particular situation.
|
22
|
+
|
23
|
+
## Actual behavior
|
24
|
+
|
25
|
+
Describe here what actually happened.
|
26
|
+
|
27
|
+
## Steps to reproduce the problem
|
28
|
+
|
29
|
+
This is extremely important! Providing us with a reliable way to reproduce
|
30
|
+
a problem will expedite its solution.
|
31
|
+
|
32
|
+
## Your setup details
|
33
|
+
|
34
|
+
Please provide kafka version and the output of `karafka info` or `bundle exec karafka info` if using Bundler.
|
35
|
+
|
36
|
+
Here's an example:
|
37
|
+
|
38
|
+
```
|
39
|
+
$ [bundle exec] karafka info
|
40
|
+
Karafka version: 1.3.0
|
41
|
+
Ruby version: 2.6.3
|
42
|
+
Ruby-kafka version: 0.7.9
|
43
|
+
Application client id: karafka-local
|
44
|
+
Backend: inline
|
45
|
+
Batch fetching: true
|
46
|
+
Batch consuming: true
|
47
|
+
Boot file: /app/karafka/karafka.rb
|
48
|
+
Environment: development
|
49
|
+
Kafka seed brokers: ["kafka://kafka:9092"]
|
50
|
+
```
|
@@ -0,0 +1,20 @@
|
|
1
|
+
---
|
2
|
+
name: Feature Request
|
3
|
+
about: Suggest new Karafka features or improvements to existing features.
|
4
|
+
---
|
5
|
+
|
6
|
+
## Is your feature request related to a problem? Please describe.
|
7
|
+
|
8
|
+
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
9
|
+
|
10
|
+
## Describe the solution you'd like
|
11
|
+
|
12
|
+
A clear and concise description of what you want to happen.
|
13
|
+
|
14
|
+
## Describe alternatives you've considered
|
15
|
+
|
16
|
+
A clear and concise description of any alternative solutions or features you've considered.
|
17
|
+
|
18
|
+
## Additional context
|
19
|
+
|
20
|
+
Add any other context or screenshots about the feature request here.
|
@@ -0,0 +1,76 @@
|
|
1
|
+
name: ci
|
2
|
+
|
3
|
+
concurrency: ci-${{ github.ref }}
|
4
|
+
|
5
|
+
on:
|
6
|
+
pull_request:
|
7
|
+
push:
|
8
|
+
schedule:
|
9
|
+
- cron: '0 1 * * *'
|
10
|
+
|
11
|
+
jobs:
|
12
|
+
specs:
|
13
|
+
runs-on: ubuntu-latest
|
14
|
+
needs: diffend
|
15
|
+
strategy:
|
16
|
+
fail-fast: false
|
17
|
+
matrix:
|
18
|
+
ruby:
|
19
|
+
- '3.1'
|
20
|
+
- '3.0'
|
21
|
+
- '2.7'
|
22
|
+
include:
|
23
|
+
- ruby: '3.1'
|
24
|
+
coverage: 'true'
|
25
|
+
steps:
|
26
|
+
- uses: actions/checkout@v2
|
27
|
+
- name: Install package dependencies
|
28
|
+
run: "[ -e $APT_DEPS ] || sudo apt-get install -y --no-install-recommends $APT_DEPS"
|
29
|
+
- name: Set up Ruby
|
30
|
+
uses: ruby/setup-ruby@v1
|
31
|
+
with:
|
32
|
+
ruby-version: ${{matrix.ruby}}
|
33
|
+
- name: Install latest bundler
|
34
|
+
run: |
|
35
|
+
gem install bundler --no-document
|
36
|
+
bundle config set without 'tools benchmarks docs'
|
37
|
+
- name: Bundle install
|
38
|
+
run: |
|
39
|
+
bundle config set without development
|
40
|
+
bundle install --jobs 4 --retry 3
|
41
|
+
- name: Run Kafka with docker-compose
|
42
|
+
run: docker-compose up -d
|
43
|
+
- name: Run all tests
|
44
|
+
env:
|
45
|
+
GITHUB_COVERAGE: ${{matrix.coverage}}
|
46
|
+
run: bundle exec rspec
|
47
|
+
|
48
|
+
diffend:
|
49
|
+
runs-on: ubuntu-latest
|
50
|
+
strategy:
|
51
|
+
fail-fast: false
|
52
|
+
steps:
|
53
|
+
- uses: actions/checkout@v2
|
54
|
+
with:
|
55
|
+
fetch-depth: 0
|
56
|
+
- name: Set up Ruby
|
57
|
+
uses: ruby/setup-ruby@v1
|
58
|
+
with:
|
59
|
+
ruby-version: 3.1
|
60
|
+
- name: Install latest bundler
|
61
|
+
run: gem install bundler --no-document
|
62
|
+
- name: Install Diffend plugin
|
63
|
+
run: bundle plugin install diffend
|
64
|
+
- name: Bundle Secure
|
65
|
+
run: bundle secure
|
66
|
+
|
67
|
+
coditsu:
|
68
|
+
runs-on: ubuntu-latest
|
69
|
+
strategy:
|
70
|
+
fail-fast: false
|
71
|
+
steps:
|
72
|
+
- uses: actions/checkout@v2
|
73
|
+
with:
|
74
|
+
fetch-depth: 0
|
75
|
+
- name: Run Coditsu
|
76
|
+
run: \curl -sSL https://api.coditsu.io/run/ci | bash
|
data/.gitignore
CHANGED
data/.ruby-version
CHANGED
@@ -1 +1 @@
|
|
1
|
-
|
1
|
+
3.1.2
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,275 @@
|
|
1
1
|
# Karafka framework changelog
|
2
2
|
|
3
|
+
## 1.4.14 (2022-10-14)
|
4
|
+
- Fix `concurrent-ruby` missing as a dependency (Azdaroth)
|
5
|
+
- Warn about upcoming end of 1.4 support.
|
6
|
+
|
7
|
+
## 1.4.13 (2022-02-19)
|
8
|
+
- Drop support for ruby 2.6
|
9
|
+
- Add mfa requirement
|
10
|
+
|
11
|
+
## 1.4.12 (2022-01-13)
|
12
|
+
- Ruby 3.1 support
|
13
|
+
- `irb` dependency removal (vbyno)
|
14
|
+
|
15
|
+
## 1.4.11 (2021-12-04)
|
16
|
+
- Source code metadata url added to the gemspec
|
17
|
+
- Gem bump
|
18
|
+
|
19
|
+
## 1.4.10 (2021-10-30)
|
20
|
+
- update gems requirements in the gemspec (nijikon)
|
21
|
+
|
22
|
+
## 1.4.9 (2021-09-29)
|
23
|
+
- fix `dry-configurable` deprecation warnings for default value as positional argument
|
24
|
+
|
25
|
+
## 1.4.8 (2021-09-08)
|
26
|
+
- Allow 'rails' in Gemfile to enable rails-aware generator (rewritten)
|
27
|
+
|
28
|
+
## 1.4.7 (2021-09-04)
|
29
|
+
- Update ruby-kafka to `1.4.0`
|
30
|
+
- Support for `resolve_seed_brokers` option (with Azdaroth)
|
31
|
+
- Set minimum `ruby-kafka` requirement to `1.3.0`
|
32
|
+
|
33
|
+
## 1.4.6 (2021-08-05)
|
34
|
+
- #700 Fix Ruby 3 compatibility issues in Connection::Client#pause (MmKolodziej)
|
35
|
+
|
36
|
+
## 1.4.5 (2021-06-16)
|
37
|
+
- Fixup logger checks for non-writeable logfile (ojab)
|
38
|
+
- #689 - Update the stdout initialization message for framework initialization
|
39
|
+
|
40
|
+
## 1.4.4 (2021-04-19)
|
41
|
+
- Remove Ruby 2.5 support and update minimum Ruby requirement to 2.6
|
42
|
+
- Remove rake dependency
|
43
|
+
|
44
|
+
## 1.4.3 (2021-03-24)
|
45
|
+
- Fixes for Ruby 3.0 compatibility
|
46
|
+
|
47
|
+
## 1.4.2 (2021-02-16)
|
48
|
+
- Rescue Errno::EROFS in ensure_dir_exists (unasuke)
|
49
|
+
|
50
|
+
## 1.4.1 (2020-12-04)
|
51
|
+
- Return non-zero exit code when printing usage
|
52
|
+
- Add support for :assignment_strategy for consumers
|
53
|
+
|
54
|
+
## 1.4.0 (2020-09-05)
|
55
|
+
- Rename `Karafka::Params::Metadata` to `Karafka::Params::BatchMetadata`
|
56
|
+
- Rename consumer `#metadata` to `#batch_metadata`
|
57
|
+
- Separate metadata (including Karafka native metadata) from the root of params (backwards compatibility preserved thanks to rabotyaga)
|
58
|
+
- Remove metadata hash dependency
|
59
|
+
- Remove params dependency on a hash in favour of PORO
|
60
|
+
- Remove batch metadata dependency on a hash
|
61
|
+
- Remove MultiJson in favour of JSON in the default deserializer
|
62
|
+
- allow accessing all the metadata without accessing the payload
|
63
|
+
- freeze params and underlying elements except for the mutable payload
|
64
|
+
- provide access to raw payload after serialization
|
65
|
+
- fixes a bug where a non-deserializable (error) params would be marked as deserialized after first unsuccessful deserialization attempt
|
66
|
+
- fixes bug where karafka would mutate internal ruby-kafka state
|
67
|
+
- fixes bug where topic name in metadata would not be mapped using topic mappers
|
68
|
+
- simplifies the params and params batch API, before `#payload` usage, it won't be deserialized
|
69
|
+
- removes the `#[]` API from params to prevent from accessing raw data in a different way than #raw_payload
|
70
|
+
- makes the params batch operations consistent as params payload is deserialized only when accessed explicitly
|
71
|
+
|
72
|
+
## 1.3.7 (2020-08-11)
|
73
|
+
- #599 - Allow metadata access without deserialization attempt (rabotyaga)
|
74
|
+
- Sync with ruby-kafka `1.2.0` api
|
75
|
+
|
76
|
+
## 1.3.6 (2020-04-24)
|
77
|
+
- #583 - Use Karafka.logger for CLI messages (prikha)
|
78
|
+
- #582 - Cannot only define seed brokers in consumer groups
|
79
|
+
|
80
|
+
## 1.3.5 (2020-04-02)
|
81
|
+
- #578 - ThreadError: can't be called from trap context patch
|
82
|
+
|
83
|
+
## 1.3.4 (2020-02-17)
|
84
|
+
- `dry-configurable` upgrade (solnic)
|
85
|
+
- Remove temporary `thor` patches that are no longer needed
|
86
|
+
|
87
|
+
## 1.3.3 (2019-12-23)
|
88
|
+
- Require `delegate` to fix missing dependency in `ruby-kafka`
|
89
|
+
|
90
|
+
## 1.3.2 (2019-12-23)
|
91
|
+
- #561 - Allow `thor` 1.0.x usage in Karafka
|
92
|
+
- #567 - Ruby 2.7.0 support + unfreeze of a frozen string fix
|
93
|
+
|
94
|
+
## 1.3.1 (2019-11-11)
|
95
|
+
- #545 - Makes sure the log directory exists when is possible (robertomiranda)
|
96
|
+
- Ruby 2.6.5 support
|
97
|
+
- #551 - add support for DSA keys
|
98
|
+
- #549 - Missing directories after `karafka install` (nijikon)
|
99
|
+
|
100
|
+
## 1.3.0 (2019-09-09)
|
101
|
+
- Drop support for Ruby 2.4
|
102
|
+
- YARD docs tags cleanup
|
103
|
+
|
104
|
+
## 1.3.0.rc1 (2019-07-31)
|
105
|
+
- Drop support for Kafka 0.10 in favor of native support for Kafka 0.11.
|
106
|
+
- Update ruby-kafka to the 0.7 version
|
107
|
+
- Support messages headers receiving
|
108
|
+
- Message bus unification
|
109
|
+
- Parser available in metadata
|
110
|
+
- Cleanup towards moving to a non-global state app management
|
111
|
+
- Drop Ruby 2.3 support
|
112
|
+
- Support for Ruby 2.6.3
|
113
|
+
- `Karafka::Loader` has been removed in favor of Zeitwerk
|
114
|
+
- Schemas are now contracts
|
115
|
+
- #393 - Reorganize responders - removed `multiple_usage` constrain
|
116
|
+
- #388 - ssl_client_cert_chain sync
|
117
|
+
- #300 - Store value in a value key and replace its content with parsed version - without root merge
|
118
|
+
- #331 - Disallow building groups without topics
|
119
|
+
- #340 - Instrumentation unification. Better and more consistent naming
|
120
|
+
- #340 - Procline instrumentation for a nicer process name
|
121
|
+
- #342 - Change default for `fetcher_max_queue_size` from `100` to `10` to lower max memory usage
|
122
|
+
- #345 - Cleanup exceptions names
|
123
|
+
- #341 - Split connection delegator into batch delegator and single_delegator
|
124
|
+
- #351 - Rename `#retrieve!` to `#parse!` on params and `#parsed` to `parse!` on params batch.
|
125
|
+
- #351 - Adds '#first' for params_batch that returns parsed first element from the params_batch object.
|
126
|
+
- #360 - Single params consuming mode automatically parses data specs
|
127
|
+
- #359 - Divide mark_as_consumed into mark_as_consumed and mark_as_consumed!
|
128
|
+
- #356 - Provide a `#values` for params_batch to extract only values of objects from the params_batch
|
129
|
+
- #363 - Too shallow ruby-kafka version lock
|
130
|
+
- #354 - Expose consumer heartbeat
|
131
|
+
- #377 - Remove the persistent setup in favor of persistence
|
132
|
+
- #375 - Sidekiq Backend parser mismatch
|
133
|
+
- #369 - Single consumer can support more than one topic
|
134
|
+
- #288 - Drop dependency on `activesupport` gem
|
135
|
+
- #371 - SASL over SSL
|
136
|
+
- #392 - Move params redundant data to metadata
|
137
|
+
- #335 - Metadata access from within the consumer
|
138
|
+
- #402 - Delayed reconnection upon critical failures
|
139
|
+
- #405 - `reconnect_timeout` value is now being validated
|
140
|
+
- #437 - Specs ensuring that the `#437` won't occur in the `1.3` release
|
141
|
+
- #426 - ssl client cert key password
|
142
|
+
- #444 - add certificate and private key validation
|
143
|
+
- #460 - Decouple responder "parser" (generator?) from topic.parser (benissimo)
|
144
|
+
- #463 - Split parsers into serializers / deserializers
|
145
|
+
- #473 - Support SASL OAuthBearer Authentication
|
146
|
+
- #475 - Disallow subscribing to the same topic with multiple consumers
|
147
|
+
- #485 - Setting shutdown_timeout to nil kills the app without waiting for anything
|
148
|
+
- #487 - Make listeners as instances
|
149
|
+
- #29 - Consumer class names must have the word "Consumer" in it in order to work (Sidekiq backend)
|
150
|
+
- #491 - irb is missing for console to work
|
151
|
+
- #502 - Karafka process hangs when sending multiple sigkills
|
152
|
+
- #506 - ssl_verify_hostname sync
|
153
|
+
- #483 - Upgrade dry-validation before releasing 1.3
|
154
|
+
- #492 - Use Zeitwerk for code reload in development
|
155
|
+
- #508 - Reset the consumers instances upon reconnecting to a cluster
|
156
|
+
- [#530](https://github.com/karafka/karafka/pull/530) - expose ruby and ruby-kafka version
|
157
|
+
- [534](https://github.com/karafka/karafka/pull/534) - Allow to use headers in the deserializer object
|
158
|
+
- [#319](https://github.com/karafka/karafka/pull/328) - Support for exponential backoff in pause
|
159
|
+
|
160
|
+
## 1.2.11
|
161
|
+
- [#470](https://github.com/karafka/karafka/issues/470) Karafka not working with dry-configurable 0.8
|
162
|
+
|
163
|
+
## 1.2.10
|
164
|
+
- [#453](https://github.com/karafka/karafka/pull/453) require `Forwardable` module
|
165
|
+
|
166
|
+
## 1.2.9
|
167
|
+
- Critical exceptions now will cause consumer to stop instead of retrying without a break
|
168
|
+
- #412 - Fix dry-inflector dependency lock in gemspec
|
169
|
+
- #414 - Backport to 1.2 the delayed retry upon failure
|
170
|
+
- #437 - Raw message is no longer added to params after ParserError raised
|
171
|
+
|
172
|
+
## 1.2.8
|
173
|
+
- #408 - Responder Topic Lookup Bug on Heroku
|
174
|
+
|
175
|
+
## 1.2.7
|
176
|
+
- Unlock Ruby-kafka version with a warning
|
177
|
+
|
178
|
+
## 1.2.6
|
179
|
+
- Lock WaterDrop to 1.2.3
|
180
|
+
- Lock Ruby-Kafka to 0.6.x (support for 0.7 will be added in Karafka 1.3)
|
181
|
+
- #382 - Full logging with AR, etc for development mode when there is Rails integration
|
182
|
+
|
183
|
+
## 1.2.5
|
184
|
+
- #354 - Expose consumer heartbeat
|
185
|
+
- #373 - Async producer not working properly with responders
|
186
|
+
|
187
|
+
## 1.2.4
|
188
|
+
- #332 - Fetcher for max queue size
|
189
|
+
|
190
|
+
## 1.2.3
|
191
|
+
- #313 - support PLAINTEXT and SSL for scheme
|
192
|
+
- #288 - drop activesupport callbacks in favor of notifications
|
193
|
+
- #320 - Pausing indefinitely with nil pause timeout doesn't work
|
194
|
+
- #318 - Partition pausing doesn't work with custom topic mappers
|
195
|
+
- Rename ConfigAdapter to ApiAdapter to better reflect what it does
|
196
|
+
- #317 - Manual offset committing doesn't work with custom topic mappers
|
197
|
+
|
198
|
+
## 1.2.2
|
199
|
+
- #312 - Broken for ActiveSupport 5.2.0
|
200
|
+
|
201
|
+
## 1.2.1
|
202
|
+
- #304 - Unification of error instrumentation event details
|
203
|
+
- #306 - Using file logger from within a trap context upon shutdown is impossible
|
204
|
+
|
205
|
+
## 1.2.0
|
206
|
+
- Spec improvements
|
207
|
+
- #260 - Specs missing randomization
|
208
|
+
- #251 - Shutdown upon non responding (unreachable) cluster is not possible
|
209
|
+
- #258 - Investigate lowering requirements on activesupport
|
210
|
+
- #246 - Alias consumer#mark_as_consumed on controller
|
211
|
+
- #259 - Allow forcing key/partition key on responders
|
212
|
+
- #267 - Styling inconsistency
|
213
|
+
- #242 - Support setting the max bytes to fetch per request
|
214
|
+
- #247 - Support SCRAM once released
|
215
|
+
- #271 - Provide an after_init option to pass a configuration block
|
216
|
+
- #262 - Error in the monitor code for NewRelic
|
217
|
+
- #241 - Performance metrics
|
218
|
+
- #274 - Rename controllers to consumers
|
219
|
+
- #184 - Seek to
|
220
|
+
- #284 - Dynamic Params parent class
|
221
|
+
- #275 - ssl_ca_certs_from_system
|
222
|
+
- #296 - Instrument forceful exit with an error
|
223
|
+
- Replaced some of the activesupport parts with dry-inflector
|
224
|
+
- Lower ActiveSupport dependency
|
225
|
+
- Remove configurators in favor of the after_init block configurator
|
226
|
+
- Ruby 2.5.0 support
|
227
|
+
- Renamed Karafka::Connection::Processor to Karafka::Connection::Delegator to match incoming naming conventions
|
228
|
+
- Renamed Karafka::Connection::Consumer to Karafka::Connection::Client due to #274
|
229
|
+
- Removed HashWithIndifferentAccess in favor of a regular hash
|
230
|
+
- JSON parsing defaults now to string keys
|
231
|
+
- Lower memory usage due to less params data internal details
|
232
|
+
- Support multiple ```after_init``` blocks in favor of a single one
|
233
|
+
- Renamed ```received_at``` to ```receive_time``` to follow ruby-kafka and WaterDrop conventions
|
234
|
+
- Adjust internal setup to easier map Ruby-Kafka config changes
|
235
|
+
- System callbacks reorganization
|
236
|
+
- Added ```before_fetch_loop``` configuration block for early client usage (```#seek```, etc)
|
237
|
+
- Renamed ```after_fetched``` to ```after_fetch``` to normalize the naming convention
|
238
|
+
- Instrumentation on a connection delegator level
|
239
|
+
- Added ```params_batch#last``` method to retrieve last element after unparsing
|
240
|
+
- All params keys are now strings
|
241
|
+
|
242
|
+
## 1.1.2
|
243
|
+
- #256 - Default kafka.seed_brokers configuration is created in invalid format
|
244
|
+
|
245
|
+
## 1.1.1
|
246
|
+
- #253 - Allow providing a global per app parser in config settings
|
247
|
+
|
248
|
+
## 1.1.0
|
249
|
+
- Gem bump
|
250
|
+
- Switch from Celluloid to native Thread management
|
251
|
+
- Improved shutdown process
|
252
|
+
- Introduced optional fetch callbacks and moved current the ```after_received``` there as well
|
253
|
+
- Karafka will raise Errors::InvalidPauseTimeout exception when trying to pause but timeout set to 0
|
254
|
+
- Allow float for timeouts and other time based second settings
|
255
|
+
- Renamed MessagesProcessor to Processor and MessagesConsumer to Consumer - we don't process and don't consumer anything else so it was pointless to keep this "namespace"
|
256
|
+
- #232 - Remove unused ActiveSupport require
|
257
|
+
- #214 - Expose consumer on a controller layer
|
258
|
+
- #193 - Process shutdown callbacks
|
259
|
+
- Fixed accessibility of ```#params_batch``` from the outside of the controller
|
260
|
+
- connection_pool config options are no longer required
|
261
|
+
- celluloid config options are no longer required
|
262
|
+
- ```#perform``` is now renamed to ```#consume``` with warning level on using the old one (deprecated)
|
263
|
+
- #235 - Rename perform to consume
|
264
|
+
- Upgrade to ruby-kafka 0.5
|
265
|
+
- Due to redesign of Waterdrop concurrency setting is no longer needed
|
266
|
+
- #236 - Manual offset management
|
267
|
+
- WaterDrop 1.0.0 support with async
|
268
|
+
- Renamed ```batch_consuming``` option to ```batch_fetching``` as it is not a consumption (with processing) but a process of fetching messages from Kafka. The messages is considered consumed, when it is processed.
|
269
|
+
- Renamed ```batch_processing``` to ```batch_consuming``` to resemble Kafka concept of consuming messages.
|
270
|
+
- Renamed ```after_received``` to ```after_fetched``` to normalize the naming conventions.
|
271
|
+
- Responders support the per topic ```async``` option.
|
272
|
+
|
3
273
|
## 1.0.1
|
4
274
|
- #210 - LoadError: cannot load such file -- [...]/karafka.rb
|
5
275
|
- Ruby 2.4.2 as a default (+travis integration)
|
@@ -48,7 +318,7 @@
|
|
48
318
|
- Switch to multi json so everyone can use their favourite JSON parser
|
49
319
|
- Added jruby support in general and in Travis
|
50
320
|
- #196 - Topic mapper does not map topics when subscribing thanks to @webandtech
|
51
|
-
- #96 - Karafka server -
|
321
|
+
- #96 - Karafka server - possibility to run it only for a certain topics
|
52
322
|
- ~~karafka worker cli option is removed (please use sidekiq directly)~~ - restored, bad idea
|
53
323
|
- (optional) pausing upon processing failures ```pause_timeout```
|
54
324
|
- Karafka console main process no longer intercepts irb errors
|
@@ -56,15 +326,15 @@
|
|
56
326
|
- #204 - Long running controllers
|
57
327
|
- Better internal API to handle multiple usage cases using ```Karafka::Controllers::Includer```
|
58
328
|
- #207 - Rename before_enqueued to after_received
|
59
|
-
- #147 -
|
329
|
+
- #147 - De-attach Karafka from Sidekiq by extracting Sidekiq backend
|
60
330
|
|
61
331
|
### New features and improvements
|
62
332
|
|
63
|
-
- batch processing thanks to ```#
|
333
|
+
- batch processing thanks to ```#batch_consuming``` flag and ```#params_batch``` on controllers
|
64
334
|
- ```#topic``` method on an controller instance to make a clear distinction in between params and route details
|
65
335
|
- Changed routing model (still compatible with 0.5) to allow better resources management
|
66
336
|
- Lower memory requirements due to object creation limitation (2-3 times less objects on each new message)
|
67
|
-
- Introduced the ```#
|
337
|
+
- Introduced the ```#batch_consuming``` config flag (config for #126) that can be set per each consumer_group
|
68
338
|
- Added support for partition, offset and partition key in the params hash
|
69
339
|
- ```name``` option in config renamed to ```client_id```
|
70
340
|
- Long running controllers with ```persistent``` flag on a topic config level, to make controller instances persistent between messages batches (single controller instance per topic per partition no per messages batch) - turned on by default
|
@@ -77,7 +347,7 @@
|
|
77
347
|
- ```start_from_beginning``` moved into kafka scope (```kafka.start_from_beginning```)
|
78
348
|
- Router no longer checks for route uniqueness - now you can define same routes for multiple kafkas and do a lot of crazy stuff, so it's your responsibility to check uniqueness
|
79
349
|
- Change in the way we identify topics in between Karafka and Sidekiq workers. If you upgrade, please make sure, all the jobs scheduled in Sidekiq are finished before the upgrade.
|
80
|
-
- ```batch_mode``` renamed to ```
|
350
|
+
- ```batch_mode``` renamed to ```batch_fetching```
|
81
351
|
- Renamed content to value to better resemble ruby-kafka internal messages naming convention
|
82
352
|
- When having a responder with ```required``` topics and not using ```#respond_with``` at all, it will raise an exception
|
83
353
|
- Renamed ```inline_mode``` to ```inline_processing``` to resemble other settings conventions
|
@@ -141,7 +411,7 @@
|
|
141
411
|
- Waterdrop 0.3.2.1 with kafka.hosts instead of kafka_hosts
|
142
412
|
- #105 - Karafka::Monitor#caller_label not working with inherited monitors
|
143
413
|
- #99 - Standalone mode (without Sidekiq)
|
144
|
-
- #97 - Buffer responders single topics before send (
|
414
|
+
- #97 - Buffer responders single topics before send (pre-validation)
|
145
415
|
- Better control over consumer thanks to additional config options
|
146
416
|
- #111 - Dynamic worker assignment based on the income params
|
147
417
|
- Long shutdown time fix
|
@@ -149,7 +419,7 @@
|
|
149
419
|
## 0.5.0
|
150
420
|
- Removed Zookeeper totally as dependency
|
151
421
|
- Better group and partition rebalancing
|
152
|
-
- Automatic thread management (no need for
|
422
|
+
- Automatic thread management (no need for tuning) - each topic is a separate actor/thread
|
153
423
|
- Moved from Poseidon into Ruby-Kafka
|
154
424
|
- No more max_concurrency setting
|
155
425
|
- After you define your App class and routes (and everything else) you need to add execute App.boot!
|
@@ -165,14 +435,14 @@
|
|
165
435
|
- Ruby 2.2.* support dropped
|
166
436
|
- Using App name as a Kafka client_id
|
167
437
|
- Automatic Capistrano integration
|
168
|
-
- Responders support for handling better responses
|
438
|
+
- Responders support for handling better responses pipe-lining and better responses flow description and design (see README for more details)
|
169
439
|
- Gem bump
|
170
440
|
- Readme updates
|
171
441
|
- karafka flow CLI command for printing the application flow
|
172
|
-
- Some internal
|
442
|
+
- Some internal refactoring
|
173
443
|
|
174
444
|
## 0.4.2
|
175
|
-
- #87 -
|
445
|
+
- #87 - Re-consume mode with crone for better Rails/Rack integration
|
176
446
|
- Moved Karafka server related stuff into separate Karafka::Server class
|
177
447
|
- Renamed Karafka::Runner into Karafka::Fetcher
|
178
448
|
- Gem bump
|
@@ -184,7 +454,7 @@
|
|
184
454
|
|
185
455
|
## 0.4.1
|
186
456
|
- Explicit throw(:abort) required to halt before_enqueue (like in Rails 5)
|
187
|
-
- #61 -
|
457
|
+
- #61 - autodiscovery of Kafka brokers based on Zookeeper data
|
188
458
|
- #63 - Graceful shutdown with current offset state during data processing
|
189
459
|
- #65 - Example of NewRelic monitor is outdated
|
190
460
|
- #71 - Setup should be executed after user code is loaded
|
@@ -240,7 +510,7 @@
|
|
240
510
|
- Added Karafka::Monitoring that allows to add custom logging and monitoring with external libraries and systems
|
241
511
|
- Moved logging functionality into Karafka::Monitoring default monitoring
|
242
512
|
- Added possibility to provide own monitoring as long as in responds to #notice and #notice_error
|
243
|
-
-
|
513
|
+
- Standardized logging format for all logs
|
244
514
|
|
245
515
|
## 0.3.0
|
246
516
|
- Switched from custom ParserError for each parser to general catching of Karafka::Errors::ParseError and its descendants
|
@@ -257,7 +527,7 @@
|
|
257
527
|
|
258
528
|
## 0.1.19
|
259
529
|
- Internal call - schedule naming change
|
260
|
-
- Enqueue to perform_async naming in controller to follow
|
530
|
+
- Enqueue to perform_async naming in controller to follow Sidekiq naming convention
|
261
531
|
- Gem bump
|
262
532
|
|
263
533
|
## 0.1.18
|
@@ -268,7 +538,7 @@
|
|
268
538
|
- Changed Karafka::Connection::Cluster tp Karafka::Connection::ActorCluster to distinguish between a single thread actor cluster for multiple topic connection and a future feature that will allow process clusterization.
|
269
539
|
- Add an ability to use user-defined parsers for a messages
|
270
540
|
- Lazy load params for before callbacks
|
271
|
-
- Automatic loading/
|
541
|
+
- Automatic loading/initializing all workers classes during startup (so Sidekiq won't fail with unknown workers exception)
|
272
542
|
- Params are now private to controller
|
273
543
|
- Added bootstrap method to app.rb
|
274
544
|
|
@@ -309,7 +579,7 @@
|
|
309
579
|
- Added worker logger
|
310
580
|
|
311
581
|
## 0.1.8
|
312
|
-
-
|
582
|
+
- Dropped local env support in favour of [Envlogic](https://github.com/karafka/envlogic) - no changes in API
|
313
583
|
|
314
584
|
## 0.1.7
|
315
585
|
- Karafka option for Redis hosts (not localhost only)
|
@@ -339,7 +609,7 @@
|
|
339
609
|
|
340
610
|
## 0.1.1
|
341
611
|
- README updates
|
342
|
-
-
|
612
|
+
- Rake tasks updates
|
343
613
|
- Rake installation task
|
344
614
|
- Changelog file added
|
345
615
|
|
data/CODE_OF_CONDUCT.md
CHANGED
@@ -34,7 +34,7 @@ This Code of Conduct applies both within project spaces and in public spaces whe
|
|
34
34
|
|
35
35
|
## Enforcement
|
36
36
|
|
37
|
-
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at maciej@
|
37
|
+
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at maciej@mensfeld.pl. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
|
38
38
|
|
39
39
|
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
|
40
40
|
|
data/CONTRIBUTING.md
CHANGED
@@ -4,12 +4,11 @@
|
|
4
4
|
|
5
5
|
First, thank you for considering contributing to karafka! It's people like you that make the open source community such a great community! 😊
|
6
6
|
|
7
|
-
We welcome any type of contribution, not only code. You can help with
|
7
|
+
We welcome any type of contribution, not only code. You can help with:
|
8
8
|
- **QA**: file bug reports, the more details you can give the better (e.g. screenshots with the console open)
|
9
9
|
- **Marketing**: writing blog posts, howto's, printing stickers, ...
|
10
10
|
- **Community**: presenting the project at meetups, organizing a dedicated meetup for the local community, ...
|
11
11
|
- **Code**: take a look at the [open issues](issues). Even if you can't write code, commenting on them, showing that you care about a given issue matters. It helps us triage them.
|
12
|
-
- **Money**: we welcome financial contributions in full transparency on our [open collective](https://opencollective.com/karafka).
|
13
12
|
|
14
13
|
## Your First Contribution
|
15
14
|
|
@@ -21,13 +20,13 @@ Any code change should be submitted as a pull request. The description should ex
|
|
21
20
|
|
22
21
|
## Code review process
|
23
22
|
|
24
|
-
|
25
|
-
It is also always helpful to have some context for your pull request. What was the purpose? Why does it matter to you?
|
23
|
+
Each pull request must pass all the rspec specs and meet our quality requirements.
|
26
24
|
|
27
|
-
|
25
|
+
To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
|
28
26
|
|
29
|
-
|
30
|
-
|
27
|
+
Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
|
28
|
+
|
29
|
+
[![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/commit_builds)
|
31
30
|
|
32
31
|
## Questions
|
33
32
|
|