karafka 1.2.2 → 1.4.0.rc1
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- checksums.yaml.gz.sig +2 -0
- data.tar.gz.sig +0 -0
- data/.coditsu/ci.yml +3 -0
- data/.console_irbrc +1 -3
- data/.diffend.yml +3 -0
- data/.github/FUNDING.yml +3 -0
- data/.github/ISSUE_TEMPLATE/bug_report.md +50 -0
- data/.github/ISSUE_TEMPLATE/feature_request.md +20 -0
- data/.github/workflows/ci.yml +52 -0
- data/.gitignore +1 -0
- data/.ruby-version +1 -1
- data/CHANGELOG.md +157 -13
- data/CODE_OF_CONDUCT.md +1 -1
- data/CONTRIBUTING.md +1 -1
- data/Gemfile +5 -2
- data/Gemfile.lock +95 -79
- data/README.md +15 -3
- data/bin/karafka +1 -1
- data/certs/mensfeld.pem +25 -0
- data/config/errors.yml +38 -5
- data/docker-compose.yml +17 -0
- data/karafka.gemspec +19 -13
- data/lib/karafka.rb +10 -16
- data/lib/karafka/app.rb +14 -6
- data/lib/karafka/attributes_map.rb +13 -18
- data/lib/karafka/base_consumer.rb +19 -30
- data/lib/karafka/base_responder.rb +51 -29
- data/lib/karafka/cli.rb +2 -2
- data/lib/karafka/cli/console.rb +11 -9
- data/lib/karafka/cli/flow.rb +9 -7
- data/lib/karafka/cli/info.rb +4 -2
- data/lib/karafka/cli/install.rb +30 -6
- data/lib/karafka/cli/server.rb +11 -6
- data/lib/karafka/code_reloader.rb +67 -0
- data/lib/karafka/connection/{config_adapter.rb → api_adapter.rb} +62 -21
- data/lib/karafka/connection/batch_delegator.rb +55 -0
- data/lib/karafka/connection/builder.rb +18 -0
- data/lib/karafka/connection/client.rb +40 -40
- data/lib/karafka/connection/listener.rb +26 -15
- data/lib/karafka/connection/message_delegator.rb +36 -0
- data/lib/karafka/consumers/batch_metadata.rb +10 -0
- data/lib/karafka/consumers/callbacks.rb +32 -15
- data/lib/karafka/consumers/includer.rb +31 -18
- data/lib/karafka/consumers/responders.rb +2 -2
- data/lib/karafka/contracts.rb +10 -0
- data/lib/karafka/contracts/config.rb +21 -0
- data/lib/karafka/contracts/consumer_group.rb +206 -0
- data/lib/karafka/contracts/consumer_group_topic.rb +19 -0
- data/lib/karafka/contracts/responder_usage.rb +54 -0
- data/lib/karafka/contracts/server_cli_options.rb +31 -0
- data/lib/karafka/errors.rb +17 -19
- data/lib/karafka/fetcher.rb +28 -30
- data/lib/karafka/helpers/class_matcher.rb +12 -2
- data/lib/karafka/helpers/config_retriever.rb +1 -1
- data/lib/karafka/helpers/inflector.rb +26 -0
- data/lib/karafka/helpers/multi_delegator.rb +0 -1
- data/lib/karafka/instrumentation/logger.rb +9 -6
- data/lib/karafka/instrumentation/monitor.rb +15 -9
- data/lib/karafka/instrumentation/proctitle_listener.rb +36 -0
- data/lib/karafka/instrumentation/stdout_listener.rb +140 -0
- data/lib/karafka/params/batch_metadata.rb +26 -0
- data/lib/karafka/params/builders/batch_metadata.rb +30 -0
- data/lib/karafka/params/builders/params.rb +38 -0
- data/lib/karafka/params/builders/params_batch.rb +25 -0
- data/lib/karafka/params/metadata.rb +20 -0
- data/lib/karafka/params/params.rb +50 -0
- data/lib/karafka/params/params_batch.rb +35 -21
- data/lib/karafka/patches/ruby_kafka.rb +21 -8
- data/lib/karafka/persistence/client.rb +15 -11
- data/lib/karafka/persistence/{consumer.rb → consumers.rb} +20 -13
- data/lib/karafka/persistence/topics.rb +48 -0
- data/lib/karafka/process.rb +0 -4
- data/lib/karafka/responders/builder.rb +1 -1
- data/lib/karafka/responders/topic.rb +6 -8
- data/lib/karafka/routing/builder.rb +36 -8
- data/lib/karafka/routing/consumer_group.rb +1 -1
- data/lib/karafka/routing/consumer_mapper.rb +9 -9
- data/lib/karafka/routing/proxy.rb +10 -1
- data/lib/karafka/routing/topic.rb +5 -3
- data/lib/karafka/routing/topic_mapper.rb +16 -18
- data/lib/karafka/serialization/json/deserializer.rb +27 -0
- data/lib/karafka/serialization/json/serializer.rb +31 -0
- data/lib/karafka/server.rb +34 -49
- data/lib/karafka/setup/config.rb +74 -40
- data/lib/karafka/setup/configurators/water_drop.rb +7 -3
- data/lib/karafka/setup/dsl.rb +0 -1
- data/lib/karafka/status.rb +7 -3
- data/lib/karafka/templates/{application_consumer.rb.example → application_consumer.rb.erb} +2 -1
- data/lib/karafka/templates/{application_responder.rb.example → application_responder.rb.erb} +0 -0
- data/lib/karafka/templates/karafka.rb.erb +92 -0
- data/lib/karafka/version.rb +1 -1
- metadata +97 -73
- metadata.gz.sig +4 -0
- data/.travis.yml +0 -13
- data/lib/karafka/callbacks.rb +0 -30
- data/lib/karafka/callbacks/config.rb +0 -22
- data/lib/karafka/callbacks/dsl.rb +0 -16
- data/lib/karafka/connection/delegator.rb +0 -46
- data/lib/karafka/instrumentation/listener.rb +0 -112
- data/lib/karafka/loader.rb +0 -28
- data/lib/karafka/params/dsl.rb +0 -156
- data/lib/karafka/parsers/json.rb +0 -38
- data/lib/karafka/patches/dry_configurable.rb +0 -35
- data/lib/karafka/persistence/topic.rb +0 -29
- data/lib/karafka/schemas/config.rb +0 -24
- data/lib/karafka/schemas/consumer_group.rb +0 -77
- data/lib/karafka/schemas/consumer_group_topic.rb +0 -18
- data/lib/karafka/schemas/responder_usage.rb +0 -39
- data/lib/karafka/schemas/server_cli_options.rb +0 -43
- data/lib/karafka/setup/configurators/base.rb +0 -29
- data/lib/karafka/setup/configurators/params.rb +0 -25
- data/lib/karafka/templates/karafka.rb.example +0 -54
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: e1bc3a30364a87750723e5c338cdddffb6c6c4c2965e12a7379fbe96a6e7756c
|
4
|
+
data.tar.gz: 073fe6b2ffe1d7cbd53670330cd14780b48dda4f364a4ba160a42c13a1914c2d
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: d7bbc41af65bf2590d7e045e552b10121f202b7a031681c7ac5d20148e8c6c053eced4aa98ece0661a1ead6df1c163b8d03e06b19ca5f83346c4575ec10489c7
|
7
|
+
data.tar.gz: 522f1a73478633269333f04ebdfffb988e26a405b659e44bf44da5757b105dd24bc16f3214415d4258c325c1c4fa899f54ef322a335b31b859dda131a70e2021
|
checksums.yaml.gz.sig
ADDED
data.tar.gz.sig
ADDED
Binary file
|
data/.coditsu/ci.yml
ADDED
data/.console_irbrc
CHANGED
@@ -1,11 +1,9 @@
|
|
1
1
|
# irbrc for Karafka console
|
2
|
-
require 'karafka'
|
3
|
-
require Karafka.boot_file
|
4
2
|
|
5
3
|
IRB.conf[:AUTO_INDENT] = true
|
6
4
|
IRB.conf[:SAVE_HISTORY] = 1000
|
7
5
|
IRB.conf[:USE_READLINE] = true
|
8
|
-
IRB.conf[:HISTORY_FILE] = "
|
6
|
+
IRB.conf[:HISTORY_FILE] = ".irb-history"
|
9
7
|
IRB.conf[:LOAD_MODULES] = [] unless IRB.conf.key?(:LOAD_MODULES)
|
10
8
|
|
11
9
|
unless IRB.conf[:LOAD_MODULES].include?('irb/completion')
|
data/.diffend.yml
ADDED
data/.github/FUNDING.yml
ADDED
@@ -0,0 +1,50 @@
|
|
1
|
+
---
|
2
|
+
name: Bug Report
|
3
|
+
about: Report an issue with Karafka you've discovered.
|
4
|
+
---
|
5
|
+
|
6
|
+
*Be clear, concise and precise in your description of the problem.
|
7
|
+
Open an issue with a descriptive title and a summary in grammatically correct,
|
8
|
+
complete sentences.*
|
9
|
+
|
10
|
+
*Use the template below when reporting bugs. Please, make sure that
|
11
|
+
you're running the latest stable Karafka and that the problem you're reporting
|
12
|
+
hasn't been reported (and potentially fixed) already.*
|
13
|
+
|
14
|
+
*Before filing the ticket you should replace all text above the horizontal
|
15
|
+
rule with your own words.*
|
16
|
+
|
17
|
+
--------
|
18
|
+
|
19
|
+
## Expected behavior
|
20
|
+
|
21
|
+
Describe here how you expected Karafka to behave in this particular situation.
|
22
|
+
|
23
|
+
## Actual behavior
|
24
|
+
|
25
|
+
Describe here what actually happened.
|
26
|
+
|
27
|
+
## Steps to reproduce the problem
|
28
|
+
|
29
|
+
This is extremely important! Providing us with a reliable way to reproduce
|
30
|
+
a problem will expedite its solution.
|
31
|
+
|
32
|
+
## Your setup details
|
33
|
+
|
34
|
+
Please provide kafka version and the output of `karafka info` or `bundle exec karafka info` if using Bundler.
|
35
|
+
|
36
|
+
Here's an example:
|
37
|
+
|
38
|
+
```
|
39
|
+
$ [bundle exec] karafka info
|
40
|
+
Karafka version: 1.3.0
|
41
|
+
Ruby version: 2.6.3
|
42
|
+
Ruby-kafka version: 0.7.9
|
43
|
+
Application client id: karafka-local
|
44
|
+
Backend: inline
|
45
|
+
Batch fetching: true
|
46
|
+
Batch consuming: true
|
47
|
+
Boot file: /app/karafka/karafka.rb
|
48
|
+
Environment: development
|
49
|
+
Kafka seed brokers: ["kafka://kafka:9092"]
|
50
|
+
```
|
@@ -0,0 +1,20 @@
|
|
1
|
+
---
|
2
|
+
name: Feature Request
|
3
|
+
about: Suggest new Karafka features or improvements to existing features.
|
4
|
+
---
|
5
|
+
|
6
|
+
## Is your feature request related to a problem? Please describe.
|
7
|
+
|
8
|
+
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
9
|
+
|
10
|
+
## Describe the solution you'd like
|
11
|
+
|
12
|
+
A clear and concise description of what you want to happen.
|
13
|
+
|
14
|
+
## Describe alternatives you've considered
|
15
|
+
|
16
|
+
A clear and concise description of any alternative solutions or features you've considered.
|
17
|
+
|
18
|
+
## Additional context
|
19
|
+
|
20
|
+
Add any other context or screenshots about the feature request here.
|
@@ -0,0 +1,52 @@
|
|
1
|
+
name: ci
|
2
|
+
|
3
|
+
on:
|
4
|
+
push:
|
5
|
+
schedule:
|
6
|
+
- cron: '0 1 * * *'
|
7
|
+
|
8
|
+
jobs:
|
9
|
+
specs:
|
10
|
+
runs-on: ubuntu-latest
|
11
|
+
strategy:
|
12
|
+
fail-fast: false
|
13
|
+
matrix:
|
14
|
+
ruby:
|
15
|
+
- '2.7'
|
16
|
+
- '2.6'
|
17
|
+
- '2.5'
|
18
|
+
include:
|
19
|
+
- ruby: '2.7'
|
20
|
+
coverage: 'true'
|
21
|
+
steps:
|
22
|
+
- uses: actions/checkout@v2
|
23
|
+
- name: Install package dependencies
|
24
|
+
run: "[ -e $APT_DEPS ] || sudo apt-get install -y --no-install-recommends $APT_DEPS"
|
25
|
+
- name: Set up Ruby
|
26
|
+
uses: ruby/setup-ruby@v1
|
27
|
+
with:
|
28
|
+
ruby-version: ${{matrix.ruby}}
|
29
|
+
- name: Install latest bundler
|
30
|
+
run: |
|
31
|
+
gem install bundler --no-document
|
32
|
+
bundle config set without 'tools benchmarks docs'
|
33
|
+
- name: Bundle install
|
34
|
+
run: |
|
35
|
+
bundle config set without development
|
36
|
+
bundle install --jobs 4 --retry 3
|
37
|
+
- name: Run Kafka with docker-compose
|
38
|
+
run: docker-compose up -d
|
39
|
+
- name: Run all tests
|
40
|
+
env:
|
41
|
+
GITHUB_COVERAGE: ${{matrix.coverage}}
|
42
|
+
run: bundle exec rspec
|
43
|
+
coditsu:
|
44
|
+
runs-on: ubuntu-latest
|
45
|
+
strategy:
|
46
|
+
fail-fast: false
|
47
|
+
steps:
|
48
|
+
- uses: actions/checkout@v2
|
49
|
+
with:
|
50
|
+
fetch-depth: 0
|
51
|
+
- name: Run Coditsu
|
52
|
+
run: \curl -sSL https://api.coditsu.io/run/ci | bash
|
data/.gitignore
CHANGED
data/.ruby-version
CHANGED
@@ -1 +1 @@
|
|
1
|
-
2.
|
1
|
+
2.7.1
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,149 @@
|
|
1
1
|
# Karafka framework changelog
|
2
2
|
|
3
|
+
## 1.4.0.rc1 (2020-08-25)
|
4
|
+
- Rename `Karafka::Params::Metadata` to `Karafka::Params::BatchMetadata`
|
5
|
+
` Rename consumer `#metadata` to `#batch_metadata`
|
6
|
+
- Separate metadata (including Karafka native metadata) from the root of params
|
7
|
+
- Remove metadata hash dependency
|
8
|
+
- Remove params dependency on a hash in favour of PORO
|
9
|
+
- Remove batch metadata dependency on a hash
|
10
|
+
- Remove MultiJson in favour of JSON in the default deserializer
|
11
|
+
- allow accessing all the metadata without accessing the payload
|
12
|
+
- freeze params and underlying elements except for the mutable payload
|
13
|
+
- provide access to raw payload after serialization
|
14
|
+
- fixes a bug where a non-deserializable (error) params would be marked as deserialized after first unsuccessful deserialization attempt
|
15
|
+
- fixes bug where karafka would mutate internal ruby-kafka state
|
16
|
+
- fixes bug where topic name in metadata would not be mapped using topic mappers
|
17
|
+
- simplifies the params and params batch API, before `#payload` usage, it won't be deserialized
|
18
|
+
- removes the `#[]` API from params to prevent from accessing raw data in a different way than #raw_payload
|
19
|
+
- makes the params batch operations consistent as params payload is deserialized only when accessed explicitly
|
20
|
+
|
21
|
+
## 1.3.7 (2020-08-11)
|
22
|
+
- #599 - Allow metadata access without deserialization attempt (rabotyaga)
|
23
|
+
- Sync with ruby-kafka `1.2.0` api
|
24
|
+
|
25
|
+
## 1.3.6 (2020-04-24)
|
26
|
+
- #583 - Use Karafka.logger for CLI messages (prikha)
|
27
|
+
- #582 - Cannot only define seed brokers in consumer groups
|
28
|
+
|
29
|
+
## 1.3.5 (2020-04-02)
|
30
|
+
- #578 - ThreadError: can't be called from trap context patch
|
31
|
+
|
32
|
+
## 1.3.4 (2020-02-17)
|
33
|
+
- `dry-configurable` upgrade (solnic)
|
34
|
+
- Remove temporary `thor` patches that are no longer needed
|
35
|
+
|
36
|
+
## 1.3.3 (2019-12-23)
|
37
|
+
- Require `delegate` to fix missing dependency in `ruby-kafka`
|
38
|
+
|
39
|
+
## 1.3.2 (2019-12-23)
|
40
|
+
- #561 - Allow `thor` 1.0.x usage in Karafka
|
41
|
+
- #567 - Ruby 2.7.0 support + unfreeze of a frozen string fix
|
42
|
+
|
43
|
+
## 1.3.1 (2019-11-11)
|
44
|
+
- #545 - Makes sure the log directory exists when is possible (robertomiranda)
|
45
|
+
- Ruby 2.6.5 support
|
46
|
+
- #551 - add support for DSA keys
|
47
|
+
- #549 - Missing directories after `karafka install` (nijikon)
|
48
|
+
|
49
|
+
## 1.3.0 (2019-09-09)
|
50
|
+
- Drop support for Ruby 2.4
|
51
|
+
- YARD docs tags cleanup
|
52
|
+
|
53
|
+
## 1.3.0.rc1 (2019-07-31)
|
54
|
+
- Drop support for Kafka 0.10 in favor of native support for Kafka 0.11.
|
55
|
+
- Update ruby-kafka to the 0.7 version
|
56
|
+
- Support messages headers receiving
|
57
|
+
- Message bus unification
|
58
|
+
- Parser available in metadata
|
59
|
+
- Cleanup towards moving to a non-global state app management
|
60
|
+
- Drop Ruby 2.3 support
|
61
|
+
- Support for Ruby 2.6.3
|
62
|
+
- `Karafka::Loader` has been removed in favor of Zeitwerk
|
63
|
+
- Schemas are now contracts
|
64
|
+
- #393 - Reorganize responders - removed `multiple_usage` constrain
|
65
|
+
- #388 - ssl_client_cert_chain sync
|
66
|
+
- #300 - Store value in a value key and replace its content with parsed version - without root merge
|
67
|
+
- #331 - Disallow building groups without topics
|
68
|
+
- #340 - Instrumentation unification. Better and more consistent naming
|
69
|
+
- #340 - Procline instrumentation for a nicer process name
|
70
|
+
- #342 - Change default for `fetcher_max_queue_size` from `100` to `10` to lower max memory usage
|
71
|
+
- #345 - Cleanup exceptions names
|
72
|
+
- #341 - Split connection delegator into batch delegator and single_delegator
|
73
|
+
- #351 - Rename `#retrieve!` to `#parse!` on params and `#parsed` to `parse!` on params batch.
|
74
|
+
- #351 - Adds '#first' for params_batch that returns parsed first element from the params_batch object.
|
75
|
+
- #360 - Single params consuming mode automatically parses data specs
|
76
|
+
- #359 - Divide mark_as_consumed into mark_as_consumed and mark_as_consumed!
|
77
|
+
- #356 - Provide a `#values` for params_batch to extract only values of objects from the params_batch
|
78
|
+
- #363 - Too shallow ruby-kafka version lock
|
79
|
+
- #354 - Expose consumer heartbeat
|
80
|
+
- #377 - Remove the persistent setup in favor of persistence
|
81
|
+
- #375 - Sidekiq Backend parser mismatch
|
82
|
+
- #369 - Single consumer can support more than one topic
|
83
|
+
- #288 - Drop dependency on `activesupport` gem
|
84
|
+
- #371 - SASL over SSL
|
85
|
+
- #392 - Move params redundant data to metadata
|
86
|
+
- #335 - Metadata access from within the consumer
|
87
|
+
- #402 - Delayed reconnection upon critical failures
|
88
|
+
- #405 - `reconnect_timeout` value is now being validated
|
89
|
+
- #437 - Specs ensuring that the `#437` won't occur in the `1.3` release
|
90
|
+
- #426 - ssl client cert key password
|
91
|
+
- #444 - add certificate and private key validation
|
92
|
+
- #460 - Decouple responder "parser" (generator?) from topic.parser (benissimo)
|
93
|
+
- #463 - Split parsers into serializers / deserializers
|
94
|
+
- #473 - Support SASL OAuthBearer Authentication
|
95
|
+
- #475 - Disallow subscribing to the same topic with multiple consumers
|
96
|
+
- #485 - Setting shutdown_timeout to nil kills the app without waiting for anything
|
97
|
+
- #487 - Make listeners as instances
|
98
|
+
- #29 - Consumer class names must have the word "Consumer" in it in order to work (Sidekiq backend)
|
99
|
+
- #491 - irb is missing for console to work
|
100
|
+
- #502 - Karafka process hangs when sending multiple sigkills
|
101
|
+
- #506 - ssl_verify_hostname sync
|
102
|
+
- #483 - Upgrade dry-validation before releasing 1.3
|
103
|
+
- #492 - Use Zeitwerk for code reload in development
|
104
|
+
- #508 - Reset the consumers instances upon reconnecting to a cluster
|
105
|
+
- [#530](https://github.com/karafka/karafka/pull/530) - expose ruby and ruby-kafka version
|
106
|
+
- [534](https://github.com/karafka/karafka/pull/534) - Allow to use headers in the deserializer object
|
107
|
+
- [#319](https://github.com/karafka/karafka/pull/328) - Support for exponential backoff in pause
|
108
|
+
|
109
|
+
## 1.2.11
|
110
|
+
- [#470](https://github.com/karafka/karafka/issues/470) Karafka not working with dry-configurable 0.8
|
111
|
+
|
112
|
+
## 1.2.10
|
113
|
+
- [#453](https://github.com/karafka/karafka/pull/453) require `Forwardable` module
|
114
|
+
|
115
|
+
## 1.2.9
|
116
|
+
- Critical exceptions now will cause consumer to stop instead of retrying without a break
|
117
|
+
- #412 - Fix dry-inflector dependency lock in gemspec
|
118
|
+
- #414 - Backport to 1.2 the delayed retry upon failure
|
119
|
+
- #437 - Raw message is no longer added to params after ParserError raised
|
120
|
+
|
121
|
+
## 1.2.8
|
122
|
+
- #408 - Responder Topic Lookup Bug on Heroku
|
123
|
+
|
124
|
+
## 1.2.7
|
125
|
+
- Unlock Ruby-kafka version with a warning
|
126
|
+
|
127
|
+
## 1.2.6
|
128
|
+
- Lock WaterDrop to 1.2.3
|
129
|
+
- Lock Ruby-Kafka to 0.6.x (support for 0.7 will be added in Karafka 1.3)
|
130
|
+
- #382 - Full logging with AR, etc for development mode when there is Rails integration
|
131
|
+
|
132
|
+
## 1.2.5
|
133
|
+
- #354 - Expose consumer heartbeat
|
134
|
+
- #373 - Async producer not working properly with responders
|
135
|
+
|
136
|
+
## 1.2.4
|
137
|
+
- #332 - Fetcher for max queue size
|
138
|
+
|
139
|
+
## 1.2.3
|
140
|
+
- #313 - support PLAINTEXT and SSL for scheme
|
141
|
+
- #288 - drop activesupport callbacks in favor of notifications
|
142
|
+
- #320 - Pausing indefinitely with nil pause timeout doesn't work
|
143
|
+
- #318 - Partition pausing doesn't work with custom topic mappers
|
144
|
+
- Rename ConfigAdapter to ApiAdapter to better reflect what it does
|
145
|
+
- #317 - Manual offset committing doesn't work with custom topic mappers
|
146
|
+
|
3
147
|
## 1.2.2
|
4
148
|
- #312 - Broken for ActiveSupport 5.2.0
|
5
149
|
|
@@ -123,7 +267,7 @@
|
|
123
267
|
- Switch to multi json so everyone can use their favourite JSON parser
|
124
268
|
- Added jruby support in general and in Travis
|
125
269
|
- #196 - Topic mapper does not map topics when subscribing thanks to @webandtech
|
126
|
-
- #96 - Karafka server -
|
270
|
+
- #96 - Karafka server - possibility to run it only for a certain topics
|
127
271
|
- ~~karafka worker cli option is removed (please use sidekiq directly)~~ - restored, bad idea
|
128
272
|
- (optional) pausing upon processing failures ```pause_timeout```
|
129
273
|
- Karafka console main process no longer intercepts irb errors
|
@@ -131,7 +275,7 @@
|
|
131
275
|
- #204 - Long running controllers
|
132
276
|
- Better internal API to handle multiple usage cases using ```Karafka::Controllers::Includer```
|
133
277
|
- #207 - Rename before_enqueued to after_received
|
134
|
-
- #147 -
|
278
|
+
- #147 - De-attach Karafka from Sidekiq by extracting Sidekiq backend
|
135
279
|
|
136
280
|
### New features and improvements
|
137
281
|
|
@@ -216,7 +360,7 @@
|
|
216
360
|
- Waterdrop 0.3.2.1 with kafka.hosts instead of kafka_hosts
|
217
361
|
- #105 - Karafka::Monitor#caller_label not working with inherited monitors
|
218
362
|
- #99 - Standalone mode (without Sidekiq)
|
219
|
-
- #97 - Buffer responders single topics before send (
|
363
|
+
- #97 - Buffer responders single topics before send (pre-validation)
|
220
364
|
- Better control over consumer thanks to additional config options
|
221
365
|
- #111 - Dynamic worker assignment based on the income params
|
222
366
|
- Long shutdown time fix
|
@@ -224,7 +368,7 @@
|
|
224
368
|
## 0.5.0
|
225
369
|
- Removed Zookeeper totally as dependency
|
226
370
|
- Better group and partition rebalancing
|
227
|
-
- Automatic thread management (no need for
|
371
|
+
- Automatic thread management (no need for tuning) - each topic is a separate actor/thread
|
228
372
|
- Moved from Poseidon into Ruby-Kafka
|
229
373
|
- No more max_concurrency setting
|
230
374
|
- After you define your App class and routes (and everything else) you need to add execute App.boot!
|
@@ -240,14 +384,14 @@
|
|
240
384
|
- Ruby 2.2.* support dropped
|
241
385
|
- Using App name as a Kafka client_id
|
242
386
|
- Automatic Capistrano integration
|
243
|
-
- Responders support for handling better responses
|
387
|
+
- Responders support for handling better responses pipe-lining and better responses flow description and design (see README for more details)
|
244
388
|
- Gem bump
|
245
389
|
- Readme updates
|
246
390
|
- karafka flow CLI command for printing the application flow
|
247
|
-
- Some internal
|
391
|
+
- Some internal refactoring
|
248
392
|
|
249
393
|
## 0.4.2
|
250
|
-
- #87 -
|
394
|
+
- #87 - Re-consume mode with crone for better Rails/Rack integration
|
251
395
|
- Moved Karafka server related stuff into separate Karafka::Server class
|
252
396
|
- Renamed Karafka::Runner into Karafka::Fetcher
|
253
397
|
- Gem bump
|
@@ -259,7 +403,7 @@
|
|
259
403
|
|
260
404
|
## 0.4.1
|
261
405
|
- Explicit throw(:abort) required to halt before_enqueue (like in Rails 5)
|
262
|
-
- #61 -
|
406
|
+
- #61 - autodiscovery of Kafka brokers based on Zookeeper data
|
263
407
|
- #63 - Graceful shutdown with current offset state during data processing
|
264
408
|
- #65 - Example of NewRelic monitor is outdated
|
265
409
|
- #71 - Setup should be executed after user code is loaded
|
@@ -315,7 +459,7 @@
|
|
315
459
|
- Added Karafka::Monitoring that allows to add custom logging and monitoring with external libraries and systems
|
316
460
|
- Moved logging functionality into Karafka::Monitoring default monitoring
|
317
461
|
- Added possibility to provide own monitoring as long as in responds to #notice and #notice_error
|
318
|
-
-
|
462
|
+
- Standardized logging format for all logs
|
319
463
|
|
320
464
|
## 0.3.0
|
321
465
|
- Switched from custom ParserError for each parser to general catching of Karafka::Errors::ParseError and its descendants
|
@@ -332,7 +476,7 @@
|
|
332
476
|
|
333
477
|
## 0.1.19
|
334
478
|
- Internal call - schedule naming change
|
335
|
-
- Enqueue to perform_async naming in controller to follow
|
479
|
+
- Enqueue to perform_async naming in controller to follow Sidekiq naming convention
|
336
480
|
- Gem bump
|
337
481
|
|
338
482
|
## 0.1.18
|
@@ -343,7 +487,7 @@
|
|
343
487
|
- Changed Karafka::Connection::Cluster tp Karafka::Connection::ActorCluster to distinguish between a single thread actor cluster for multiple topic connection and a future feature that will allow process clusterization.
|
344
488
|
- Add an ability to use user-defined parsers for a messages
|
345
489
|
- Lazy load params for before callbacks
|
346
|
-
- Automatic loading/
|
490
|
+
- Automatic loading/initializing all workers classes during startup (so Sidekiq won't fail with unknown workers exception)
|
347
491
|
- Params are now private to controller
|
348
492
|
- Added bootstrap method to app.rb
|
349
493
|
|
@@ -384,7 +528,7 @@
|
|
384
528
|
- Added worker logger
|
385
529
|
|
386
530
|
## 0.1.8
|
387
|
-
-
|
531
|
+
- Dropped local env support in favour of [Envlogic](https://github.com/karafka/envlogic) - no changes in API
|
388
532
|
|
389
533
|
## 0.1.7
|
390
534
|
- Karafka option for Redis hosts (not localhost only)
|
@@ -414,7 +558,7 @@
|
|
414
558
|
|
415
559
|
## 0.1.1
|
416
560
|
- README updates
|
417
|
-
-
|
561
|
+
- Rake tasks updates
|
418
562
|
- Rake installation task
|
419
563
|
- Changelog file added
|
420
564
|
|
data/CODE_OF_CONDUCT.md
CHANGED
@@ -34,7 +34,7 @@ This Code of Conduct applies both within project spaces and in public spaces whe
|
|
34
34
|
|
35
35
|
## Enforcement
|
36
36
|
|
37
|
-
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at maciej@
|
37
|
+
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at maciej@mensfeld.pl. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
|
38
38
|
|
39
39
|
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
|
40
40
|
|