karafka 1.0.0 → 1.2.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +5 -5
- data/.ruby-version +1 -1
- data/.travis.yml +3 -1
- data/CHANGELOG.md +90 -3
- data/CONTRIBUTING.md +5 -6
- data/Gemfile +1 -1
- data/Gemfile.lock +59 -64
- data/README.md +28 -57
- data/bin/karafka +13 -1
- data/config/errors.yml +6 -0
- data/karafka.gemspec +10 -9
- data/lib/karafka.rb +19 -10
- data/lib/karafka/app.rb +8 -15
- data/lib/karafka/attributes_map.rb +4 -4
- data/lib/karafka/backends/inline.rb +2 -3
- data/lib/karafka/base_consumer.rb +68 -0
- data/lib/karafka/base_responder.rb +41 -17
- data/lib/karafka/callbacks.rb +30 -0
- data/lib/karafka/callbacks/config.rb +22 -0
- data/lib/karafka/callbacks/dsl.rb +16 -0
- data/lib/karafka/cli/base.rb +2 -0
- data/lib/karafka/cli/flow.rb +1 -1
- data/lib/karafka/cli/info.rb +1 -2
- data/lib/karafka/cli/install.rb +2 -3
- data/lib/karafka/cli/server.rb +9 -12
- data/lib/karafka/connection/client.rb +117 -0
- data/lib/karafka/connection/config_adapter.rb +30 -14
- data/lib/karafka/connection/delegator.rb +46 -0
- data/lib/karafka/connection/listener.rb +22 -20
- data/lib/karafka/consumers/callbacks.rb +54 -0
- data/lib/karafka/consumers/includer.rb +51 -0
- data/lib/karafka/consumers/responders.rb +24 -0
- data/lib/karafka/{controllers → consumers}/single_params.rb +3 -3
- data/lib/karafka/errors.rb +19 -2
- data/lib/karafka/fetcher.rb +30 -28
- data/lib/karafka/helpers/class_matcher.rb +8 -8
- data/lib/karafka/helpers/config_retriever.rb +2 -2
- data/lib/karafka/instrumentation/listener.rb +112 -0
- data/lib/karafka/instrumentation/logger.rb +55 -0
- data/lib/karafka/instrumentation/monitor.rb +64 -0
- data/lib/karafka/loader.rb +0 -1
- data/lib/karafka/params/dsl.rb +156 -0
- data/lib/karafka/params/params_batch.rb +7 -2
- data/lib/karafka/patches/dry_configurable.rb +7 -7
- data/lib/karafka/patches/ruby_kafka.rb +34 -0
- data/lib/karafka/persistence/client.rb +25 -0
- data/lib/karafka/persistence/consumer.rb +38 -0
- data/lib/karafka/persistence/topic.rb +29 -0
- data/lib/karafka/process.rb +6 -5
- data/lib/karafka/responders/builder.rb +15 -14
- data/lib/karafka/responders/topic.rb +8 -1
- data/lib/karafka/routing/builder.rb +2 -2
- data/lib/karafka/routing/consumer_group.rb +1 -1
- data/lib/karafka/routing/consumer_mapper.rb +34 -0
- data/lib/karafka/routing/router.rb +1 -1
- data/lib/karafka/routing/topic.rb +5 -11
- data/lib/karafka/routing/{mapper.rb → topic_mapper.rb} +2 -2
- data/lib/karafka/schemas/config.rb +4 -5
- data/lib/karafka/schemas/consumer_group.rb +45 -24
- data/lib/karafka/schemas/consumer_group_topic.rb +18 -0
- data/lib/karafka/schemas/responder_usage.rb +1 -0
- data/lib/karafka/server.rb +39 -20
- data/lib/karafka/setup/config.rb +74 -51
- data/lib/karafka/setup/configurators/base.rb +6 -12
- data/lib/karafka/setup/configurators/params.rb +25 -0
- data/lib/karafka/setup/configurators/water_drop.rb +15 -14
- data/lib/karafka/setup/dsl.rb +22 -0
- data/lib/karafka/templates/{application_controller.rb.example → application_consumer.rb.example} +2 -3
- data/lib/karafka/templates/karafka.rb.example +18 -5
- data/lib/karafka/version.rb +1 -1
- metadata +87 -63
- data/.github/ISSUE_TEMPLATE.md +0 -2
- data/Rakefile +0 -7
- data/lib/karafka/base_controller.rb +0 -118
- data/lib/karafka/connection/messages_consumer.rb +0 -106
- data/lib/karafka/connection/messages_processor.rb +0 -59
- data/lib/karafka/controllers/includer.rb +0 -51
- data/lib/karafka/controllers/responders.rb +0 -19
- data/lib/karafka/logger.rb +0 -53
- data/lib/karafka/monitor.rb +0 -98
- data/lib/karafka/params/params.rb +0 -101
- data/lib/karafka/persistence.rb +0 -18
- data/lib/karafka/setup/configurators/celluloid.rb +0 -22
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
|
-
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 0bb0a1f72768ebf4bf720ebda57ebc26a0178275adabffd494b24e1612e9b38a
|
4
|
+
data.tar.gz: f586038a0498227e8a287cc173a23e77bb99b6cb8d453c39be55a220e2a0d361
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: e2b862da6372bc91f76bc01c20eb21ccde9d61d67749ccce97eadd2f739484dbd68a149d08d6e275d94d95b6ce9610d86c2d01e85bb1841a80e048ed832810ac
|
7
|
+
data.tar.gz: 06fb89700e59f810ec984b54f0424cfdeb98c79e17b8629010f011e68edff5fa1e345576a7b28ed07aff440739ed11824016da6083601a92b8f83fcca68e576e
|
data/.ruby-version
CHANGED
@@ -1 +1 @@
|
|
1
|
-
2.
|
1
|
+
2.5.0
|
data/.travis.yml
CHANGED
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,92 @@
|
|
1
1
|
# Karafka framework changelog
|
2
2
|
|
3
|
+
## 1.2.0
|
4
|
+
- Spec improvements
|
5
|
+
- #260 - Specs missing randomization
|
6
|
+
- #251 - Shutdown upon non responding (unreachable) cluster is not possible
|
7
|
+
- #258 - Investigate lowering requirements on activesupport
|
8
|
+
- #246 - Alias consumer#mark_as_consumed on controller
|
9
|
+
- #259 - Allow forcing key/partition key on responders
|
10
|
+
- #267 - Styling inconsistency
|
11
|
+
- #242 - Support setting the max bytes to fetch per request
|
12
|
+
- #247 - Support SCRAM once released
|
13
|
+
- #271 - Provide an after_init option to pass a configuration block
|
14
|
+
- #262 - Error in the monitor code for NewRelic
|
15
|
+
- #241 - Performance metrics
|
16
|
+
- #274 - Rename controllers to consumers
|
17
|
+
- #184 - Seek to
|
18
|
+
- #284 - Dynamic Params parent class
|
19
|
+
- #275 - ssl_ca_certs_from_system
|
20
|
+
- #296 - Instrument forceful exit with an error
|
21
|
+
- Replaced some of the activesupport parts with dry-inflector
|
22
|
+
- Lower ActiveSupport dependency
|
23
|
+
- Remove configurators in favor of the after_init block configurator
|
24
|
+
- Ruby 2.5.0 support
|
25
|
+
- Renamed Karafka::Connection::Processor to Karafka::Connection::Delegator to match incoming naming conventions
|
26
|
+
- Renamed Karafka::Connection::Consumer to Karafka::Connection::Client due to #274
|
27
|
+
- Removed HashWithIndifferentAccess in favor of a regular hash
|
28
|
+
- JSON parsing defaults now to string keys
|
29
|
+
- Lower memory usage due to less params data internal details
|
30
|
+
- Support multiple ```after_init``` blocks in favor of a single one
|
31
|
+
- Renamed ```received_at``` to ```receive_time``` to follow ruby-kafka and WaterDrop conventions
|
32
|
+
- Adjust internal setup to easier map Ruby-Kafka config changes
|
33
|
+
- System callbacks reorganization
|
34
|
+
- Added ```before_fetch_loop``` configuration block for early client usage (```#seek```, etc)
|
35
|
+
- Renamed ```after_fetched``` to ```after_fetch``` to normalize the naming convention
|
36
|
+
- Instrumentation on a connection delegator level
|
37
|
+
- Added ```params_batch#last``` method to retrieve last element after unparsing
|
38
|
+
- All params keys are now strings
|
39
|
+
|
40
|
+
## 1.1.2
|
41
|
+
- #256 - Default kafka.seed_brokers configuration is created in invalid format
|
42
|
+
|
43
|
+
## 1.1.1
|
44
|
+
- #253 - Allow providing a global per app parser in config settings
|
45
|
+
|
46
|
+
## 1.1.0
|
47
|
+
- Gem bump
|
48
|
+
- Switch from Celluloid to native Thread management
|
49
|
+
- Improved shutdown process
|
50
|
+
- Introduced optional fetch callbacks and moved current the ```after_received``` there as well
|
51
|
+
- Karafka will raise Errors::InvalidPauseTimeout exception when trying to pause but timeout set to 0
|
52
|
+
- Allow float for timeouts and other time based second settings
|
53
|
+
- Renamed MessagesProcessor to Processor and MessagesConsumer to Consumer - we don't process and don't consumer anything else so it was pointless to keep this "namespace"
|
54
|
+
- #232 - Remove unused ActiveSupport require
|
55
|
+
- #214 - Expose consumer on a controller layer
|
56
|
+
- #193 - Process shutdown callbacks
|
57
|
+
- Fixed accessibility of ```#params_batch``` from the outside of the controller
|
58
|
+
- connection_pool config options are no longer required
|
59
|
+
- celluloid config options are no longer required
|
60
|
+
- ```#perform``` is now renamed to ```#consume``` with warning level on using the old one (deprecated)
|
61
|
+
- #235 - Rename perform to consume
|
62
|
+
- Upgrade to ruby-kafka 0.5
|
63
|
+
- Due to redesign of Waterdrop concurrency setting is no longer needed
|
64
|
+
- #236 - Manual offset management
|
65
|
+
- WaterDrop 1.0.0 support with async
|
66
|
+
- Renamed ```batch_consuming``` option to ```batch_fetching``` as it is not a consumption (with processing) but a process of fetching messages from Kafka. The messages is considered consumed, when it is processed.
|
67
|
+
- Renamed ```batch_processing``` to ```batch_consuming``` to resemble Kafka concept of consuming messages.
|
68
|
+
- Renamed ```after_received``` to ```after_fetched``` to normalize the naming conventions.
|
69
|
+
- Responders support the per topic ```async``` option.
|
70
|
+
|
71
|
+
## 1.0.1
|
72
|
+
- #210 - LoadError: cannot load such file -- [...]/karafka.rb
|
73
|
+
- Ruby 2.4.2 as a default (+travis integration)
|
74
|
+
- JRuby upgrade
|
75
|
+
- Expanded persistence layer (moved to a namespace for easier future development)
|
76
|
+
- #213 - Misleading error when non-existing dependency is required
|
77
|
+
- #212 - Make params react to #topic, #partition, #offset
|
78
|
+
- #215 - Consumer group route dynamic options are ignored
|
79
|
+
- #217 - check RUBY_ENGINE constant if RUBY_VERSION is missing (#217)
|
80
|
+
- #218 - add configuration setting to control Celluloid's shutdown timeout
|
81
|
+
- Renamed Karafka::Routing::Mapper to Karafka::Routing::TopicMapper to match naming conventions
|
82
|
+
- #219 - Allow explicit consumer group names, without prefixes
|
83
|
+
- Fix to early removed pid upon shutdown of demonized process
|
84
|
+
- max_wait_time updated to match https://github.com/zendesk/ruby-kafka/issues/433
|
85
|
+
- #230 - Better uri validation for seed brokers (incompatibility as the kafka:// or kafka+ssl:// is required)
|
86
|
+
- Small internal docs fixes
|
87
|
+
- Dry::Validation::MissingMessageError: message for broker_schema? was not found
|
88
|
+
- #238 - warning: already initialized constant Karafka::Schemas::URI_SCHEMES
|
89
|
+
|
3
90
|
## 1.0.0
|
4
91
|
|
5
92
|
### Closed issues:
|
@@ -41,11 +128,11 @@
|
|
41
128
|
|
42
129
|
### New features and improvements
|
43
130
|
|
44
|
-
- batch processing thanks to ```#
|
131
|
+
- batch processing thanks to ```#batch_consuming``` flag and ```#params_batch``` on controllers
|
45
132
|
- ```#topic``` method on an controller instance to make a clear distinction in between params and route details
|
46
133
|
- Changed routing model (still compatible with 0.5) to allow better resources management
|
47
134
|
- Lower memory requirements due to object creation limitation (2-3 times less objects on each new message)
|
48
|
-
- Introduced the ```#
|
135
|
+
- Introduced the ```#batch_consuming``` config flag (config for #126) that can be set per each consumer_group
|
49
136
|
- Added support for partition, offset and partition key in the params hash
|
50
137
|
- ```name``` option in config renamed to ```client_id```
|
51
138
|
- Long running controllers with ```persistent``` flag on a topic config level, to make controller instances persistent between messages batches (single controller instance per topic per partition no per messages batch) - turned on by default
|
@@ -58,7 +145,7 @@
|
|
58
145
|
- ```start_from_beginning``` moved into kafka scope (```kafka.start_from_beginning```)
|
59
146
|
- Router no longer checks for route uniqueness - now you can define same routes for multiple kafkas and do a lot of crazy stuff, so it's your responsibility to check uniqueness
|
60
147
|
- Change in the way we identify topics in between Karafka and Sidekiq workers. If you upgrade, please make sure, all the jobs scheduled in Sidekiq are finished before the upgrade.
|
61
|
-
- ```batch_mode``` renamed to ```
|
148
|
+
- ```batch_mode``` renamed to ```batch_fetching```
|
62
149
|
- Renamed content to value to better resemble ruby-kafka internal messages naming convention
|
63
150
|
- When having a responder with ```required``` topics and not using ```#respond_with``` at all, it will raise an exception
|
64
151
|
- Renamed ```inline_mode``` to ```inline_processing``` to resemble other settings conventions
|
data/CONTRIBUTING.md
CHANGED
@@ -9,7 +9,6 @@ We welcome any type of contribution, not only code. You can help with
|
|
9
9
|
- **Marketing**: writing blog posts, howto's, printing stickers, ...
|
10
10
|
- **Community**: presenting the project at meetups, organizing a dedicated meetup for the local community, ...
|
11
11
|
- **Code**: take a look at the [open issues](issues). Even if you can't write code, commenting on them, showing that you care about a given issue matters. It helps us triage them.
|
12
|
-
- **Money**: we welcome financial contributions in full transparency on our [open collective](https://opencollective.com/karafka).
|
13
12
|
|
14
13
|
## Your First Contribution
|
15
14
|
|
@@ -21,13 +20,13 @@ Any code change should be submitted as a pull request. The description should ex
|
|
21
20
|
|
22
21
|
## Code review process
|
23
22
|
|
24
|
-
|
25
|
-
It is also always helpful to have some context for your pull request. What was the purpose? Why does it matter to you?
|
23
|
+
Each pull request must pass all the rspec specs and meet our quality requirements.
|
26
24
|
|
27
|
-
|
25
|
+
To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
|
28
26
|
|
29
|
-
|
30
|
-
|
27
|
+
Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
|
28
|
+
|
29
|
+
[](https://app.coditsu.io/karafka/commit_builds)
|
31
30
|
|
32
31
|
## Questions
|
33
32
|
|
data/Gemfile
CHANGED
data/Gemfile.lock
CHANGED
@@ -1,119 +1,114 @@
|
|
1
1
|
PATH
|
2
2
|
remote: .
|
3
3
|
specs:
|
4
|
-
karafka (1.
|
5
|
-
activesupport (>=
|
6
|
-
celluloid
|
4
|
+
karafka (1.2.0)
|
5
|
+
activesupport (>= 4.0)
|
7
6
|
dry-configurable (~> 0.7)
|
7
|
+
dry-inflector (~> 0.1.1)
|
8
|
+
dry-monitor (~> 0.1)
|
8
9
|
dry-validation (~> 0.11)
|
9
10
|
envlogic (~> 1.0)
|
10
11
|
multi_json (>= 1.12)
|
11
12
|
rake (>= 11.3)
|
12
13
|
require_all (>= 1.4)
|
13
|
-
ruby-kafka (>= 0.
|
14
|
+
ruby-kafka (>= 0.5.3)
|
14
15
|
thor (~> 0.19)
|
15
|
-
waterdrop (
|
16
|
+
waterdrop (~> 1.2)
|
16
17
|
|
17
18
|
GEM
|
18
19
|
remote: https://rubygems.org/
|
19
20
|
specs:
|
20
|
-
activesupport (5.1.
|
21
|
+
activesupport (5.1.5)
|
21
22
|
concurrent-ruby (~> 1.0, >= 1.0.2)
|
22
23
|
i18n (~> 0.7)
|
23
24
|
minitest (~> 5.1)
|
24
25
|
tzinfo (~> 1.1)
|
25
|
-
celluloid (0.17.3)
|
26
|
-
celluloid-essentials
|
27
|
-
celluloid-extras
|
28
|
-
celluloid-fsm
|
29
|
-
celluloid-pool
|
30
|
-
celluloid-supervision
|
31
|
-
timers (>= 4.1.1)
|
32
|
-
celluloid-essentials (0.20.5)
|
33
|
-
timers (>= 4.1.1)
|
34
|
-
celluloid-extras (0.20.5)
|
35
|
-
timers (>= 4.1.1)
|
36
|
-
celluloid-fsm (0.20.5)
|
37
|
-
timers (>= 4.1.1)
|
38
|
-
celluloid-pool (0.20.5)
|
39
|
-
timers (>= 4.1.1)
|
40
|
-
celluloid-supervision (0.20.6)
|
41
|
-
timers (>= 4.1.1)
|
42
26
|
concurrent-ruby (1.0.5)
|
43
|
-
|
27
|
+
delivery_boy (0.2.4)
|
28
|
+
king_konf (~> 0.1.8)
|
29
|
+
ruby-kafka (~> 0.5.1)
|
44
30
|
diff-lcs (1.3)
|
45
|
-
docile (1.
|
31
|
+
docile (1.3.0)
|
46
32
|
dry-configurable (0.7.0)
|
47
33
|
concurrent-ruby (~> 1.0)
|
48
34
|
dry-container (0.6.0)
|
49
35
|
concurrent-ruby (~> 1.0)
|
50
36
|
dry-configurable (~> 0.1, >= 0.1.3)
|
51
|
-
dry-core (0.
|
37
|
+
dry-core (0.4.5)
|
52
38
|
concurrent-ruby (~> 1.0)
|
53
39
|
dry-equalizer (0.2.0)
|
54
|
-
dry-
|
40
|
+
dry-events (0.1.0)
|
41
|
+
concurrent-ruby (~> 1.0)
|
42
|
+
dry-core (~> 0.4)
|
43
|
+
dry-equalizer (~> 0.2)
|
44
|
+
dry-inflector (0.1.1)
|
45
|
+
dry-logic (0.4.2)
|
55
46
|
dry-container (~> 0.2, >= 0.2.6)
|
56
47
|
dry-core (~> 0.2)
|
57
48
|
dry-equalizer (~> 0.2)
|
58
|
-
dry-
|
49
|
+
dry-monitor (0.1.2)
|
50
|
+
dry-configurable (~> 0.5)
|
51
|
+
dry-equalizer (~> 0.2)
|
52
|
+
dry-events (~> 0.1)
|
53
|
+
rouge (~> 2.0, >= 2.2.1)
|
54
|
+
dry-types (0.12.2)
|
59
55
|
concurrent-ruby (~> 1.0)
|
60
56
|
dry-configurable (~> 0.1)
|
61
57
|
dry-container (~> 0.3)
|
62
58
|
dry-core (~> 0.2, >= 0.2.1)
|
63
59
|
dry-equalizer (~> 0.2)
|
64
|
-
dry-logic (~> 0.4, >= 0.4.
|
60
|
+
dry-logic (~> 0.4, >= 0.4.2)
|
65
61
|
inflecto (~> 0.0.0, >= 0.0.2)
|
66
|
-
dry-validation (0.11.
|
62
|
+
dry-validation (0.11.1)
|
67
63
|
concurrent-ruby (~> 1.0)
|
68
64
|
dry-configurable (~> 0.1, >= 0.1.3)
|
69
65
|
dry-core (~> 0.2, >= 0.2.1)
|
70
66
|
dry-equalizer (~> 0.2)
|
71
67
|
dry-logic (~> 0.4, >= 0.4.0)
|
72
|
-
dry-types (~> 0.
|
73
|
-
envlogic (1.0
|
74
|
-
|
75
|
-
|
76
|
-
|
68
|
+
dry-types (~> 0.12.0)
|
69
|
+
envlogic (1.1.0)
|
70
|
+
dry-inflector (~> 0.1)
|
71
|
+
i18n (0.9.5)
|
72
|
+
concurrent-ruby (~> 1.0)
|
77
73
|
inflecto (0.0.2)
|
78
74
|
json (2.1.0)
|
79
|
-
|
80
|
-
|
81
|
-
|
82
|
-
|
83
|
-
|
84
|
-
|
85
|
-
|
86
|
-
|
87
|
-
rspec-
|
88
|
-
|
89
|
-
rspec-
|
90
|
-
rspec-
|
75
|
+
king_konf (0.1.10)
|
76
|
+
minitest (5.11.3)
|
77
|
+
multi_json (1.13.1)
|
78
|
+
null-logger (0.1.5)
|
79
|
+
rake (12.3.1)
|
80
|
+
require_all (2.0.0)
|
81
|
+
rouge (2.2.1)
|
82
|
+
rspec (3.7.0)
|
83
|
+
rspec-core (~> 3.7.0)
|
84
|
+
rspec-expectations (~> 3.7.0)
|
85
|
+
rspec-mocks (~> 3.7.0)
|
86
|
+
rspec-core (3.7.1)
|
87
|
+
rspec-support (~> 3.7.0)
|
88
|
+
rspec-expectations (3.7.0)
|
91
89
|
diff-lcs (>= 1.2.0, < 2.0)
|
92
|
-
rspec-support (~> 3.
|
93
|
-
rspec-mocks (3.
|
90
|
+
rspec-support (~> 3.7.0)
|
91
|
+
rspec-mocks (3.7.0)
|
94
92
|
diff-lcs (>= 1.2.0, < 2.0)
|
95
|
-
rspec-support (~> 3.
|
96
|
-
rspec-support (3.
|
97
|
-
ruby-kafka (0.4
|
98
|
-
simplecov (0.
|
99
|
-
docile (~> 1.1
|
93
|
+
rspec-support (~> 3.7.0)
|
94
|
+
rspec-support (3.7.1)
|
95
|
+
ruby-kafka (0.5.4)
|
96
|
+
simplecov (0.16.1)
|
97
|
+
docile (~> 1.1)
|
100
98
|
json (>= 1.8, < 3)
|
101
99
|
simplecov-html (~> 0.10.0)
|
102
100
|
simplecov-html (0.10.2)
|
103
101
|
thor (0.20.0)
|
104
102
|
thread_safe (0.3.6)
|
105
103
|
timecop (0.9.1)
|
106
|
-
|
107
|
-
hitimes
|
108
|
-
tzinfo (1.2.3)
|
104
|
+
tzinfo (1.2.5)
|
109
105
|
thread_safe (~> 0.1)
|
110
|
-
waterdrop (
|
111
|
-
|
112
|
-
|
113
|
-
dry-
|
106
|
+
waterdrop (1.2.0)
|
107
|
+
delivery_boy (~> 0.2)
|
108
|
+
dry-configurable (~> 0.7)
|
109
|
+
dry-monitor (~> 0.1)
|
110
|
+
dry-validation (~> 0.11)
|
114
111
|
null-logger
|
115
|
-
rake
|
116
|
-
ruby-kafka (~> 0.4)
|
117
112
|
|
118
113
|
PLATFORMS
|
119
114
|
ruby
|
@@ -125,4 +120,4 @@ DEPENDENCIES
|
|
125
120
|
timecop
|
126
121
|
|
127
122
|
BUNDLED WITH
|
128
|
-
1.
|
123
|
+
1.16.1
|
data/README.md
CHANGED
@@ -1,17 +1,26 @@
|
|
1
1
|

|
2
2
|
|
3
|
-
[](#backers) [](#sponsors) [](https://gitter.im/karafka/karafka?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
|
3
|
+
[](https://travis-ci.org/karafka/karafka)
|
5
4
|
|
6
5
|
Framework used to simplify Apache Kafka based Ruby applications development.
|
7
6
|
|
8
|
-
|
7
|
+
Karafka allows you to capture everything that happens in your systems in large scale, providing you with a seamless and stable core for consuming and processing this data, without having to focus on things that are not your business domain.
|
9
8
|
|
10
9
|
Karafka not only handles incoming messages but also provides tools for building complex data-flow applications that receive and send messages.
|
11
10
|
|
11
|
+
**Warning**: Wiki and all the docs refer to the 1.2.0.beta4. Sorry for the inconvenience. We will release the stable 1.2.0 version soon.
|
12
|
+
|
12
13
|
## How does it work
|
13
14
|
|
14
|
-
Karafka provides a higher-level abstraction that allows you to focus on your business logic development, instead of focusing on
|
15
|
+
Karafka provides a higher-level abstraction that allows you to focus on your business logic development, instead of focusing on implementing lower level abstraction layers. It provides developers with a set of tools that are dedicated for building multi-topic applications similarly to how Rails applications are being built.
|
16
|
+
|
17
|
+
### Some things you might wonder about:
|
18
|
+
|
19
|
+
- You can integrate Karafka with **any** Ruby based application.
|
20
|
+
- Karafka does **not** require Sidekiq or any other third party software (apart from Kafka itself).
|
21
|
+
- Karafka works with Ruby on Rails but it is a **standalone** framework that can work without it.
|
22
|
+
- Karafka has a **minimal** set of dependencies, so adding it won't be a huge burden for your already existing applications.
|
23
|
+
- Karafka processes can be executed for a **given subset** of consumer groups and/or topics, so you can fine tune it depending on your business logic.
|
15
24
|
|
16
25
|
Karafka based applications can be easily deployed to any type of infrastructure, including those based on:
|
17
26
|
|
@@ -19,6 +28,12 @@ Karafka based applications can be easily deployed to any type of infrastructure,
|
|
19
28
|
* Capistrano
|
20
29
|
* Docker
|
21
30
|
|
31
|
+
## Support
|
32
|
+
|
33
|
+
Karafka has a [Wiki pages](https://github.com/karafka/karafka/wiki) for almost everything and a pretty decent [FAQ](https://github.com/karafka/karafka/wiki/FAQ). It covers the whole installation, setup and deployment along with other useful details on how to run Karafka.
|
34
|
+
|
35
|
+
If you have any questions about using Karafka, feel free to join our [Gitter](https://gitter.im/karafka/karafka) chat channel.
|
36
|
+
|
22
37
|
## Getting started
|
23
38
|
|
24
39
|
If you want to get started with Kafka and Karafka as fast as possible, then the best idea is to just clone our example repository:
|
@@ -40,20 +55,6 @@ and follow the instructions from the [example app Wiki](https://github.com/karaf
|
|
40
55
|
|
41
56
|
If you need more details and know how on how to start Karafka with a clean installation, read the [Getting started page](https://github.com/karafka/karafka/wiki/Getting-started) section of our Wiki.
|
42
57
|
|
43
|
-
## Support
|
44
|
-
|
45
|
-
Karafka has a [Wiki pages](https://github.com/karafka/karafka/wiki) for almost everything. It covers the whole installation, setup and deployment along with other useful details on how to run Karafka.
|
46
|
-
|
47
|
-
If you have any questions about using Karafka, feel free to join our [Gitter](https://gitter.im/karafka/karafka) chat channel.
|
48
|
-
|
49
|
-
Karafka dev team also provides commercial support in following matters:
|
50
|
-
|
51
|
-
- Additional programming services for integrating existing Ruby apps with Kafka and Karafka
|
52
|
-
- Expertise and guidance on using Karafka within new and existing projects
|
53
|
-
- Trainings on how to design and develop systems based on Apache Kafka and Karafka framework
|
54
|
-
|
55
|
-
If you are interested in our commercial services, please contact [Maciej Mensfeld (maciej@coditsu.io)](mailto:maciej@coditsu.io) directly.
|
56
|
-
|
57
58
|
## Notice
|
58
59
|
|
59
60
|
Karafka framework and Karafka team are __not__ related to Kafka streaming service called CloudKarafka in any matter. We don't recommend nor discourage usage of their platform.
|
@@ -64,55 +65,25 @@ Karafka framework and Karafka team are __not__ related to Kafka streaming servic
|
|
64
65
|
* [Karafka Travis CI](https://travis-ci.org/karafka/karafka)
|
65
66
|
* [Karafka Coditsu](https://app.coditsu.io/karafka/repositories/karafka)
|
66
67
|
|
67
|
-
## Note on
|
68
|
+
## Note on contributions
|
68
69
|
|
69
|
-
|
70
|
-
Make your feature addition or bug fix.
|
71
|
-
Add tests for it. This is important so we don't break it in a future versions unintentionally.
|
72
|
-
Commit, do not mess with Rakefile, version, or history. (if you want to have your own version, that is fine but bump version in a commit by itself I can ignore when I pull). Send me a pull request. Bonus points for topic branches.
|
70
|
+
First, thank you for considering contributing to Karafka! It's people like you that make the open source community such a great community!
|
73
71
|
|
74
|
-
|
72
|
+
Each pull request must pass all the rspec specs and meet our quality requirements.
|
75
73
|
|
76
|
-
|
74
|
+
To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
|
77
75
|
|
78
|
-
|
76
|
+
Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
|
79
77
|
|
80
|
-
|
81
|
-
|
82
|
-
```bash
|
83
|
-
bundle exec rake
|
84
|
-
```
|
85
|
-
|
86
|
-
to check if everything is in order. After that you can submit a pull request.
|
78
|
+
[](https://app.coditsu.io/karafka/commit_builds)
|
87
79
|
|
88
80
|
## Contributors
|
89
81
|
|
90
|
-
This project exists thanks to all the people who contribute.
|
82
|
+
This project exists thanks to all the people who contribute.
|
91
83
|
<a href="https://github.com/karafka/karafka/graphs/contributors"><img src="https://opencollective.com/karafka/contributors.svg?width=890" /></a>
|
92
84
|
|
93
|
-
## Backers
|
94
|
-
|
95
|
-
Thank you to all our backers! 🙏 [[Become a backer](https://opencollective.com/karafka#backer)]
|
96
|
-
|
97
|
-
<a href="https://opencollective.com/karafka#backers" target="_blank"><img src="https://opencollective.com/karafka/backers.svg?width=890"></a>
|
98
|
-
|
99
|
-
|
100
85
|
## Sponsors
|
101
86
|
|
102
|
-
We are looking for sustainable sponsorship. If your company is relying on Karafka framework or simply want to see Karafka evolve faster to meet your requirements, please consider backing the project.
|
103
|
-
|
104
|
-
Please contact [Maciej Mensfeld (maciej@coditsu.io)](mailto:maciej@coditsu.io) directly for more details.
|
105
|
-
|
106
|
-
|
107
|
-
<a href="https://opencollective.com/karafka/sponsor/0/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/0/avatar.svg"></a>
|
108
|
-
<a href="https://opencollective.com/karafka/sponsor/1/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/1/avatar.svg"></a>
|
109
|
-
<a href="https://opencollective.com/karafka/sponsor/2/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/2/avatar.svg"></a>
|
110
|
-
<a href="https://opencollective.com/karafka/sponsor/3/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/3/avatar.svg"></a>
|
111
|
-
<a href="https://opencollective.com/karafka/sponsor/4/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/4/avatar.svg"></a>
|
112
|
-
<a href="https://opencollective.com/karafka/sponsor/5/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/5/avatar.svg"></a>
|
113
|
-
<a href="https://opencollective.com/karafka/sponsor/6/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/6/avatar.svg"></a>
|
114
|
-
<a href="https://opencollective.com/karafka/sponsor/7/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/7/avatar.svg"></a>
|
115
|
-
<a href="https://opencollective.com/karafka/sponsor/8/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/8/avatar.svg"></a>
|
116
|
-
<a href="https://opencollective.com/karafka/sponsor/9/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/9/avatar.svg"></a>
|
117
|
-
|
87
|
+
We are looking for sustainable sponsorship. If your company is relying on Karafka framework or simply want to see Karafka evolve faster to meet your requirements, please consider backing the project.
|
118
88
|
|
89
|
+
Please contact [Maciej Mensfeld](mailto:maciej@coditsu.io) directly for more details.
|