karafka 1.0.0 → 1.2.0

Sign up to get free protection for your applications and to get access to all the features.
Files changed (83) hide show
  1. checksums.yaml +5 -5
  2. data/.ruby-version +1 -1
  3. data/.travis.yml +3 -1
  4. data/CHANGELOG.md +90 -3
  5. data/CONTRIBUTING.md +5 -6
  6. data/Gemfile +1 -1
  7. data/Gemfile.lock +59 -64
  8. data/README.md +28 -57
  9. data/bin/karafka +13 -1
  10. data/config/errors.yml +6 -0
  11. data/karafka.gemspec +10 -9
  12. data/lib/karafka.rb +19 -10
  13. data/lib/karafka/app.rb +8 -15
  14. data/lib/karafka/attributes_map.rb +4 -4
  15. data/lib/karafka/backends/inline.rb +2 -3
  16. data/lib/karafka/base_consumer.rb +68 -0
  17. data/lib/karafka/base_responder.rb +41 -17
  18. data/lib/karafka/callbacks.rb +30 -0
  19. data/lib/karafka/callbacks/config.rb +22 -0
  20. data/lib/karafka/callbacks/dsl.rb +16 -0
  21. data/lib/karafka/cli/base.rb +2 -0
  22. data/lib/karafka/cli/flow.rb +1 -1
  23. data/lib/karafka/cli/info.rb +1 -2
  24. data/lib/karafka/cli/install.rb +2 -3
  25. data/lib/karafka/cli/server.rb +9 -12
  26. data/lib/karafka/connection/client.rb +117 -0
  27. data/lib/karafka/connection/config_adapter.rb +30 -14
  28. data/lib/karafka/connection/delegator.rb +46 -0
  29. data/lib/karafka/connection/listener.rb +22 -20
  30. data/lib/karafka/consumers/callbacks.rb +54 -0
  31. data/lib/karafka/consumers/includer.rb +51 -0
  32. data/lib/karafka/consumers/responders.rb +24 -0
  33. data/lib/karafka/{controllers → consumers}/single_params.rb +3 -3
  34. data/lib/karafka/errors.rb +19 -2
  35. data/lib/karafka/fetcher.rb +30 -28
  36. data/lib/karafka/helpers/class_matcher.rb +8 -8
  37. data/lib/karafka/helpers/config_retriever.rb +2 -2
  38. data/lib/karafka/instrumentation/listener.rb +112 -0
  39. data/lib/karafka/instrumentation/logger.rb +55 -0
  40. data/lib/karafka/instrumentation/monitor.rb +64 -0
  41. data/lib/karafka/loader.rb +0 -1
  42. data/lib/karafka/params/dsl.rb +156 -0
  43. data/lib/karafka/params/params_batch.rb +7 -2
  44. data/lib/karafka/patches/dry_configurable.rb +7 -7
  45. data/lib/karafka/patches/ruby_kafka.rb +34 -0
  46. data/lib/karafka/persistence/client.rb +25 -0
  47. data/lib/karafka/persistence/consumer.rb +38 -0
  48. data/lib/karafka/persistence/topic.rb +29 -0
  49. data/lib/karafka/process.rb +6 -5
  50. data/lib/karafka/responders/builder.rb +15 -14
  51. data/lib/karafka/responders/topic.rb +8 -1
  52. data/lib/karafka/routing/builder.rb +2 -2
  53. data/lib/karafka/routing/consumer_group.rb +1 -1
  54. data/lib/karafka/routing/consumer_mapper.rb +34 -0
  55. data/lib/karafka/routing/router.rb +1 -1
  56. data/lib/karafka/routing/topic.rb +5 -11
  57. data/lib/karafka/routing/{mapper.rb → topic_mapper.rb} +2 -2
  58. data/lib/karafka/schemas/config.rb +4 -5
  59. data/lib/karafka/schemas/consumer_group.rb +45 -24
  60. data/lib/karafka/schemas/consumer_group_topic.rb +18 -0
  61. data/lib/karafka/schemas/responder_usage.rb +1 -0
  62. data/lib/karafka/server.rb +39 -20
  63. data/lib/karafka/setup/config.rb +74 -51
  64. data/lib/karafka/setup/configurators/base.rb +6 -12
  65. data/lib/karafka/setup/configurators/params.rb +25 -0
  66. data/lib/karafka/setup/configurators/water_drop.rb +15 -14
  67. data/lib/karafka/setup/dsl.rb +22 -0
  68. data/lib/karafka/templates/{application_controller.rb.example → application_consumer.rb.example} +2 -3
  69. data/lib/karafka/templates/karafka.rb.example +18 -5
  70. data/lib/karafka/version.rb +1 -1
  71. metadata +87 -63
  72. data/.github/ISSUE_TEMPLATE.md +0 -2
  73. data/Rakefile +0 -7
  74. data/lib/karafka/base_controller.rb +0 -118
  75. data/lib/karafka/connection/messages_consumer.rb +0 -106
  76. data/lib/karafka/connection/messages_processor.rb +0 -59
  77. data/lib/karafka/controllers/includer.rb +0 -51
  78. data/lib/karafka/controllers/responders.rb +0 -19
  79. data/lib/karafka/logger.rb +0 -53
  80. data/lib/karafka/monitor.rb +0 -98
  81. data/lib/karafka/params/params.rb +0 -101
  82. data/lib/karafka/persistence.rb +0 -18
  83. data/lib/karafka/setup/configurators/celluloid.rb +0 -22
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: 49921e358054ab6470899c0e122cc9c93f5398ea
4
- data.tar.gz: 183d5ff52e6f4ce09110287147d735929ff2a6e7
2
+ SHA256:
3
+ metadata.gz: 0bb0a1f72768ebf4bf720ebda57ebc26a0178275adabffd494b24e1612e9b38a
4
+ data.tar.gz: f586038a0498227e8a287cc173a23e77bb99b6cb8d453c39be55a220e2a0d361
5
5
  SHA512:
6
- metadata.gz: cd8ad63256d3133dbc1c4a4676b54c64815353047a8cb42c87e6ef343ff6af12099ff07d03d01b4a3b61fc648d79dc6b13c1f969a48f3ab04f1dc3a91799e99b
7
- data.tar.gz: 345addc070e2b13fc7d11f07faefccbcfbffeedad3ce106d98a335c0e54f5c29731fc9288c46b114aee4df62f7578c2e120e8b61b7a378c517881221b2254e4e
6
+ metadata.gz: e2b862da6372bc91f76bc01c20eb21ccde9d61d67749ccce97eadd2f739484dbd68a149d08d6e275d94d95b6ce9610d86c2d01e85bb1841a80e048ed832810ac
7
+ data.tar.gz: 06fb89700e59f810ec984b54f0424cfdeb98c79e17b8629010f011e68edff5fa1e345576a7b28ed07aff440739ed11824016da6083601a92b8f83fcca68e576e
data/.ruby-version CHANGED
@@ -1 +1 @@
1
- 2.4.1
1
+ 2.5.0
data/.travis.yml CHANGED
@@ -8,7 +8,9 @@ rvm:
8
8
  - 2.3.4
9
9
  - 2.4.0
10
10
  - 2.4.1
11
- - jruby-9.1.12.0
11
+ - 2.4.2
12
+ - 2.5.0
13
+ - jruby-head
12
14
  script: bundle exec rspec spec/
13
15
  env:
14
16
  global:
data/CHANGELOG.md CHANGED
@@ -1,5 +1,92 @@
1
1
  # Karafka framework changelog
2
2
 
3
+ ## 1.2.0
4
+ - Spec improvements
5
+ - #260 - Specs missing randomization
6
+ - #251 - Shutdown upon non responding (unreachable) cluster is not possible
7
+ - #258 - Investigate lowering requirements on activesupport
8
+ - #246 - Alias consumer#mark_as_consumed on controller
9
+ - #259 - Allow forcing key/partition key on responders
10
+ - #267 - Styling inconsistency
11
+ - #242 - Support setting the max bytes to fetch per request
12
+ - #247 - Support SCRAM once released
13
+ - #271 - Provide an after_init option to pass a configuration block
14
+ - #262 - Error in the monitor code for NewRelic
15
+ - #241 - Performance metrics
16
+ - #274 - Rename controllers to consumers
17
+ - #184 - Seek to
18
+ - #284 - Dynamic Params parent class
19
+ - #275 - ssl_ca_certs_from_system
20
+ - #296 - Instrument forceful exit with an error
21
+ - Replaced some of the activesupport parts with dry-inflector
22
+ - Lower ActiveSupport dependency
23
+ - Remove configurators in favor of the after_init block configurator
24
+ - Ruby 2.5.0 support
25
+ - Renamed Karafka::Connection::Processor to Karafka::Connection::Delegator to match incoming naming conventions
26
+ - Renamed Karafka::Connection::Consumer to Karafka::Connection::Client due to #274
27
+ - Removed HashWithIndifferentAccess in favor of a regular hash
28
+ - JSON parsing defaults now to string keys
29
+ - Lower memory usage due to less params data internal details
30
+ - Support multiple ```after_init``` blocks in favor of a single one
31
+ - Renamed ```received_at``` to ```receive_time``` to follow ruby-kafka and WaterDrop conventions
32
+ - Adjust internal setup to easier map Ruby-Kafka config changes
33
+ - System callbacks reorganization
34
+ - Added ```before_fetch_loop``` configuration block for early client usage (```#seek```, etc)
35
+ - Renamed ```after_fetched``` to ```after_fetch``` to normalize the naming convention
36
+ - Instrumentation on a connection delegator level
37
+ - Added ```params_batch#last``` method to retrieve last element after unparsing
38
+ - All params keys are now strings
39
+
40
+ ## 1.1.2
41
+ - #256 - Default kafka.seed_brokers configuration is created in invalid format
42
+
43
+ ## 1.1.1
44
+ - #253 - Allow providing a global per app parser in config settings
45
+
46
+ ## 1.1.0
47
+ - Gem bump
48
+ - Switch from Celluloid to native Thread management
49
+ - Improved shutdown process
50
+ - Introduced optional fetch callbacks and moved current the ```after_received``` there as well
51
+ - Karafka will raise Errors::InvalidPauseTimeout exception when trying to pause but timeout set to 0
52
+ - Allow float for timeouts and other time based second settings
53
+ - Renamed MessagesProcessor to Processor and MessagesConsumer to Consumer - we don't process and don't consumer anything else so it was pointless to keep this "namespace"
54
+ - #232 - Remove unused ActiveSupport require
55
+ - #214 - Expose consumer on a controller layer
56
+ - #193 - Process shutdown callbacks
57
+ - Fixed accessibility of ```#params_batch``` from the outside of the controller
58
+ - connection_pool config options are no longer required
59
+ - celluloid config options are no longer required
60
+ - ```#perform``` is now renamed to ```#consume``` with warning level on using the old one (deprecated)
61
+ - #235 - Rename perform to consume
62
+ - Upgrade to ruby-kafka 0.5
63
+ - Due to redesign of Waterdrop concurrency setting is no longer needed
64
+ - #236 - Manual offset management
65
+ - WaterDrop 1.0.0 support with async
66
+ - Renamed ```batch_consuming``` option to ```batch_fetching``` as it is not a consumption (with processing) but a process of fetching messages from Kafka. The messages is considered consumed, when it is processed.
67
+ - Renamed ```batch_processing``` to ```batch_consuming``` to resemble Kafka concept of consuming messages.
68
+ - Renamed ```after_received``` to ```after_fetched``` to normalize the naming conventions.
69
+ - Responders support the per topic ```async``` option.
70
+
71
+ ## 1.0.1
72
+ - #210 - LoadError: cannot load such file -- [...]/karafka.rb
73
+ - Ruby 2.4.2 as a default (+travis integration)
74
+ - JRuby upgrade
75
+ - Expanded persistence layer (moved to a namespace for easier future development)
76
+ - #213 - Misleading error when non-existing dependency is required
77
+ - #212 - Make params react to #topic, #partition, #offset
78
+ - #215 - Consumer group route dynamic options are ignored
79
+ - #217 - check RUBY_ENGINE constant if RUBY_VERSION is missing (#217)
80
+ - #218 - add configuration setting to control Celluloid's shutdown timeout
81
+ - Renamed Karafka::Routing::Mapper to Karafka::Routing::TopicMapper to match naming conventions
82
+ - #219 - Allow explicit consumer group names, without prefixes
83
+ - Fix to early removed pid upon shutdown of demonized process
84
+ - max_wait_time updated to match https://github.com/zendesk/ruby-kafka/issues/433
85
+ - #230 - Better uri validation for seed brokers (incompatibility as the kafka:// or kafka+ssl:// is required)
86
+ - Small internal docs fixes
87
+ - Dry::Validation::MissingMessageError: message for broker_schema? was not found
88
+ - #238 - warning: already initialized constant Karafka::Schemas::URI_SCHEMES
89
+
3
90
  ## 1.0.0
4
91
 
5
92
  ### Closed issues:
@@ -41,11 +128,11 @@
41
128
 
42
129
  ### New features and improvements
43
130
 
44
- - batch processing thanks to ```#batch_processing``` flag and ```#params_batch``` on controllers
131
+ - batch processing thanks to ```#batch_consuming``` flag and ```#params_batch``` on controllers
45
132
  - ```#topic``` method on an controller instance to make a clear distinction in between params and route details
46
133
  - Changed routing model (still compatible with 0.5) to allow better resources management
47
134
  - Lower memory requirements due to object creation limitation (2-3 times less objects on each new message)
48
- - Introduced the ```#batch_processing``` config flag (config for #126) that can be set per each consumer_group
135
+ - Introduced the ```#batch_consuming``` config flag (config for #126) that can be set per each consumer_group
49
136
  - Added support for partition, offset and partition key in the params hash
50
137
  - ```name``` option in config renamed to ```client_id```
51
138
  - Long running controllers with ```persistent``` flag on a topic config level, to make controller instances persistent between messages batches (single controller instance per topic per partition no per messages batch) - turned on by default
@@ -58,7 +145,7 @@
58
145
  - ```start_from_beginning``` moved into kafka scope (```kafka.start_from_beginning```)
59
146
  - Router no longer checks for route uniqueness - now you can define same routes for multiple kafkas and do a lot of crazy stuff, so it's your responsibility to check uniqueness
60
147
  - Change in the way we identify topics in between Karafka and Sidekiq workers. If you upgrade, please make sure, all the jobs scheduled in Sidekiq are finished before the upgrade.
61
- - ```batch_mode``` renamed to ```batch_consuming```
148
+ - ```batch_mode``` renamed to ```batch_fetching```
62
149
  - Renamed content to value to better resemble ruby-kafka internal messages naming convention
63
150
  - When having a responder with ```required``` topics and not using ```#respond_with``` at all, it will raise an exception
64
151
  - Renamed ```inline_mode``` to ```inline_processing``` to resemble other settings conventions
data/CONTRIBUTING.md CHANGED
@@ -9,7 +9,6 @@ We welcome any type of contribution, not only code. You can help with
9
9
  - **Marketing**: writing blog posts, howto's, printing stickers, ...
10
10
  - **Community**: presenting the project at meetups, organizing a dedicated meetup for the local community, ...
11
11
  - **Code**: take a look at the [open issues](issues). Even if you can't write code, commenting on them, showing that you care about a given issue matters. It helps us triage them.
12
- - **Money**: we welcome financial contributions in full transparency on our [open collective](https://opencollective.com/karafka).
13
12
 
14
13
  ## Your First Contribution
15
14
 
@@ -21,13 +20,13 @@ Any code change should be submitted as a pull request. The description should ex
21
20
 
22
21
  ## Code review process
23
22
 
24
- The bigger the pull request, the longer it will take to review and merge. Try to break down large pull requests in smaller chunks that are easier to review and merge.
25
- It is also always helpful to have some context for your pull request. What was the purpose? Why does it matter to you?
23
+ Each pull request must pass all the rspec specs and meet our quality requirements.
26
24
 
27
- ## Financial contributions
25
+ To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
28
26
 
29
- We also welcome financial contributions in full transparency on our [open collective](https://opencollective.com/karafka).
30
- Anyone can file an expense. If the expense makes sense for the development of the community, it will be "merged" in the ledger of our open collective by the core contributors and the person who filed the expense will be reimbursed.
27
+ Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
28
+
29
+ [![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/commit_builds)
31
30
 
32
31
  ## Questions
33
32
 
data/Gemfile CHANGED
@@ -5,7 +5,7 @@ source 'https://rubygems.org'
5
5
  gemspec
6
6
 
7
7
  group :development, :test do
8
- gem 'timecop'
9
8
  gem 'rspec'
10
9
  gem 'simplecov'
10
+ gem 'timecop'
11
11
  end
data/Gemfile.lock CHANGED
@@ -1,119 +1,114 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (1.0.0)
5
- activesupport (>= 5.0)
6
- celluloid
4
+ karafka (1.2.0)
5
+ activesupport (>= 4.0)
7
6
  dry-configurable (~> 0.7)
7
+ dry-inflector (~> 0.1.1)
8
+ dry-monitor (~> 0.1)
8
9
  dry-validation (~> 0.11)
9
10
  envlogic (~> 1.0)
10
11
  multi_json (>= 1.12)
11
12
  rake (>= 11.3)
12
13
  require_all (>= 1.4)
13
- ruby-kafka (>= 0.4)
14
+ ruby-kafka (>= 0.5.3)
14
15
  thor (~> 0.19)
15
- waterdrop (>= 0.4)
16
+ waterdrop (~> 1.2)
16
17
 
17
18
  GEM
18
19
  remote: https://rubygems.org/
19
20
  specs:
20
- activesupport (5.1.3)
21
+ activesupport (5.1.5)
21
22
  concurrent-ruby (~> 1.0, >= 1.0.2)
22
23
  i18n (~> 0.7)
23
24
  minitest (~> 5.1)
24
25
  tzinfo (~> 1.1)
25
- celluloid (0.17.3)
26
- celluloid-essentials
27
- celluloid-extras
28
- celluloid-fsm
29
- celluloid-pool
30
- celluloid-supervision
31
- timers (>= 4.1.1)
32
- celluloid-essentials (0.20.5)
33
- timers (>= 4.1.1)
34
- celluloid-extras (0.20.5)
35
- timers (>= 4.1.1)
36
- celluloid-fsm (0.20.5)
37
- timers (>= 4.1.1)
38
- celluloid-pool (0.20.5)
39
- timers (>= 4.1.1)
40
- celluloid-supervision (0.20.6)
41
- timers (>= 4.1.1)
42
26
  concurrent-ruby (1.0.5)
43
- connection_pool (2.2.1)
27
+ delivery_boy (0.2.4)
28
+ king_konf (~> 0.1.8)
29
+ ruby-kafka (~> 0.5.1)
44
30
  diff-lcs (1.3)
45
- docile (1.1.5)
31
+ docile (1.3.0)
46
32
  dry-configurable (0.7.0)
47
33
  concurrent-ruby (~> 1.0)
48
34
  dry-container (0.6.0)
49
35
  concurrent-ruby (~> 1.0)
50
36
  dry-configurable (~> 0.1, >= 0.1.3)
51
- dry-core (0.3.3)
37
+ dry-core (0.4.5)
52
38
  concurrent-ruby (~> 1.0)
53
39
  dry-equalizer (0.2.0)
54
- dry-logic (0.4.1)
40
+ dry-events (0.1.0)
41
+ concurrent-ruby (~> 1.0)
42
+ dry-core (~> 0.4)
43
+ dry-equalizer (~> 0.2)
44
+ dry-inflector (0.1.1)
45
+ dry-logic (0.4.2)
55
46
  dry-container (~> 0.2, >= 0.2.6)
56
47
  dry-core (~> 0.2)
57
48
  dry-equalizer (~> 0.2)
58
- dry-types (0.11.1)
49
+ dry-monitor (0.1.2)
50
+ dry-configurable (~> 0.5)
51
+ dry-equalizer (~> 0.2)
52
+ dry-events (~> 0.1)
53
+ rouge (~> 2.0, >= 2.2.1)
54
+ dry-types (0.12.2)
59
55
  concurrent-ruby (~> 1.0)
60
56
  dry-configurable (~> 0.1)
61
57
  dry-container (~> 0.3)
62
58
  dry-core (~> 0.2, >= 0.2.1)
63
59
  dry-equalizer (~> 0.2)
64
- dry-logic (~> 0.4, >= 0.4.0)
60
+ dry-logic (~> 0.4, >= 0.4.2)
65
61
  inflecto (~> 0.0.0, >= 0.0.2)
66
- dry-validation (0.11.0)
62
+ dry-validation (0.11.1)
67
63
  concurrent-ruby (~> 1.0)
68
64
  dry-configurable (~> 0.1, >= 0.1.3)
69
65
  dry-core (~> 0.2, >= 0.2.1)
70
66
  dry-equalizer (~> 0.2)
71
67
  dry-logic (~> 0.4, >= 0.4.0)
72
- dry-types (~> 0.11.0)
73
- envlogic (1.0.4)
74
- activesupport
75
- hitimes (1.2.6)
76
- i18n (0.8.6)
68
+ dry-types (~> 0.12.0)
69
+ envlogic (1.1.0)
70
+ dry-inflector (~> 0.1)
71
+ i18n (0.9.5)
72
+ concurrent-ruby (~> 1.0)
77
73
  inflecto (0.0.2)
78
74
  json (2.1.0)
79
- minitest (5.10.3)
80
- multi_json (1.12.2)
81
- null-logger (0.1.4)
82
- rake (12.0.0)
83
- require_all (1.4.0)
84
- rspec (3.6.0)
85
- rspec-core (~> 3.6.0)
86
- rspec-expectations (~> 3.6.0)
87
- rspec-mocks (~> 3.6.0)
88
- rspec-core (3.6.0)
89
- rspec-support (~> 3.6.0)
90
- rspec-expectations (3.6.0)
75
+ king_konf (0.1.10)
76
+ minitest (5.11.3)
77
+ multi_json (1.13.1)
78
+ null-logger (0.1.5)
79
+ rake (12.3.1)
80
+ require_all (2.0.0)
81
+ rouge (2.2.1)
82
+ rspec (3.7.0)
83
+ rspec-core (~> 3.7.0)
84
+ rspec-expectations (~> 3.7.0)
85
+ rspec-mocks (~> 3.7.0)
86
+ rspec-core (3.7.1)
87
+ rspec-support (~> 3.7.0)
88
+ rspec-expectations (3.7.0)
91
89
  diff-lcs (>= 1.2.0, < 2.0)
92
- rspec-support (~> 3.6.0)
93
- rspec-mocks (3.6.0)
90
+ rspec-support (~> 3.7.0)
91
+ rspec-mocks (3.7.0)
94
92
  diff-lcs (>= 1.2.0, < 2.0)
95
- rspec-support (~> 3.6.0)
96
- rspec-support (3.6.0)
97
- ruby-kafka (0.4.1)
98
- simplecov (0.15.0)
99
- docile (~> 1.1.0)
93
+ rspec-support (~> 3.7.0)
94
+ rspec-support (3.7.1)
95
+ ruby-kafka (0.5.4)
96
+ simplecov (0.16.1)
97
+ docile (~> 1.1)
100
98
  json (>= 1.8, < 3)
101
99
  simplecov-html (~> 0.10.0)
102
100
  simplecov-html (0.10.2)
103
101
  thor (0.20.0)
104
102
  thread_safe (0.3.6)
105
103
  timecop (0.9.1)
106
- timers (4.1.2)
107
- hitimes
108
- tzinfo (1.2.3)
104
+ tzinfo (1.2.5)
109
105
  thread_safe (~> 0.1)
110
- waterdrop (0.4.0)
111
- bundler
112
- connection_pool
113
- dry-configurable (~> 0.6)
106
+ waterdrop (1.2.0)
107
+ delivery_boy (~> 0.2)
108
+ dry-configurable (~> 0.7)
109
+ dry-monitor (~> 0.1)
110
+ dry-validation (~> 0.11)
114
111
  null-logger
115
- rake
116
- ruby-kafka (~> 0.4)
117
112
 
118
113
  PLATFORMS
119
114
  ruby
@@ -125,4 +120,4 @@ DEPENDENCIES
125
120
  timecop
126
121
 
127
122
  BUNDLED WITH
128
- 1.14.6
123
+ 1.16.1
data/README.md CHANGED
@@ -1,17 +1,26 @@
1
1
  ![karafka logo](https://raw.githubusercontent.com/karafka/misc/master/logo/karafka_logotype_transparent2.png)
2
2
 
3
- [![Build Status](https://travis-ci.org/karafka/karafka.png)](https://travis-ci.org/karafka/karafka)
4
- [![Backers on Open Collective](https://opencollective.com/karafka/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/karafka/sponsors/badge.svg)](#sponsors) [![Join the chat at https://gitter.im/karafka/karafka](https://badges.gitter.im/karafka/karafka.svg)](https://gitter.im/karafka/karafka?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
3
+ [![Build Status](https://travis-ci.org/karafka/karafka.svg?branch=master)](https://travis-ci.org/karafka/karafka)
5
4
 
6
5
  Framework used to simplify Apache Kafka based Ruby applications development.
7
6
 
8
- It allows programmers to use approach similar to standard HTTP conventions (```params``` and ```params_batch```) when working with asynchronous Kafka messages.
7
+ Karafka allows you to capture everything that happens in your systems in large scale, providing you with a seamless and stable core for consuming and processing this data, without having to focus on things that are not your business domain.
9
8
 
10
9
  Karafka not only handles incoming messages but also provides tools for building complex data-flow applications that receive and send messages.
11
10
 
11
+ **Warning**: Wiki and all the docs refer to the 1.2.0.beta4. Sorry for the inconvenience. We will release the stable 1.2.0 version soon.
12
+
12
13
  ## How does it work
13
14
 
14
- Karafka provides a higher-level abstraction that allows you to focus on your business logic development, instead of focusing on implementing lower level abstration layers. It provides developers with a set of tools that are dedicated for building multi-topic applications similarly to how Rails applications are being built.
15
+ Karafka provides a higher-level abstraction that allows you to focus on your business logic development, instead of focusing on implementing lower level abstraction layers. It provides developers with a set of tools that are dedicated for building multi-topic applications similarly to how Rails applications are being built.
16
+
17
+ ### Some things you might wonder about:
18
+
19
+ - You can integrate Karafka with **any** Ruby based application.
20
+ - Karafka does **not** require Sidekiq or any other third party software (apart from Kafka itself).
21
+ - Karafka works with Ruby on Rails but it is a **standalone** framework that can work without it.
22
+ - Karafka has a **minimal** set of dependencies, so adding it won't be a huge burden for your already existing applications.
23
+ - Karafka processes can be executed for a **given subset** of consumer groups and/or topics, so you can fine tune it depending on your business logic.
15
24
 
16
25
  Karafka based applications can be easily deployed to any type of infrastructure, including those based on:
17
26
 
@@ -19,6 +28,12 @@ Karafka based applications can be easily deployed to any type of infrastructure,
19
28
  * Capistrano
20
29
  * Docker
21
30
 
31
+ ## Support
32
+
33
+ Karafka has a [Wiki pages](https://github.com/karafka/karafka/wiki) for almost everything and a pretty decent [FAQ](https://github.com/karafka/karafka/wiki/FAQ). It covers the whole installation, setup and deployment along with other useful details on how to run Karafka.
34
+
35
+ If you have any questions about using Karafka, feel free to join our [Gitter](https://gitter.im/karafka/karafka) chat channel.
36
+
22
37
  ## Getting started
23
38
 
24
39
  If you want to get started with Kafka and Karafka as fast as possible, then the best idea is to just clone our example repository:
@@ -40,20 +55,6 @@ and follow the instructions from the [example app Wiki](https://github.com/karaf
40
55
 
41
56
  If you need more details and know how on how to start Karafka with a clean installation, read the [Getting started page](https://github.com/karafka/karafka/wiki/Getting-started) section of our Wiki.
42
57
 
43
- ## Support
44
-
45
- Karafka has a [Wiki pages](https://github.com/karafka/karafka/wiki) for almost everything. It covers the whole installation, setup and deployment along with other useful details on how to run Karafka.
46
-
47
- If you have any questions about using Karafka, feel free to join our [Gitter](https://gitter.im/karafka/karafka) chat channel.
48
-
49
- Karafka dev team also provides commercial support in following matters:
50
-
51
- - Additional programming services for integrating existing Ruby apps with Kafka and Karafka
52
- - Expertise and guidance on using Karafka within new and existing projects
53
- - Trainings on how to design and develop systems based on Apache Kafka and Karafka framework
54
-
55
- If you are interested in our commercial services, please contact [Maciej Mensfeld (maciej@coditsu.io)](mailto:maciej@coditsu.io) directly.
56
-
57
58
  ## Notice
58
59
 
59
60
  Karafka framework and Karafka team are __not__ related to Kafka streaming service called CloudKarafka in any matter. We don't recommend nor discourage usage of their platform.
@@ -64,55 +65,25 @@ Karafka framework and Karafka team are __not__ related to Kafka streaming servic
64
65
  * [Karafka Travis CI](https://travis-ci.org/karafka/karafka)
65
66
  * [Karafka Coditsu](https://app.coditsu.io/karafka/repositories/karafka)
66
67
 
67
- ## Note on Patches/Pull Requests
68
+ ## Note on contributions
68
69
 
69
- Fork the project.
70
- Make your feature addition or bug fix.
71
- Add tests for it. This is important so we don't break it in a future versions unintentionally.
72
- Commit, do not mess with Rakefile, version, or history. (if you want to have your own version, that is fine but bump version in a commit by itself I can ignore when I pull). Send me a pull request. Bonus points for topic branches.
70
+ First, thank you for considering contributing to Karafka! It's people like you that make the open source community such a great community!
73
71
 
74
- [![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/repositories/karafka)
72
+ Each pull request must pass all the rspec specs and meet our quality requirements.
75
73
 
76
- Each pull request must pass our quality requirements. To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combinse multiple linters and code analyzers for both code and documentation.
74
+ To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
77
75
 
78
- Unfortunately, it does not yet support independent forks, however you should be fine by looking at what we require.
76
+ Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
79
77
 
80
- Please run:
81
-
82
- ```bash
83
- bundle exec rake
84
- ```
85
-
86
- to check if everything is in order. After that you can submit a pull request.
78
+ [![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/commit_builds)
87
79
 
88
80
  ## Contributors
89
81
 
90
- This project exists thanks to all the people who contribute. [[Contribute]](CONTRIBUTING.md).
82
+ This project exists thanks to all the people who contribute.
91
83
  <a href="https://github.com/karafka/karafka/graphs/contributors"><img src="https://opencollective.com/karafka/contributors.svg?width=890" /></a>
92
84
 
93
- ## Backers
94
-
95
- Thank you to all our backers! 🙏 [[Become a backer](https://opencollective.com/karafka#backer)]
96
-
97
- <a href="https://opencollective.com/karafka#backers" target="_blank"><img src="https://opencollective.com/karafka/backers.svg?width=890"></a>
98
-
99
-
100
85
  ## Sponsors
101
86
 
102
- We are looking for sustainable sponsorship. If your company is relying on Karafka framework or simply want to see Karafka evolve faster to meet your requirements, please consider backing the project. [[Become a sponsor](https://opencollective.com/karafka#sponsor)]
103
-
104
- Please contact [Maciej Mensfeld (maciej@coditsu.io)](mailto:maciej@coditsu.io) directly for more details.
105
-
106
-
107
- <a href="https://opencollective.com/karafka/sponsor/0/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/0/avatar.svg"></a>
108
- <a href="https://opencollective.com/karafka/sponsor/1/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/1/avatar.svg"></a>
109
- <a href="https://opencollective.com/karafka/sponsor/2/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/2/avatar.svg"></a>
110
- <a href="https://opencollective.com/karafka/sponsor/3/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/3/avatar.svg"></a>
111
- <a href="https://opencollective.com/karafka/sponsor/4/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/4/avatar.svg"></a>
112
- <a href="https://opencollective.com/karafka/sponsor/5/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/5/avatar.svg"></a>
113
- <a href="https://opencollective.com/karafka/sponsor/6/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/6/avatar.svg"></a>
114
- <a href="https://opencollective.com/karafka/sponsor/7/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/7/avatar.svg"></a>
115
- <a href="https://opencollective.com/karafka/sponsor/8/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/8/avatar.svg"></a>
116
- <a href="https://opencollective.com/karafka/sponsor/9/website" target="_blank"><img src="https://opencollective.com/karafka/sponsor/9/avatar.svg"></a>
117
-
87
+ We are looking for sustainable sponsorship. If your company is relying on Karafka framework or simply want to see Karafka evolve faster to meet your requirements, please consider backing the project.
118
88
 
89
+ Please contact [Maciej Mensfeld](mailto:maciej@coditsu.io) directly for more details.