karafka 1.1.0 → 1.1.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 26ee2938706de62609da9faf4179b3071fdc9e62
4
- data.tar.gz: eedc149b1ade99a21d82f32ad3ec9009b065cc89
3
+ metadata.gz: 1fcea5db98bb786e35298cda7dcfc34354935106
4
+ data.tar.gz: 8da4bad16ba1d23a04b6bfc6906d8432ff3d715c
5
5
  SHA512:
6
- metadata.gz: 2e25b2453563d9330955248307eee910147899fefb15bc197313f79012c132f0970fede7f0a0fd50b6966267c9d7d48f82f50b75fea927baa8c01bf68ebee305
7
- data.tar.gz: 902fd68ad24eb5400976d9f6b3cc5d955617a70e0b14186e2b27e8292fcff2d59aa013231834ad679c9694eebd6809589cf98715151d89a1637d6e71e2f5a14e
6
+ metadata.gz: c775dc7525a5e3f0252b5d5fb90f37d31636d2aeec99de936650ff6de6ec45c93383b111bd600d2cc190808e3b5f941dd619d8c5d51ad95c53852217df38184f
7
+ data.tar.gz: c891f17541b7298ac7e9e310d3d16a1e3464d365db34fcfb461109ddb6d03e3f499ccc1bd1886539d62d2451754adf4a7e9a4763324b5d46bcda680ca60ed358
@@ -1,6 +1,9 @@
1
1
  # Karafka framework changelog
2
2
 
3
- ## 1.1.0 Unreleased
3
+ ## 1.1.1
4
+ - #253 - Allow providing a global per app parser in config settings
5
+
6
+ ## 1.1.0
4
7
  - Gem bump
5
8
  - Switch from Celluloid to native Thread management
6
9
  - Improved shutdown process
@@ -14,7 +17,7 @@
14
17
  - Fixed accessibility of ```#params_batch``` from the outside of the controller
15
18
  - connection_pool config options are no longer required
16
19
  - celluloid config options are no longer required
17
- - ```#perform``` is not renamed to ```#consume``` with warning level on using the old one (deprecated)
20
+ - ```#perform``` is now renamed to ```#consume``` with warning level on using the old one (deprecated)
18
21
  - #235 - Rename perform to consume
19
22
  - Upgrade to ruby-kafka 0.5
20
23
  - Due to redesign of Waterdrop concurrency setting is no longer needed
@@ -23,6 +26,7 @@
23
26
  - Renamed ```batch_consuming``` option to ```batch_fetching``` as it is not a consumption (with processing) but a process of fetching messages from Kafka. The messages is considered consumed, when it is processed.
24
27
  - Renamed ```batch_processing``` to ```batch_consuming``` to resemble Kafka concept of consuming messages.
25
28
  - Renamed ```after_received``` to ```after_fetched``` to normalize the naming conventions.
29
+ - Responders support the per topic ```async``` option.
26
30
 
27
31
  ## 1.0.1
28
32
  - #210 - LoadError: cannot load such file -- [...]/karafka.rb
@@ -9,7 +9,6 @@ We welcome any type of contribution, not only code. You can help with
9
9
  - **Marketing**: writing blog posts, howto's, printing stickers, ...
10
10
  - **Community**: presenting the project at meetups, organizing a dedicated meetup for the local community, ...
11
11
  - **Code**: take a look at the [open issues](issues). Even if you can't write code, commenting on them, showing that you care about a given issue matters. It helps us triage them.
12
- - **Money**: we welcome financial contributions in full transparency on our [open collective](https://opencollective.com/karafka).
13
12
 
14
13
  ## Your First Contribution
15
14
 
@@ -21,13 +20,13 @@ Any code change should be submitted as a pull request. The description should ex
21
20
 
22
21
  ## Code review process
23
22
 
24
- The bigger the pull request, the longer it will take to review and merge. Try to break down large pull requests in smaller chunks that are easier to review and merge.
25
- It is also always helpful to have some context for your pull request. What was the purpose? Why does it matter to you?
23
+ Each pull request must pass all the rspec specs and meet our quality requirements.
26
24
 
27
- ## Financial contributions
25
+ To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
28
26
 
29
- We also welcome financial contributions in full transparency on our [open collective](https://opencollective.com/karafka).
30
- Anyone can file an expense. If the expense makes sense for the development of the community, it will be "merged" in the ledger of our open collective by the core contributors and the person who filed the expense will be reimbursed.
27
+ Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
28
+
29
+ [![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/commit_builds)
31
30
 
32
31
  ## Questions
33
32
 
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (1.1.0)
4
+ karafka (1.1.1)
5
5
  activesupport (>= 5.0)
6
6
  dry-configurable (~> 0.7)
7
7
  dry-validation (~> 0.11)
@@ -64,7 +64,7 @@ GEM
64
64
  minitest (5.10.3)
65
65
  multi_json (1.12.2)
66
66
  null-logger (0.1.4)
67
- rake (12.2.1)
67
+ rake (12.3.0)
68
68
  require_all (1.4.0)
69
69
  rspec (3.7.0)
70
70
  rspec-core (~> 3.7.0)
@@ -108,4 +108,4 @@ DEPENDENCIES
108
108
  waterdrop
109
109
 
110
110
  BUNDLED WITH
111
- 1.15.4
111
+ 1.16.0
data/README.md CHANGED
@@ -62,26 +62,17 @@ Karafka framework and Karafka team are __not__ related to Kafka streaming servic
62
62
  * [Karafka Travis CI](https://travis-ci.org/karafka/karafka)
63
63
  * [Karafka Coditsu](https://app.coditsu.io/karafka/repositories/karafka)
64
64
 
65
- ## Note on Patches/Pull Requests
65
+ ## Note on contributions
66
66
 
67
- Fork the project.
68
- Make your feature addition or bug fix.
69
- Add tests for it. This is important so we don't break it in a future versions unintentionally.
70
- Commit, do not mess with Rakefile, version, or history. (if you want to have your own version, that is fine but bump version in a commit by itself I can ignore when I pull). Send me a pull request. Bonus points for topic branches.
67
+ First, thank you for considering contributing to Karafka! It's people like you that make the open source community such a great community!
71
68
 
72
- [![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/repositories/karafka)
69
+ Each pull request must pass all the rspec specs and meet our quality requirements.
73
70
 
74
- Each pull request must pass our quality requirements. To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation.
71
+ To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
75
72
 
76
- Unfortunately, it does not yet support independent forks, however you should be fine by looking at what we require.
73
+ Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/commit_builds) of Karafka organization.
77
74
 
78
- Please run:
79
-
80
- ```bash
81
- bundle exec rspec
82
- ```
83
-
84
- to check if everything is in order. After that you can submit a pull request.
75
+ [![coditsu](https://coditsu.io/assets/quality_bar.svg)](https://app.coditsu.io/karafka/commit_builds)
85
76
 
86
77
  ## Contributors
87
78
 
@@ -92,7 +92,7 @@ module Karafka
92
92
  # @param parser_class [Class] parser class that we can use to generate appropriate string
93
93
  # or nothing if we want to default to Karafka::Parsers::Json
94
94
  # @return [Karafka::BaseResponder] base responder descendant responder
95
- def initialize(parser_class = Karafka::Parsers::Json)
95
+ def initialize(parser_class = Karafka::App.config.parser)
96
96
  @parser_class = parser_class
97
97
  @messages_buffer = {}
98
98
  end
@@ -39,12 +39,6 @@ module Karafka
39
39
  @responder ||= Karafka::Responders::Builder.new(controller).build
40
40
  end
41
41
 
42
- # @return [Class] Parser class (not instance) that we want to use to unparse Kafka messages
43
- # @note If not provided - will use Json as default
44
- def parser
45
- @parser ||= Karafka::Parsers::Json
46
- end
47
-
48
42
  Karafka::AttributesMap.topic.each do |attribute|
49
43
  config_retriever_for(attribute)
50
44
  end
@@ -34,6 +34,8 @@ module Karafka
34
34
  # - #incoming - for remapping from the incoming message to our internal format
35
35
  # - #outgoing - for remapping from internal topic name into outgoing message
36
36
  setting :topic_mapper, -> { Routing::TopicMapper }
37
+ # Default parser for parsing and unparsing incoming and outgoing data
38
+ setting :parser, -> { Karafka::Parsers::Json }
37
39
  # If batch_fetching is true, we will fetch kafka messages in batches instead of 1 by 1
38
40
  # @note Fetching does not equal consuming, see batch_consuming description for details
39
41
  setting :batch_fetching, true
@@ -3,5 +3,5 @@
3
3
  # Main module namespace
4
4
  module Karafka
5
5
  # Current Karafka version
6
- VERSION = '1.1.0'
6
+ VERSION = '1.1.1'
7
7
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: karafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.1.0
4
+ version: 1.1.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Maciej Mensfeld
@@ -10,7 +10,7 @@ authors:
10
10
  autorequire:
11
11
  bindir: bin
12
12
  cert_chain: []
13
- date: 2017-11-07 00:00:00.000000000 Z
13
+ date: 2017-11-20 00:00:00.000000000 Z
14
14
  dependencies:
15
15
  - !ruby/object:Gem::Dependency
16
16
  name: activesupport
@@ -163,7 +163,6 @@ extensions: []
163
163
  extra_rdoc_files: []
164
164
  files:
165
165
  - ".console_irbrc"
166
- - ".github/ISSUE_TEMPLATE.md"
167
166
  - ".gitignore"
168
167
  - ".rspec"
169
168
  - ".ruby-gemset"
@@ -1,2 +0,0 @@
1
- <!-- Love karafka? Please consider supporting our collective:
2
- 👉 https://opencollective.com/karafka/donate -->