logstash-input-kafka 5.1.11 → 6.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 16e4afb2f078c43e0d2c2cec5e626a39a05763ff
4
- data.tar.gz: 548f91e907e2bc73a90ce449c87f8723652c1c78
3
+ metadata.gz: 2f20d67ea46283bd3aaf21569aab83e631c19921
4
+ data.tar.gz: e3da91a864987986ceef9acc22080f13a2845399
5
5
  SHA512:
6
- metadata.gz: 4906c8d3c5ea405220c027d5c757283ee4c555cbe626f1459e25231da0f6350524fb395630545f092cbff25c3b83dc75b1e585febd55400c4c53af76817e215f
7
- data.tar.gz: a4bda274415f4c3c6a42faf2568d8871d697686a16156deee49e4c50acd6efde90229b1155e45dc2c88b4e628377ad3147cf6c8aa26f941a63090c461613d9d6
6
+ metadata.gz: c8ddadc5b3151ee32602c2fed08698bf58459ab19590e6fe0c5a775ea74087649239870ed2aee591a6db1a938b675d5ab9cfd1f2a8bcf4cceeea48e612227f53
7
+ data.tar.gz: eb73a4f54c954c9b522a15c61ea6930ece83a3854a63c1cb3d82de7c08254a411795534a98ebc662ef0d0b73e21908c9d7adf48c0a95b25329b4b4eb18089de3
data/CHANGELOG.md CHANGED
@@ -1,42 +1,5 @@
1
- ## 5.1.11
2
- Documentation fixes
3
-
4
- ## 5.1.10
5
- Documentation fixes
6
-
7
- ## 5.1.9
8
- - Docs: Fix topic title.
9
-
10
- ## 5.1.8
11
- - Docs: Fix asciidoc problems.
12
-
13
- ## 5.1.7
14
- - Fix NPE when SASL_SSL+PLAIN (no Kerberos) options are specified.
15
-
16
- ## 5.1.6
17
- - fix: Client ID is no longer reused across multiple Kafka consumer instances
18
-
19
- ## 5.1.5
20
- - Fix a bug where consumer was not correctly setup when `SASL_SSL` option was specified.
21
-
22
- ## 5.1.4
23
- - Fix log4j bootstrapping for 2.4
24
-
25
- ## 5.1.3
26
- - Make error reporting clearer when connection fails
27
- - set kerberos only when using GSSAPI
28
-
29
- ## 5.1.2
30
- - Docs: Move info about security features out of the compatibility matrix and into the main text.
31
-
32
- ## 5.1.1
33
- - Docs: Clarify compatibility matrix and remove it from the changelog to avoid duplication.
34
-
35
- ## 5.1.0
36
- - Add Kerberos authentication support.
37
-
38
- ## 5.0.6
39
- - default `poll_timeout_ms` to 100ms
1
+ ## 6.0.0
2
+ - Breaking: Support for Kafka 0.10.1.0. Only supports brokers 0.10.1.x or later.
40
3
 
41
4
  ## 5.0.5
42
5
  - place setup_log4j for logging registration behind version check
data/README.md CHANGED
@@ -6,14 +6,16 @@ This is a plugin for [Logstash](https://github.com/elastic/logstash).
6
6
 
7
7
  It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
8
8
 
9
- ## Logging
9
+ ## Kafka Compatibility
10
10
 
11
- Kafka logs do not respect the Log4J2 root logger level and defaults to INFO, for other levels, you must explicitly set the log level in your Logstash deployment's `log4j2.properties` file, e.g.:
12
- ```
13
- logger.kafka.name=org.apache.kafka
14
- logger.kafka.appenderRef.console.ref=console
15
- logger.kafka.level=debug
16
- ```
11
+ Here's a table that describes the compatibility matrix for Kafka Broker support. Please remember that it is good advice to upgrade brokers before consumers/producers since brokers target backwards compatibility. The 0.9 broker will work with both the 0.8 consumer and 0.9 consumer APIs but not the other way around.
12
+
13
+ | Kafka Broker Version | Logstash Version | Input Plugin | Output Plugin | Why? |
14
+ |:---------------:|:------------------:|:--------------:|:---------------:|:------|
15
+ | 0.8 | 2.0 - 2.x | < 3.0.0 | <3.0.0 | Legacy, 0.8 is still popular |
16
+ | 0.9 | 2.0 - 2.3.x | 3.0.0 | 3.0.0 | Intermediate release before 0.10 that works with old Ruby Event API `[]` |
17
+ | 0.9 | 2.4, 5.0 | 4.0.0 | 4.0.0 | Intermediate release before 0.10 with new get/set API |
18
+ | 0.10 | 2.4, 5.0 | 5.0.0 | 5.0.0 | Track latest Kafka release. Not compatible with 0.9 broker |
17
19
 
18
20
  ## Documentation
19
21
 
@@ -4,32 +4,25 @@ require 'stud/interval'
4
4
  require 'java'
5
5
  require 'logstash-input-kafka_jars.rb'
6
6
 
7
- # This input will read events from a Kafka topic. It uses the 0.10 version of the
8
- # consumer API provided by Kafka to read messages from the broker.
7
+ # This input will read events from a Kafka topic. It uses the the newly designed
8
+ # 0.10 version of consumer API provided by Kafka to read messages from the broker.
9
9
  #
10
10
  # Here's a compatibility matrix that shows the Kafka client versions that are compatible with each combination
11
11
  # of Logstash and the Kafka input plugin:
12
12
  #
13
13
  # [options="header"]
14
14
  # |==========================================================
15
- # |Kafka Client Version |Logstash Version |Plugin Version |Why?
16
- # |0.8 |2.0.0 - 2.x.x |<3.0.0 |Legacy, 0.8 is still popular
17
- # |0.9 |2.0.0 - 2.3.x | 3.x.x |Works with the old Ruby Event API (`event['product']['price'] = 10`)
18
- # |0.9 |2.4.x - 5.x.x | 4.x.x |Works with the new getter/setter APIs (`event.set('[product][price]', 10)`)
19
- # |0.10.0.x |2.4.x - 5.x.x | 5.x.x |Not compatible with the <= 0.9 broker
15
+ # |Kafka Client Version |Logstash Version |Plugin Version |Security Features |Why?
16
+ # |0.8 |2.0.0 - 2.x.x |<3.0.0 | |Legacy, 0.8 is still popular
17
+ # |0.9 |2.0.0 - 2.3.x | 3.x.x |Basic Auth, SSL |Works with the old Ruby Event API (`event['product']['price'] = 10`)
18
+ # |0.9 |2.4.0 - 5.0.x | 4.x.x |Basic Auth, SSL |Works with the new getter/setter APIs (`event.set('[product][price]', 10)`)
19
+ # |0.10 |2.4.0 - 5.0.x | 5.x.x |Basic Auth, SSL |Not compatible with the 0.9 broker
20
20
  # |==========================================================
21
21
  #
22
22
  # NOTE: We recommended that you use matching Kafka client and broker versions. During upgrades, you should
23
23
  # upgrade brokers before clients because brokers target backwards compatibility. For example, the 0.9 broker
24
24
  # is compatible with both the 0.8 consumer and 0.9 consumer APIs, but not the other way around.
25
25
  #
26
- # This input supports connecting to Kafka over:
27
- #
28
- # * SSL (requires plugin version 3.0.0 or later)
29
- # * Kerberos SASL (requires plugin version 5.1.0 or later)
30
- #
31
- # By default security is disabled but can be turned on as needed.
32
- #
33
26
  # The Logstash Kafka consumer handles group management and uses the default offset management
34
27
  # strategy using Kafka topics.
35
28
  #
@@ -46,6 +39,9 @@ require 'logstash-input-kafka_jars.rb'
46
39
  #
47
40
  # Kafka consumer configuration: http://kafka.apache.org/documentation.html#consumerconfigs
48
41
  #
42
+ # This version also adds support for SSL/TLS security connection to Kafka. By default SSL is
43
+ # disabled but can be turned on as needed.
44
+ #
49
45
  class LogStash::Inputs::Kafka < LogStash::Inputs::Base
50
46
  config_name 'kafka'
51
47
 
@@ -147,51 +143,17 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
147
143
  # The topics configuration will be ignored when using this configuration.
148
144
  config :topics_pattern, :validate => :string
149
145
  # Time kafka consumer will wait to receive new messages from topics
150
- config :poll_timeout_ms, :validate => :number, :default => 100
146
+ config :poll_timeout_ms, :validate => :number
151
147
  # Enable SSL/TLS secured communication to Kafka broker.
152
- config :ssl, :validate => :boolean, :default => false, :deprecated => "Use security_protocol => 'ssl'"
153
- # The truststore type.
154
- config :ssl_truststore_type, :validate => :string
148
+ config :ssl, :validate => :boolean, :default => false
155
149
  # The JKS truststore path to validate the Kafka broker's certificate.
156
150
  config :ssl_truststore_location, :validate => :path
157
151
  # The truststore password
158
152
  config :ssl_truststore_password, :validate => :password
159
- # The keystore type.
160
- config :ssl_keystore_type, :validate => :string
161
153
  # If client authentication is required, this setting stores the keystore path.
162
154
  config :ssl_keystore_location, :validate => :path
163
155
  # If client authentication is required, this setting stores the keystore password
164
156
  config :ssl_keystore_password, :validate => :password
165
- # The password of the private key in the key store file.
166
- config :ssl_key_password, :validate => :password
167
- # Security protocol to use, which can be either of PLAINTEXT,SSL,SASL_PLAINTEXT,SASL_SSL
168
- config :security_protocol, :validate => ["PLAINTEXT", "SSL", "SASL_PLAINTEXT", "SASL_SSL"], :default => "PLAINTEXT"
169
- # http://kafka.apache.org/documentation.html#security_sasl[SASL mechanism] used for client connections.
170
- # This may be any mechanism for which a security provider is available.
171
- # GSSAPI is the default mechanism.
172
- config :sasl_mechanism, :validate => :string, :default => "GSSAPI"
173
- # The Kerberos principal name that Kafka broker runs as.
174
- # This can be defined either in Kafka's JAAS config or in Kafka's config.
175
- config :sasl_kerberos_service_name, :validate => :string
176
- # The Java Authentication and Authorization Service (JAAS) API supplies user authentication and authorization
177
- # services for Kafka. This setting provides the path to the JAAS file. Sample JAAS file for Kafka client:
178
- # [source,java]
179
- # ----------------------------------
180
- # KafkaClient {
181
- # com.sun.security.auth.module.Krb5LoginModule required
182
- # useTicketCache=true
183
- # renewTicket=true
184
- # serviceName="kafka";
185
- # };
186
- # ----------------------------------
187
- #
188
- # Please note that specifying `jaas_path` and `kerberos_config` in the config file will add these
189
- # to the global JVM system properties. This means if you have multiple Kafka inputs, all of them would be sharing the same
190
- # `jaas_path` and `kerberos_config`. If this is not desirable, you would have to run separate instances of Logstash on
191
- # different JVM instances.
192
- config :jaas_path, :validate => :path
193
- # Optional path to kerberos config file. This is krb5.conf style as detailed in https://web.mit.edu/kerberos/krb5-1.12/doc/admin/conf_files/krb5_conf.html
194
- config :kerberos_config, :validate => :path
195
157
  # Option to add Kafka metadata like topic, message size to the event.
196
158
  # This will add a field named `kafka` to the logstash event containing the following attributes:
197
159
  # `topic`: The topic this message is associated with
@@ -204,12 +166,17 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
204
166
 
205
167
  public
206
168
  def register
169
+ # Logstash 2.4
170
+ if defined?(LogStash::Logger) && LogStash::Logger.respond_to?(:setup_log4j)
171
+ LogStash::Logger.setup_log4j(@logger)
172
+ end
173
+
207
174
  @runner_threads = []
208
175
  end # def register
209
176
 
210
177
  public
211
178
  def run(logstash_queue)
212
- @runner_consumers = consumer_threads.times.map { |i| create_consumer("#{client_id}-#{i}") }
179
+ @runner_consumers = consumer_threads.times.map { || create_consumer }
213
180
  @runner_threads = @runner_consumers.map { |consumer| thread_runner(logstash_queue, consumer) }
214
181
  @runner_threads.each { |t| t.join }
215
182
  end # def run
@@ -219,11 +186,6 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
219
186
  @runner_consumers.each { |c| c.wakeup }
220
187
  end
221
188
 
222
- public
223
- def kafka_consumers
224
- @runner_consumers
225
- end
226
-
227
189
  private
228
190
  def thread_runner(logstash_queue, consumer)
229
191
  Thread.new do
@@ -260,7 +222,7 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
260
222
  end
261
223
 
262
224
  private
263
- def create_consumer(client_id)
225
+ def create_consumer
264
226
  begin
265
227
  props = java.util.Properties.new
266
228
  kafka = org.apache.kafka.clients.consumer.ConsumerConfig
@@ -290,47 +252,20 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
290
252
  props.put(kafka::SESSION_TIMEOUT_MS_CONFIG, session_timeout_ms) unless session_timeout_ms.nil?
291
253
  props.put(kafka::VALUE_DESERIALIZER_CLASS_CONFIG, value_deserializer_class)
292
254
 
293
- props.put("security.protocol", security_protocol) unless security_protocol.nil?
255
+ if ssl
256
+ props.put("security.protocol", "SSL")
257
+ props.put("ssl.truststore.location", ssl_truststore_location)
258
+ props.put("ssl.truststore.password", ssl_truststore_password.value) unless ssl_truststore_password.nil?
294
259
 
295
- if security_protocol == "SSL"
296
- set_trustore_keystore_config(props)
297
- elsif security_protocol == "SASL_PLAINTEXT"
298
- set_sasl_config(props)
299
- elsif security_protocol == "SASL_SSL"
300
- set_trustore_keystore_config(props)
301
- set_sasl_config(props)
260
+ #Client auth stuff
261
+ props.put("ssl.keystore.location", ssl_keystore_location) unless ssl_keystore_location.nil?
262
+ props.put("ssl.keystore.password", ssl_keystore_password.value) unless ssl_keystore_password.nil?
302
263
  end
303
264
 
304
265
  org.apache.kafka.clients.consumer.KafkaConsumer.new(props)
305
266
  rescue => e
306
- logger.error("Unable to create Kafka consumer from given configuration",
307
- :kafka_error_message => e,
308
- :cause => e.respond_to?(:getCause) ? e.getCause() : nil)
267
+ @logger.error("Unable to create Kafka consumer from given configuration", :kafka_error_message => e)
309
268
  throw e
310
269
  end
311
270
  end
312
-
313
- def set_trustore_keystore_config(props)
314
- props.put("ssl.truststore.type", ssl_truststore_type) unless ssl_truststore_type.nil?
315
- props.put("ssl.truststore.location", ssl_truststore_location)
316
- props.put("ssl.truststore.password", ssl_truststore_password.value) unless ssl_truststore_password.nil?
317
-
318
- # Client auth stuff
319
- props.put("ssl.keystore.type", ssl_keystore_type) unless ssl_keystore_type.nil?
320
- props.put("ssl.key.password", ssl_key_password.value) unless ssl_key_password.nil?
321
- props.put("ssl.keystore.location", ssl_keystore_location) unless ssl_keystore_location.nil?
322
- props.put("ssl.keystore.password", ssl_keystore_password.value) unless ssl_keystore_password.nil?
323
- end
324
-
325
- def set_sasl_config(props)
326
- java.lang.System.setProperty("java.security.auth.login.config",jaas_path) unless jaas_path.nil?
327
- java.lang.System.setProperty("java.security.krb5.conf",kerberos_config) unless kerberos_config.nil?
328
-
329
- props.put("sasl.mechanism",sasl_mechanism)
330
- if sasl_mechanism == "GSSAPI" && sasl_kerberos_service_name.nil?
331
- raise LogStash::ConfigurationError, "sasl_kerberos_service_name must be specified when SASL mechanism is GSSAPI"
332
- end
333
-
334
- props.put("sasl.kerberos.service.name",sasl_kerberos_service_name) unless sasl_kerberos_service_name.nil?
335
- end
336
271
  end #class LogStash::Inputs::Kafka
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-input-kafka'
3
- s.version = '5.1.11'
3
+ s.version = '6.0.0'
4
4
  s.licenses = ['Apache License (2.0)']
5
5
  s.summary = 'This input will read events from a Kafka topic. It uses the high level consumer API provided by Kafka to read messages from the broker'
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -18,8 +18,8 @@ Gem::Specification.new do |s|
18
18
  # Special flag to let us know this is actually a logstash plugin
19
19
  s.metadata = { 'logstash_plugin' => 'true', 'group' => 'input'}
20
20
 
21
- s.requirements << "jar 'org.apache.kafka:kafka-clients', '0.10.0.1'"
22
- s.requirements << "jar 'org.apache.logging.log4j:log4j-slf4j-impl', '2.8.2'"
21
+ s.requirements << "jar 'org.apache.kafka:kafka-clients', '0.10.1.0'"
22
+ s.requirements << "jar 'org.slf4j:slf4j-log4j12', '1.7.21'"
23
23
 
24
24
  s.add_development_dependency 'jar-dependencies', '~> 0.3.2'
25
25
 
@@ -10,9 +10,7 @@ describe "inputs/kafka", :integration => true do
10
10
  let(:group_id_1) {rand(36**8).to_s(36)}
11
11
  let(:group_id_2) {rand(36**8).to_s(36)}
12
12
  let(:group_id_3) {rand(36**8).to_s(36)}
13
- let(:group_id_4) {rand(36**8).to_s(36)}
14
13
  let(:plain_config) { { 'topics' => ['logstash_topic_plain'], 'codec' => 'plain', 'group_id' => group_id_1, 'auto_offset_reset' => 'earliest'} }
15
- let(:multi_consumer_config) { plain_config.merge({"group_id" => group_id_4, "client_id" => "spec", "consumer_threads" => 3}) }
16
14
  let(:snappy_config) { { 'topics' => ['logstash_topic_snappy'], 'codec' => 'plain', 'group_id' => group_id_1, 'auto_offset_reset' => 'earliest'} }
17
15
  let(:lz4_config) { { 'topics' => ['logstash_topic_lz4'], 'codec' => 'plain', 'group_id' => group_id_1, 'auto_offset_reset' => 'earliest'} }
18
16
  let(:pattern_config) { { 'topics_pattern' => 'logstash_topic_.*', 'group_id' => group_id_2, 'codec' => 'plain', 'auto_offset_reset' => 'earliest'} }
@@ -55,21 +53,11 @@ describe "inputs/kafka", :integration => true do
55
53
  wait(timeout_seconds).for { queue.length }.to eq(num_events)
56
54
  expect(queue.length).to eq(num_events)
57
55
  end
58
-
59
- it "should consumer all messages with multiple consumers" do
60
- kafka_input = LogStash::Inputs::Kafka.new(multi_consumer_config)
61
- queue = Array.new
62
- t = thread_it(kafka_input, queue)
63
- t.run
64
- wait(timeout_seconds).for { queue.length }.to eq(num_events)
65
- expect(queue.length).to eq(num_events)
66
- kafka_input.kafka_consumers.each_with_index do |consumer, i|
67
- expect(consumer.metrics.keys.first.tags["client-id"]).to eq("spec-#{i}")
68
- end
69
- end
56
+
70
57
  end
71
58
 
72
59
  describe "#kafka-topics-pattern" do
60
+
73
61
  def thread_it(kafka_input, queue)
74
62
  Thread.new do
75
63
  begin
@@ -77,7 +65,7 @@ describe "inputs/kafka", :integration => true do
77
65
  end
78
66
  end
79
67
  end
80
-
68
+
81
69
  it "should consume all messages from all 3 topics" do
82
70
  kafka_input = LogStash::Inputs::Kafka.new(pattern_config)
83
71
  queue = Array.new
@@ -85,7 +73,7 @@ describe "inputs/kafka", :integration => true do
85
73
  t.run
86
74
  wait(timeout_seconds).for { queue.length }.to eq(3*num_events)
87
75
  expect(queue.length).to eq(3*num_events)
88
- end
76
+ end
89
77
  end
90
78
 
91
79
  describe "#kafka-decorate" do
@@ -96,7 +84,7 @@ describe "inputs/kafka", :integration => true do
96
84
  end
97
85
  end
98
86
  end
99
-
87
+
100
88
  it "should show the right topic and group name in decorated kafka section" do
101
89
  kafka_input = LogStash::Inputs::Kafka.new(decorate_config)
102
90
  queue = Queue.new
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-input-kafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 5.1.11
4
+ version: 6.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elasticsearch
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-08-21 00:00:00.000000000 Z
11
+ date: 2016-11-02 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -138,11 +138,11 @@ files:
138
138
  - logstash-input-kafka.gemspec
139
139
  - spec/integration/inputs/kafka_spec.rb
140
140
  - spec/unit/inputs/kafka_spec.rb
141
- - vendor/jar-dependencies/runtime-jars/kafka-clients-0.10.0.1.jar
142
- - vendor/jar-dependencies/runtime-jars/log4j-api-2.8.2.jar
143
- - vendor/jar-dependencies/runtime-jars/log4j-slf4j-impl-2.8.2.jar
141
+ - vendor/jar-dependencies/runtime-jars/kafka-clients-0.10.1.0.jar
142
+ - vendor/jar-dependencies/runtime-jars/log4j-1.2.17.jar
144
143
  - vendor/jar-dependencies/runtime-jars/lz4-1.3.0.jar
145
144
  - vendor/jar-dependencies/runtime-jars/slf4j-api-1.7.21.jar
145
+ - vendor/jar-dependencies/runtime-jars/slf4j-log4j12-1.7.21.jar
146
146
  - vendor/jar-dependencies/runtime-jars/snappy-java-1.1.2.6.jar
147
147
  homepage: http://www.elastic.co/guide/en/logstash/current/index.html
148
148
  licenses:
@@ -165,8 +165,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
165
165
  - !ruby/object:Gem::Version
166
166
  version: '0'
167
167
  requirements:
168
- - jar 'org.apache.kafka:kafka-clients', '0.10.0.1'
169
- - jar 'org.apache.logging.log4j:log4j-slf4j-impl', '2.8.2'
168
+ - jar 'org.apache.kafka:kafka-clients', '0.10.1.0'
169
+ - jar 'org.slf4j:slf4j-log4j12', '1.7.21'
170
170
  rubyforge_project:
171
171
  rubygems_version: 2.4.8
172
172
  signing_key: