logstash-integration-kafka 10.5.3-java → 10.7.3-java

Sign up to get free protection for your applications and to get access to all the features.
Files changed (29) hide show
  1. checksums.yaml +4 -4
  2. data/CHANGELOG.md +17 -0
  3. data/README.md +1 -1
  4. data/docs/input-kafka.asciidoc +58 -7
  5. data/lib/logstash-integration-kafka_jars.rb +13 -4
  6. data/lib/logstash/inputs/kafka.rb +70 -7
  7. data/lib/logstash/outputs/kafka.rb +0 -1
  8. data/lib/logstash/plugin_mixins/common.rb +93 -0
  9. data/logstash-integration-kafka.gemspec +3 -1
  10. data/spec/integration/inputs/kafka_spec.rb +251 -13
  11. data/spec/unit/inputs/avro_schema_fixture_payment.asvc +8 -0
  12. data/spec/unit/inputs/kafka_spec.rb +38 -5
  13. data/vendor/jar-dependencies/com/github/luben/zstd-jni/1.4.4-7/zstd-jni-1.4.4-7.jar +0 -0
  14. data/vendor/jar-dependencies/io/confluent/common-config/5.5.1/common-config-5.5.1.jar +0 -0
  15. data/vendor/jar-dependencies/io/confluent/common-utils/5.5.1/common-utils-5.5.1.jar +0 -0
  16. data/vendor/jar-dependencies/io/confluent/kafka-avro-serializer/5.5.1/kafka-avro-serializer-5.5.1.jar +0 -0
  17. data/vendor/jar-dependencies/io/confluent/kafka-schema-registry-client/5.5.1/kafka-schema-registry-client-5.5.1.jar +0 -0
  18. data/vendor/jar-dependencies/io/confluent/kafka-schema-serializer/5.5.1/kafka-schema-serializer-5.5.1.jar +0 -0
  19. data/vendor/jar-dependencies/javax/ws/rs/javax.ws.rs-api/2.1.1/javax.ws.rs-api-2.1.1.jar +0 -0
  20. data/vendor/jar-dependencies/org/apache/avro/avro/1.9.2/avro-1.9.2.jar +0 -0
  21. data/vendor/jar-dependencies/org/apache/kafka/kafka-clients/{2.4.1/kafka-clients-2.4.1.jar → 2.5.1/kafka-clients-2.5.1.jar} +0 -0
  22. data/vendor/jar-dependencies/org/apache/kafka/kafka_2.12/2.5.1/kafka_2.12-2.5.1.jar +0 -0
  23. data/vendor/jar-dependencies/org/glassfish/jersey/core/jersey-common/2.33/jersey-common-2.33.jar +0 -0
  24. data/vendor/jar-dependencies/org/lz4/lz4-java/1.7.1/lz4-java-1.7.1.jar +0 -0
  25. data/vendor/jar-dependencies/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar +0 -0
  26. metadata +52 -6
  27. data/vendor/jar-dependencies/com/github/luben/zstd-jni/1.4.3-1/zstd-jni-1.4.3-1.jar +0 -0
  28. data/vendor/jar-dependencies/org/lz4/lz4-java/1.6.0/lz4-java-1.6.0.jar +0 -0
  29. data/vendor/jar-dependencies/org/slf4j/slf4j-api/1.7.28/slf4j-api-1.7.28.jar +0 -0
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4383db6ec7c8fa26ef358d104c490f51620f615afb2f68359b6f6e98d4e58f8b
4
- data.tar.gz: 040637202d15cb1e5784104ff505f10a6610e91187a57f104a8f81cd2b24475a
3
+ metadata.gz: 1049dfcad573e64aed24bebc6114b0345ce6513e43c2104c54ac8e3ed6602e57
4
+ data.tar.gz: 41260002d0a2765b4beeac73bdd8d8f100e0b3e058c1e5dbac4f2a460a29a3d0
5
5
  SHA512:
6
- metadata.gz: 98da085bceebd241a6d45f9166aa4ff1a132551cd2cda8825ceab3c22ebbb5f78579f0d1a0b596aeaf3d504147d656cf2ef784570ef3fd9ab98c792ae6e15be4
7
- data.tar.gz: f552e5ec8d84f3ae7d85b3d4bc4ba4f7309321ea81efaa9bfb69a4a493141868b2576a626f0ee061bb0ebe6cbc8071993dd8592bc53030478baad2b5173e1086
6
+ metadata.gz: 744b4621e51c32ef882c7f5521be467c75805c2ba6ee8eaf225567412fe9e6bade35e60ae7fa85b87de50f68a191f5408b0504025e28071e11eb227ab5f428c8
7
+ data.tar.gz: feb75d5f0da3523e891253213cf52b49460881c125adf2fc82710f96c0feaa3eb4630468bbec7f1182c6a52e44b060362b5dd946b6194c066821a3de4c6e1314
data/CHANGELOG.md CHANGED
@@ -1,3 +1,20 @@
1
+ ## 10.7.3
2
+ - Changed `decorate_events` to add also Kafka headers [#78](https://github.com/logstash-plugins/logstash-integration-kafka/pull/78)
3
+
4
+ ## 10.7.2
5
+ - Update Jersey dependency to version 2.33 [#75](https://github.com/logstash-plugins/logstash-integration-kafka/pull/75)
6
+
7
+ ## 10.7.1
8
+ - Fix: dropped usage of SHUTDOWN event deprecated since Logstash 5.0 [#71](https://github.com/logstash-plugins/logstash-integration-kafka/pull/71)
9
+
10
+ ## 10.7.0
11
+ - Switched use from Faraday to Manticore as HTTP client library to access Schema Registry service
12
+ to fix issue [#63](https://github.com/logstash-plugins/logstash-integration-kafka/pull/63)
13
+
14
+ ## 10.6.0
15
+ - Added functionality to Kafka input to use Avro deserializer in retrieving data from Kafka. The schema is retrieved
16
+ from an instance of Confluent's Schema Registry service [#51](https://github.com/logstash-plugins/logstash-integration-kafka/pull/51)
17
+
1
18
  ## 10.5.3
2
19
  - Fix: set (optional) truststore when endpoint id check disabled [#60](https://github.com/logstash-plugins/logstash-integration-kafka/pull/60).
3
20
  Since **10.1.0** disabling server host-name verification (`ssl_endpoint_identification_algorithm => ""`) did not allow
data/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Logstash Plugin
2
2
 
3
- [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-integration-kafka.svg)](https://travis-ci.org/logstash-plugins/logstash-integration-kafka)
3
+ [![Travis Build Status](https://travis-ci.com/logstash-plugins/logstash-integration-kafka.svg)](https://travis-ci.com/logstash-plugins/logstash-integration-kafka)
4
4
 
5
5
  This is a plugin for [Logstash](https://github.com/elastic/logstash).
6
6
 
@@ -73,7 +73,7 @@ either when the record was created (default) or when it was received by the
73
73
  broker. See more about property log.message.timestamp.type at
74
74
  https://kafka.apache.org/{kafka_client_doc}/documentation.html#brokerconfigs
75
75
 
76
- Metadata is only added to the event if the `decorate_events` option is set to true (it defaults to false).
76
+ Metadata is only added to the event if the `decorate_events` option is set to `basic` or `extended` (it defaults to `none`).
77
77
 
78
78
  Please note that `@metadata` fields are not part of any of your events at output time. If you need these information to be
79
79
  inserted into your original event, you'll have to use the `mutate` filter to manually copy the required fields into your `event`.
@@ -99,7 +99,7 @@ See the https://kafka.apache.org/{kafka_client_doc}/documentation for more detai
99
99
  | <<plugins-{type}s-{plugin}-client_rack>> |<<string,string>>|No
100
100
  | <<plugins-{type}s-{plugin}-connections_max_idle_ms>> |<<number,number>>|No
101
101
  | <<plugins-{type}s-{plugin}-consumer_threads>> |<<number,number>>|No
102
- | <<plugins-{type}s-{plugin}-decorate_events>> |<<boolean,boolean>>|No
102
+ | <<plugins-{type}s-{plugin}-decorate_events>> |<<string,string>>|No
103
103
  | <<plugins-{type}s-{plugin}-enable_auto_commit>> |<<boolean,boolean>>|No
104
104
  | <<plugins-{type}s-{plugin}-exclude_internal_topics>> |<<string,string>>|No
105
105
  | <<plugins-{type}s-{plugin}-fetch_max_bytes>> |<<number,number>>|No
@@ -124,6 +124,10 @@ See the https://kafka.apache.org/{kafka_client_doc}/documentation for more detai
124
124
  | <<plugins-{type}s-{plugin}-sasl_jaas_config>> |<<string,string>>|No
125
125
  | <<plugins-{type}s-{plugin}-sasl_kerberos_service_name>> |<<string,string>>|No
126
126
  | <<plugins-{type}s-{plugin}-sasl_mechanism>> |<<string,string>>|No
127
+ | <<plugins-{type}s-{plugin}-schema_registry_key>> |<<string,string>>|No
128
+ | <<plugins-{type}s-{plugin}-schema_registry_proxy>> |<<uri,uri>>|No
129
+ | <<plugins-{type}s-{plugin}-schema_registry_secret>> |<<string,string>>|No
130
+ | <<plugins-{type}s-{plugin}-schema_registry_url>> |<<uri,uri>>|No
127
131
  | <<plugins-{type}s-{plugin}-security_protocol>> |<<string,string>>, one of `["PLAINTEXT", "SSL", "SASL_PLAINTEXT", "SASL_SSL"]`|No
128
132
  | <<plugins-{type}s-{plugin}-send_buffer_bytes>> |<<number,number>>|No
129
133
  | <<plugins-{type}s-{plugin}-session_timeout_ms>> |<<number,number>>|No
@@ -242,10 +246,16 @@ balance — more threads than partitions means that some threads will be idl
242
246
  [id="plugins-{type}s-{plugin}-decorate_events"]
243
247
  ===== `decorate_events`
244
248
 
245
- * Value type is <<boolean,boolean>>
246
- * Default value is `false`
247
-
248
- Option to add Kafka metadata like topic, message size to the event.
249
+ * Value type is <<string,string>>
250
+ * Accepted values are:
251
+ - `none`: no metadata is added
252
+ - `basic`: record's attributes are added
253
+ - `extended`: record's attributes, headers are added
254
+ - `false`: deprecated alias for `none`
255
+ - `true`: deprecated alias for `basic`
256
+ * Default value is `none`
257
+
258
+ Option to add Kafka metadata like topic, message size and header key values to the event.
249
259
  This will add a field named `kafka` to the logstash event containing the following attributes:
250
260
 
251
261
  * `topic`: The topic this message is associated with
@@ -528,6 +538,44 @@ http://kafka.apache.org/documentation.html#security_sasl[SASL mechanism] used fo
528
538
  This may be any mechanism for which a security provider is available.
529
539
  GSSAPI is the default mechanism.
530
540
 
541
+ [id="plugins-{type}s-{plugin}-schema_registry_key"]
542
+ ===== `schema_registry_key`
543
+
544
+ * Value type is <<string,string>>
545
+ * There is no default value for this setting.
546
+
547
+ Set the username for basic authorization to access remote Schema Registry.
548
+
549
+ [id="plugins-{type}s-{plugin}-schema_registry_proxy"]
550
+ ===== `schema_registry_proxy`
551
+
552
+ * Value type is <<uri,uri>>
553
+ * There is no default value for this setting.
554
+
555
+ Set the address of a forward HTTP proxy. An empty string is treated as if proxy was not set.
556
+
557
+ [id="plugins-{type}s-{plugin}-schema_registry_secret"]
558
+ ===== `schema_registry_secret`
559
+
560
+ * Value type is <<string,string>>
561
+ * There is no default value for this setting.
562
+
563
+ Set the password for basic authorization to access remote Schema Registry.
564
+
565
+ [id="plugins-{type}s-{plugin}-schema_registry_url"]
566
+ ===== `schema_registry_url`
567
+
568
+ * Value type is <<uri,uri>>
569
+
570
+ The URI that points to an instance of the
571
+ https://docs.confluent.io/current/schema-registry/index.html[Schema Registry] service,
572
+ used to manage Avro schemas. Be sure that the Avro schemas for deserializing the data from
573
+ the specified topics have been uploaded to the Schema Registry service.
574
+ The schemas must follow a naming convention with the pattern <topic name>-value.
575
+
576
+ Use either the Schema Registry config option or the
577
+ <<plugins-{type}s-{plugin}-value_deserializer_class>> config option, but not both.
578
+
531
579
  [id="plugins-{type}s-{plugin}-security_protocol"]
532
580
  ===== `security_protocol`
533
581
 
@@ -641,7 +689,10 @@ The topics configuration will be ignored when using this configuration.
641
689
  * Value type is <<string,string>>
642
690
  * Default value is `"org.apache.kafka.common.serialization.StringDeserializer"`
643
691
 
644
- Java Class used to deserialize the record's value
692
+ Java Class used to deserialize the record's value.
693
+ A custom value deserializer can be used only if you are not using a Schema Registry.
694
+ Use either the value_deserializer_class config option or the
695
+ <<plugins-{type}s-{plugin}-schema_registry_url>> config option, but not both.
645
696
 
646
697
  [id="plugins-{type}s-{plugin}-common-options"]
647
698
  include::{include_path}/{type}.asciidoc[]
@@ -1,8 +1,17 @@
1
1
  # AUTOGENERATED BY THE GRADLE SCRIPT. DO NOT EDIT.
2
2
 
3
3
  require 'jar_dependencies'
4
- require_jar('org.apache.kafka', 'kafka-clients', '2.4.1')
5
- require_jar('com.github.luben', 'zstd-jni', '1.4.3-1')
6
- require_jar('org.slf4j', 'slf4j-api', '1.7.28')
7
- require_jar('org.lz4', 'lz4-java', '1.6.0')
4
+ require_jar('io.confluent', 'kafka-avro-serializer', '5.5.1')
5
+ require_jar('io.confluent', 'kafka-schema-serializer', '5.5.1')
6
+ require_jar('io.confluent', 'common-config', '5.5.1')
7
+ require_jar('org.apache.avro', 'avro', '1.9.2')
8
+ require_jar('io.confluent', 'kafka-schema-registry-client', '5.5.1')
9
+ require_jar('org.apache.kafka', 'kafka_2.12', '2.5.1')
10
+ require_jar('io.confluent', 'common-utils', '5.5.1')
11
+ require_jar('javax.ws.rs', 'javax.ws.rs-api', '2.1.1')
12
+ require_jar('org.glassfish.jersey.core', 'jersey-common', '2.33')
13
+ require_jar('org.apache.kafka', 'kafka-clients', '2.5.1')
14
+ require_jar('com.github.luben', 'zstd-jni', '1.4.4-7')
15
+ require_jar('org.slf4j', 'slf4j-api', '1.7.30')
16
+ require_jar('org.lz4', 'lz4-java', '1.7.1')
8
17
  require_jar('org.xerial.snappy', 'snappy-java', '1.1.7.3')
@@ -4,6 +4,11 @@ require 'stud/interval'
4
4
  require 'java'
5
5
  require 'logstash-integration-kafka_jars.rb'
6
6
  require 'logstash/plugin_mixins/kafka_support'
7
+ require 'manticore'
8
+ require "json"
9
+ require "logstash/json"
10
+ require_relative '../plugin_mixins/common'
11
+ require 'logstash/plugin_mixins/deprecation_logger_support'
7
12
 
8
13
  # This input will read events from a Kafka topic. It uses the 0.10 version of
9
14
  # the consumer API provided by Kafka to read messages from the broker.
@@ -50,7 +55,11 @@ require 'logstash/plugin_mixins/kafka_support'
50
55
  #
51
56
  class LogStash::Inputs::Kafka < LogStash::Inputs::Base
52
57
 
58
+ DEFAULT_DESERIALIZER_CLASS = "org.apache.kafka.common.serialization.StringDeserializer"
59
+
53
60
  include LogStash::PluginMixins::KafkaSupport
61
+ include ::LogStash::PluginMixins::KafkaAvroSchemaRegistry
62
+ include LogStash::PluginMixins::DeprecationLoggerSupport
54
63
 
55
64
  config_name 'kafka'
56
65
 
@@ -167,7 +176,7 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
167
176
  # and a rebalance operation is triggered for the group identified by `group_id`
168
177
  config :session_timeout_ms, :validate => :number, :default => 10_000 # (10s) Kafka default
169
178
  # Java Class used to deserialize the record's value
170
- config :value_deserializer_class, :validate => :string, :default => "org.apache.kafka.common.serialization.StringDeserializer"
179
+ config :value_deserializer_class, :validate => :string, :default => DEFAULT_DESERIALIZER_CLASS
171
180
  # A list of topics to subscribe to, defaults to ["logstash"].
172
181
  config :topics, :validate => :array, :default => ["logstash"]
173
182
  # A topic regex pattern to subscribe to.
@@ -226,21 +235,48 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
226
235
  config :sasl_jaas_config, :validate => :string
227
236
  # Optional path to kerberos config file. This is krb5.conf style as detailed in https://web.mit.edu/kerberos/krb5-1.12/doc/admin/conf_files/krb5_conf.html
228
237
  config :kerberos_config, :validate => :path
229
- # Option to add Kafka metadata like topic, message size to the event.
230
- # This will add a field named `kafka` to the logstash event containing the following attributes:
238
+ # Option to add Kafka metadata like topic, message size and header key values to the event.
239
+ # With `basic` this will add a field named `kafka` to the logstash event containing the following attributes:
231
240
  # `topic`: The topic this message is associated with
232
241
  # `consumer_group`: The consumer group used to read in this event
233
242
  # `partition`: The partition this message is associated with
234
243
  # `offset`: The offset from the partition this message is associated with
235
244
  # `key`: A ByteBuffer containing the message key
236
245
  # `timestamp`: The timestamp of this message
237
- config :decorate_events, :validate => :boolean, :default => false
246
+ # While with `extended` it adds also all the key values present in the Kafka header if the key is valid UTF-8 else
247
+ # silently skip it.
248
+ config :decorate_events, :validate => %w(none basic extended false true), :default => "none"
238
249
 
250
+ attr_reader :metadata_mode
239
251
 
240
252
  public
241
253
  def register
242
254
  @runner_threads = []
243
- end # def register
255
+ @metadata_mode = extract_metadata_level(@decorate_events)
256
+ check_schema_registry_parameters
257
+ end
258
+
259
+ METADATA_NONE = Set[].freeze
260
+ METADATA_BASIC = Set[:record_props].freeze
261
+ METADATA_EXTENDED = Set[:record_props, :headers].freeze
262
+ METADATA_DEPRECATION_MAP = { 'true' => 'basic', 'false' => 'none' }
263
+
264
+ private
265
+ def extract_metadata_level(decorate_events_setting)
266
+ metadata_enabled = decorate_events_setting
267
+
268
+ if METADATA_DEPRECATION_MAP.include?(metadata_enabled)
269
+ canonical_value = METADATA_DEPRECATION_MAP[metadata_enabled]
270
+ deprecation_logger.deprecated("Deprecated value `#{decorate_events_setting}` for `decorate_events` option; use `#{canonical_value}` instead.")
271
+ metadata_enabled = canonical_value
272
+ end
273
+
274
+ case metadata_enabled
275
+ when 'none' then METADATA_NONE
276
+ when 'basic' then METADATA_BASIC
277
+ when 'extended' then METADATA_EXTENDED
278
+ end
279
+ end
244
280
 
245
281
  public
246
282
  def run(logstash_queue)
@@ -278,7 +314,14 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
278
314
  for record in records do
279
315
  codec_instance.decode(record.value.to_s) do |event|
280
316
  decorate(event)
281
- if @decorate_events
317
+ if schema_registry_url
318
+ json = LogStash::Json.load(record.value.to_s)
319
+ json.each do |k, v|
320
+ event.set(k, v)
321
+ end
322
+ event.remove("message")
323
+ end
324
+ if @metadata_mode.include?(:record_props)
282
325
  event.set("[@metadata][kafka][topic]", record.topic)
283
326
  event.set("[@metadata][kafka][consumer_group]", @group_id)
284
327
  event.set("[@metadata][kafka][partition]", record.partition)
@@ -286,6 +329,15 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
286
329
  event.set("[@metadata][kafka][key]", record.key)
287
330
  event.set("[@metadata][kafka][timestamp]", record.timestamp)
288
331
  end
332
+ if @metadata_mode.include?(:headers)
333
+ record.headers.each do |header|
334
+ s = String.from_java_bytes(header.value)
335
+ s.force_encoding(Encoding::UTF_8)
336
+ if s.valid_encoding?
337
+ event.set("[@metadata][kafka][headers]["+header.key+"]", s)
338
+ end
339
+ end
340
+ end
289
341
  logstash_queue << event
290
342
  end
291
343
  end
@@ -337,7 +389,18 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
337
389
  props.put(kafka::CLIENT_RACK_CONFIG, client_rack) unless client_rack.nil?
338
390
 
339
391
  props.put("security.protocol", security_protocol) unless security_protocol.nil?
340
-
392
+ if schema_registry_url
393
+ props.put(kafka::VALUE_DESERIALIZER_CLASS_CONFIG, Java::io.confluent.kafka.serializers.KafkaAvroDeserializer.java_class)
394
+ serdes_config = Java::io.confluent.kafka.serializers.AbstractKafkaAvroSerDeConfig
395
+ props.put(serdes_config::SCHEMA_REGISTRY_URL_CONFIG, schema_registry_url.to_s)
396
+ if schema_registry_proxy && !schema_registry_proxy.empty?
397
+ props.put(serdes_config::PROXY_HOST, @schema_registry_proxy_host)
398
+ props.put(serdes_config::PROXY_PORT, @schema_registry_proxy_port)
399
+ end
400
+ if schema_registry_key && !schema_registry_key.empty?
401
+ props.put(serdes_config::USER_INFO_CONFIG, schema_registry_key + ":" + schema_registry_secret.value)
402
+ end
403
+ end
341
404
  if security_protocol == "SSL"
342
405
  set_trustore_keystore_config(props)
343
406
  elsif security_protocol == "SASL_PLAINTEXT"
@@ -224,7 +224,6 @@ class LogStash::Outputs::Kafka < LogStash::Outputs::Base
224
224
  end
225
225
 
226
226
  events.each do |event|
227
- break if event == LogStash::SHUTDOWN
228
227
  @codec.encode(event)
229
228
  end
230
229
 
@@ -0,0 +1,93 @@
1
+ module LogStash
2
+ module PluginMixins
3
+ module KafkaAvroSchemaRegistry
4
+
5
+ def self.included(base)
6
+ base.extend(self)
7
+ base.setup_schema_registry_config
8
+ end
9
+
10
+ def setup_schema_registry_config
11
+ # Option to set key to access Schema Registry.
12
+ config :schema_registry_key, :validate => :string
13
+
14
+ # Option to set secret to access Schema Registry.
15
+ config :schema_registry_secret, :validate => :password
16
+
17
+ # Option to set the endpoint of the Schema Registry.
18
+ # This option permit the usage of Avro Kafka deserializer which retrieve the schema of the Avro message from an
19
+ # instance of schema registry. If this option has value `value_deserializer_class` nor `topics_pattern` could be valued
20
+ config :schema_registry_url, :validate => :uri
21
+
22
+ # Option to set the proxy of the Schema Registry.
23
+ # This option permits to define a proxy to be used to reach the schema registry service instance.
24
+ config :schema_registry_proxy, :validate => :uri
25
+ end
26
+
27
+ def check_schema_registry_parameters
28
+ if @schema_registry_url
29
+ check_for_schema_registry_conflicts
30
+ @schema_registry_proxy_host, @schema_registry_proxy_port = split_proxy_into_host_and_port(schema_registry_proxy)
31
+ check_for_key_and_secret
32
+ check_for_schema_registry_connectivity_and_subjects
33
+ end
34
+ end
35
+
36
+ private
37
+ def check_for_schema_registry_conflicts
38
+ if @value_deserializer_class != LogStash::Inputs::Kafka::DEFAULT_DESERIALIZER_CLASS
39
+ raise LogStash::ConfigurationError, 'Option schema_registry_url prohibit the customization of value_deserializer_class'
40
+ end
41
+ if @topics_pattern && !@topics_pattern.empty?
42
+ raise LogStash::ConfigurationError, 'Option schema_registry_url prohibit the customization of topics_pattern'
43
+ end
44
+ end
45
+
46
+ private
47
+ def check_for_schema_registry_connectivity_and_subjects
48
+ options = {}
49
+ if schema_registry_proxy && !schema_registry_proxy.empty?
50
+ options[:proxy] = schema_registry_proxy.to_s
51
+ end
52
+ if schema_registry_key and !schema_registry_key.empty?
53
+ options[:auth] = {:user => schema_registry_key, :password => schema_registry_secret.value}
54
+ end
55
+ client = Manticore::Client.new(options)
56
+
57
+ begin
58
+ response = client.get(@schema_registry_url.to_s + '/subjects').body
59
+ rescue Manticore::ManticoreException => e
60
+ raise LogStash::ConfigurationError.new("Schema registry service doesn't respond, error: #{e.message}")
61
+ end
62
+ registered_subjects = JSON.parse response
63
+ expected_subjects = @topics.map { |t| "#{t}-value"}
64
+ if (expected_subjects & registered_subjects).size != expected_subjects.size
65
+ undefined_topic_subjects = expected_subjects - registered_subjects
66
+ raise LogStash::ConfigurationError, "The schema registry does not contain definitions for required topic subjects: #{undefined_topic_subjects}"
67
+ end
68
+ end
69
+
70
+ def split_proxy_into_host_and_port(proxy_uri)
71
+ return nil unless proxy_uri && !proxy_uri.empty?
72
+
73
+ port = proxy_uri.port
74
+
75
+ host_spec = ""
76
+ host_spec << proxy_uri.scheme || "http"
77
+ host_spec << "://"
78
+ host_spec << "#{proxy_uri.userinfo}@" if proxy_uri.userinfo
79
+ host_spec << proxy_uri.host
80
+
81
+ [host_spec, port]
82
+ end
83
+
84
+ def check_for_key_and_secret
85
+ if schema_registry_key and !schema_registry_key.empty?
86
+ if !schema_registry_secret or schema_registry_secret.value.empty?
87
+ raise LogStash::ConfigurationError, "Setting `schema_registry_secret` is required when `schema_registry_key` is provided."
88
+ end
89
+ end
90
+ end
91
+ end
92
+ end
93
+ end
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-integration-kafka'
3
- s.version = '10.5.3'
3
+ s.version = '10.7.3'
4
4
  s.licenses = ['Apache-2.0']
5
5
  s.summary = "Integration with Kafka - input and output plugins"
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline "+
@@ -46,6 +46,8 @@ Gem::Specification.new do |s|
46
46
  s.add_runtime_dependency 'logstash-codec-json'
47
47
  s.add_runtime_dependency 'logstash-codec-plain'
48
48
  s.add_runtime_dependency 'stud', '>= 0.0.22', '< 0.1.0'
49
+ s.add_runtime_dependency "manticore", '>= 0.5.4', '< 1.0.0'
50
+ s.add_runtime_dependency 'logstash-mixin-deprecation_logger_support', '~>1.0'
49
51
 
50
52
  s.add_development_dependency 'logstash-devutils'
51
53
  s.add_development_dependency 'rspec-wait'
@@ -2,6 +2,9 @@
2
2
  require "logstash/devutils/rspec/spec_helper"
3
3
  require "logstash/inputs/kafka"
4
4
  require "rspec/wait"
5
+ require "stud/try"
6
+ require "manticore"
7
+ require "json"
5
8
 
6
9
  # Please run kafka_test_setup.sh prior to executing this integration test.
7
10
  describe "inputs/kafka", :integration => true do
@@ -33,7 +36,15 @@ describe "inputs/kafka", :integration => true do
33
36
  end
34
37
  let(:decorate_config) do
35
38
  { 'topics' => ['logstash_integration_topic_plain'], 'codec' => 'plain', 'group_id' => group_id_3,
36
- 'auto_offset_reset' => 'earliest', 'decorate_events' => true }
39
+ 'auto_offset_reset' => 'earliest', 'decorate_events' => 'true' }
40
+ end
41
+ let(:decorate_headers_config) do
42
+ { 'topics' => ['logstash_integration_topic_plain_with_headers'], 'codec' => 'plain', 'group_id' => group_id_3,
43
+ 'auto_offset_reset' => 'earliest', 'decorate_events' => 'extended' }
44
+ end
45
+ let(:decorate_bad_headers_config) do
46
+ { 'topics' => ['logstash_integration_topic_plain_with_headers_badly'], 'codec' => 'plain', 'group_id' => group_id_3,
47
+ 'auto_offset_reset' => 'earliest', 'decorate_events' => 'extended' }
37
48
  end
38
49
  let(:manual_commit_config) do
39
50
  { 'topics' => ['logstash_integration_topic_plain'], 'codec' => 'plain', 'group_id' => group_id_5,
@@ -42,6 +53,35 @@ describe "inputs/kafka", :integration => true do
42
53
  let(:timeout_seconds) { 30 }
43
54
  let(:num_events) { 103 }
44
55
 
56
+ before(:all) do
57
+ # Prepare message with headers with valid UTF-8 chars
58
+ header = org.apache.kafka.common.header.internals.RecordHeader.new("name", "John ανδρεα €".to_java_bytes)
59
+ record = org.apache.kafka.clients.producer.ProducerRecord.new(
60
+ "logstash_integration_topic_plain_with_headers", 0, "key", "value", [header])
61
+ send_message(record)
62
+
63
+ # Prepare message with headers with invalid UTF-8 chars
64
+ invalid = "日本".encode('Shift_JIS').force_encoding(Encoding::UTF_8).to_java_bytes
65
+ header = org.apache.kafka.common.header.internals.RecordHeader.new("name", invalid)
66
+ record = org.apache.kafka.clients.producer.ProducerRecord.new(
67
+ "logstash_integration_topic_plain_with_headers_badly", 0, "key", "value", [header])
68
+
69
+ send_message(record)
70
+ end
71
+
72
+ def send_message(record)
73
+ props = java.util.Properties.new
74
+ kafka = org.apache.kafka.clients.producer.ProducerConfig
75
+ props.put(kafka::BOOTSTRAP_SERVERS_CONFIG, "localhost:9092")
76
+ props.put(kafka::KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
77
+ props.put(kafka::VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
78
+
79
+ producer = org.apache.kafka.clients.producer.KafkaProducer.new(props)
80
+
81
+ producer.send(record)
82
+ producer.close
83
+ end
84
+
45
85
  describe "#kafka-topics" do
46
86
 
47
87
  it "should consume all messages from plain 3-partition topic" do
@@ -71,7 +111,7 @@ describe "inputs/kafka", :integration => true do
71
111
 
72
112
  context "#kafka-topics-pattern" do
73
113
  it "should consume all messages from all 3 topics" do
74
- total_events = num_events * 3
114
+ total_events = num_events * 3 + 2
75
115
  queue = consume_messages(pattern_config, timeout: timeout_seconds, event_count: total_events)
76
116
  expect(queue.length).to eq(total_events)
77
117
  end
@@ -88,6 +128,31 @@ describe "inputs/kafka", :integration => true do
88
128
  expect(event.get("[@metadata][kafka][timestamp]")).to be >= start
89
129
  end
90
130
  end
131
+
132
+ it "should show the right topic and group name in and kafka headers decorated kafka section" do
133
+ start = LogStash::Timestamp.now.time.to_i
134
+ consume_messages(decorate_headers_config, timeout: timeout_seconds, event_count: 1) do |queue, _|
135
+ expect(queue.length).to eq(1)
136
+ event = queue.shift
137
+ expect(event.get("[@metadata][kafka][topic]")).to eq("logstash_integration_topic_plain_with_headers")
138
+ expect(event.get("[@metadata][kafka][consumer_group]")).to eq(group_id_3)
139
+ expect(event.get("[@metadata][kafka][timestamp]")).to be >= start
140
+ expect(event.get("[@metadata][kafka][headers][name]")).to eq("John ανδρεα €")
141
+ end
142
+ end
143
+
144
+ it "should skip headers not encoded in UTF-8" do
145
+ start = LogStash::Timestamp.now.time.to_i
146
+ consume_messages(decorate_bad_headers_config, timeout: timeout_seconds, event_count: 1) do |queue, _|
147
+ expect(queue.length).to eq(1)
148
+ event = queue.shift
149
+ expect(event.get("[@metadata][kafka][topic]")).to eq("logstash_integration_topic_plain_with_headers_badly")
150
+ expect(event.get("[@metadata][kafka][consumer_group]")).to eq(group_id_3)
151
+ expect(event.get("[@metadata][kafka][timestamp]")).to be >= start
152
+
153
+ expect(event.include?("[@metadata][kafka][headers][name]")).to eq(false)
154
+ end
155
+ end
91
156
  end
92
157
 
93
158
  context "#kafka-offset-commit" do
@@ -120,20 +185,193 @@ describe "inputs/kafka", :integration => true do
120
185
  end
121
186
  end
122
187
  end
188
+ end
189
+
190
+ private
191
+
192
+ def consume_messages(config, queue: Queue.new, timeout:, event_count:)
193
+ kafka_input = LogStash::Inputs::Kafka.new(config)
194
+ kafka_input.register
195
+ t = Thread.new { kafka_input.run(queue) }
196
+ begin
197
+ t.run
198
+ wait(timeout).for { queue.length }.to eq(event_count) unless timeout.eql?(false)
199
+ block_given? ? yield(queue, kafka_input) : queue
200
+ ensure
201
+ t.kill
202
+ t.join(30_000)
203
+ end
204
+ end
123
205
 
124
- private
125
206
 
126
- def consume_messages(config, queue: Queue.new, timeout:, event_count:)
127
- kafka_input = LogStash::Inputs::Kafka.new(config)
128
- t = Thread.new { kafka_input.run(queue) }
129
- begin
130
- t.run
131
- wait(timeout).for { queue.length }.to eq(event_count) unless timeout.eql?(false)
132
- block_given? ? yield(queue, kafka_input) : queue
133
- ensure
134
- t.kill
135
- t.join(30_000)
207
+ describe "schema registry connection options" do
208
+ context "remote endpoint validation" do
209
+ it "should fail if not reachable" do
210
+ config = {'schema_registry_url' => 'http://localnothost:8081'}
211
+ kafka_input = LogStash::Inputs::Kafka.new(config)
212
+ expect { kafka_input.register }.to raise_error LogStash::ConfigurationError, /Schema registry service doesn't respond.*/
213
+ end
214
+
215
+ it "should fail if any topic is not matched by a subject on the schema registry" do
216
+ config = {
217
+ 'schema_registry_url' => 'http://localhost:8081',
218
+ 'topics' => ['temperature_stream']
219
+ }
220
+
221
+ kafka_input = LogStash::Inputs::Kafka.new(config)
222
+ expect { kafka_input.register }.to raise_error LogStash::ConfigurationError, /The schema registry does not contain definitions for required topic subjects: \["temperature_stream-value"\]/
223
+ end
224
+
225
+ context "register with subject present" do
226
+ SUBJECT_NAME = "temperature_stream-value"
227
+
228
+ before(:each) do
229
+ response = save_avro_schema_to_schema_registry(File.join(Dir.pwd, "spec", "unit", "inputs", "avro_schema_fixture_payment.asvc"), SUBJECT_NAME)
230
+ expect( response.code ).to be(200)
231
+ end
232
+
233
+ after(:each) do
234
+ schema_registry_client = Manticore::Client.new
235
+ delete_remote_schema(schema_registry_client, SUBJECT_NAME)
236
+ end
237
+
238
+ it "should correctly complete registration phase" do
239
+ config = {
240
+ 'schema_registry_url' => 'http://localhost:8081',
241
+ 'topics' => ['temperature_stream']
242
+ }
243
+ kafka_input = LogStash::Inputs::Kafka.new(config)
244
+ kafka_input.register
245
+ end
136
246
  end
137
247
  end
248
+ end
138
249
 
250
+ def save_avro_schema_to_schema_registry(schema_file, subject_name)
251
+ raw_schema = File.readlines(schema_file).map(&:chomp).join
252
+ raw_schema_quoted = raw_schema.gsub('"', '\"')
253
+ response = Manticore.post("http://localhost:8081/subjects/#{subject_name}/versions",
254
+ body: '{"schema": "' + raw_schema_quoted + '"}',
255
+ headers: {"Content-Type" => "application/vnd.schemaregistry.v1+json"})
256
+ response
139
257
  end
258
+
259
+ def delete_remote_schema(schema_registry_client, subject_name)
260
+ expect(schema_registry_client.delete("http://localhost:8081/subjects/#{subject_name}").code ).to be(200)
261
+ expect(schema_registry_client.delete("http://localhost:8081/subjects/#{subject_name}?permanent=true").code ).to be(200)
262
+ end
263
+
264
+ # AdminClientConfig = org.alpache.kafka.clients.admin.AdminClientConfig
265
+
266
+ describe "Schema registry API", :integration => true do
267
+
268
+ let(:schema_registry) { Manticore::Client.new }
269
+
270
+ context 'listing subject on clean instance' do
271
+ it "should return an empty set" do
272
+ subjects = JSON.parse schema_registry.get('http://localhost:8081/subjects').body
273
+ expect( subjects ).to be_empty
274
+ end
275
+ end
276
+
277
+ context 'send a schema definition' do
278
+ it "save the definition" do
279
+ response = save_avro_schema_to_schema_registry(File.join(Dir.pwd, "spec", "unit", "inputs", "avro_schema_fixture_payment.asvc"), "schema_test_1")
280
+ expect( response.code ).to be(200)
281
+ delete_remote_schema(schema_registry, "schema_test_1")
282
+ end
283
+
284
+ it "delete the schema just added" do
285
+ response = save_avro_schema_to_schema_registry(File.join(Dir.pwd, "spec", "unit", "inputs", "avro_schema_fixture_payment.asvc"), "schema_test_1")
286
+ expect( response.code ).to be(200)
287
+
288
+ expect( schema_registry.delete('http://localhost:8081/subjects/schema_test_1?permanent=false').code ).to be(200)
289
+ sleep(1)
290
+ subjects = JSON.parse schema_registry.get('http://localhost:8081/subjects').body
291
+ expect( subjects ).to be_empty
292
+ end
293
+ end
294
+
295
+ context 'use the schema to serialize' do
296
+ after(:each) do
297
+ expect( schema_registry.delete('http://localhost:8081/subjects/topic_avro-value').code ).to be(200)
298
+ sleep 1
299
+ expect( schema_registry.delete('http://localhost:8081/subjects/topic_avro-value?permanent=true').code ).to be(200)
300
+
301
+ Stud.try(3.times, [StandardError, RSpec::Expectations::ExpectationNotMetError]) do
302
+ wait(10).for do
303
+ subjects = JSON.parse schema_registry.get('http://localhost:8081/subjects').body
304
+ subjects.empty?
305
+ end.to be_truthy
306
+ end
307
+ end
308
+
309
+ let(:group_id_1) {rand(36**8).to_s(36)}
310
+
311
+ let(:avro_topic_name) { "topic_avro" }
312
+
313
+ let(:plain_config) do
314
+ { 'schema_registry_url' => 'http://localhost:8081',
315
+ 'topics' => [avro_topic_name],
316
+ 'codec' => 'plain',
317
+ 'group_id' => group_id_1,
318
+ 'auto_offset_reset' => 'earliest' }
319
+ end
320
+
321
+ def delete_topic_if_exists(topic_name)
322
+ props = java.util.Properties.new
323
+ props.put(Java::org.apache.kafka.clients.admin.AdminClientConfig::BOOTSTRAP_SERVERS_CONFIG, "localhost:9092")
324
+
325
+ admin_client = org.apache.kafka.clients.admin.AdminClient.create(props)
326
+ topics_list = admin_client.listTopics().names().get()
327
+ if topics_list.contains(topic_name)
328
+ result = admin_client.deleteTopics([topic_name])
329
+ result.values.get(topic_name).get()
330
+ end
331
+ end
332
+
333
+ def write_some_data_to(topic_name)
334
+ props = java.util.Properties.new
335
+ config = org.apache.kafka.clients.producer.ProducerConfig
336
+
337
+ serdes_config = Java::io.confluent.kafka.serializers.AbstractKafkaAvroSerDeConfig
338
+ props.put(serdes_config::SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081")
339
+
340
+ props.put(config::BOOTSTRAP_SERVERS_CONFIG, "localhost:9092")
341
+ props.put(config::KEY_SERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringSerializer.java_class)
342
+ props.put(config::VALUE_SERIALIZER_CLASS_CONFIG, Java::io.confluent.kafka.serializers.KafkaAvroSerializer.java_class)
343
+
344
+ parser = org.apache.avro.Schema::Parser.new()
345
+ user_schema = '''{"type":"record",
346
+ "name":"myrecord",
347
+ "fields":[
348
+ {"name":"str_field", "type": "string"},
349
+ {"name":"map_field", "type": {"type": "map", "values": "string"}}
350
+ ]}'''
351
+ schema = parser.parse(user_schema)
352
+ avro_record = org.apache.avro.generic.GenericData::Record.new(schema)
353
+ avro_record.put("str_field", "value1")
354
+ avro_record.put("map_field", {"inner_field" => "inner value"})
355
+
356
+ producer = org.apache.kafka.clients.producer.KafkaProducer.new(props)
357
+ record = org.apache.kafka.clients.producer.ProducerRecord.new(topic_name, "avro_key", avro_record)
358
+ producer.send(record)
359
+ end
360
+
361
+ it "stored a new schema using Avro Kafka serdes" do
362
+ delete_topic_if_exists avro_topic_name
363
+ write_some_data_to avro_topic_name
364
+
365
+ subjects = JSON.parse schema_registry.get('http://localhost:8081/subjects').body
366
+ expect( subjects ).to contain_exactly("topic_avro-value")
367
+
368
+ num_events = 1
369
+ queue = consume_messages(plain_config, timeout: 30, event_count: num_events)
370
+ expect(queue.length).to eq(num_events)
371
+ elem = queue.pop
372
+ expect( elem.to_hash).not_to include("message")
373
+ expect( elem.get("str_field") ).to eq("value1")
374
+ expect( elem.get("map_field")["inner_field"] ).to eq("inner value")
375
+ end
376
+ end
377
+ end
@@ -0,0 +1,8 @@
1
+ {"namespace": "io.confluent.examples.clients.basicavro",
2
+ "type": "record",
3
+ "name": "Payment",
4
+ "fields": [
5
+ {"name": "id", "type": "string"},
6
+ {"name": "amount", "type": "double"}
7
+ ]
8
+ }
@@ -37,8 +37,41 @@ describe LogStash::Inputs::Kafka do
37
37
  expect { subject.register }.to_not raise_error
38
38
  end
39
39
 
40
+ context "register parameter verification" do
41
+ context "schema_registry_url" do
42
+ let(:config) do
43
+ { 'schema_registry_url' => 'http://localhost:8081', 'topics' => ['logstash'], 'consumer_threads' => 4 }
44
+ end
45
+
46
+ it "conflict with value_deserializer_class should fail" do
47
+ config['value_deserializer_class'] = 'my.fantasy.Deserializer'
48
+ expect { subject.register }.to raise_error LogStash::ConfigurationError, /Option schema_registry_url prohibit the customization of value_deserializer_class/
49
+ end
50
+
51
+ it "conflict with topics_pattern should fail" do
52
+ config['topics_pattern'] = 'topic_.*'
53
+ expect { subject.register }.to raise_error LogStash::ConfigurationError, /Option schema_registry_url prohibit the customization of topics_pattern/
54
+ end
55
+ end
56
+
57
+ context "decorate_events" do
58
+ let(:config) { { 'decorate_events' => 'extended'} }
59
+
60
+ it "should raise error for invalid value" do
61
+ config['decorate_events'] = 'avoid'
62
+ expect { subject.register }.to raise_error LogStash::ConfigurationError, /Something is wrong with your configuration./
63
+ end
64
+
65
+ it "should map old true boolean value to :record_props mode" do
66
+ config['decorate_events'] = "true"
67
+ subject.register
68
+ expect(subject.metadata_mode).to include(:record_props)
69
+ end
70
+ end
71
+ end
72
+
40
73
  context 'with client_rack' do
41
- let(:config) { super.merge('client_rack' => 'EU-R1') }
74
+ let(:config) { super().merge('client_rack' => 'EU-R1') }
42
75
 
43
76
  it "sets broker rack parameter" do
44
77
  expect(org.apache.kafka.clients.consumer.KafkaConsumer).
@@ -50,7 +83,7 @@ describe LogStash::Inputs::Kafka do
50
83
  end
51
84
 
52
85
  context 'string integer config' do
53
- let(:config) { super.merge('session_timeout_ms' => '25000', 'max_poll_interval_ms' => '345000') }
86
+ let(:config) { super().merge('session_timeout_ms' => '25000', 'max_poll_interval_ms' => '345000') }
54
87
 
55
88
  it "sets integer values" do
56
89
  expect(org.apache.kafka.clients.consumer.KafkaConsumer).
@@ -62,7 +95,7 @@ describe LogStash::Inputs::Kafka do
62
95
  end
63
96
 
64
97
  context 'integer config' do
65
- let(:config) { super.merge('session_timeout_ms' => 25200, 'max_poll_interval_ms' => 123_000) }
98
+ let(:config) { super().merge('session_timeout_ms' => 25200, 'max_poll_interval_ms' => 123_000) }
66
99
 
67
100
  it "sets integer values" do
68
101
  expect(org.apache.kafka.clients.consumer.KafkaConsumer).
@@ -74,7 +107,7 @@ describe LogStash::Inputs::Kafka do
74
107
  end
75
108
 
76
109
  context 'string boolean config' do
77
- let(:config) { super.merge('enable_auto_commit' => 'false', 'check_crcs' => 'true') }
110
+ let(:config) { super().merge('enable_auto_commit' => 'false', 'check_crcs' => 'true') }
78
111
 
79
112
  it "sets parameters" do
80
113
  expect(org.apache.kafka.clients.consumer.KafkaConsumer).
@@ -87,7 +120,7 @@ describe LogStash::Inputs::Kafka do
87
120
  end
88
121
 
89
122
  context 'boolean config' do
90
- let(:config) { super.merge('enable_auto_commit' => true, 'check_crcs' => false) }
123
+ let(:config) { super().merge('enable_auto_commit' => true, 'check_crcs' => false) }
91
124
 
92
125
  it "sets parameters" do
93
126
  expect(org.apache.kafka.clients.consumer.KafkaConsumer).
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-integration-kafka
3
3
  version: !ruby/object:Gem::Version
4
- version: 10.5.3
4
+ version: 10.7.3
5
5
  platform: java
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2020-10-21 00:00:00.000000000 Z
11
+ date: 2021-03-26 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -106,6 +106,40 @@ dependencies:
106
106
  - - "<"
107
107
  - !ruby/object:Gem::Version
108
108
  version: 0.1.0
109
+ - !ruby/object:Gem::Dependency
110
+ requirement: !ruby/object:Gem::Requirement
111
+ requirements:
112
+ - - ">="
113
+ - !ruby/object:Gem::Version
114
+ version: 0.5.4
115
+ - - "<"
116
+ - !ruby/object:Gem::Version
117
+ version: 1.0.0
118
+ name: manticore
119
+ prerelease: false
120
+ type: :runtime
121
+ version_requirements: !ruby/object:Gem::Requirement
122
+ requirements:
123
+ - - ">="
124
+ - !ruby/object:Gem::Version
125
+ version: 0.5.4
126
+ - - "<"
127
+ - !ruby/object:Gem::Version
128
+ version: 1.0.0
129
+ - !ruby/object:Gem::Dependency
130
+ requirement: !ruby/object:Gem::Requirement
131
+ requirements:
132
+ - - "~>"
133
+ - !ruby/object:Gem::Version
134
+ version: '1.0'
135
+ name: logstash-mixin-deprecation_logger_support
136
+ prerelease: false
137
+ type: :runtime
138
+ version_requirements: !ruby/object:Gem::Requirement
139
+ requirements:
140
+ - - "~>"
141
+ - !ruby/object:Gem::Version
142
+ version: '1.0'
109
143
  - !ruby/object:Gem::Dependency
110
144
  requirement: !ruby/object:Gem::Requirement
111
145
  requirements:
@@ -183,17 +217,28 @@ files:
183
217
  - lib/logstash-integration-kafka_jars.rb
184
218
  - lib/logstash/inputs/kafka.rb
185
219
  - lib/logstash/outputs/kafka.rb
220
+ - lib/logstash/plugin_mixins/common.rb
186
221
  - lib/logstash/plugin_mixins/kafka_support.rb
187
222
  - logstash-integration-kafka.gemspec
188
223
  - spec/fixtures/trust-store_stub.jks
189
224
  - spec/integration/inputs/kafka_spec.rb
190
225
  - spec/integration/outputs/kafka_spec.rb
226
+ - spec/unit/inputs/avro_schema_fixture_payment.asvc
191
227
  - spec/unit/inputs/kafka_spec.rb
192
228
  - spec/unit/outputs/kafka_spec.rb
193
- - vendor/jar-dependencies/com/github/luben/zstd-jni/1.4.3-1/zstd-jni-1.4.3-1.jar
194
- - vendor/jar-dependencies/org/apache/kafka/kafka-clients/2.4.1/kafka-clients-2.4.1.jar
195
- - vendor/jar-dependencies/org/lz4/lz4-java/1.6.0/lz4-java-1.6.0.jar
196
- - vendor/jar-dependencies/org/slf4j/slf4j-api/1.7.28/slf4j-api-1.7.28.jar
229
+ - vendor/jar-dependencies/com/github/luben/zstd-jni/1.4.4-7/zstd-jni-1.4.4-7.jar
230
+ - vendor/jar-dependencies/io/confluent/common-config/5.5.1/common-config-5.5.1.jar
231
+ - vendor/jar-dependencies/io/confluent/common-utils/5.5.1/common-utils-5.5.1.jar
232
+ - vendor/jar-dependencies/io/confluent/kafka-avro-serializer/5.5.1/kafka-avro-serializer-5.5.1.jar
233
+ - vendor/jar-dependencies/io/confluent/kafka-schema-registry-client/5.5.1/kafka-schema-registry-client-5.5.1.jar
234
+ - vendor/jar-dependencies/io/confluent/kafka-schema-serializer/5.5.1/kafka-schema-serializer-5.5.1.jar
235
+ - vendor/jar-dependencies/javax/ws/rs/javax.ws.rs-api/2.1.1/javax.ws.rs-api-2.1.1.jar
236
+ - vendor/jar-dependencies/org/apache/avro/avro/1.9.2/avro-1.9.2.jar
237
+ - vendor/jar-dependencies/org/apache/kafka/kafka-clients/2.5.1/kafka-clients-2.5.1.jar
238
+ - vendor/jar-dependencies/org/apache/kafka/kafka_2.12/2.5.1/kafka_2.12-2.5.1.jar
239
+ - vendor/jar-dependencies/org/glassfish/jersey/core/jersey-common/2.33/jersey-common-2.33.jar
240
+ - vendor/jar-dependencies/org/lz4/lz4-java/1.7.1/lz4-java-1.7.1.jar
241
+ - vendor/jar-dependencies/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar
197
242
  - vendor/jar-dependencies/org/xerial/snappy/snappy-java/1.1.7.3/snappy-java-1.1.7.3.jar
198
243
  homepage: http://www.elastic.co/guide/en/logstash/current/index.html
199
244
  licenses:
@@ -227,5 +272,6 @@ test_files:
227
272
  - spec/fixtures/trust-store_stub.jks
228
273
  - spec/integration/inputs/kafka_spec.rb
229
274
  - spec/integration/outputs/kafka_spec.rb
275
+ - spec/unit/inputs/avro_schema_fixture_payment.asvc
230
276
  - spec/unit/inputs/kafka_spec.rb
231
277
  - spec/unit/outputs/kafka_spec.rb