logstash-integration-kafka 12.0.5-java → 12.1.0-java
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +9 -1
- data/docs/index.asciidoc +1 -1
- data/docs/input-kafka.asciidoc +3 -2
- data/docs/output-kafka.asciidoc +20 -2
- data/lib/logstash/inputs/kafka.rb +2 -0
- data/lib/logstash-integration-kafka_jars.rb +8 -8
- data/logstash-integration-kafka.gemspec +1 -1
- data/spec/fixtures/kafka_sasl_config.properties +35 -0
- data/spec/integration/inputs/kafka_spec.rb +56 -34
- data/spec/unit/inputs/kafka_spec.rb +24 -0
- data/vendor/jar-dependencies/io/confluent/common-utils/{8.1.1/common-utils-8.1.1.jar → 8.2.0/common-utils-8.2.0.jar} +0 -0
- data/vendor/jar-dependencies/io/confluent/kafka-avro-serializer/8.2.0/kafka-avro-serializer-8.2.0.jar +0 -0
- data/vendor/jar-dependencies/io/confluent/kafka-schema-registry-client/8.2.0/kafka-schema-registry-client-8.2.0.jar +0 -0
- data/vendor/jar-dependencies/io/confluent/kafka-schema-serializer/8.2.0/kafka-schema-serializer-8.2.0.jar +0 -0
- data/vendor/jar-dependencies/io/confluent/logredactor/{1.0.13/logredactor-1.0.13.jar → 1.0.17/logredactor-1.0.17.jar} +0 -0
- data/vendor/jar-dependencies/io/confluent/logredactor-metrics/1.0.17/logredactor-metrics-1.0.17.jar +0 -0
- data/vendor/jar-dependencies/org/apache/kafka/kafka-clients/{4.1.0/kafka-clients-4.1.0.jar → 4.2.0/kafka-clients-4.2.0.jar} +0 -0
- metadata +11 -9
- data/vendor/jar-dependencies/io/confluent/kafka-avro-serializer/8.1.1/kafka-avro-serializer-8.1.1.jar +0 -0
- data/vendor/jar-dependencies/io/confluent/kafka-schema-registry-client/8.1.1/kafka-schema-registry-client-8.1.1.jar +0 -0
- data/vendor/jar-dependencies/io/confluent/kafka-schema-serializer/8.1.1/kafka-schema-serializer-8.1.1.jar +0 -0
- data/vendor/jar-dependencies/io/confluent/logredactor-metrics/1.0.13/logredactor-metrics-1.0.13.jar +0 -0
checksums.yaml
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
---
|
|
2
2
|
SHA256:
|
|
3
|
-
metadata.gz:
|
|
4
|
-
data.tar.gz:
|
|
3
|
+
metadata.gz: 7f5d9772f13ac08019642f4ead686f5bf0b0cbfef80c7a9ecb4115cd0c961051
|
|
4
|
+
data.tar.gz: '0935e12f42e2aa0d28725b41d16f9705e98edafe8d1977e62bf248f7f52912e4'
|
|
5
5
|
SHA512:
|
|
6
|
-
metadata.gz:
|
|
7
|
-
data.tar.gz:
|
|
6
|
+
metadata.gz: 43a3515e8aa880cf55e5d94d114c42f8a91c2c702cc4e4ed472e61fe313c7d589e2c048c24844003c1be953bcebaab6cc3f06001a7a87a780b71ecb6ede4cf3d
|
|
7
|
+
data.tar.gz: 74bda533606959d6631ae5b18ed81aa2165b69880adb31038ae5dfe84199a80994c234c759b616e7d575fac1649d55b7da08ba7daf59bc0ef003a144f33dc338
|
data/CHANGELOG.md
CHANGED
|
@@ -1,3 +1,11 @@
|
|
|
1
|
+
## 12.1.0
|
|
2
|
+
- Update Kafka client to 4.2.0 [#243](https://github.com/logstash-plugins/logstash-integration-kafka/pull/243)
|
|
3
|
+
- Remove explicit `lz4-java` dependency (now transitive from Kafka client)
|
|
4
|
+
- Document `by_duration` offset reset strategy (available since Apache Kafka 4.0.0)
|
|
5
|
+
|
|
6
|
+
## 12.0.6
|
|
7
|
+
- [DOC] Add info about Kafka timestamp behavior [#240](https://github.com/logstash-plugins/logstash-integration-kafka/pull/240)
|
|
8
|
+
|
|
1
9
|
## 12.0.5
|
|
2
10
|
- Redact `sasl_jaas_config` to prevent credentials from appearing in debug logs. [#232](https://github.com/logstash-plugins/logstash-integration-kafka/pull/232)
|
|
3
11
|
|
|
@@ -18,7 +26,7 @@
|
|
|
18
26
|
- Breaking Change: partitioner options `default` and `uniform_sticky` are removed
|
|
19
27
|
- `linger_ms` default value changed from 0 to 5
|
|
20
28
|
- Add `group_protocols` options for configuring Kafka consumer rebalance protocol
|
|
21
|
-
- Setting `
|
|
29
|
+
- Setting `group_protocol => consumer` opts in to the new consumer group protocol
|
|
22
30
|
|
|
23
31
|
## 11.7.0
|
|
24
32
|
- Add `reconnect_backoff_max_ms` option for configuring kafka client [#204](https://github.com/logstash-plugins/logstash-integration-kafka/pull/204)
|
data/docs/index.asciidoc
CHANGED
data/docs/input-kafka.asciidoc
CHANGED
|
@@ -2,8 +2,8 @@
|
|
|
2
2
|
:plugin: kafka
|
|
3
3
|
:type: input
|
|
4
4
|
:default_codec: plain
|
|
5
|
-
:kafka_client: 4.
|
|
6
|
-
:kafka_client_doc:
|
|
5
|
+
:kafka_client: 4.2.0
|
|
6
|
+
:kafka_client_doc: 42
|
|
7
7
|
|
|
8
8
|
///////////////////////////////////////////
|
|
9
9
|
START - GENERATED VARIABLES, DO NOT EDIT!
|
|
@@ -211,6 +211,7 @@ What to do when there is no initial offset in Kafka or if an offset is out of ra
|
|
|
211
211
|
|
|
212
212
|
* earliest: automatically reset the offset to the earliest offset
|
|
213
213
|
* latest: automatically reset the offset to the latest offset
|
|
214
|
+
* by_duration:<duration>: automatically reset the offset to a configured duration from the current timestamp. The duration must be specified in ISO 8601 format (PnDTnHnMn.nS). Negative durations are not allowed. For example, `by_duration:PT1H` resets to 1 hour ago. Available since Apache Kafka 4.0.0.
|
|
214
215
|
* none: throw exception to the consumer if no previous offset is found for the consumer's group
|
|
215
216
|
* anything else: throw exception to the consumer.
|
|
216
217
|
|
data/docs/output-kafka.asciidoc
CHANGED
|
@@ -2,8 +2,8 @@
|
|
|
2
2
|
:plugin: kafka
|
|
3
3
|
:type: output
|
|
4
4
|
:default_codec: plain
|
|
5
|
-
:kafka_client: 4.
|
|
6
|
-
:kafka_client_doc:
|
|
5
|
+
:kafka_client: 4.2.0
|
|
6
|
+
:kafka_client_doc: 42
|
|
7
7
|
|
|
8
8
|
///////////////////////////////////////////
|
|
9
9
|
START - GENERATED VARIABLES, DO NOT EDIT!
|
|
@@ -66,6 +66,24 @@ https://kafka.apache.org/{kafka_client_doc}/documentation.html#producerconfigs
|
|
|
66
66
|
|
|
67
67
|
NOTE: This plugin does not support using a proxy when communicating to the Kafka broker.
|
|
68
68
|
|
|
69
|
+
.Kafka timestamps and Logstash
|
|
70
|
+
****
|
|
71
|
+
* Kafka 3.6+ introduces stricter timestamp validation with the introduction of two new broker/topic-level properties: https://docs.confluent.io/platform/current/installation/configuration/topic-configs.html#message-timestamp-before-max-ms[log.message.timestamp.before.max.ms] and
|
|
72
|
+
https://docs.confluent.io/platform/current/installation/configuration/topic-configs.html#message-timestamp-after-max-ms[log.message.timestamp.after.max.ms].
|
|
73
|
+
+
|
|
74
|
+
These properties limit the time difference between the message timestamp (from Logstash) and the Kafka broker receive time.
|
|
75
|
+
Messages can be rejected if the values are exceeded and `log.message.timestamp.type=CreateTime` is set.
|
|
76
|
+
+
|
|
77
|
+
These checks are ignored if `log.message.timestamp.type=LogAppendTime` is set.
|
|
78
|
+
|
|
79
|
+
* For Kafka version 0.10.0.0+ the message creation timestamp is set by Logstash and equals the initial timestamp of the event.
|
|
80
|
+
This behavior affects Kafka’s retention policy.
|
|
81
|
+
For example, if a Logstash event was created two weeks ago and Kafka retention is set to seven days, then when the message arrives in Kafka today it may be discarded immediately, because its timestamp is older than seven days.
|
|
82
|
+
+
|
|
83
|
+
You can change this behavior by setting timestamps on message arrival instead.
|
|
84
|
+
The message is not discarded but kept for 7 more days. Set `log.message.timestamp.type` to `LogAppendTime` (default `CreateTime`) in your Kafka configuration.
|
|
85
|
+
****
|
|
86
|
+
|
|
69
87
|
[id="plugins-{type}s-{plugin}-aws_msk_iam_auth"]
|
|
70
88
|
==== AWS MSK IAM authentication
|
|
71
89
|
If you use AWS MSK, the AWS MSK IAM access control enables you to handle both authentication and authorization for your MSK cluster with AWS IAM.
|
|
@@ -75,6 +75,8 @@ class LogStash::Inputs::Kafka < LogStash::Inputs::Base
|
|
|
75
75
|
#
|
|
76
76
|
# * earliest: automatically reset the offset to the earliest offset
|
|
77
77
|
# * latest: automatically reset the offset to the latest offset
|
|
78
|
+
# * by_duration:<duration>: automatically reset the offset to a configured duration from the current
|
|
79
|
+
# timestamp (ISO 8601 format, e.g. by_duration:PT1H). Available since Apache Kafka 4.0.0.
|
|
78
80
|
# * none: throw exception to the consumer if no previous offset is found for the consumer's group
|
|
79
81
|
# * anything else: throw exception to the consumer.
|
|
80
82
|
config :auto_offset_reset, :validate => :string
|
|
@@ -1,17 +1,17 @@
|
|
|
1
1
|
# AUTOGENERATED BY THE GRADLE SCRIPT. DO NOT EDIT.
|
|
2
2
|
|
|
3
3
|
require 'jar_dependencies'
|
|
4
|
-
require_jar('io.confluent', 'kafka-avro-serializer', '8.
|
|
5
|
-
require_jar('org.apache.kafka', 'kafka-clients', '4.
|
|
6
|
-
require_jar('
|
|
7
|
-
require_jar('io.confluent', 'kafka-schema-
|
|
8
|
-
require_jar('io.confluent', 'kafka-schema-registry-client', '8.1.1')
|
|
4
|
+
require_jar('io.confluent', 'kafka-avro-serializer', '8.2.0')
|
|
5
|
+
require_jar('org.apache.kafka', 'kafka-clients', '4.2.0')
|
|
6
|
+
require_jar('io.confluent', 'kafka-schema-serializer', '8.2.0')
|
|
7
|
+
require_jar('io.confluent', 'kafka-schema-registry-client', '8.2.0')
|
|
9
8
|
require_jar('org.apache.avro', 'avro', '1.12.1')
|
|
10
9
|
require_jar('org.apache.commons', 'commons-compress', '1.28.0')
|
|
11
10
|
require_jar('com.google.guava', 'guava', '32.0.1-jre')
|
|
12
|
-
require_jar('io.confluent', 'logredactor', '1.0.
|
|
13
|
-
require_jar('io.confluent', 'common-utils', '8.
|
|
11
|
+
require_jar('io.confluent', 'logredactor', '1.0.17')
|
|
12
|
+
require_jar('io.confluent', 'common-utils', '8.2.0')
|
|
14
13
|
require_jar('com.github.luben', 'zstd-jni', '1.5.6-10')
|
|
14
|
+
require_jar('at.yawk.lz4', 'lz4-java', '1.10.1')
|
|
15
15
|
require_jar('org.xerial.snappy', 'snappy-java', '1.1.10.7')
|
|
16
16
|
require_jar('org.apache.httpcomponents.client5', 'httpclient5', '5.5')
|
|
17
17
|
require_jar('org.slf4j', 'slf4j-api', '2.0.17')
|
|
@@ -28,7 +28,7 @@ require_jar('org.checkerframework', 'checker-qual', '3.33.0')
|
|
|
28
28
|
require_jar('com.google.errorprone', 'error_prone_annotations', '2.18.0')
|
|
29
29
|
require_jar('com.google.j2objc', 'j2objc-annotations', '2.8')
|
|
30
30
|
require_jar('com.google.re2j', 're2j', '1.6')
|
|
31
|
-
require_jar('io.confluent', 'logredactor-metrics', '1.0.
|
|
31
|
+
require_jar('io.confluent', 'logredactor-metrics', '1.0.17')
|
|
32
32
|
require_jar('com.eclipsesource.minimal-json', 'minimal-json', '0.9.5')
|
|
33
33
|
require_jar('commons-codec', 'commons-codec', '1.19.0')
|
|
34
34
|
require_jar('commons-io', 'commons-io', '2.20.0')
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Gem::Specification.new do |s|
|
|
2
2
|
s.name = 'logstash-integration-kafka'
|
|
3
|
-
s.version = '12.0
|
|
3
|
+
s.version = '12.1.0'
|
|
4
4
|
s.licenses = ['Apache-2.0']
|
|
5
5
|
s.summary = "Integration with Kafka - input and output plugins"
|
|
6
6
|
s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline "+
|
|
@@ -0,0 +1,35 @@
|
|
|
1
|
+
# SASL Configuration for integration tests
|
|
2
|
+
listeners=PLAINTEXT://:9092,CONTROLLER://:9093,SASL_PLAINTEXT://:9094,SASL_SSL://:9095
|
|
3
|
+
advertised.listeners=PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9094,SASL_SSL://localhost:9095
|
|
4
|
+
listener.security.protocol.map=PLAINTEXT:PLAINTEXT,CONTROLLER:PLAINTEXT,SASL_PLAINTEXT:SASL_PLAINTEXT,SASL_SSL:SASL_SSL
|
|
5
|
+
|
|
6
|
+
# Enable multiple SASL mechanisms
|
|
7
|
+
sasl.enabled.mechanisms=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
|
|
8
|
+
sasl.mechanism.inter.broker.protocol=PLAIN
|
|
9
|
+
|
|
10
|
+
# JAAS config for SASL_PLAINTEXT listener
|
|
11
|
+
listener.name.sasl_plaintext.plain.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
|
|
12
|
+
username="admin" \
|
|
13
|
+
password="admin-secret" \
|
|
14
|
+
user_admin="admin-secret" \
|
|
15
|
+
user_logstash="logstash-secret";
|
|
16
|
+
listener.name.sasl_plaintext.scram-sha-256.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
|
|
17
|
+
username="admin" \
|
|
18
|
+
password="admin-secret";
|
|
19
|
+
listener.name.sasl_plaintext.scram-sha-512.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
|
|
20
|
+
username="admin" \
|
|
21
|
+
password="admin-secret";
|
|
22
|
+
|
|
23
|
+
# JAAS config for SASL_SSL listener
|
|
24
|
+
listener.name.sasl_ssl.plain.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
|
|
25
|
+
username="admin" \
|
|
26
|
+
password="admin-secret" \
|
|
27
|
+
user_admin="admin-secret" \
|
|
28
|
+
user_logstash="logstash-secret";
|
|
29
|
+
listener.name.sasl_ssl.scram-sha-256.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
|
|
30
|
+
username="admin" \
|
|
31
|
+
password="admin-secret";
|
|
32
|
+
listener.name.sasl_ssl.scram-sha-512.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
|
|
33
|
+
username="admin" \
|
|
34
|
+
password="admin-secret";
|
|
35
|
+
|
|
@@ -50,6 +50,18 @@ describe "inputs/kafka", :integration => true do
|
|
|
50
50
|
{ 'topics' => ['logstash_integration_topic_plain'], 'group_id' => group_id_5,
|
|
51
51
|
'auto_offset_reset' => 'earliest', 'enable_auto_commit' => 'false' }
|
|
52
52
|
end
|
|
53
|
+
let(:sasl_config) do
|
|
54
|
+
{ 'topics' => ['logstash_integration_sasl_topic'], 'group_id' => group_id_6,
|
|
55
|
+
'bootstrap_servers' => 'localhost:9094',
|
|
56
|
+
'auto_offset_reset' => 'earliest', 'security_protocol' => 'SASL_PLAINTEXT' }
|
|
57
|
+
end
|
|
58
|
+
let(:sasl_ssl_config) do
|
|
59
|
+
{ 'topics' => ['logstash_integration_sasl_topic'], 'group_id' => group_id_6,
|
|
60
|
+
'bootstrap_servers' => 'localhost:9095',
|
|
61
|
+
'auto_offset_reset' => 'earliest', 'security_protocol' => 'SASL_SSL',
|
|
62
|
+
'ssl_truststore_location' => File.join(Dir.pwd, 'tls_repository/clienttruststore.jks'),
|
|
63
|
+
'ssl_truststore_password' => 'changeit' }
|
|
64
|
+
end
|
|
53
65
|
let(:timeout_seconds) { 30 }
|
|
54
66
|
let(:num_events) { 103 }
|
|
55
67
|
|
|
@@ -264,53 +276,63 @@ describe "inputs/kafka", :integration => true do
|
|
|
264
276
|
end
|
|
265
277
|
end
|
|
266
278
|
|
|
267
|
-
|
|
268
|
-
|
|
269
|
-
|
|
270
|
-
|
|
271
|
-
|
|
272
|
-
|
|
273
|
-
}
|
|
279
|
+
context 'SASL authentication' do
|
|
280
|
+
shared_examples 'consumes all messages' do
|
|
281
|
+
it 'authenticates and consumes all messages' do
|
|
282
|
+
queue = consume_messages(consumer_config, timeout: timeout_seconds, event_count: num_events)
|
|
283
|
+
expect(queue.length).to eq(num_events)
|
|
284
|
+
end
|
|
274
285
|
end
|
|
275
286
|
|
|
276
|
-
|
|
277
|
-
|
|
278
|
-
|
|
279
|
-
|
|
280
|
-
|
|
287
|
+
context 'over SASL_PLAINTEXT' do
|
|
288
|
+
context 'with PLAIN mechanism' do
|
|
289
|
+
let(:consumer_config) do
|
|
290
|
+
sasl_config.merge(
|
|
291
|
+
'sasl_mechanism' => 'PLAIN',
|
|
292
|
+
'sasl_jaas_config' => 'org.apache.kafka.common.security.plain.PlainLoginModule required username="logstash" password="logstash-secret";',
|
|
293
|
+
)
|
|
294
|
+
end
|
|
295
|
+
|
|
296
|
+
include_examples 'consumes all messages'
|
|
281
297
|
end
|
|
282
298
|
|
|
283
|
-
|
|
284
|
-
|
|
285
|
-
|
|
286
|
-
|
|
299
|
+
context 'with SCRAM-SHA-256 mechanism' do
|
|
300
|
+
let(:consumer_config) do
|
|
301
|
+
sasl_config.merge(
|
|
302
|
+
'sasl_mechanism' => 'SCRAM-SHA-256',
|
|
303
|
+
'sasl_jaas_config' => 'org.apache.kafka.common.security.scram.ScramLoginModule required username="logstash" password="logstash-secret";',
|
|
304
|
+
)
|
|
305
|
+
end
|
|
306
|
+
|
|
307
|
+
include_examples 'consumes all messages'
|
|
287
308
|
end
|
|
288
|
-
end
|
|
289
309
|
|
|
290
|
-
|
|
291
|
-
|
|
292
|
-
|
|
310
|
+
context 'with SCRAM-SHA-512 mechanism' do
|
|
311
|
+
let(:consumer_config) do
|
|
312
|
+
sasl_config.merge(
|
|
313
|
+
'sasl_mechanism' => 'SCRAM-SHA-512',
|
|
314
|
+
'sasl_jaas_config' => 'org.apache.kafka.common.security.scram.ScramLoginModule required username="logstash" password="logstash-secret";',
|
|
315
|
+
)
|
|
316
|
+
end
|
|
293
317
|
|
|
294
|
-
|
|
318
|
+
include_examples 'consumes all messages'
|
|
319
|
+
end
|
|
295
320
|
end
|
|
296
321
|
|
|
297
|
-
context '
|
|
298
|
-
|
|
299
|
-
|
|
300
|
-
|
|
301
|
-
|
|
302
|
-
password="
|
|
303
|
-
|
|
304
|
-
|
|
305
|
-
JAAS
|
|
306
|
-
end
|
|
307
|
-
let(:consumer_config) { base_config.merge('sasl_jaas_config' => jaas_config_value) }
|
|
322
|
+
context 'over SASL_SSL' do
|
|
323
|
+
context 'with PLAIN mechanism' do
|
|
324
|
+
let(:consumer_config) do
|
|
325
|
+
sasl_ssl_config.merge(
|
|
326
|
+
'sasl_mechanism' => 'PLAIN',
|
|
327
|
+
'sasl_jaas_config' => 'org.apache.kafka.common.security.plain.PlainLoginModule required username="logstash" password="logstash-secret";',
|
|
328
|
+
)
|
|
329
|
+
end
|
|
308
330
|
|
|
309
|
-
|
|
331
|
+
include_examples 'consumes all messages'
|
|
332
|
+
end
|
|
310
333
|
end
|
|
311
334
|
end
|
|
312
335
|
|
|
313
|
-
|
|
314
336
|
context "static membership 'group.instance.id' setting" do
|
|
315
337
|
let(:base_config) do
|
|
316
338
|
{
|
|
@@ -414,6 +414,30 @@ describe LogStash::Inputs::Kafka do
|
|
|
414
414
|
end
|
|
415
415
|
end
|
|
416
416
|
|
|
417
|
+
context 'auto_offset_reset config' do
|
|
418
|
+
let(:base_props) do
|
|
419
|
+
{
|
|
420
|
+
'bootstrap.servers' => 'localhost:9092',
|
|
421
|
+
'key.deserializer' => 'org.apache.kafka.common.serialization.StringDeserializer',
|
|
422
|
+
'value.deserializer' => 'org.apache.kafka.common.serialization.StringDeserializer'
|
|
423
|
+
}
|
|
424
|
+
end
|
|
425
|
+
|
|
426
|
+
%w[earliest latest none by_duration:PT1H by_duration:P1D by_duration:PT30S by_duration:PT15M].each do |value|
|
|
427
|
+
it "accepts '#{value}' as a valid auto.offset.reset value" do
|
|
428
|
+
props = base_props.merge('auto.offset.reset' => value)
|
|
429
|
+
expect { org.apache.kafka.clients.consumer.ConsumerConfig.new(props) }.to_not raise_error
|
|
430
|
+
end
|
|
431
|
+
end
|
|
432
|
+
|
|
433
|
+
%w[by_duration: by_duration:INVALID by_duration:-PT1H by_duration:abc].each do |value|
|
|
434
|
+
it "rejects '#{value}' as an invalid auto.offset.reset value" do
|
|
435
|
+
props = base_props.merge('auto.offset.reset' => value)
|
|
436
|
+
expect { org.apache.kafka.clients.consumer.ConsumerConfig.new(props) }.to raise_error(org.apache.kafka.common.config.ConfigException)
|
|
437
|
+
end
|
|
438
|
+
end
|
|
439
|
+
end
|
|
440
|
+
|
|
417
441
|
context 'string integer config' do
|
|
418
442
|
let(:config) { super().merge('session_timeout_ms' => '25000',
|
|
419
443
|
'max_poll_interval_ms' => '345000',
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
data/vendor/jar-dependencies/io/confluent/logredactor-metrics/1.0.17/logredactor-metrics-1.0.17.jar
ADDED
|
Binary file
|
|
Binary file
|
metadata
CHANGED
|
@@ -1,13 +1,13 @@
|
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
|
2
2
|
name: logstash-integration-kafka
|
|
3
3
|
version: !ruby/object:Gem::Version
|
|
4
|
-
version: 12.0
|
|
4
|
+
version: 12.1.0
|
|
5
5
|
platform: java
|
|
6
6
|
authors:
|
|
7
7
|
- Elastic
|
|
8
8
|
bindir: bin
|
|
9
9
|
cert_chain: []
|
|
10
|
-
date: 2026-03-
|
|
10
|
+
date: 2026-03-18 00:00:00.000000000 Z
|
|
11
11
|
dependencies:
|
|
12
12
|
- !ruby/object:Gem::Dependency
|
|
13
13
|
name: logstash-core-plugin-api
|
|
@@ -236,6 +236,7 @@ files:
|
|
|
236
236
|
- spec/check_docs_spec.rb
|
|
237
237
|
- spec/fixtures/jaas.config
|
|
238
238
|
- spec/fixtures/jaas3.config
|
|
239
|
+
- spec/fixtures/kafka_sasl_config.properties
|
|
239
240
|
- spec/fixtures/pwd
|
|
240
241
|
- spec/fixtures/trust-store_stub.jks
|
|
241
242
|
- spec/integration/inputs/kafka_spec.rb
|
|
@@ -260,12 +261,12 @@ files:
|
|
|
260
261
|
- vendor/jar-dependencies/com/google/re2j/re2j/1.6/re2j-1.6.jar
|
|
261
262
|
- vendor/jar-dependencies/commons-codec/commons-codec/1.19.0/commons-codec-1.19.0.jar
|
|
262
263
|
- vendor/jar-dependencies/commons-io/commons-io/2.20.0/commons-io-2.20.0.jar
|
|
263
|
-
- vendor/jar-dependencies/io/confluent/common-utils/8.
|
|
264
|
-
- vendor/jar-dependencies/io/confluent/kafka-avro-serializer/8.
|
|
265
|
-
- vendor/jar-dependencies/io/confluent/kafka-schema-registry-client/8.
|
|
266
|
-
- vendor/jar-dependencies/io/confluent/kafka-schema-serializer/8.
|
|
267
|
-
- vendor/jar-dependencies/io/confluent/logredactor-metrics/1.0.
|
|
268
|
-
- vendor/jar-dependencies/io/confluent/logredactor/1.0.
|
|
264
|
+
- vendor/jar-dependencies/io/confluent/common-utils/8.2.0/common-utils-8.2.0.jar
|
|
265
|
+
- vendor/jar-dependencies/io/confluent/kafka-avro-serializer/8.2.0/kafka-avro-serializer-8.2.0.jar
|
|
266
|
+
- vendor/jar-dependencies/io/confluent/kafka-schema-registry-client/8.2.0/kafka-schema-registry-client-8.2.0.jar
|
|
267
|
+
- vendor/jar-dependencies/io/confluent/kafka-schema-serializer/8.2.0/kafka-schema-serializer-8.2.0.jar
|
|
268
|
+
- vendor/jar-dependencies/io/confluent/logredactor-metrics/1.0.17/logredactor-metrics-1.0.17.jar
|
|
269
|
+
- vendor/jar-dependencies/io/confluent/logredactor/1.0.17/logredactor-1.0.17.jar
|
|
269
270
|
- vendor/jar-dependencies/io/swagger/core/v3/swagger-annotations/2.2.29/swagger-annotations-2.2.29.jar
|
|
270
271
|
- vendor/jar-dependencies/org/apache/avro/avro/1.12.1/avro-1.12.1.jar
|
|
271
272
|
- vendor/jar-dependencies/org/apache/commons/commons-compress/1.28.0/commons-compress-1.28.0.jar
|
|
@@ -273,7 +274,7 @@ files:
|
|
|
273
274
|
- vendor/jar-dependencies/org/apache/httpcomponents/client5/httpclient5/5.5/httpclient5-5.5.jar
|
|
274
275
|
- vendor/jar-dependencies/org/apache/httpcomponents/core5/httpcore5-h2/5.3.4/httpcore5-h2-5.3.4.jar
|
|
275
276
|
- vendor/jar-dependencies/org/apache/httpcomponents/core5/httpcore5/5.3.4/httpcore5-5.3.4.jar
|
|
276
|
-
- vendor/jar-dependencies/org/apache/kafka/kafka-clients/4.
|
|
277
|
+
- vendor/jar-dependencies/org/apache/kafka/kafka-clients/4.2.0/kafka-clients-4.2.0.jar
|
|
277
278
|
- vendor/jar-dependencies/org/checkerframework/checker-qual/3.33.0/checker-qual-3.33.0.jar
|
|
278
279
|
- vendor/jar-dependencies/org/slf4j/slf4j-api/2.0.17/slf4j-api-2.0.17.jar
|
|
279
280
|
- vendor/jar-dependencies/org/xerial/snappy/snappy-java/1.1.10.7/snappy-java-1.1.10.7.jar
|
|
@@ -307,6 +308,7 @@ test_files:
|
|
|
307
308
|
- spec/check_docs_spec.rb
|
|
308
309
|
- spec/fixtures/jaas.config
|
|
309
310
|
- spec/fixtures/jaas3.config
|
|
311
|
+
- spec/fixtures/kafka_sasl_config.properties
|
|
310
312
|
- spec/fixtures/pwd
|
|
311
313
|
- spec/fixtures/trust-store_stub.jks
|
|
312
314
|
- spec/integration/inputs/kafka_spec.rb
|
|
Binary file
|
|
Binary file
|
|
Binary file
|