logstash-integration-kafka 10.5.0-java → 10.5.1-java
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +5 -0
- data/docs/index.asciidoc +7 -2
- data/docs/input-kafka.asciidoc +20 -8
- data/docs/output-kafka.asciidoc +23 -10
- data/logstash-integration-kafka.gemspec +1 -1
- metadata +2 -2
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 93e42aba43873cfb3c8165a31b184e8178f738787329a36bcde79b234d2f642a
|
4
|
+
data.tar.gz: 41cca0d58b15c3072d51d009399e01e6854dea4c1d0cc3791e52430365599d73
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: b40a479faa7568a326b53fb721f692bdb4f66e0c87e3553d39b29456b3e8b61f4849d0c325c21c3a8247989c16467d4c72268673b0c7e7ee41de7af1d939a523
|
7
|
+
data.tar.gz: 1df3fe42154e9b035145a159fbca4605f3c82d8d61765af3242ec6a2ff571b9b9d2bd536e484dfba766ca353722976c0c81b1e8f70676f083cec0ab3b6983a4e
|
data/CHANGELOG.md
CHANGED
@@ -1,3 +1,8 @@
|
|
1
|
+
## 10.5.1
|
2
|
+
- [DOC]Replaced plugin_header file with plugin_header-integration file. [#46](https://github.com/logstash-plugins/logstash-integration-kafka/pull/46)
|
3
|
+
- [DOC]Update kafka client version across kafka integration docs [#47](https://github.com/logstash-plugins/logstash-integration-kafka/pull/47)
|
4
|
+
- [DOC]Replace hard-coded kafka client and doc path version numbers with attributes to simplify doc maintenance [#48](https://github.com/logstash-plugins/logstash-integration-kafka/pull/48)
|
5
|
+
|
1
6
|
## 10.5.0
|
2
7
|
- Changed: retry sending messages only for retriable exceptions [#27](https://github.com/logstash-plugins/logstash-integration-kafka/pull/29)
|
3
8
|
|
data/docs/index.asciidoc
CHANGED
@@ -1,6 +1,7 @@
|
|
1
1
|
:plugin: kafka
|
2
2
|
:type: integration
|
3
3
|
:no_codec:
|
4
|
+
:kafka_client: 2.4
|
4
5
|
|
5
6
|
///////////////////////////////////////////
|
6
7
|
START - GENERATED VARIABLES, DO NOT EDIT!
|
@@ -21,11 +22,15 @@ include::{include_path}/plugin_header.asciidoc[]
|
|
21
22
|
|
22
23
|
==== Description
|
23
24
|
|
24
|
-
The Kafka Integration Plugin provides integrated plugins for working with the
|
25
|
+
The Kafka Integration Plugin provides integrated plugins for working with the
|
26
|
+
https://kafka.apache.org/[Kafka] distributed streaming platform.
|
25
27
|
|
26
28
|
- {logstash-ref}/plugins-inputs-kafka.html[Kafka Input Plugin]
|
27
29
|
- {logstash-ref}/plugins-outputs-kafka.html[Kafka Output Plugin]
|
28
30
|
|
29
|
-
This plugin uses Kafka Client
|
31
|
+
This plugin uses Kafka Client {kafka_client}. For broker compatibility, see the official
|
32
|
+
https://cwiki.apache.org/confluence/display/KAFKA/Compatibility+Matrix[Kafka
|
33
|
+
compatibility reference]. If the linked compatibility wiki is not up-to-date,
|
34
|
+
please contact Kafka support/community to confirm compatibility.
|
30
35
|
|
31
36
|
:no_codec!:
|
data/docs/input-kafka.asciidoc
CHANGED
@@ -1,6 +1,9 @@
|
|
1
|
+
:integration: kafka
|
1
2
|
:plugin: kafka
|
2
3
|
:type: input
|
3
4
|
:default_codec: plain
|
5
|
+
:kafka_client: 2.4
|
6
|
+
:kafka_client_doc: 24
|
4
7
|
|
5
8
|
///////////////////////////////////////////
|
6
9
|
START - GENERATED VARIABLES, DO NOT EDIT!
|
@@ -17,15 +20,20 @@ END - GENERATED VARIABLES, DO NOT EDIT!
|
|
17
20
|
|
18
21
|
=== Kafka input plugin
|
19
22
|
|
20
|
-
include::{include_path}/plugin_header.asciidoc[]
|
23
|
+
include::{include_path}/plugin_header-integration.asciidoc[]
|
21
24
|
|
22
25
|
==== Description
|
23
26
|
|
24
27
|
This input will read events from a Kafka topic.
|
25
28
|
|
26
|
-
This plugin uses Kafka Client
|
29
|
+
This plugin uses Kafka Client {kafka_client}. For broker compatibility, see the
|
30
|
+
official
|
31
|
+
https://cwiki.apache.org/confluence/display/KAFKA/Compatibility+Matrix[Kafka
|
32
|
+
compatibility reference]. If the linked compatibility wiki is not up-to-date,
|
33
|
+
please contact Kafka support/community to confirm compatibility.
|
27
34
|
|
28
|
-
If you require features not yet available in this plugin (including client
|
35
|
+
If you require features not yet available in this plugin (including client
|
36
|
+
version upgrades), please file an issue with details about what you need.
|
29
37
|
|
30
38
|
This input supports connecting to Kafka over:
|
31
39
|
|
@@ -46,9 +54,9 @@ the same `group_id`.
|
|
46
54
|
Ideally you should have as many threads as the number of partitions for a perfect balance --
|
47
55
|
more threads than partitions means that some threads will be idle
|
48
56
|
|
49
|
-
For more information see https://kafka.apache.org/
|
57
|
+
For more information see https://kafka.apache.org/{kafka_client_doc}/documentation.html#theconsumer
|
50
58
|
|
51
|
-
Kafka consumer configuration: https://kafka.apache.org/
|
59
|
+
Kafka consumer configuration: https://kafka.apache.org/{kafka_client_doc}/documentation.html#consumerconfigs
|
52
60
|
|
53
61
|
==== Metadata fields
|
54
62
|
|
@@ -59,7 +67,11 @@ The following metadata from Kafka broker are added under the `[@metadata]` field
|
|
59
67
|
* `[@metadata][kafka][partition]`: Partition info for this message.
|
60
68
|
* `[@metadata][kafka][offset]`: Original record offset for this message.
|
61
69
|
* `[@metadata][kafka][key]`: Record key, if any.
|
62
|
-
* `[@metadata][kafka][timestamp]`: Timestamp in the Record.
|
70
|
+
* `[@metadata][kafka][timestamp]`: Timestamp in the Record.
|
71
|
+
Depending on your broker configuration, this can be
|
72
|
+
either when the record was created (default) or when it was received by the
|
73
|
+
broker. See more about property log.message.timestamp.type at
|
74
|
+
https://kafka.apache.org/{kafka_client_doc}/documentation.html#brokerconfigs
|
63
75
|
|
64
76
|
Metadata is only added to the event if the `decorate_events` option is set to true (it defaults to false).
|
65
77
|
|
@@ -73,7 +85,7 @@ This plugin supports these configuration options plus the <<plugins-{type}s-{plu
|
|
73
85
|
|
74
86
|
NOTE: Some of these options map to a Kafka option. Defaults usually reflect the Kafka default setting,
|
75
87
|
and might change if Kafka's consumer defaults change.
|
76
|
-
See the https://kafka.apache.org/
|
88
|
+
See the https://kafka.apache.org/{kafka_client_doc}/documentation for more details.
|
77
89
|
|
78
90
|
[cols="<,<,<",options="header",]
|
79
91
|
|=======================================================================
|
@@ -421,7 +433,7 @@ partition ownership amongst consumer instances, supported options are:
|
|
421
433
|
* `sticky`
|
422
434
|
* `cooperative_sticky`
|
423
435
|
|
424
|
-
These map to Kafka's corresponding https://kafka.apache.org/
|
436
|
+
These map to Kafka's corresponding https://kafka.apache.org/{kafka_client_doc}/javadoc/org/apache/kafka/clients/consumer/ConsumerPartitionAssignor.html[`ConsumerPartitionAssignor`]
|
425
437
|
implementations.
|
426
438
|
|
427
439
|
[id="plugins-{type}s-{plugin}-poll_timeout_ms"]
|
data/docs/output-kafka.asciidoc
CHANGED
@@ -1,6 +1,9 @@
|
|
1
|
+
:integration: kafka
|
1
2
|
:plugin: kafka
|
2
3
|
:type: output
|
3
4
|
:default_codec: plain
|
5
|
+
:kafka_client: 2.4
|
6
|
+
:kafka_client_doc: 24
|
4
7
|
|
5
8
|
///////////////////////////////////////////
|
6
9
|
START - GENERATED VARIABLES, DO NOT EDIT!
|
@@ -17,15 +20,20 @@ END - GENERATED VARIABLES, DO NOT EDIT!
|
|
17
20
|
|
18
21
|
=== Kafka output plugin
|
19
22
|
|
20
|
-
include::{include_path}/plugin_header.asciidoc[]
|
23
|
+
include::{include_path}/plugin_header-integration.asciidoc[]
|
21
24
|
|
22
25
|
==== Description
|
23
26
|
|
24
27
|
Write events to a Kafka topic.
|
25
28
|
|
26
|
-
This plugin uses Kafka Client
|
29
|
+
This plugin uses Kafka Client {kafka_client}. For broker compatibility, see the
|
30
|
+
official
|
31
|
+
https://cwiki.apache.org/confluence/display/KAFKA/Compatibility+Matrix[Kafka
|
32
|
+
compatibility reference]. If the linked compatibility wiki is not up-to-date,
|
33
|
+
please contact Kafka support/community to confirm compatibility.
|
27
34
|
|
28
|
-
If you require features not yet available in this plugin (including client
|
35
|
+
If you require features not yet available in this plugin (including client
|
36
|
+
version upgrades), please file an issue with details about what you need.
|
29
37
|
|
30
38
|
This output supports connecting to Kafka over:
|
31
39
|
|
@@ -36,9 +44,12 @@ By default security is disabled but can be turned on as needed.
|
|
36
44
|
|
37
45
|
The only required configuration is the topic_id.
|
38
46
|
|
39
|
-
The default codec is plain. Logstash will encode your events with not only the
|
47
|
+
The default codec is plain. Logstash will encode your events with not only the
|
48
|
+
message field but also with a timestamp and hostname.
|
49
|
+
|
50
|
+
If you want the full content of your events to be sent as json, you should set
|
51
|
+
the codec in the output configuration like this:
|
40
52
|
|
41
|
-
If you want the full content of your events to be sent as json, you should set the codec in the output configuration like this:
|
42
53
|
[source,ruby]
|
43
54
|
output {
|
44
55
|
kafka {
|
@@ -47,9 +58,11 @@ If you want the full content of your events to be sent as json, you should set t
|
|
47
58
|
}
|
48
59
|
}
|
49
60
|
|
50
|
-
For more information see
|
61
|
+
For more information see
|
62
|
+
https://kafka.apache.org/{kafka_client_doc}/documentation.html#theproducer
|
51
63
|
|
52
|
-
Kafka producer configuration:
|
64
|
+
Kafka producer configuration:
|
65
|
+
https://kafka.apache.org/{kafka_client_doc}/documentation.html#producerconfigs
|
53
66
|
|
54
67
|
[id="plugins-{type}s-{plugin}-options"]
|
55
68
|
==== Kafka Output Configuration Options
|
@@ -58,7 +71,7 @@ This plugin supports the following configuration options plus the <<plugins-{typ
|
|
58
71
|
|
59
72
|
NOTE: Some of these options map to a Kafka option. Defaults usually reflect the Kafka default setting,
|
60
73
|
and might change if Kafka's producer defaults change.
|
61
|
-
See the https://kafka.apache.org/
|
74
|
+
See the https://kafka.apache.org/{kafka_client_doc}/documentation for more details.
|
62
75
|
|
63
76
|
[cols="<,<,<",options="header",]
|
64
77
|
|=======================================================================
|
@@ -328,9 +341,9 @@ Kafka down, etc).
|
|
328
341
|
A value less than zero is a configuration error.
|
329
342
|
|
330
343
|
Starting with version 10.5.0, this plugin will only retry exceptions that are a subclass of
|
331
|
-
https://kafka.apache.org/
|
344
|
+
https://kafka.apache.org/{kafka_client_doc}/javadoc/org/apache/kafka/common/errors/RetriableException.html[RetriableException]
|
332
345
|
and
|
333
|
-
https://kafka.apache.org/
|
346
|
+
https://kafka.apache.org/{kafka_client_doc}/javadoc/org/apache/kafka/common/errors/InterruptException.html[InterruptException].
|
334
347
|
If producing a message throws any other exception, an error is logged and the message is dropped without retrying.
|
335
348
|
This prevents the Logstash pipeline from hanging indefinitely.
|
336
349
|
|
@@ -1,6 +1,6 @@
|
|
1
1
|
Gem::Specification.new do |s|
|
2
2
|
s.name = 'logstash-integration-kafka'
|
3
|
-
s.version = '10.5.
|
3
|
+
s.version = '10.5.1'
|
4
4
|
s.licenses = ['Apache-2.0']
|
5
5
|
s.summary = "Integration with Kafka - input and output plugins"
|
6
6
|
s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline "+
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-integration-kafka
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 10.5.
|
4
|
+
version: 10.5.1
|
5
5
|
platform: java
|
6
6
|
authors:
|
7
7
|
- Elastic
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2020-
|
11
|
+
date: 2020-08-20 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
requirement: !ruby/object:Gem::Requirement
|