logstash-filter-kafka_time_machine 0.2.0 → 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4afd292b610638a04cf07a7c272c59819c561d43941f3784a3db4d67a91d4e40
4
- data.tar.gz: 55b7b510230ff6549506b45a47fdd7457d7129a51188fd5820411bc76d1ccb5b
3
+ metadata.gz: f54f2603dd73f8dd42e4677f51fdb40942cdb8ac918dcca99bcec2f022e16db2
4
+ data.tar.gz: cd3ec446acde4a0c82890ea4d38416ec62ab4081fa6f301be1a12a40ca287ae5
5
5
  SHA512:
6
- metadata.gz: 511df4078b94243f61877b2d2391692ab4623b1dbff23f064cda7a14323ef5a84482d96755a75fb7f592834c9762cb4d3c9071596e88ffe1d850fd5b18a666ae
7
- data.tar.gz: e0c80cceba1e9f214cf481ab3d6aa65874007e5c80696401c610d1a519c79d6d7a97392f179b35461cef6b0b0b25c8ab59a030879af9280c29c10ddc57aa4bc1
6
+ metadata.gz: d65b9f86abd856b62dcb214eeba4a67074bfbb9bd67bdc2abdf3f99344df41bd788f029c7943d3d268c7eb88d05b5d7e5d606c86f0484a1558a8970a1299b07c
7
+ data.tar.gz: bb9dee15c942388e2261f85390f4ea0635723e5e6122162a5ebfa857676de0e312e683ba666d01326556310028b4fc0644e48e26e16fcaca79d092dbc542356b
data/README.md CHANGED
@@ -1,3 +1,258 @@
1
- # logstash-filter-kafka_time_machine
1
+ [![Gem Version](https://badge.fury.io/rb/logstash-filter-kafka_time_machine.svg)](https://badge.fury.io/rb/logstash-filter-kafka_time_machine)
2
2
 
3
- TBD
3
+ # Logstash Plugin: logstash-filter-kafka_time_machine
4
+
5
+ This is a filter plugin for [Logstash](https://github.com/elastic/logstash)
6
+
7
+ ## Description
8
+
9
+ This filter plugin add additional fields to an existing event. These fields provide additional metadata for tracking log events that have traversed multiple kafka and logstash blocks for aggregation.
10
+
11
+ The typical flow for log events:
12
+
13
+ ```
14
+
15
+ Service Log ---> kafka_shipper <--- logstash_shipper ---> | ott_network_link | ---> kafka_indexer <--- logstash_indexer ---> elastic_search
16
+
17
+ ```
18
+
19
+ This filter leverages metadata inserted into the log event on both `logstash_shipper` and `logstash_indexer` nodes to track dwell time of log events through this pipeline. When used the filter will add the following fields to the log event:
20
+
21
+ | Event Field | Output Type | Description |
22
+ | ----------------------------------- | ----------- | ----------------------------------------------------------------------------------------- |
23
+ | [ktm][datacenter_shipper] | string | Echo of `kafka_datacenter_shipper |
24
+ | [ktm][kafka_topic_shipper] | string | Echo of `kafka_topic_shipper |
25
+ | [ktm][kafka_consumer_group_shipper] | string | Echo of `kafka_consumer_group_shipper |
26
+ | [ktm][kafka_topic_indexer] | string | Echo of `kafka_topic_indexer |
27
+ | [ktm][kafka_consumer_group_indexer] | string | Echo of `kafka_consumer_group_indexer |
28
+ | [ktm][payload_size_bytes] | number | If present in event, `payload` field size in bytes, else it's omitted |
29
+ | [ktm][lag_shipper_ms] | number | Value of "`logstash_kafka_read_time_shipper - kafka_append_time_shipper`" in milliseconds |
30
+ | [ktm][lag_indexer_ms] | number | Value of "`logstash_kafka_read_time_indexer - kafka_append_time_indexer`" in milliseconds |
31
+ | [ktm][lag_total_ms] | number | Value of "`logstash_kafka_read_time_indexer - kafka_append_time_shipper`" in milliseconds |
32
+
33
+ What you do with the `[ktm]` event on it's return is use case specific. One currently used approach is to index this data in the ES document for log event it was generated against. This provides metrics on a per document level in ES.
34
+
35
+ ## Kafka Time Machine Configuration Options
36
+
37
+ This plugin supports the following configuration options:
38
+
39
+ | Setting | Input Type | Required |
40
+ | --------------------------------------------------------------------- | ---------- | -------- |
41
+ | [kafka_datacenter_shipper](#kafka_datacenter_shipper) | string | Yes |
42
+ | [kafka_topic_shipper](#kafka_topic_shipper) | string | Yes |
43
+ | [kafka_consumer_group_shipper](#kafka_consumer_group_shipper) | string | Yes |
44
+ | [kafka_append_time_shipper](#kafka_append_time_shipper) | string | Yes |
45
+ | [logstash_kafka_read_time_shipper](#logstash_kafka_read_time_shipper) | string | Yes |
46
+ | [kafka_topic_indexer](#kafka_topic_indexer) | string | Yes |
47
+ | [kafka_consumer_group_indexer](#kafka_consumer_group_indexer) | string | Yes |
48
+ | [kafka_append_time_indexer](#kafka_append_time_indexer) | string | Yes |
49
+ | [logstash_kafka_read_time_indexer](#logstash_kafka_read_time_indexer) | string | Yes |
50
+
51
+
52
+
53
+ ### kafka_datacenter_shipper
54
+
55
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
56
+ - There is no default value for this setting.
57
+
58
+ Provide datacenter that log event originated from; datacenter kafka_shipper is in. Field values can be static or dynamic:
59
+
60
+ ```
61
+ filter {
62
+ kafka_time_machine {
63
+ kafka_datacenter_shipper => "static_field"
64
+ }
65
+ }
66
+ ```
67
+
68
+ ```
69
+ filter {
70
+ kafka_time_machine {
71
+ kafka_datacenter_shipper => "%{[dynamic_field]}"
72
+ }
73
+ }
74
+ ```
75
+
76
+ ### kafka_topic_shipper
77
+
78
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
79
+ - There is no default value for this setting.
80
+
81
+ Provide kafka topic log event was read from on shipper. Field values can be static or dynamic:
82
+
83
+ ```
84
+ filter {
85
+ kafka_time_machine {
86
+ kafka_topic_shipper => "static_field"
87
+ }
88
+ }
89
+ ```
90
+
91
+ ```
92
+ filter {
93
+ kafka_time_machine {
94
+ kafka_topic_shipper => "%{[dynamic_field]}"
95
+ }
96
+ }
97
+ ```
98
+
99
+ ### kafka_consumer_group_shipper
100
+
101
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
102
+ - There is no default value for this setting.
103
+
104
+ Provide kafka consumer group log event was read from on shipper. Field values can be static or dynamic:
105
+
106
+ ```
107
+ filter {
108
+ kafka_time_machine {
109
+ kafka_consumer_group_shipper => "static_field"
110
+ }
111
+ }
112
+ ```
113
+
114
+ ```
115
+ filter {
116
+ kafka_time_machine {
117
+ kafka_consumer_group_shipper => "%{[dynamic_field]}"
118
+ }
119
+ }
120
+ ```
121
+
122
+ ### kafka_append_time_shipper
123
+
124
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
125
+ - There is no default value for this setting.
126
+
127
+ Provide EPOCH time in milliseconds log event was added to `kafka_shipper`. Field values can be static or dynamic:
128
+
129
+ ```
130
+ filter {
131
+ kafka_time_machine {
132
+ kafka_append_time_shipper => 1624394191000
133
+ }
134
+ }
135
+ ```
136
+
137
+ ```
138
+ filter {
139
+ kafka_time_machine {
140
+ kafka_append_time_shipper => "%{[dynamic_field]}"
141
+ }
142
+ }
143
+ ```
144
+
145
+ ### logstash_kafka_read_time_shipper
146
+
147
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
148
+ - There is no default value for this setting.
149
+
150
+ Provide EPOCH time in milliseconds log event read from to `kafka_shipper`. Field values can be static or dynamic:
151
+
152
+ ```
153
+ filter {
154
+ kafka_time_machine {
155
+ logstash_kafka_read_time_shipper => 1624394191000
156
+ }
157
+ }
158
+ ```
159
+
160
+ ```
161
+ filter {
162
+ kafka_time_machine {
163
+ logstash_kafka_read_time_shipper => "%{[dynamic_field]}"
164
+ }
165
+ }
166
+ ```
167
+
168
+ ### kafka_topic_indexer
169
+
170
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
171
+ - There is no default value for this setting.
172
+
173
+ Provide kafka topic log event was read from on indexer. Field values can be static or dynamic:
174
+
175
+ ```
176
+ filter {
177
+ kafka_time_machine {
178
+ kafka_topic_indexer => "static_field"
179
+ }
180
+ }
181
+ ```
182
+
183
+ ```
184
+ filter {
185
+ kafka_time_machine {
186
+ kafka_topic_indexer => "%{[dynamic_field]}"
187
+ }
188
+ }
189
+ ```
190
+
191
+ ### kafka_consumer_group_indexer
192
+
193
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
194
+ - There is no default value for this setting.
195
+
196
+ Provide kafka consumer group log event was read from on indexer. Field values can be static or dynamic:
197
+
198
+ ```
199
+ filter {
200
+ kafka_time_machine {
201
+ kafka_consumer_group_indexer => "static_field"
202
+ }
203
+ }
204
+ ```
205
+
206
+ ```
207
+ filter {
208
+ kafka_time_machine {
209
+ kafka_consumer_group_indexer => "%{[dynamic_field]}"
210
+ }
211
+ }
212
+ ```
213
+
214
+ ### kafka_append_time_indexer
215
+
216
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
217
+ - There is no default value for this setting.
218
+
219
+ Provide EPOCH time in milliseconds log event was added to `kafka_indexer`. Field values can be static or dynamic:
220
+
221
+ ```
222
+ filter {
223
+ kafka_time_machine {
224
+ kafka_append_time_indexer => 1624394191000
225
+ }
226
+ }
227
+ ```
228
+
229
+ ```
230
+ filter {
231
+ kafka_time_machine {
232
+ kafka_append_time_indexer => "%{[dynamic_field]}"
233
+ }
234
+ }
235
+ ```
236
+
237
+ ### logstash_kafka_read_time_indexer
238
+
239
+ - Value type is [string](https://www.elastic.co/guide/en/logstash/7.13/configuration-file-structure.html#string)
240
+ - There is no default value for this setting.
241
+
242
+ Provide EPOCH time in milliseconds log event read from to `kafka_indexer`. Field values can be static or dynamic:
243
+
244
+ ```
245
+ filter {
246
+ kafka_time_machine {
247
+ logstash_kafka_read_time_indexer => 1624394191000
248
+ }
249
+ }
250
+ ```
251
+
252
+ ```
253
+ filter {
254
+ kafka_time_machine {
255
+ logstash_kafka_read_time_indexer => "%{[dynamic_field]}"
256
+ }
257
+ }
258
+ ```
@@ -0,0 +1,110 @@
1
+ # encoding: utf-8
2
+ require "logstash/filters/base"
3
+ require "logstash/namespace"
4
+ require "logstash/event"
5
+
6
+ class LogStash::Filters::KafkaTimeMachine < LogStash::Filters::Base
7
+
8
+ config_name "kafka_time_machine"
9
+
10
+ # Datacenter the kafka message originated from.
11
+ config :kafka_datacenter_shipper, :validate => :string, :required => true
12
+
13
+ # Kafka Topic on shipper datacenter
14
+ config :kafka_topic_shipper, :validate => :string, :required => true
15
+
16
+ # Kafka Consumer Group on shipper datacenter
17
+ config :kafka_consumer_group_shipper, :validate => :string, :required => true
18
+
19
+ # Time message was appended to kafka on shipper datacenter
20
+ config :kafka_append_time_shipper, :validate => :string, :required => true
21
+
22
+ # Time message read from kafka by logstash on shipper datacenter
23
+ config :logstash_kafka_read_time_shipper, :validate => :string, :required => true
24
+
25
+ # Kafka Topic on indexer datacenter
26
+ config :kafka_topic_indexer, :validate => :string, :required => true
27
+
28
+ # Kafka Consumer Group on indexer datacenter
29
+ config :kafka_consumer_group_indexer, :validate => :string, :required => true
30
+
31
+ # Time message was appended to kafka on indexer datacenter
32
+ config :kafka_append_time_indexer, :validate => :string, :required => true
33
+
34
+ # Time message read from kafka by logstash on indexer datacenter
35
+ config :logstash_kafka_read_time_indexer, :validate => :string, :required => true
36
+
37
+ public
38
+ def register
39
+
40
+ end
41
+
42
+ public
43
+ def filter(event)
44
+
45
+ # Extract shipper data and check for validity; note that kafka_datacenter_shipper is used for both shipper and indexer arrays
46
+ kafka_datacenter_shipper = event.sprintf(@kafka_datacenter_shipper)
47
+ kafka_topic_shipper = event.sprintf(@kafka_topic_shipper)
48
+ kafka_consumer_group_shipper = event.sprintf(@kafka_consumer_group_shipper)
49
+ kafka_append_time_shipper = Float(event.sprintf(@kafka_append_time_shipper)) rescue nil
50
+ logstash_kafka_read_time_shipper = Float(event.sprintf(@logstash_kafka_read_time_shipper)) rescue nil
51
+
52
+ kafka_shipper_array = Array[kafka_datacenter_shipper, kafka_topic_shipper, kafka_consumer_group_shipper, kafka_append_time_shipper, logstash_kafka_read_time_shipper]
53
+ @logger.debug("kafka_shipper_array: #{kafka_shipper_array}")
54
+
55
+ if (kafka_shipper_array.any? { |text| text.nil? || text.to_s.empty? })
56
+ @logger.debug("kafka_shipper_array invalid: Found null")
57
+ error_string_shipper = "Error in shipper data: #{kafka_shipper_array}"
58
+ shipper_valid = false
59
+ else
60
+ @logger.debug("kafka_shipper_array valid")
61
+ shipper_valid = true
62
+ logstash_kafka_read_time_shipper = logstash_kafka_read_time_shipper.to_i
63
+ kafka_append_time_shipper = kafka_append_time_shipper.to_i
64
+ kafka_shipper_lag_ms = logstash_kafka_read_time_shipper - kafka_append_time_shipper
65
+ end
66
+
67
+ # Extract indexer data and check for validity
68
+ kafka_topic_indexer = event.sprintf(@kafka_topic_indexer)
69
+ kafka_consumer_group_indexer = event.sprintf(@kafka_consumer_group_indexer)
70
+ kafka_append_time_indexer = Float(event.sprintf(@kafka_append_time_indexer)) rescue nil
71
+ logstash_kafka_read_time_indexer = Float(event.sprintf(@logstash_kafka_read_time_indexer)) rescue nil
72
+
73
+ kafka_indexer_array = Array[kafka_datacenter_shipper, kafka_topic_indexer, kafka_consumer_group_indexer, kafka_append_time_indexer, logstash_kafka_read_time_indexer]
74
+ @logger.debug("kafka_indexer_array: #{kafka_indexer_array}")
75
+
76
+ if (kafka_indexer_array.any? { |text| text.nil? || text.to_s.empty? })
77
+ @logger.debug("kafka_indexer_array invalid: Found null")
78
+ error_string_indexer = "Error in indexer data: #{kafka_indexer_array}"
79
+ indexer_valid = false
80
+ else
81
+ @logger.debug("kafka_indexer_array valid")
82
+ indexer_valid = true
83
+ logstash_kafka_read_time_indexer = logstash_kafka_read_time_indexer.to_i
84
+ kafka_append_time_indexer = kafka_append_time_indexer.to_i
85
+ kafka_indexer_lag_ms = logstash_kafka_read_time_indexer - kafka_append_time_indexer
86
+ end
87
+
88
+ if (shipper_valid == true && indexer_valid == true)
89
+ kafka_total_lag_ms = logstash_kafka_read_time_indexer - kafka_append_time_shipper
90
+ event.set("[ktm]", {"lag_total_ms" => kafka_total_lag_ms, "lag_indexer_ms" => kafka_indexer_lag_ms, "lag_shipper_ms" => kafka_shipper_lag_ms, "datacenter_shipper" => kafka_datacenter_shipper, "kafka_topic_indexer" => kafka_topic_indexer, "kafka_consumer_group_indexer" => kafka_consumer_group_indexer, "kafka_topic_shipper" => kafka_topic_shipper, "kafka_consumer_group_shipper" => kafka_consumer_group_shipper, "tags" => ["ktm_lag_total"] })
91
+ elsif (shipper_valid == true && indexer_valid == false)
92
+ event.set("[ktm]", {"lag_shipper_ms" => kafka_shipper_lag_ms, "datacenter_shipper" => kafka_datacenter_shipper, "kafka_topic_shipper" => kafka_topic_shipper, "kafka_consumer_group_shipper" => kafka_consumer_group_shipper, "tags" => ["ktm_lag_shipper"] })
93
+ elsif (indexer_valid == true && shipper_valid == false)
94
+ event.set("[ktm]", {"lag_indexer_ms" => kafka_indexer_lag_ms, "datacenter_shipper" => kafka_datacenter_shipper, "kafka_topic_indexer" => kafka_topic_indexer, "kafka_consumer_group_indexer" => kafka_consumer_group_indexer, "tags" => ["ktm_lag_indexer"] })
95
+ elsif (indexer_valid == false && shipper_valid == false)
96
+ @logger.debug("Error kafka_time_machine: Could not build valid response --> #{error_string_shipper}, #{error_string_indexer}")
97
+ end
98
+
99
+ # Add in the size of the payload field
100
+ if event.get("[payload]")
101
+ payload_bytesize = event.get("[payload]").bytesize
102
+ event.set("[ktm][payload_size_bytes]", payload_bytesize)
103
+ end
104
+
105
+ # filter_matched should go in the last line of our successful code
106
+ filter_matched(event)
107
+
108
+ end # def filter
109
+
110
+ end # class LogStash::Filters::KafkaTimeMachine
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-filter-kafka_time_machine'
3
- s.version = '0.2.0'
3
+ s.version = '1.0.0'
4
4
  s.licenses = ['Apache-2.0']
5
5
  s.summary = "Calculate total time of logstash event that traversed 2 Kafka queues from a shipper site to an indexer site"
6
6
  s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-filter-kafka_time_machine
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.0
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Chris Foster
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2021-06-15 00:00:00.000000000 Z
11
+ date: 2021-06-22 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: logstash-core-plugin-api
@@ -54,7 +54,7 @@ extra_rdoc_files: []
54
54
  files:
55
55
  - Gemfile
56
56
  - README.md
57
- - lib/logstash/filters/kafkatimemachine.rb
57
+ - lib/logstash/filters/kafka_time_machine.rb
58
58
  - logstash-filter-kafka_time_machine.gemspec
59
59
  homepage: http://www.elastic.co/guide/en/logstash/current/index.html
60
60
  licenses:
@@ -1,78 +0,0 @@
1
- # encoding: utf-8
2
- require "logstash/filters/base"
3
- require "logstash/namespace"
4
- require "logstash/event"
5
-
6
- class LogStash::Filters::KafkaTimeMachine < LogStash::Filters::Base
7
-
8
- config_name "kafkatimemachine"
9
-
10
- public
11
- def register
12
-
13
- end
14
-
15
- public
16
- def filter(event)
17
-
18
- # Extract shipper data and check for validity; note that kafka_datacenter_shipper is used for both shipper and indexer arrays
19
- kafka_datacenter_shipper = event.get("[@metadata][kafka_datacenter_shipper]")
20
- kafka_topic_shipper = event.get("[@metadata][kafka_topic_shipper]")
21
- kafka_consumer_group_shipper = event.get("[@metadata][kafka_consumer_group_shipper]")
22
- kafka_append_time_shipper = Float(event.get("[@metadata][kafka_append_time_shipper]")) rescue nil
23
- logstash_kafka_read_time_shipper = Float(event.get("[@metadata][logstash_kafka_read_time_shipper]")) rescue nil
24
-
25
- kafka_shipper_array = Array[kafka_datacenter_shipper, kafka_topic_shipper, kafka_consumer_group_shipper, kafka_append_time_shipper, logstash_kafka_read_time_shipper]
26
- @logger.debug("kafka_shipper_array: #{kafka_shipper_array}")
27
-
28
- if (kafka_shipper_array.any? { |text| text.nil? || text.to_s.empty? })
29
- @logger.debug("kafka_shipper_array invalid: Found null")
30
- error_string_shipper = "Error in shipper data: #{kafka_shipper_array}"
31
- shipper_valid = false
32
- else
33
- @logger.debug("kafka_shipper_array valid")
34
- shipper_valid = true
35
- logstash_kafka_read_time_shipper = logstash_kafka_read_time_shipper.to_i
36
- kafka_append_time_shipper = kafka_append_time_shipper.to_i
37
- kafka_shipper_lag_ms = logstash_kafka_read_time_shipper - kafka_append_time_shipper
38
- end
39
-
40
- # Extract indexer data and check for validity
41
- kafka_topic_indexer = event.get("[@metadata][kafka_topic_indexer]")
42
- kafka_consumer_group_indexer = event.get("[@metadata][kafka_consumer_group_indexer]")
43
- kafka_append_time_indexer = Float(event.get("[@metadata][kafka_append_time_indexer]")) rescue nil
44
- logstash_kafka_read_time_indexer = Float(event.get("[@metadata][logstash_kafka_read_time_indexer]")) rescue nil
45
-
46
- kafka_indexer_array = Array[kafka_datacenter_shipper, kafka_topic_indexer, kafka_consumer_group_indexer, kafka_append_time_indexer, logstash_kafka_read_time_indexer]
47
- @logger.debug("kafka_indexer_array: #{kafka_indexer_array}")
48
-
49
- if (kafka_indexer_array.any? { |text| text.nil? || text.to_s.empty? })
50
- @logger.debug("kafka_indexer_array invalid: Found null")
51
- error_string_indexer = "Error in indexer data: #{kafka_indexer_array}"
52
- indexer_valid = false
53
- else
54
- @logger.debug("kafka_indexer_array valid")
55
- indexer_valid = true
56
- logstash_kafka_read_time_indexer = logstash_kafka_read_time_indexer.to_i
57
- kafka_append_time_indexer = kafka_append_time_indexer.to_i
58
- kafka_indexer_lag_ms = logstash_kafka_read_time_indexer - kafka_append_time_indexer
59
- end
60
-
61
- if (shipper_valid == true && indexer_valid == true)
62
- kafka_total_lag_ms = logstash_kafka_read_time_indexer - kafka_append_time_shipper
63
- event.set("[_ktm]", {"lag_total" => kafka_total_lag_ms, "lag_indexer" => kafka_indexer_lag_ms, "lag_shipper" => kafka_shipper_lag_ms, "datacenter_shipper" => kafka_datacenter_shipper, "kafka_topic_indexer" => kafka_topic_indexer, "kafka_consumer_group_indexer" => kafka_consumer_group_indexer, "kafka_topic_shipper" => kafka_topic_shipper, "kafka_consumer_group_shipper" => kafka_consumer_group_shipper, "tags" => ["ktm_lag_complete"] })
64
- elsif (shipper_valid == true && indexer_valid == false)
65
- event.set("[_ktm]", {"lag_shipper" => kafka_shipper_lag_ms, "datacenter_shipper" => kafka_datacenter_shipper, "kafka_topic_shipper" => kafka_topic_shipper, "kafka_consumer_group_shipper" => kafka_consumer_group_shipper, "tags" => ["ktm_lag_shipper"] })
66
- elsif (indexer_valid == true && shipper_valid == false)
67
- event.set("[_ktm]", {"lag_indexer" => kafka_indexer_lag_ms, "datacenter_shipper" => kafka_datacenter_shipper, "kafka_topic_indexer" => kafka_topic_indexer, "kafka_consumer_group_indexer" => kafka_consumer_group_indexer, "tags" => ["ktm_lag_indexer"] })
68
- elsif (indexer_valid == false && shipper_valid == false)
69
- @logger.error("Error kafkatimemachine: Could not build valid response --> #{error_string_shipper}, #{error_string_indexer}")
70
- # event.set("[_ktm]", {"error_shipper" => error_string_shipper, "error_indexer" => error_string_indexer, "datacenter_shipper" => kafka_datacenter_shipper, "tags" => ["ktm_error"] })
71
- end
72
-
73
- # filter_matched should go in the last line of our successful code
74
- filter_matched(event)
75
-
76
- end # def filter
77
-
78
- end # class LogStash::Filters::KafkaTimeMachine