logstash-filter-transaction_time 1.0.4 → 1.0.5

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: '03089c0ec29302fde4807201a27eedabd29f9dc5'
4
- data.tar.gz: aadea35e5b5cdd92321e1bbe7310faa0252c1a02
3
+ metadata.gz: 074c50d41cd992c872b665bb63497ceee820cf5a
4
+ data.tar.gz: eeab3271345ab338bddcce72683b744a04a6cf9c
5
5
  SHA512:
6
- metadata.gz: f8b2558058d717b92ba30c3a4b0172997d138702169b12cf3cbc46af193d922d22fe11dd8c9fd8bb8493c3e808e3a85ff5a4a30d32fabc00544b83adf0dc3feb
7
- data.tar.gz: 9a1e6678888c794ce841ad2d9a1dbbe6ba086ca72d2cd03ce6eea4c043a5c8ab1a88412996064d54c7d680dca26cd1d26e47196864e2f4d5b6b91d46759344ad
6
+ metadata.gz: f244fe40c2e89b9a7a9f7e95441b7ce6b5a9fe2006d3d1fc6ab77d892b095969888552aa40003a776fda376b244a2f8dd9980a9e5b5a1423c92574e3f2354537
7
+ data.tar.gz: e9c27b093beac560791750525774e74feac4fded750ac4c3a804efc9d99e69a7473a90f707129122a2585146970091a4f3970074af1fdcdf50b12d121c6fae06
data/README.md CHANGED
@@ -8,7 +8,122 @@ But instead of defining a start and an end for a transaction - only the unique i
8
8
  Per default the transaction time is stored together with the unique identifier in a new event, which may be stored in the same or another index.
9
9
  The information from the first, last, oldest or newest event may be attached with the new transaction_time event.
10
10
 
11
-
11
+ # Usage
12
+
13
+ The TransactionTime filter measures the time between two events in a transaction
14
+
15
+ This filter is supposed to be used instead of logstash-filters-elapsed
16
+ when you know that the order of a transaction cannot be guaranteed.
17
+ Which is most likely the case if you are using multiple workers and
18
+ a big amount of events are entering the pipeline in a rapid manner.
19
+
20
+ ## The configuration:
21
+ ```ruby
22
+ filter {
23
+ transaction_time {
24
+ uid_field => "Transaction-unique field"
25
+ ignore_uid => []
26
+ timeout => seconds
27
+ timestamp_tag => "name of timestamp"
28
+ replace_timestamp => ['keep', 'oldest', 'newest']
29
+ filter_tag => "transaction tag"
30
+ attach_event => ['first','last','oldest','newest','none']
31
+ release_expired => [true,false]
32
+ store_data_oldest => []
33
+ store_data_newest => []
34
+ periodic_flush => [true,false]
35
+ }
36
+ }
37
+ ```
38
+ - `uid_field`
39
+ The only required parameter is "uid_field" which is used to identify
40
+ the events in a transaction. A transaction is concidered complete
41
+ when two events with the same UID has been captured.
42
+ It is when a transaction completes that the transaction time is calculated.
43
+
44
+ - `ignore_uid`
45
+ The ignore_uid field takes an array of strings. These strings represent specific UIDs
46
+ that should be ignored. This can be useful for ignoring parsing errors.
47
+ Example:
48
+ ```ruby
49
+ ignore_uid => ["%{[transactionUID][0]}", ""]
50
+ ```
51
+ Will ignore events having empty string or "%{[transactionUID][0]}" in the uid_field.
52
+
53
+ - `timeout`
54
+ The timeout parameter determines the maximum length of a transaction.
55
+ It is set to 300 (5 minutes) by default.
56
+ The transaction will not be recorded if timeout duration is exceeded.
57
+ The value of this parameter will have an impact on the memory footprint of the plugin.
58
+
59
+ - `timestamp_tag`
60
+ The timestamp_tag parameter may be used to select a specific field in the events to use
61
+ when calculating the transaction time. The default field is @timestamp.
62
+
63
+ - `replace_timestamp`
64
+ The new event created when a transaction completes may set its own timestamp
65
+ to when it completes (default) or it may use the timestamp of one of the events in the transaction.
66
+ The parameter replace_timestamp is used to specify this behaviour.
67
+
68
+ - `filter_tag`
69
+ Since this plugin exclusivly calculates the time between events in a transaction,
70
+ it may be wise to filter out the events that are infact not transactions.
71
+ This will help reduce both the memory footprint and processing time of this plugin,
72
+ especially if the pipeline receives a lot of non-transactional events.
73
+ You could use grok and/or mutate to apply this filter like this:
74
+ ```ruby
75
+ filter {
76
+ grok{
77
+ match => { "message" => "(?<message_type>.*)\t(?<msgbody>.*)\t+UID:%{UUID:uid}" }
78
+ }
79
+ if [message_type] in ["MaterialIdentified","Recipe","Result"."ReleaseMaterial"]{
80
+ mutate {
81
+ add_tag => "Transaction"
82
+ }
83
+ }
84
+ transaction_time {
85
+ uid_field => "UID"
86
+ filter_tag => "Transaction"
87
+ }
88
+ }
89
+ ```
90
+ In the example, grok is used to identify the message_type and then the tag "transaction" is added for a specific set of messages. This tag is then used in the transaction_time as filter_tag. Only the messages with this tag will be evaluated.
91
+ > **Note**: Do not use reserved name "_TransactionTime_" which is added to all events created by this plugin
92
+
93
+ - `attach_event`
94
+ The attach_event parameter can be used to append information from one of the events to the
95
+ new transaction_time event. The default is to not attach anything.
96
+ The memory footprint is kept to a minimum by using the default value.
97
+
98
+ - `release_expired`
99
+ The release_expired parameter determines if the first event in an expired transactions
100
+ should be released or not. Defaults to true
101
+
102
+ - `store_data_oldest/store_data_newest`
103
+ The parameters store_data_oldest and store_data_newest are both used in order to attach
104
+ specific fields from oldest respectively newest event. An example of this could be:
105
+ ```ruby
106
+ store_data_oldest => ["@timestamp", "work_unit", "work_center", "message_type"]
107
+ store_data_newest => ["@timestamp", "work_unit", "work_center", "message_type"]
108
+ ```
109
+ Which will result in the genereated transaction event inluding the specified fields from oldest and newest events in a hashmap named oldest/newest under the hash named "transaction_data"
110
+ Example of output data:
111
+ ```
112
+ "transaction_data" => {
113
+ "oldest" => {
114
+ "message_type" => "MaterialIdentified",
115
+ "@timestamp" => 2018-10-31T07:36:23.072Z,
116
+ "work_unit" => "WT000743",
117
+ "work_center" => "WR000046"
118
+ },
119
+ "newest" => {
120
+ "message_type" => "Recipe",
121
+ "@timestamp" => 2018-10-31T07:36:28.188Z,
122
+ "work_unit" => "WT000743",
123
+ "work_center" => "WR000046"
124
+ }
125
+ }
126
+ ```
12
127
 
13
128
  # Logstash Plugin
14
129
 
@@ -14,6 +14,7 @@ require "logstash/namespace"
14
14
  # filter {
15
15
  # transaction_time {
16
16
  # uid_field => "Transaction-unique field"
17
+ # ignore_uid => []
17
18
  # timeout => seconds
18
19
  # timestamp_tag => "name of timestamp"
19
20
  # replace_timestamp => ['keep', 'oldest', 'newest']
@@ -31,6 +32,12 @@ require "logstash/namespace"
31
32
  # the events in a transaction. A transaction is concidered complete
32
33
  # when two events with the same UID has been captured.
33
34
  # It is when a transaction completes that the transaction time is calculated.
35
+ #
36
+ # The ignore_uid field takes an array of strings. These strings represent specific UIDs
37
+ # that should be ignored. This can be useful for ignoring parsing errors.
38
+ # Example:
39
+ # ignore_uid => ["%{[transactionUID][0]}", ""]
40
+ # Will ignore events having empty string or "%{[transactionUID][0]}" in the uid_field.
34
41
  #
35
42
  # The timeout parameter determines the maximum length of a transaction.
36
43
  # It is set to 300 (5 minutes) by default.
@@ -119,6 +126,8 @@ class LogStash::Filters::TransactionTime < LogStash::Filters::Base
119
126
 
120
127
  # The name of the UID-field used to identify transaction-pairs
121
128
  config :uid_field, :validate => :string, :required => true
129
+ # Array of UIDs to ignore (useful for ignoring parse-errors).
130
+ config :ignore_uids, :validate => :array, :default => []
122
131
  # The amount of time (in seconds) before a transaction is dropped. Defaults to 5 minutes
123
132
  config :timeout, :validate => :number, :default => 300
124
133
  # What tag to use as timestamp when calculating the elapsed transaction time. Defaults to @timestamp
@@ -177,17 +186,18 @@ class LogStash::Filters::TransactionTime < LogStash::Filters::Base
177
186
  (event.get("tags").nil? || !event.get("tags").include?(TRANSACTION_TIME_EXPIRED_TAG)) &&
178
187
  (@filter_tag.nil? || (!event.get("tags").nil? && event.get("tags").include?(@filter_tag))))
179
188
 
180
-
181
- @mutex.synchronize do
182
- if(!@transactions.has_key?(uid))
183
- @transactions[uid] = LogStash::Filters::TransactionTime::Transaction.new(event, uid, @storeEvent)
184
-
185
- else #End of transaction
186
- @transactions[uid].addSecond(event,@storeEvent)
187
- transaction_event = new_transactiontime_event(@transactions[uid], @attachData)
188
- filter_matched(transaction_event)
189
- yield transaction_event if block_given?
190
- @transactions.delete(uid)
189
+ if not @ignore_uids.include?(uid)
190
+ @mutex.synchronize do
191
+ if(!@transactions.has_key?(uid))
192
+ @transactions[uid] = LogStash::Filters::TransactionTime::Transaction.new(event, uid, @storeEvent)
193
+
194
+ else #End of transaction
195
+ @transactions[uid].addSecond(event,@storeEvent)
196
+ transaction_event = new_transactiontime_event(@transactions[uid], @attachData)
197
+ filter_matched(transaction_event)
198
+ yield transaction_event if block_given?
199
+ @transactions.delete(uid)
200
+ end
191
201
  end
192
202
  end
193
203
  end
@@ -207,10 +217,14 @@ class LogStash::Filters::TransactionTime < LogStash::Filters::Base
207
217
  expired_elements = remove_expired_elements()
208
218
  end
209
219
 
210
- expired_elements.each do |element|
211
- filter_matched(element)
220
+ if @release_expired
221
+ expired_elements.each do |element|
222
+ filter_matched(element)
223
+ end
224
+ #print("Exp" + options.to_s + expired_elements.to_s)
225
+ return expired_elements
212
226
  end
213
- return expired_elements
227
+ return []
214
228
  #yield expired_elements if block_given?
215
229
  #return create_expired_events_from(expired_elements)
216
230
  end
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-filter-transaction_time'
3
- s.version = '1.0.4'
3
+ s.version = '1.0.5'
4
4
  s.licenses = ['Apache-2.0','Apache License (2.0)']
5
5
  s.summary = 'Writes the time difference between two events in a transaction to a new event'
6
6
  s.description = 'This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program. Source-code and documentation available at github: https://github.com/AddinITAB/logstash-filter-transaction_time'
@@ -175,6 +175,26 @@ describe LogStash::Filters::TransactionTime do
175
175
  insist { @filter.transactions.size } == 1
176
176
  end
177
177
  end
178
+ describe "Setup release_expired = false" do
179
+ it "never releases any expired events when flush is called" do
180
+ config = {"release_expired" => false}
181
+ @config.merge!(config)
182
+
183
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
184
+ insist { @filter.transactions.size } == 1
185
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid2, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
186
+ insist { @filter.transactions.size } == 2
187
+
188
+ #Looks like flush doesn't have config-scope. Setting release_expired hard instead of by config. Will it work like intended when using only config?
189
+ @filter.release_expired = false
190
+ ((TIMEOUT/5)+1).times do
191
+ flushRes = @filter.flush({"from" => "test" })
192
+ insist { (flushRes.any?) } == false
193
+ #insist { @filter.flush().nil? }
194
+ end
195
+ insist { @filter.transactions.size } == 0
196
+ end
197
+ end
178
198
  end
179
199
 
180
200
  context "Testing Timestamp Override." do
@@ -185,6 +205,7 @@ describe LogStash::Filters::TransactionTime do
185
205
  config = {"replace_timestamp" => 'oldest'}
186
206
  @config.merge!(config)
187
207
 
208
+
188
209
  @filter = LogStash::Filters::TransactionTime.new(@config)
189
210
  @filter.register
190
211
 
@@ -377,4 +398,34 @@ describe LogStash::Filters::TransactionTime do
377
398
  end
378
399
  end
379
400
  end
401
+ context "Testing ignore_uids." do
402
+ nokUid = "Erroneous UID"
403
+ uid = "9ACCA7B7-D0E9-4E52-A023-9D588E5BE42C"
404
+ describe "Config ignore_uids set" do
405
+ it "will not accept events with specified uid as transactions" do
406
+ config = {"ignore_uids" => ["Erroneous UID"]}
407
+ @config.merge!(config)
408
+
409
+ @filter = LogStash::Filters::TransactionTime.new(@config)
410
+ @filter.register
411
+
412
+ @filter.filter(event("message" => "first", UID_FIELD => nokUid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
413
+ @filter.filter(event("message" => "last", UID_FIELD => nokUid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
414
+ insist { new_event } == nil
415
+ end
416
+ end
417
+ it "will accept other events as transactions" do
418
+ config = {"ignore_uids" => ["Erroneous UID"]}
419
+ @config.merge!(config)
420
+
421
+ @filter = LogStash::Filters::TransactionTime.new(@config)
422
+ @filter.register
423
+
424
+ @filter.filter(event("message" => "first", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
425
+ @filter.filter(event("message" => "last", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
426
+ insist { new_event } != nil
427
+ end
428
+ end
429
+ end
430
+ end
380
431
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-filter-transaction_time
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.4
4
+ version: 1.0.5
5
5
  platform: ruby
6
6
  authors:
7
7
  - Tommy Welleby
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-10-31 00:00:00.000000000 Z
11
+ date: 2018-11-02 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement