logstash-filter-transaction_time 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: df5ee7635060bbbad794fd0e0f833f461ba721b1
4
+ data.tar.gz: 7b6a091cf7af7fb18c113f7f51589c70d56c8536
5
+ SHA512:
6
+ metadata.gz: 85dfd46602f89ac8cfefdc04cf0884689df48521f1d36cfde35a9762881c7d399525128c5ab5b3eb2cfd725029ab1c3d201a789d0eea93bcfa5c1beb83b5a184
7
+ data.tar.gz: faa6f933e23971346e19b4724354f494c0bb66f0e23da2c10917d21cb5c8353ba217b19f3bb0b94b730eab54b060e58a9fc3947bde8e187ced12252f19ff336f
data/CHANGELOG.md ADDED
@@ -0,0 +1,2 @@
1
+ ## 0.1.0
2
+ - Plugin created with the logstash plugin generator
data/CONTRIBUTORS ADDED
@@ -0,0 +1,10 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Tommy Welleby - tommywelleby@gmail.com
6
+
7
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
8
+ Logstash, and you aren't on the list above and want to be, please let us know
9
+ and we'll make sure you're here. Contributions from folks like you are what make
10
+ open source awesome.
data/DEVELOPER.md ADDED
@@ -0,0 +1,2 @@
1
+ # logstash-filter-transaction_time
2
+ Example filter plugin. This should help bootstrap your effort to write your own filter plugin!
data/Gemfile ADDED
@@ -0,0 +1,3 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
3
+
data/LICENSE ADDED
@@ -0,0 +1,11 @@
1
+ Licensed under the Apache License, Version 2.0 (the "License");
2
+ you may not use this file except in compliance with the License.
3
+ You may obtain a copy of the License at
4
+
5
+ http://www.apache.org/licenses/LICENSE-2.0
6
+
7
+ Unless required by applicable law or agreed to in writing, software
8
+ distributed under the License is distributed on an "AS IS" BASIS,
9
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10
+ See the License for the specific language governing permissions and
11
+ limitations under the License.
data/README.md ADDED
@@ -0,0 +1,97 @@
1
+ # About
2
+ This plugin is a substitute for the logstash-filter-elapsed plugin.
3
+ The elapsed-plugin requires a transaction to be executed in a specified order and then decorates the last part of the transaction (or creates a new event) with the elapsed time.
4
+ The order of which the parts of a transaction is received cannot always be predicted when using multiple workers for a pipeline.
5
+ Hence the need for this plugin.
6
+ This plugin, like elapsed, uses a unique identifier to pair events in a transaction.
7
+ But instead of defining a start and an end for a transaction - only the unique identifier is used.
8
+ This of course has some implications. The biggest one not being able to decorate the last part of the transaction since it may or may not be the same type of event.
9
+ Instead the transaction time is stored together with the unique identifier. Either in the same or another index.
10
+
11
+
12
+ # Logstash Plugin
13
+
14
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
15
+
16
+ It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
17
+
18
+ ## Documentation
19
+
20
+ Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
21
+
22
+ - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
23
+ - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
24
+
25
+ ## Need Help?
26
+
27
+ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
28
+
29
+ ## Developing
30
+
31
+ ### 1. Plugin Developement and Testing
32
+
33
+ #### Code
34
+ - To get started, you'll need JRuby with the Bundler gem installed.
35
+
36
+ - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
37
+
38
+ - Install dependencies
39
+ ```sh
40
+ bundle install
41
+ ```
42
+
43
+ #### Test
44
+
45
+ - Update your dependencies
46
+
47
+ ```sh
48
+ bundle install
49
+ ```
50
+
51
+ - Run tests
52
+
53
+ ```sh
54
+ bundle exec rspec
55
+ ```
56
+
57
+ ### 2. Running your unpublished Plugin in Logstash
58
+
59
+ #### 2.1 Run in a local Logstash clone
60
+
61
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
62
+ ```ruby
63
+ gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
64
+ ```
65
+ - Install plugin
66
+ ```sh
67
+ bin/logstash-plugin install --no-verify
68
+ ```
69
+ - Run Logstash with your plugin
70
+ ```sh
71
+ bin/logstash -e 'filter {awesome {}}'
72
+ ```
73
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
74
+
75
+ #### 2.2 Run in an installed Logstash
76
+
77
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
78
+
79
+ - Build your plugin gem
80
+ ```sh
81
+ gem build logstash-filter-awesome.gemspec
82
+ ```
83
+ - Install the plugin from the Logstash home
84
+ ```sh
85
+ bin/logstash-plugin install /your/local/plugin/logstash-filter-awesome.gem
86
+ ```
87
+ - Start Logstash and proceed to test the plugin
88
+
89
+ ## Contributing
90
+
91
+ All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
92
+
93
+ Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
94
+
95
+ It is more important to the community that you are able to contribute.
96
+
97
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
@@ -0,0 +1,214 @@
1
+ # encoding: utf-8
2
+ require "logstash/filters/base"
3
+ require "logstash/namespace"
4
+
5
+ # This filter will replace the contents of the default
6
+ # message field with whatever you specify in the configuration.
7
+ #
8
+ # It is only intended to be used as an .
9
+ class LogStash::Filters::TransactionTime < LogStash::Filters::Base
10
+
11
+ HOST_FIELD = "host"
12
+ TRANSACTION_TIME_TAG = "TransactionTime"
13
+ TRANSACTION_TIME_FIELD = "transaction_time"
14
+ TRANSACTION_UID_FIELD = "transaction_uid"
15
+ TIMESTAMP_START_FIELD = "timestamp_start"
16
+
17
+ config_name "transaction_time"
18
+
19
+ # The name of the UID-field used to identify transaction-pairs
20
+ config :uid_field, :validate => :string, :required => true
21
+ # The amount of time (in seconds) before a transaction is dropped. Defaults to 5 minutes
22
+ config :timeout, :validate => :number, :default => 300
23
+ # What tag to use as timestamp when calculating the elapsed transaction time. Defaults to @timestamp
24
+ config :timestamp_tag, :validate => :string, :default => "@timestamp"
25
+ # Override the new events timestamp with the oldest or newest timestamp or keep the new one (set when logstash has processed the event)
26
+ config :replace_timestamp, :validate => ['keep', 'oldest', 'newest'], :default => 'keep'
27
+ # Tag used to identify transactional events. If set, only events tagged with the specified tag attached will be concidered transactions and be processed by the plugin
28
+ config :filter_tag, :validate => :string
29
+ # Whether or not to attach one or none of the events in a transaction to the output event.
30
+ # Defaults to 'none' - which reduces memory footprint by not adding the event to the transactionlist.
31
+ config :attach_event, :validate => ['first','last','oldest','newest','none'], :default => 'none'
32
+
33
+ public
34
+ def register
35
+ # Add instance variables
36
+ @transactions = Hash.new
37
+ @mutex = Mutex.new
38
+ @storeEvent = !(@attach_event.eql?"none")
39
+ @@timestampTag = @timestamp_tag
40
+ end # def register
41
+
42
+ def transactions
43
+ @transactions
44
+ end
45
+ def self.timestampTag
46
+ @@timestampTag
47
+ end
48
+
49
+ public
50
+ def filter(event)
51
+
52
+ uid = event.get(@uid_field)
53
+ #return if uid.nil?
54
+
55
+ @logger.debug("Received UID", uid: uid)
56
+
57
+ if (@filter_tag.nil? || (!event.get("tags").nil? && event.get("tags").include?(@filter_tag)))
58
+ @mutex.synchronize do
59
+ if(!@transactions.has_key?(uid))
60
+ @transactions[uid] = LogStash::Filters::TransactionTime::Transaction.new(event, uid, @storeEvent)
61
+ else #End of transaction
62
+ @transactions[uid].addSecond(event,@storeEvent)
63
+ transaction_event = new_transactiontime_event(@transactions[uid])
64
+ filter_matched(transaction_event)
65
+ yield transaction_event if block_given?
66
+ @transactions.delete(uid)
67
+ end
68
+ end
69
+ end
70
+
71
+ event.set("uid_field", @uid_field)
72
+
73
+ # filter_matched should go in the last line of our successful code
74
+ filter_matched(event)
75
+ end # def filter
76
+
77
+
78
+ # The method is invoked by LogStash every 5 seconds.
79
+ def flush(options = {})
80
+ expired_elements = []
81
+
82
+ @mutex.synchronize do
83
+ increment_age_by(5)
84
+ expired_elements = remove_expired_elements()
85
+ end
86
+
87
+ #return create_expired_events_from(expired_elements)
88
+ end
89
+
90
+ private
91
+ def increment_age_by(seconds)
92
+ @transactions.each_pair do |key, transaction|
93
+ transaction.age += seconds
94
+ end
95
+ end
96
+
97
+ # Remove the expired "start events" from the internal
98
+ # buffer and return them.
99
+ def remove_expired_elements()
100
+ expired = []
101
+ @transactions.delete_if do |key, transaction|
102
+ if(transaction.age >= @timeout)
103
+ expired << transaction
104
+ next true
105
+ end
106
+ next false
107
+ end
108
+ return expired
109
+ end
110
+
111
+ def new_transactiontime_event(transaction)
112
+ case @attach_event
113
+ when 'oldest'
114
+ event = transaction.getOldestEvent()
115
+ when 'first'
116
+ event = transaction.firstEvent
117
+ when 'newest'
118
+ event = transaction.getNewestEvent()
119
+ when 'last'
120
+ event = transaction.lastEvent
121
+ else
122
+ event = LogStash::Event.new
123
+ end
124
+ event.set(HOST_FIELD, Socket.gethostname)
125
+
126
+ event.tag(TRANSACTION_TIME_TAG)
127
+ event.set(TRANSACTION_TIME_FIELD, transaction.diff)
128
+ event.set(TRANSACTION_UID_FIELD, transaction.uid)
129
+ event.set(TIMESTAMP_START_FIELD, transaction.getOldestTimestamp())
130
+
131
+ if(@replace_timestamp.eql?'oldest')
132
+ event.set("@timestamp", transaction.getOldestTimestamp())
133
+ elsif (@replace_timestamp.eql?'newest')
134
+ event.set("@timestamp", transaction.getNewestTimestamp())
135
+ end
136
+
137
+
138
+ return event
139
+ end
140
+
141
+
142
+ end # class LogStash::Filters::TransactionTime
143
+
144
+
145
+
146
+
147
+
148
+
149
+
150
+ class LogStash::Filters::TransactionTime::Transaction
151
+ attr_accessor :firstEvent, :lastEvent,:firstTimestamp, :secondTimestamp, :uid, :age, :diff
152
+
153
+ def initialize(firstEvent, uid, storeEvent = false)
154
+ if(storeEvent)
155
+ @firstEvent = firstEvent
156
+ end
157
+ @firstTimestamp = firstEvent.get(LogStash::Filters::TransactionTime.timestampTag)
158
+ @uid = uid
159
+ @age = 0
160
+ end
161
+
162
+ def addSecond(lastEvent,storeEvent = false)
163
+ if(storeEvent)
164
+ @lastEvent = lastEvent
165
+ end
166
+ @secondTimestamp = lastEvent.get(LogStash::Filters::TransactionTime.timestampTag)
167
+ @diff = calculateDiff()
168
+ end
169
+
170
+ #Gets the first (based on timestamp) event
171
+ def getOldestEvent()
172
+ if invalidTransaction()
173
+ return nil
174
+ end
175
+
176
+ if(@firstTimestamp < @secondTimestamp)
177
+ return @firstEvent
178
+ else
179
+ return @lastEvent
180
+ end
181
+ end
182
+
183
+ def getOldestTimestamp()
184
+ return [@firstTimestamp,@secondTimestamp].min
185
+ end
186
+
187
+ #Gets the last (based on timestamp) event
188
+ def getNewestEvent()
189
+ if invalidTransaction()
190
+ return nil
191
+ end
192
+ if(@firstTimestamp > @secondTimestamp)
193
+ return @firstEvent
194
+ else
195
+ return @lastEvent
196
+ end
197
+ end
198
+
199
+ def getNewestTimestamp()
200
+ return [@firstTimestamp,@secondTimestamp].max
201
+ end
202
+
203
+ def invalidTransaction()
204
+ return firstTimestamp.nil? || secondTimestamp.nil?
205
+ end
206
+
207
+ def calculateDiff()
208
+ if invalidTransaction()
209
+ return nil
210
+ end
211
+
212
+ return getNewestTimestamp() - getOldestTimestamp()
213
+ end
214
+ end
@@ -0,0 +1,23 @@
1
+ Gem::Specification.new do |s|
2
+ s.name = 'logstash-filter-transaction_time'
3
+ s.version = '1.0.0'
4
+ s.licenses = ['Apache-2.0','Apache License (2.0)']
5
+ s.summary = 'Writes the time difference between two events in a transaction to a new event'
6
+ s.description = 'This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program'
7
+ s.homepage = 'http://addinit.se/'
8
+ s.authors = ['Tommy Welleby']
9
+ s.email = 'tommy.welleby@addinit.se'
10
+ s.require_paths = ['lib']
11
+
12
+ # Files
13
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
14
+ # Tests
15
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
16
+
17
+ # Special flag to let us know this is actually a logstash plugin
18
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" }
19
+
20
+ # Gem dependencies
21
+ s.add_runtime_dependency "logstash-core-plugin-api", "~> 2.0"
22
+ s.add_development_dependency 'logstash-devutils'
23
+ end
@@ -0,0 +1,319 @@
1
+ # encoding: utf-8
2
+ require_relative '../spec_helper'
3
+ require "logstash/filters/transaction_time"
4
+
5
+
6
+ describe LogStash::Filters::TransactionTime do
7
+ UID_FIELD = "uniqueIdField"
8
+ TIMEOUT = 30
9
+
10
+
11
+ describe "Set to Hello World" do
12
+ let(:config) do <<-CONFIG
13
+ filter {
14
+ transaction_time {
15
+ timestamp_tag => "@timestamp"
16
+ uid_field => "uid"
17
+ }
18
+ }
19
+ CONFIG
20
+ end
21
+
22
+ sample("timestamp_tag" => "testing") do
23
+ expect(subject).to include("timestamp_tag")
24
+ expect(subject.get('timestamp_tag')).to eq('testing')
25
+ end
26
+
27
+ sample("uid_field" => "some text") do
28
+ expect(subject).to include("uid_field")
29
+ expect(subject.get('uid_field')).to eq('uid')
30
+ end
31
+ end
32
+
33
+ def event(data)
34
+ data["message"] ||= "Log message"
35
+ LogStash::Event.new(data)
36
+ end
37
+
38
+ before(:each) do
39
+ setup_filter()
40
+ end
41
+
42
+ def setup_filter(config = {})
43
+ @config = {"uid_field" => UID_FIELD, "timeout" => TIMEOUT, "attach_event" => 'first'}
44
+ @config.merge!(config)
45
+ @filter = LogStash::Filters::TransactionTime.new(@config)
46
+ @filter.register
47
+ end
48
+
49
+ context "Testing Hash with UID. " do
50
+ describe "Receiving" do
51
+ uid = "D7AF37D9-4F7F-4EFC-B481-06F65F75E8C0"
52
+ uid2 = "5DE49829-5CD3-4103-8062-781AC63BE4F5"
53
+ describe "one event" do
54
+ it "records the transaction" do
55
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid))
56
+ #insist { @filter.events[uid] } == "HEJ"
57
+ insist { @filter.transactions.size } == 1
58
+ insist { @filter.transactions[uid].firstEvent } != nil
59
+ insist { @filter.transactions[uid].lastEvent } == nil
60
+ end
61
+ end
62
+ describe "and events with the same UID" do
63
+ it "completes and removes transaction" do
64
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid))
65
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid))
66
+ insist { @filter.transactions.size } == 0
67
+ insist { @filter.transactions[uid] } == nil
68
+ end
69
+ end
70
+ describe "and events with different UID" do
71
+ it "increases the number of transactions to two" do
72
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid))
73
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid2))
74
+ insist { @filter.transactions.size } == 2
75
+ insist { @filter.transactions[uid].firstEvent } != nil
76
+ insist { @filter.transactions[uid].lastEvent } == nil
77
+ insist { @filter.transactions[uid2].firstEvent } != nil
78
+ insist { @filter.transactions[uid2].lastEvent } == nil
79
+ end
80
+ end
81
+ end
82
+ end
83
+
84
+ context "Testing TransactionTime. " do
85
+ describe "Receiving" do
86
+ uid = "D7AF37D9-4F7F-4EFC-B481-06F65F75E8CC"
87
+ describe "two events with the same UID in cronological order" do
88
+ it "calculates TransactionTime with last presicion" do
89
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:21.000+0100"))
90
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100")) do | new_event |
91
+ insist { new_event } != nil
92
+ insist { new_event.get("tags").include?("TransactionTime") }
93
+ insist { new_event.get("transaction_time") } == 1.0
94
+ end
95
+ insist { @filter.transactions.size } == 0
96
+ insist { @filter.transactions[uid] } == nil
97
+ end
98
+ it "calculates TransactionTime with ms presicion" do
99
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:21.001+0100"))
100
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100")) do | new_event |
101
+ insist { new_event } != nil
102
+ insist { new_event.get("tags").include?("TransactionTime") }
103
+ insist { new_event.get("transaction_time") } == 0.999
104
+ end
105
+ insist { @filter.transactions.size } == 0
106
+ insist { @filter.transactions[uid] } == nil
107
+ end
108
+ end
109
+ describe "two events with the same UID in REVERSED cronological order" do
110
+ it "calculates TransactionTime with last presicion" do
111
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
112
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:21.000+0100"))do | new_event |
113
+ insist { new_event } != nil
114
+ insist { new_event.get("tags").include?("TransactionTime") }
115
+ insist { new_event.get("transaction_time") } == 1.0
116
+ end
117
+ insist { @filter.transactions.size } == 0
118
+ insist { @filter.transactions[uid] } == nil
119
+ end
120
+ it "calculates TransactionTime with ms presicion" do
121
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
122
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:21.001+0100")) do | new_event |
123
+ insist { new_event } != nil
124
+ insist { new_event.get("tags").include?("TransactionTime") }
125
+ insist { new_event.get("transaction_time") } == 0.999
126
+ end
127
+ insist { @filter.transactions.size } == 0
128
+ insist { @filter.transactions[uid] } == nil
129
+ end
130
+ end
131
+ end
132
+ end #end context Testing TransactionTime
133
+
134
+ context "Testing flush. " do
135
+ uid = "D7AF37D9-4F7F-4EFC-B481-06F65F75E8CC"
136
+ uid2 = "C27BBC4C-6456-4581-982E-7497B4C7E754"
137
+ describe "Call flush enough times" do
138
+ it "flushes all old transactions" do
139
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
140
+ insist { @filter.transactions.size } == 1
141
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid2, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
142
+ insist { @filter.transactions.size } == 2
143
+ ((TIMEOUT/5)-1).times do
144
+ @filter.flush()
145
+ end
146
+ insist { @filter.transactions.size } == 2
147
+ @filter.flush()
148
+ insist { @filter.transactions.size } == 0
149
+ end
150
+ it "does not flush newer transactions" do
151
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
152
+ insist { @filter.transactions.size } == 1
153
+ ((TIMEOUT/5)-1).times do
154
+ @filter.flush()
155
+ end
156
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid2, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
157
+ insist { @filter.transactions.size } == 2
158
+ @filter.flush()
159
+ insist { @filter.transactions.size } == 1
160
+ end
161
+ end
162
+ end
163
+
164
+ context "Testing Timestamp Override." do
165
+ uid = "D7AF37D9-4F7F-4EFC-B481-06F65F75E8CC"
166
+ describe "Two events with the same UID" do
167
+ describe "When config set to replace_timestamp => oldest" do
168
+ it "sets the timestamp to the oldest" do
169
+ config = {"replace_timestamp" => 'oldest'}
170
+ @config.merge!(config)
171
+
172
+ @filter = LogStash::Filters::TransactionTime.new(@config)
173
+ @filter.register
174
+
175
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
176
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
177
+ insist { new_event } != nil
178
+ insist { new_event.get("tags").include?("TransactionTime") }
179
+ insist { new_event.get("@timestamp").to_s } == LogStash::Timestamp.parse_iso8601("2018-04-22T09:46:22.000+0100").to_s
180
+ end
181
+ end
182
+ end
183
+ describe "When config set to replace_timestamp => newest" do
184
+ it "sets the timestamp to the newest" do
185
+ config = {"replace_timestamp" => 'newest'}
186
+ @config.merge!(config)
187
+
188
+ @filter = LogStash::Filters::TransactionTime.new(@config)
189
+ @filter.register
190
+
191
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
192
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
193
+ insist { new_event } != nil
194
+ insist { new_event.get("tags").include?("TransactionTime") }
195
+ insist { new_event.get("@timestamp").to_s } == LogStash::Timestamp.parse_iso8601("2018-04-22T09:46:22.100+0100").to_s
196
+ end
197
+ end
198
+ end
199
+ end
200
+ end
201
+
202
+ context "Testing filter_tag." do
203
+ uid = "D7AF37D9-4F7F-4EFC-B481-06F65F75E8CC"
204
+ uid2 = "58C8B705-49C5-4269-92D9-2C959599534C"
205
+ describe "Incoming events with different UID" do
206
+ describe "only two tagged with specified 'filter_tag'" do
207
+ it "registers only two transactions" do
208
+ config = {"filter_tag" => 'transaction'}
209
+ @config.merge!(config)
210
+
211
+ @filter = LogStash::Filters::TransactionTime.new(@config)
212
+ @filter.register
213
+
214
+ insist { @filter.transactions.size } == 0
215
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100", "tags" => ['transaction']))
216
+ insist { @filter.transactions.size } == 1
217
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid2, "@timestamp" => "2018-04-22T09:46:22.100+0100"))
218
+ insist { @filter.transactions.size } == 1
219
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid2, "@timestamp" => "2018-04-22T09:46:22.100+0100", "tags" => ['unrelated']))
220
+ insist { @filter.transactions.size } == 1
221
+ @filter.filter(event("message" => "Log message", UID_FIELD => uid2, "@timestamp" => "2018-04-22T09:46:22.100+0100", "tags" => ['transaction']))
222
+ insist { @filter.transactions.size } == 2
223
+ end
224
+ end
225
+ end
226
+ end
227
+
228
+
229
+ context "Testing attach_event." do
230
+ uid = "9ACCA7B7-D0E9-4E52-A023-9D588E5BE42C"
231
+ describe "Config attach_event" do
232
+ describe "with 'first'" do
233
+ it "attaches info from first event in transaction" do
234
+ config = {"attach_event" => 'first'}
235
+ @config.merge!(config)
236
+
237
+ @filter = LogStash::Filters::TransactionTime.new(@config)
238
+ @filter.register
239
+
240
+ @filter.filter(event("message" => "first", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
241
+ @filter.filter(event("message" => "last", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
242
+ insist { new_event } != nil
243
+ insist { new_event.get("tags").include?("TransactionTime") }
244
+ insist { new_event.get("message") } != nil
245
+ insist { new_event.get("message") } == "first"
246
+ end
247
+ end
248
+ end
249
+ describe "with 'last'" do
250
+ it "attaches info from last event in transaction" do
251
+ config = {"attach_event" => 'last'}
252
+ @config.merge!(config)
253
+
254
+ @filter = LogStash::Filters::TransactionTime.new(@config)
255
+ @filter.register
256
+
257
+ @filter.filter(event("message" => "first", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
258
+ @filter.filter(event("message" => "last", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
259
+ insist { new_event } != nil
260
+ insist { new_event.get("tags").include?("TransactionTime") }
261
+ insist { new_event.get("message") } != nil
262
+ insist { new_event.get("message") } == "last"
263
+ end
264
+ end
265
+ end
266
+ describe "with 'oldest'" do
267
+ it "attaches info from oldest event in transaction" do
268
+ config = {"attach_event" => 'oldest'}
269
+ @config.merge!(config)
270
+
271
+ @filter = LogStash::Filters::TransactionTime.new(@config)
272
+ @filter.register
273
+
274
+ @filter.filter(event("message" => "oldest", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
275
+ @filter.filter(event("message" => "newest", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
276
+ insist { new_event } != nil
277
+ insist { new_event.get("tags").include?("TransactionTime") }
278
+ insist { new_event.get("message") } != nil
279
+ insist { new_event.get("message") } == "oldest"
280
+ end
281
+ end
282
+ end
283
+ describe "with 'newest'" do
284
+ it "attaches info from newest event in transaction" do
285
+ config = {"attach_event" => 'newest'}
286
+ @config.merge!(config)
287
+
288
+ @filter = LogStash::Filters::TransactionTime.new(@config)
289
+ @filter.register
290
+
291
+ @filter.filter(event("message" => "oldest", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
292
+ @filter.filter(event("message" => "newest", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
293
+ insist { new_event } != nil
294
+ insist { new_event.get("tags").include?("TransactionTime") }
295
+ insist { new_event.get("message") } != nil
296
+ insist { new_event.get("message") } == "newest"
297
+ end
298
+ end
299
+ end
300
+ describe "with 'none'" do
301
+ it "attaches no info from any event in transaction" do
302
+ config = {"attach_event" => 'none'}
303
+ @config.merge!(config)
304
+
305
+ @filter = LogStash::Filters::TransactionTime.new(@config)
306
+ @filter.register
307
+
308
+ @filter.filter(event("message" => "oldest", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.000+0100"))
309
+ @filter.filter(event("message" => "newest", UID_FIELD => uid, "@timestamp" => "2018-04-22T09:46:22.100+0100")) do | new_event |
310
+ insist { new_event } != nil
311
+ insist { new_event.get("tags").include?("TransactionTime") }
312
+ insist { new_event.get("message") } == nil
313
+ end
314
+ end
315
+ end
316
+ end
317
+ end
318
+
319
+ end
@@ -0,0 +1,2 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
metadata ADDED
@@ -0,0 +1,88 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: logstash-filter-transaction_time
3
+ version: !ruby/object:Gem::Version
4
+ version: 1.0.0
5
+ platform: ruby
6
+ authors:
7
+ - Tommy Welleby
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2018-04-26 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - "~>"
17
+ - !ruby/object:Gem::Version
18
+ version: '2.0'
19
+ name: logstash-core-plugin-api
20
+ prerelease: false
21
+ type: :runtime
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - "~>"
25
+ - !ruby/object:Gem::Version
26
+ version: '2.0'
27
+ - !ruby/object:Gem::Dependency
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - ">="
31
+ - !ruby/object:Gem::Version
32
+ version: '0'
33
+ name: logstash-devutils
34
+ prerelease: false
35
+ type: :development
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - ">="
39
+ - !ruby/object:Gem::Version
40
+ version: '0'
41
+ description: This gem is a Logstash plugin required to be installed on top of the
42
+ Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This
43
+ gem is not a stand-alone program
44
+ email: tommy.welleby@addinit.se
45
+ executables: []
46
+ extensions: []
47
+ extra_rdoc_files: []
48
+ files:
49
+ - CHANGELOG.md
50
+ - CONTRIBUTORS
51
+ - DEVELOPER.md
52
+ - Gemfile
53
+ - LICENSE
54
+ - README.md
55
+ - lib/logstash/filters/transaction_time.rb
56
+ - logstash-filter-transaction_time.gemspec
57
+ - spec/filters/transaction_time_spec.rb
58
+ - spec/spec_helper.rb
59
+ homepage: http://addinit.se/
60
+ licenses:
61
+ - Apache-2.0
62
+ - Apache License (2.0)
63
+ metadata:
64
+ logstash_plugin: 'true'
65
+ logstash_group: filter
66
+ post_install_message:
67
+ rdoc_options: []
68
+ require_paths:
69
+ - lib
70
+ required_ruby_version: !ruby/object:Gem::Requirement
71
+ requirements:
72
+ - - ">="
73
+ - !ruby/object:Gem::Version
74
+ version: '0'
75
+ required_rubygems_version: !ruby/object:Gem::Requirement
76
+ requirements:
77
+ - - ">="
78
+ - !ruby/object:Gem::Version
79
+ version: '0'
80
+ requirements: []
81
+ rubyforge_project:
82
+ rubygems_version: 2.6.14.1
83
+ signing_key:
84
+ specification_version: 4
85
+ summary: Writes the time difference between two events in a transaction to a new event
86
+ test_files:
87
+ - spec/filters/transaction_time_spec.rb
88
+ - spec/spec_helper.rb