logstash-filter-dissect 1.0.6

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: e8ae7e2bba795f607709e23084ba6c6603470909
4
+ data.tar.gz: d92b6cec5ce57c933e356b147d67d9d50d0f9602
5
+ SHA512:
6
+ metadata.gz: 449cafef68edfd28da3b5dc6c56ad5b93cd532261251868acb5d76f02c654b824120dfa6b937ed6b32085ffd25a4c89dc0a98752840be752580c2ba20e176b93
7
+ data.tar.gz: dcea450f73cf8ad4f3f9845ffe1d850feec89cc02683c2d0e3f72049141844d2550f701a3c9aab24c1620f6504217ed02e52a84f283287c1efccfe88f1c903e7
@@ -0,0 +1,5 @@
1
+ ## 1.0.6
2
+ - Relax constraint on logstash-core-plugin-api to >= 1.60 <= 2.99
3
+
4
+ ## 1.0.5
5
+ - Initial commit
@@ -0,0 +1,10 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Guy Boertje [guyboertje]
6
+
7
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
8
+ Logstash, and you aren't on the list above and want to be, please let us know
9
+ and we'll make sure you're here. Contributions from folks like you are what make
10
+ open source awesome.
@@ -0,0 +1,2 @@
1
+ # logstash-filter-dissect
2
+ If any developer-centric advice will appear here, as needed.
data/Gemfile ADDED
@@ -0,0 +1,6 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
3
+
4
+ gem 'logstash-input-generator', '~> 3.0', '>= 3.0.1'
5
+ gem 'logstash-output-null', '~> 3.0', '>= 3.0.1'
6
+ gem 'logstash-filter-drop', '~> 3.0', '>= 3.0.1'
data/LICENSE ADDED
@@ -0,0 +1,13 @@
1
+ Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
2
+
3
+ Licensed under the Apache License, Version 2.0 (the "License");
4
+ you may not use this file except in compliance with the License.
5
+ You may obtain a copy of the License at
6
+
7
+ http://www.apache.org/licenses/LICENSE-2.0
8
+
9
+ Unless required by applicable law or agreed to in writing, software
10
+ distributed under the License is distributed on an "AS IS" BASIS,
11
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ See the License for the specific language governing permissions and
13
+ limitations under the License.
@@ -0,0 +1,5 @@
1
+ Elasticsearch
2
+ Copyright 2012-2015 Elasticsearch
3
+
4
+ This product includes software developed by The Apache Software
5
+ Foundation (http://www.apache.org/).
@@ -0,0 +1,98 @@
1
+ # Logstash Plugin
2
+
3
+ [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-filter-example.svg)](https://travis-ci.org/logstash-plugins/logstash-filter-example)
4
+
5
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
6
+
7
+ It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
8
+
9
+ ## Documentation
10
+
11
+ Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
12
+
13
+ - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
14
+ - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
15
+
16
+ ## Need Help?
17
+
18
+ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
19
+
20
+ ## Developing
21
+
22
+ ### 1. Plugin Developement and Testing
23
+
24
+ #### Code
25
+ - To get started, you'll need JRuby with the Bundler gem installed.
26
+
27
+ - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
28
+
29
+ - Install dependencies
30
+ ```sh
31
+ bundle install
32
+ ```
33
+
34
+ #### Test
35
+
36
+ - Update your dependencies
37
+
38
+ ```sh
39
+ bundle install
40
+ ```
41
+
42
+ - Run tests
43
+
44
+ ```sh
45
+ bundle exec rspec
46
+ ```
47
+
48
+ ### 2. Running your unpublished Plugin in Logstash
49
+
50
+ #### 2.1 Run in a local Logstash clone
51
+
52
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
53
+ ```ruby
54
+ gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
55
+ ```
56
+ - Install plugin
57
+ ```sh
58
+ # Logstash 2.3 and higher
59
+ bin/logstash-plugin install --no-verify
60
+
61
+ # Prior to Logstash 2.3
62
+ bin/plugin install --no-verify
63
+
64
+ ```
65
+ - Run Logstash with your plugin
66
+ ```sh
67
+ bin/logstash -e 'filter {awesome {}}'
68
+ ```
69
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
70
+
71
+ #### 2.2 Run in an installed Logstash
72
+
73
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
74
+
75
+ - Build your plugin gem
76
+ ```sh
77
+ gem build logstash-filter-awesome.gemspec
78
+ ```
79
+ - Install the plugin from the Logstash home
80
+ ```sh
81
+ # Logstash 2.3 and higher
82
+ bin/logstash-plugin install --no-verify
83
+
84
+ # Prior to Logstash 2.3
85
+ bin/plugin install --no-verify
86
+
87
+ ```
88
+ - Start Logstash and proceed to test the plugin
89
+
90
+ ## Contributing
91
+
92
+ All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
93
+
94
+ Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
95
+
96
+ It is more important to the community that you are able to contribute.
97
+
98
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
@@ -0,0 +1,148 @@
1
+ # encoding: utf-8
2
+ require "logstash/filters/base"
3
+ require "logstash/namespace"
4
+
5
+ require "java"
6
+ require "jars/jruby-dissect-library.jar"
7
+ require "jruby_dissector"
8
+
9
+ # De-structures text
10
+ #
11
+ # The dissect filter is a kind of split operation.
12
+ # Unlike a regular split operation where a single delimiter is applied to the
13
+ # whole string, this operation applies a sequence of delimiters to an Event field's
14
+ # string value. This sequence is called a dissection.
15
+ # The dissection is created as a string using a %{} notation:
16
+ # ........
17
+ # delimiter suffix
18
+ # +---+ ++
19
+ # %{key1}/ -- %{+key2/1}: %{&key1}
20
+ # +-----+ | +--+
21
+ # field prefix key
22
+ # ........
23
+
24
+ # Note: delimiters can't contain the `%{` `}` characters.
25
+
26
+ # The config should look like this:
27
+ # [source, ruby]
28
+ # filter {
29
+ # dissect {
30
+ # mapping => {
31
+ # "message" => "%{timestamp} %{+timestamp} %{+timestamp} %{logsource} %{} %{program}[%{pid}]: %{msg}"
32
+ # }
33
+ # }
34
+ # }
35
+
36
+ # When dissecting a string any text between the delimiters, a found value, will be stored
37
+ # in the Event using that field name.
38
+
39
+ # The Key:
40
+ # The key is the text between the `%{` and `}`, exclusive of the ?, +, & prefixes and the ordinal suffix.
41
+ # `%{?aaa}` - key is `aaa`
42
+ # `%{+bbb/3}` - key is `bbb`
43
+ # `%{&ccc}` - key is `ccc`
44
+
45
+ # Normal field notation
46
+ # The found value is added to the Event using the key.
47
+ # `%{some_field}` - a normal field
48
+
49
+ # Skip field notation
50
+ # The found value is recorded internally but not added to the Event.
51
+ # The key, if supplied, is prefixed with a `?`.
52
+ # `%{}` - an empty skip field
53
+ # `%{?some_field} - named skip field
54
+
55
+ # Append field notation
56
+ # The value is appended to another value or stored if its the first field seen.
57
+ # The key is prefixed with a `+`.
58
+ # The final value is stored in the Event using the key.
59
+ # The delimiter found before the field or a space is appended before the found value.
60
+ # `%{+some_field}` - an append field
61
+ # `%{+some_field/2}` - and append field with an order modifier.
62
+ # An order modifier, `/number`, allows one to reorder the append sequence.
63
+ # e.g. for a text of `1 2 3 go`, this `%{+a/2} %{+a/1} %{+a/4} %{+a/3}` will build a key/value of `a => 2 1 go 3`
64
+ # Append fields without an order modifier will append in declared order.
65
+ # e.g. for a text of `1 2 3 go`, this `%{a} %{b} %{+a}` will build two key/values of `a => 1 3 go, b => 2`
66
+
67
+ # Indirect field notation
68
+ # The found value is added to the Event using the found value of another field as the key.
69
+ # The key is prefixed with a `&`.
70
+ # `%{&some_field}` - an indirect field where the key is indirectly sourced from the value of `some_field`.
71
+ # e.g. for a text of `error: some_error, description`, this `error: %{?err}, %{&desc}`will build a key/value of `'some_error' => description`
72
+ # Hint: use a Skip field if you do not want the indirection key/value stored.
73
+ # e.g. for a text of `google: 77.98`, this `%{?a}: %{&a}` will build a key/value of `google => 77.98`.
74
+
75
+ # Note: for append and indirect field the key can refer to a field that already exists in the event before dissection.
76
+ # Note: append and indirect cannot be combined. This will fail validation.
77
+ # `%{+&something}` - will add a value to the `&something` key, probably not the intended outcome.
78
+ # `%{&+something}` will add a value to the `+something` key, again unintended.
79
+
80
+ # Delimiter repetition
81
+ # In the source text if a field has variable width padded with delimiters, the padding will be ignored.
82
+ # e.g. for texts of:
83
+ # ........
84
+ # 00000043 ViewReceiver I
85
+ # 000000b3 Peer I
86
+ # ........
87
+ # and a dissection of `%{a} %{b} %{c}`; the padding is ignored.
88
+ #
89
+ # You probably want to put this filter in an if block to ensure that the event
90
+ # contains text with a suitable layout.
91
+ # [source, ruby]
92
+ # filter {
93
+ # if [type] == "syslog" or "syslog" in [tags] {
94
+ # dissect {
95
+ # mapping => {
96
+ # "message" => "%{timestamp} %{+timestamp} %{+timestamp} %{logsource} %{} %{program}[%{pid}]: %{msg}"
97
+ # }
98
+ # }
99
+ # }
100
+ # }
101
+
102
+ module LogStash module Filters class Dissect < LogStash::Filters::Base
103
+
104
+ config_name "dissect"
105
+
106
+ # A hash of dissections of field => value
107
+ # A later dissection can be done on an earlier one
108
+ # or they can be independent.
109
+ #
110
+ # For example
111
+ # [source, ruby]
112
+ # filter {
113
+ # dissect {
114
+ # mapping => {
115
+ # "message" => "%{field1} %{field2} %{description}"
116
+ # "description" => "%{field3} %{field4} %{field5}"
117
+ # }
118
+ # }
119
+ # }
120
+ #
121
+ # This is useful if you want to keep the field `description` also
122
+ # dissect it some more.
123
+ config :mapping, :validate => :hash, :default => {}
124
+
125
+ # TODO add docs
126
+ config :convert_datatype, :validate => :hash, :default => {}
127
+
128
+ # Append values to the `tags` field when dissection fails
129
+ config :tag_on_failure, :validate => :array, :default => ["_dissectfailure"]
130
+
131
+ public
132
+
133
+ def register
134
+ @dissector = LogStash::Dissector.new(@mapping)
135
+ end
136
+
137
+ def filter(event)
138
+ # all plugin functions happen in the JRuby extension:
139
+ # debug, warn and error logging, filter_matched, tagging etc.
140
+ @dissector.dissect(event, self)
141
+ end
142
+
143
+ def multi_filter(events)
144
+ LogStash::Util.set_thread_plugin(self)
145
+ @dissector.dissect_multi(events, self)
146
+ events
147
+ end
148
+ end end end
@@ -0,0 +1,23 @@
1
+ Gem::Specification.new do |s|
2
+ s.name = 'logstash-filter-dissect'
3
+ s.version = '1.0.6'
4
+ s.licenses = ['Apache License (2.0)']
5
+ s.summary = "This dissect filter will destructurize text in multiple fields."
6
+ s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
7
+ s.authors = ["Elastic"]
8
+ s.email = 'info@elastic.co'
9
+ s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
10
+ s.require_paths = ["lib"]
11
+
12
+ # Files
13
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
14
+ # Tests
15
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
16
+
17
+ # Special flag to let us know this is actually a logstash plugin
18
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" }
19
+
20
+ # Gem dependencies
21
+ s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
22
+ s.add_development_dependency 'logstash-devutils', '~> 1.0.0'
23
+ end
@@ -0,0 +1,255 @@
1
+ # encoding: utf-8
2
+ require 'spec_helper'
3
+ require "logstash/filters/dissect"
4
+
5
+ describe LogStash::Filters::Dissect do
6
+ class LoggerMock
7
+ attr_reader :msgs, :hashes
8
+ def initialize()
9
+ @msgs = []
10
+ @hashes = []
11
+ end
12
+
13
+ def error(*msg)
14
+ @msgs.push(msg[0])
15
+ @hashes.push(msg[1])
16
+ end
17
+
18
+ def warn(*msg)
19
+ @msgs.push(msg[0])
20
+ @hashes.push(msg[1])
21
+ end
22
+
23
+ def debug?() true; end
24
+
25
+ def debug(*msg)
26
+ @msgs.push(msg[0])
27
+ @hashes.push(msg[1])
28
+ end
29
+ end
30
+
31
+ describe "Basic dissection" do
32
+ let(:config) do <<-CONFIG
33
+ filter {
34
+ dissect {
35
+ mapping => {
36
+ message => "[%{occurred_at}] %{code} %{service} %{ic} %{svc_message}"
37
+ }
38
+ }
39
+ }
40
+ CONFIG
41
+ end
42
+
43
+ sample("message" => "[25/05/16 09:10:38:425 BST] 00000001 SystemOut O java.lang:type=MemoryPool,name=class storage") do
44
+ expect(subject.get("occurred_at")).to eq("25/05/16 09:10:38:425 BST")
45
+ expect(subject.get("code")).to eq("00000001")
46
+ expect(subject.get("service")).to eq("SystemOut")
47
+ expect(subject.get("ic")).to eq("O")
48
+ expect(subject.get("svc_message")).to eq("java.lang:type=MemoryPool,name=class storage")
49
+ end
50
+ end
51
+
52
+ describe "Basic dissection with datatype conversion" do
53
+ let(:config) do <<-CONFIG
54
+ filter {
55
+ dissect {
56
+ mapping => {
57
+ message => "[%{occurred_at}] %{code} %{service} %{?ic}=%{&ic}% %{svc_message}"
58
+ }
59
+ convert_datatype => {
60
+ cpu => "float"
61
+ code => "int"
62
+ }
63
+ }
64
+ }
65
+ CONFIG
66
+ end
67
+
68
+ sample("message" => "[25/05/16 09:10:38:425 BST] 00000001 SystemOut cpu=95.43% java.lang:type=MemoryPool,name=class storage") do
69
+ expect(subject.get("occurred_at")).to eq("25/05/16 09:10:38:425 BST")
70
+ expect(subject.get("code")).to eq(1)
71
+ expect(subject.get("service")).to eq("SystemOut")
72
+ expect(subject.get("cpu")).to eq(95.43)
73
+ expect(subject.get("svc_message")).to eq("java.lang:type=MemoryPool,name=class storage")
74
+ end
75
+ end
76
+
77
+ describe "Basic dissection with failing datatype conversion" do
78
+ subject(:filter) { LogStash::Filters::Dissect.new(config) }
79
+
80
+ let(:message) { "[25/05/16 09:10:38:425 BST] 00000001 SystemOut cpu=95.43% java.lang:type=MemoryPool,name=class storage" }
81
+ let(:config) do
82
+ {
83
+ "mapping" => {"message" => "[%{occurred_at}] %{code} %{service} %{?ic}=%{&ic}% %{svc_message}"},
84
+ "convert_datatype" => {
85
+ "ccu" => "float", # ccu field -> nil
86
+ "code" => "integer", # only int is supported
87
+ "other" => "int" # other field -> hash - not coercible
88
+ }
89
+ }
90
+ end
91
+ let(:event) { LogStash::Event.new("message" => message, "other" => {}) }
92
+ let(:loggr) { LoggerMock.new }
93
+
94
+ before(:each) do
95
+ filter.logger = loggr
96
+ end
97
+
98
+ it "tags and log messages are created" do
99
+ filter.register
100
+ filter.filter(event)
101
+ expect(event.get("code")).to eq("00000001")
102
+ expect(event.get("tags")).to eq(["_dataconversionnullvalue_ccu_float", "_dataconversionmissing_code_integer", "_dataconversionuncoercible_other_int"])
103
+ expect(loggr.msgs).to eq(
104
+ [
105
+ "Event before dissection",
106
+ "Dissector datatype conversion, value cannot be coerced, key: ccu, value: null",
107
+ "Dissector datatype conversion, datatype not supported: integer",
108
+ "Dissector datatype conversion, value cannot be coerced, key: other, value: {}",
109
+ "Event after dissection"
110
+ ]
111
+ )
112
+ end
113
+ end
114
+
115
+ describe "dissect with skip and append" do
116
+ let(:config) do <<-CONFIG
117
+ filter {
118
+ dissect {
119
+ mapping => {
120
+ "message" => "%{timestamp} %{+timestamp} %{+timestamp} %{logsource} %{} %{program}[%{pid}]: %{msg}"
121
+ }
122
+ add_field => { favorite_filter => "why, dissect of course" }
123
+ }
124
+ }
125
+ CONFIG
126
+ end
127
+
128
+ sample("message" => "Mar 16 00:01:25 evita skip-this postfix/smtpd[1713]: connect from camomile.cloud9.net[168.100.1.3]") do
129
+ expect(subject.get("tags")).to be_nil
130
+ expect(subject.get("logsource")).to eq("evita")
131
+ expect(subject.get("timestamp")).to eq("Mar 16 00:01:25")
132
+ expect(subject.get("msg")).to eq("connect from camomile.cloud9.net[168.100.1.3]")
133
+ expect(subject.get("program")).to eq("postfix/smtpd")
134
+ expect(subject.get("pid")).to eq("1713")
135
+ expect(subject.get("favorite_filter")).to eq("why, dissect of course")
136
+ end
137
+ end
138
+
139
+ context "when mapping a key is not found" do
140
+ subject(:filter) { LogStash::Filters::Dissect.new(config) }
141
+
142
+ let(:message) { "very random message :-)" }
143
+ let(:config) { {"mapping" => {"blah-di-blah" => "%{timestamp} %{+timestamp}"}} }
144
+ let(:event) { LogStash::Event.new("message" => message) }
145
+ let(:loggr) { LoggerMock.new }
146
+
147
+ before(:each) do
148
+ filter.logger = loggr
149
+ end
150
+
151
+ it "does not raise any exceptions" do
152
+ expect{filter.register}.not_to raise_exception
153
+ end
154
+
155
+ it "dissect failure key missing is logged" do
156
+ filter.register
157
+ filter.filter(event)
158
+ expect(loggr.msgs).to eq(["Event before dissection", "Dissector mapping, key not found in event", "Event after dissection"])
159
+ end
160
+ end
161
+
162
+ describe "valid field format handling" do
163
+ subject(:filter) { LogStash::Filters::Dissect.new(config) }
164
+ let(:config) { {"mapping" => {"message" => "%{+timestamp/2} %{+timestamp/1} %{?no_name} %{&no_name} %{} %{program}[%{pid}]: %{msg}"}}}
165
+ let(:loggr) { LoggerMock.new }
166
+
167
+ before(:each) do
168
+ filter.logger = loggr
169
+ end
170
+
171
+ it "does not raise an error in register" do
172
+ expect{filter.register}.not_to raise_exception
173
+ end
174
+ end
175
+
176
+ describe "invalid field format handling" do
177
+ subject(:filter) { LogStash::Filters::Dissect.new(config) }
178
+ let(:loggr) { LoggerMock.new }
179
+
180
+ before(:each) do
181
+ filter.logger = loggr
182
+ end
183
+
184
+ context "when field is defined as Append and Indirect (+&)" do
185
+ let(:config) { {"mapping" => {"message" => "%{+&timestamp}"}}}
186
+ it "raises an error in register" do
187
+ msg = "org.logstash.dissect.InvalidFieldException: Field cannot prefix with both Append and Indirect Prefix (+&): +&timestamp"
188
+ expect{filter.register}.to raise_exception(LogStash::FieldFormatError, msg)
189
+ end
190
+ end
191
+
192
+ context "when field is defined as Indirect and Append (&+)" do
193
+ let(:config) { {"mapping" => {"message" => "%{&+timestamp}"}}}
194
+ it "raises an error in register" do
195
+ msg = "org.logstash.dissect.InvalidFieldException: Field cannot prefix with both Append and Indirect Prefix (&+): &+timestamp"
196
+ expect{filter.register}.to raise_exception(LogStash::FieldFormatError, msg)
197
+ end
198
+ end
199
+ end
200
+
201
+ describe "baseline performance test", :performance => true do
202
+ event_count = 1000000
203
+ min_rate = 30000
204
+
205
+ max_duration = event_count / min_rate
206
+ cfg_base = <<-CONFIG
207
+ input {
208
+ generator {
209
+ count => #{event_count}
210
+ message => "Mar 16 00:01:25 evita postfix/smtpd[1713]: connect from camomile.cloud9.net[168.100.1.3]"
211
+ }
212
+ }
213
+ output { null { } }
214
+ CONFIG
215
+
216
+ config(cfg_base)
217
+ start = Time.now.to_f
218
+ agent do
219
+ duration = (Time.now.to_f - start)
220
+ puts "\n\ninputs/generator baseline rate: #{"%02.0f/sec" % (event_count / duration)}, elapsed: #{duration}s\n\n"
221
+ insist { duration } < max_duration
222
+ end
223
+ end
224
+
225
+ describe "dissect performance test", :performance => true do
226
+ event_count = 1000000
227
+ min_rate = 30000
228
+ max_duration = event_count / min_rate
229
+
230
+ cfg_filter = <<-CONFIG
231
+ input {
232
+ generator {
233
+ count => #{event_count}
234
+ message => "Mar 16 00:01:25 evita postfix/smtpd[1713]: connect from camomile.cloud9.net[168.100.1.3]"
235
+ }
236
+ }
237
+ filter {
238
+ dissect {
239
+ mapping => {
240
+ "message" => "%{timestamp} %{+timestamp} %{+timestamp} %{logsource} %{program}[%{pid}]: %{msg}"
241
+ }
242
+ }
243
+ }
244
+ output { null { } }
245
+ CONFIG
246
+
247
+ config(cfg_filter)
248
+ start = Time.now.to_f
249
+ agent do
250
+ duration = (Time.now.to_f - start)
251
+ puts "\n\nfilters/dissect rate: #{"%02.0f/sec" % (event_count / duration)}, elapsed: #{duration}s\n\n"
252
+ insist { duration } < event_count / min_rate
253
+ end
254
+ end
255
+ end
@@ -0,0 +1,11 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
3
+
4
+ module LogStash::Environment
5
+ # running the grok code outside a logstash package means
6
+ # LOGSTASH_HOME will not be defined, so let's set it here
7
+ # before requiring the grok filter
8
+ unless self.const_defined?(:LOGSTASH_HOME)
9
+ LOGSTASH_HOME = File.expand_path("../../../", __FILE__)
10
+ end
11
+ end
metadata ADDED
@@ -0,0 +1,93 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: logstash-filter-dissect
3
+ version: !ruby/object:Gem::Version
4
+ version: 1.0.6
5
+ platform: ruby
6
+ authors:
7
+ - Elastic
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2016-07-14 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - ">="
17
+ - !ruby/object:Gem::Version
18
+ version: '1.60'
19
+ - - "<="
20
+ - !ruby/object:Gem::Version
21
+ version: '2.99'
22
+ name: logstash-core-plugin-api
23
+ prerelease: false
24
+ type: :runtime
25
+ version_requirements: !ruby/object:Gem::Requirement
26
+ requirements:
27
+ - - ">="
28
+ - !ruby/object:Gem::Version
29
+ version: '1.60'
30
+ - - "<="
31
+ - !ruby/object:Gem::Version
32
+ version: '2.99'
33
+ - !ruby/object:Gem::Dependency
34
+ requirement: !ruby/object:Gem::Requirement
35
+ requirements:
36
+ - - "~>"
37
+ - !ruby/object:Gem::Version
38
+ version: 1.0.0
39
+ name: logstash-devutils
40
+ prerelease: false
41
+ type: :development
42
+ version_requirements: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - "~>"
45
+ - !ruby/object:Gem::Version
46
+ version: 1.0.0
47
+ description: This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program
48
+ email: info@elastic.co
49
+ executables: []
50
+ extensions: []
51
+ extra_rdoc_files: []
52
+ files:
53
+ - CHANGELOG.md
54
+ - CONTRIBUTORS
55
+ - DEVELOPER.md
56
+ - Gemfile
57
+ - LICENSE
58
+ - NOTICE.TXT
59
+ - README.md
60
+ - lib/jars/jruby-dissect-library.jar
61
+ - lib/logstash/filters/dissect.rb
62
+ - logstash-filter-dissect.gemspec
63
+ - spec/filters/dissect_spec.rb
64
+ - spec/spec_helper.rb
65
+ homepage: http://www.elastic.co/guide/en/logstash/current/index.html
66
+ licenses:
67
+ - Apache License (2.0)
68
+ metadata:
69
+ logstash_plugin: 'true'
70
+ logstash_group: filter
71
+ post_install_message:
72
+ rdoc_options: []
73
+ require_paths:
74
+ - lib
75
+ required_ruby_version: !ruby/object:Gem::Requirement
76
+ requirements:
77
+ - - ">="
78
+ - !ruby/object:Gem::Version
79
+ version: '0'
80
+ required_rubygems_version: !ruby/object:Gem::Requirement
81
+ requirements:
82
+ - - ">="
83
+ - !ruby/object:Gem::Version
84
+ version: '0'
85
+ requirements: []
86
+ rubyforge_project:
87
+ rubygems_version: 2.6.3
88
+ signing_key:
89
+ specification_version: 4
90
+ summary: This dissect filter will destructurize text in multiple fields.
91
+ test_files:
92
+ - spec/filters/dissect_spec.rb
93
+ - spec/spec_helper.rb