logstash-filter-grok 3.4.0 → 3.4.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: b1e652d5cf4cc9eff9c8a678a3c981b394c26de0
4
- data.tar.gz: 6124306fb38c72fdad65262fb0e544ca5b51592b
3
+ metadata.gz: 20ccae49d2cac575daa26bbda6f8554f1f3abd22
4
+ data.tar.gz: c90ffb04ace29dbe59a48f3b7db70cb60b72bc68
5
5
  SHA512:
6
- metadata.gz: 2fd4bb01edc17528e22cf0085b8a09b5f42624c4f843c1cef1ff089c3ec3f8a4ba79a3e5957a5cec0d819a257e389c6c0985bfb1065b305ad7c8128435f89916
7
- data.tar.gz: 0fbc997375ccbc3e5406b36afe73d4ee38f5e618c5f7e5c2158a5407d030e5c1a924f9e7fc4e220eed5dbd7276f3e01e84ddb1780ce8c195f19eebb91cd288ec
6
+ metadata.gz: 81f30c18b7f29b68f915f554482a7f44bf5f9e315c7cc3c3d0b70462f00c92162d34fd51acfa13459982b3787369ee580d3153385a8056dd226441bfc08e0ada
7
+ data.tar.gz: 58fa40252932500b01561e802d6889908d7485aabe5d932460fc841edc849d48b98968e1331d577887fe360c85d08ad3845c81581b170699f3fadd0a8b488901
@@ -1,3 +1,6 @@
1
+ ## 3.4.1
2
+ - Fix subdirectories in a pattern folder causing an exception in some cases
3
+
1
4
  ## 3.4.0
2
5
  - Add option to define patterns inline in the filter using `pattern_definitions` configuration.
3
6
 
@@ -0,0 +1,332 @@
1
+ :plugin: grok
2
+ :type: filter
3
+
4
+ ///////////////////////////////////////////
5
+ START - GENERATED VARIABLES, DO NOT EDIT!
6
+ ///////////////////////////////////////////
7
+ :version: %VERSION%
8
+ :release_date: %RELEASE_DATE%
9
+ :changelog_url: %CHANGELOG_URL%
10
+ :include_path: ../../../logstash/docs/include
11
+ ///////////////////////////////////////////
12
+ END - GENERATED VARIABLES, DO NOT EDIT!
13
+ ///////////////////////////////////////////
14
+
15
+ [id="plugins-{type}-{plugin}"]
16
+
17
+ === Grok
18
+
19
+ include::{include_path}/plugin_header.asciidoc[]
20
+
21
+ ==== Description
22
+
23
+ Parse arbitrary text and structure it.
24
+
25
+ Grok is currently the best way in logstash to parse crappy unstructured log
26
+ data into something structured and queryable.
27
+
28
+ This tool is perfect for syslog logs, apache and other webserver logs, mysql
29
+ logs, and in general, any log format that is generally written for humans
30
+ and not computer consumption.
31
+
32
+ Logstash ships with about 120 patterns by default. You can find them here:
33
+ <https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns>. You can add
34
+ your own trivially. (See the `patterns_dir` setting)
35
+
36
+ If you need help building patterns to match your logs, you will find the
37
+ <http://grokdebug.herokuapp.com> and <http://grokconstructor.appspot.com/> applications quite useful!
38
+
39
+ ==== Grok Basics
40
+
41
+ Grok works by combining text patterns into something that matches your
42
+ logs.
43
+
44
+ The syntax for a grok pattern is `%{SYNTAX:SEMANTIC}`
45
+
46
+ The `SYNTAX` is the name of the pattern that will match your text. For
47
+ example, `3.44` will be matched by the `NUMBER` pattern and `55.3.244.1` will
48
+ be matched by the `IP` pattern. The syntax is how you match.
49
+
50
+ The `SEMANTIC` is the identifier you give to the piece of text being matched.
51
+ For example, `3.44` could be the duration of an event, so you could call it
52
+ simply `duration`. Further, a string `55.3.244.1` might identify the `client`
53
+ making a request.
54
+
55
+ For the above example, your grok filter would look something like this:
56
+ [source,ruby]
57
+ %{NUMBER:duration} %{IP:client}
58
+
59
+ Optionally you can add a data type conversion to your grok pattern. By default
60
+ all semantics are saved as strings. If you wish to convert a semantic's data type,
61
+ for example change a string to an integer then suffix it with the target data type.
62
+ For example `%{NUMBER:num:int}` which converts the `num` semantic from a string to an
63
+ integer. Currently the only supported conversions are `int` and `float`.
64
+
65
+ .Examples:
66
+
67
+ With that idea of a syntax and semantic, we can pull out useful fields from a
68
+ sample log like this fictional http request log:
69
+ [source,ruby]
70
+ 55.3.244.1 GET /index.html 15824 0.043
71
+
72
+ The pattern for this could be:
73
+ [source,ruby]
74
+ %{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}
75
+
76
+ A more realistic example, let's read these logs from a file:
77
+ [source,ruby]
78
+ input {
79
+ file {
80
+ path => "/var/log/http.log"
81
+ }
82
+ }
83
+ filter {
84
+ grok {
85
+ match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" }
86
+ }
87
+ }
88
+
89
+ After the grok filter, the event will have a few extra fields in it:
90
+
91
+ * `client: 55.3.244.1`
92
+ * `method: GET`
93
+ * `request: /index.html`
94
+ * `bytes: 15824`
95
+ * `duration: 0.043`
96
+
97
+ ==== Regular Expressions
98
+
99
+ Grok sits on top of regular expressions, so any regular expressions are valid
100
+ in grok as well. The regular expression library is Oniguruma, and you can see
101
+ the full supported regexp syntax https://github.com/kkos/oniguruma/blob/master/doc/RE[on the Oniguruma
102
+ site].
103
+
104
+ ==== Custom Patterns
105
+
106
+ Sometimes logstash doesn't have a pattern you need. For this, you have
107
+ a few options.
108
+
109
+ First, you can use the Oniguruma syntax for named capture which will
110
+ let you match a piece of text and save it as a field:
111
+ [source,ruby]
112
+ (?<field_name>the pattern here)
113
+
114
+ For example, postfix logs have a `queue id` that is an 10 or 11-character
115
+ hexadecimal value. I can capture that easily like this:
116
+ [source,ruby]
117
+ (?<queue_id>[0-9A-F]{10,11})
118
+
119
+ Alternately, you can create a custom patterns file.
120
+
121
+ * Create a directory called `patterns` with a file in it called `extra`
122
+ (the file name doesn't matter, but name it meaningfully for yourself)
123
+ * In that file, write the pattern you need as the pattern name, a space, then
124
+ the regexp for that pattern.
125
+
126
+ For example, doing the postfix queue id example as above:
127
+ [source,ruby]
128
+ # contents of ./patterns/postfix:
129
+ POSTFIX_QUEUEID [0-9A-F]{10,11}
130
+
131
+ Then use the `patterns_dir` setting in this plugin to tell logstash where
132
+ your custom patterns directory is. Here's a full example with a sample log:
133
+ [source,ruby]
134
+ Jan 1 06:25:43 mailserver14 postfix/cleanup[21403]: BEF25A72965: message-id=<20130101142543.5828399CCAF@mailserver14.example.com>
135
+ [source,ruby]
136
+ filter {
137
+ grok {
138
+ patterns_dir => ["./patterns"]
139
+ match => { "message" => "%{SYSLOGBASE} %{POSTFIX_QUEUEID:queue_id}: %{GREEDYDATA:syslog_message}" }
140
+ }
141
+ }
142
+
143
+ The above will match and result in the following fields:
144
+
145
+ * `timestamp: Jan 1 06:25:43`
146
+ * `logsource: mailserver14`
147
+ * `program: postfix/cleanup`
148
+ * `pid: 21403`
149
+ * `queue_id: BEF25A72965`
150
+ * `syslog_message: message-id=<20130101142543.5828399CCAF@mailserver14.example.com>`
151
+
152
+ The `timestamp`, `logsource`, `program`, and `pid` fields come from the
153
+ `SYSLOGBASE` pattern which itself is defined by other patterns.
154
+
155
+ Another option is to define patterns _inline_ in the filter using `pattern_definitions`.
156
+ This is mostly for convenience and allows user to define a pattern which can be used just in that
157
+ filter. This newly defined patterns in `pattern_definitions` will not be available outside of that particular `grok` filter.
158
+
159
+
160
+ [id="plugins-{type}s-{plugin}-options"]
161
+ ==== Grok Filter Configuration Options
162
+
163
+ This plugin supports the following configuration options plus the <<plugins-{type}s-common-options>> described later.
164
+
165
+ [cols="<,<,<",options="header",]
166
+ |=======================================================================
167
+ |Setting |Input type|Required
168
+ | <<plugins-{type}s-{plugin}-break_on_match>> |<<boolean,boolean>>|No
169
+ | <<plugins-{type}s-{plugin}-keep_empty_captures>> |<<boolean,boolean>>|No
170
+ | <<plugins-{type}s-{plugin}-match>> |<<hash,hash>>|No
171
+ | <<plugins-{type}s-{plugin}-named_captures_only>> |<<boolean,boolean>>|No
172
+ | <<plugins-{type}s-{plugin}-overwrite>> |<<array,array>>|No
173
+ | <<plugins-{type}s-{plugin}-pattern_definitions>> |<<hash,hash>>|No
174
+ | <<plugins-{type}s-{plugin}-patterns_dir>> |<<array,array>>|No
175
+ | <<plugins-{type}s-{plugin}-patterns_files_glob>> |<<string,string>>|No
176
+ | <<plugins-{type}s-{plugin}-tag_on_failure>> |<<array,array>>|No
177
+ | <<plugins-{type}s-{plugin}-tag_on_timeout>> |<<string,string>>|No
178
+ | <<plugins-{type}s-{plugin}-timeout_millis>> |<<number,number>>|No
179
+ |=======================================================================
180
+
181
+ Also see <<plugins-{type}s-common-options>> for a list of options supported by all
182
+ filter plugins.
183
+
184
+ &nbsp;
185
+
186
+ [id="plugins-{type}s-{plugin}-break_on_match"]
187
+ ===== `break_on_match`
188
+
189
+ * Value type is <<boolean,boolean>>
190
+ * Default value is `true`
191
+
192
+ Break on first match. The first successful match by grok will result in the
193
+ filter being finished. If you want grok to try all patterns (maybe you are
194
+ parsing different things), then set this to false.
195
+
196
+ [id="plugins-{type}s-{plugin}-keep_empty_captures"]
197
+ ===== `keep_empty_captures`
198
+
199
+ * Value type is <<boolean,boolean>>
200
+ * Default value is `false`
201
+
202
+ If `true`, keep empty captures as event fields.
203
+
204
+ [id="plugins-{type}s-{plugin}-match"]
205
+ ===== `match`
206
+
207
+ * Value type is <<hash,hash>>
208
+ * Default value is `{}`
209
+
210
+ A hash of matches of field => value
211
+
212
+ For example:
213
+ [source,ruby]
214
+ filter {
215
+ grok { match => { "message" => "Duration: %{NUMBER:duration}" } }
216
+ }
217
+
218
+ If you need to match multiple patterns against a single field, the value can be an array of patterns
219
+ [source,ruby]
220
+ filter {
221
+ grok { match => { "message" => [ "Duration: %{NUMBER:duration}", "Speed: %{NUMBER:speed}" ] } }
222
+ }
223
+
224
+
225
+ [id="plugins-{type}s-{plugin}-named_captures_only"]
226
+ ===== `named_captures_only`
227
+
228
+ * Value type is <<boolean,boolean>>
229
+ * Default value is `true`
230
+
231
+ If `true`, only store named captures from grok.
232
+
233
+ [id="plugins-{type}s-{plugin}-overwrite"]
234
+ ===== `overwrite`
235
+
236
+ * Value type is <<array,array>>
237
+ * Default value is `[]`
238
+
239
+ The fields to overwrite.
240
+
241
+ This allows you to overwrite a value in a field that already exists.
242
+
243
+ For example, if you have a syslog line in the `message` field, you can
244
+ overwrite the `message` field with part of the match like so:
245
+ [source,ruby]
246
+ filter {
247
+ grok {
248
+ match => { "message" => "%{SYSLOGBASE} %{DATA:message}" }
249
+ overwrite => [ "message" ]
250
+ }
251
+ }
252
+
253
+ In this case, a line like `May 29 16:37:11 sadness logger: hello world`
254
+ will be parsed and `hello world` will overwrite the original message.
255
+
256
+ [id="plugins-{type}s-{plugin}-pattern_definitions"]
257
+ ===== `pattern_definitions`
258
+
259
+ * Value type is <<hash,hash>>
260
+ * Default value is `{}`
261
+
262
+ A hash of pattern-name and pattern tuples defining custom patterns to be used by
263
+ the current filter. Patterns matching existing names will override the pre-existing
264
+ definition. Think of this as inline patterns available just for this definition of
265
+ grok
266
+
267
+ [id="plugins-{type}s-{plugin}-patterns_dir"]
268
+ ===== `patterns_dir`
269
+
270
+ * Value type is <<array,array>>
271
+ * Default value is `[]`
272
+
273
+
274
+ Logstash ships by default with a bunch of patterns, so you don't
275
+ necessarily need to define this yourself unless you are adding additional
276
+ patterns. You can point to multiple pattern directories using this setting.
277
+ Note that Grok will read all files in the directory matching the patterns_files_glob
278
+ and assume it's a pattern file (including any tilde backup files).
279
+ [source,ruby]
280
+ patterns_dir => ["/opt/logstash/patterns", "/opt/logstash/extra_patterns"]
281
+
282
+ Pattern files are plain text with format:
283
+ [source,ruby]
284
+ NAME PATTERN
285
+
286
+ For example:
287
+ [source,ruby]
288
+ NUMBER \d+
289
+
290
+ The patterns are loaded when the pipeline is created.
291
+
292
+ [id="plugins-{type}s-{plugin}-patterns_files_glob"]
293
+ ===== `patterns_files_glob`
294
+
295
+ * Value type is <<string,string>>
296
+ * Default value is `"*"`
297
+
298
+ Glob pattern, used to select the pattern files in the directories
299
+ specified by patterns_dir
300
+
301
+ [id="plugins-{type}s-{plugin}-tag_on_failure"]
302
+ ===== `tag_on_failure`
303
+
304
+ * Value type is <<array,array>>
305
+ * Default value is `["_grokparsefailure"]`
306
+
307
+ Append values to the `tags` field when there has been no
308
+ successful match
309
+
310
+ [id="plugins-{type}s-{plugin}-tag_on_timeout"]
311
+ ===== `tag_on_timeout`
312
+
313
+ * Value type is <<string,string>>
314
+ * Default value is `"_groktimeout"`
315
+
316
+ Tag to apply if a grok regexp times out.
317
+
318
+ [id="plugins-{type}s-{plugin}-timeout_millis"]
319
+ ===== `timeout_millis`
320
+
321
+ * Value type is <<number,number>>
322
+ * Default value is `30000`
323
+
324
+ Attempt to terminate regexps after this amount of time.
325
+ This applies per pattern if multiple patterns are applied
326
+ This will never timeout early, but may take a little longer to timeout.
327
+ Actual timeout is approximate based on a 250ms quantization.
328
+ Set to 0 to disable timeouts
329
+
330
+
331
+
332
+ include::{include_path}/{type}.asciidoc[]
@@ -390,7 +390,11 @@
390
390
 
391
391
  Dir.glob(path).each do |file|
392
392
  @logger.trace("Grok loading patterns from file", :path => file)
393
- patternfiles << file
393
+ if File.directory?(file)
394
+ @logger.debug("Skipping path because it is a directory", :path => file)
395
+ else
396
+ patternfiles << file
397
+ end
394
398
  end
395
399
  end
396
400
  patternfiles
@@ -11,7 +11,7 @@ class LogStash::Filters::Grok::TimeoutEnforcer
11
11
  # Stores running matches with their start time, this is used to cancel long running matches
12
12
  # Is a map of Thread => start_time
13
13
  @threads_to_start_time = {}
14
- @state_lock = java.util.concurrent.locks.ReentrantLock.new
14
+ @state_lock = ReentrantLock.new
15
15
  end
16
16
 
17
17
  def grok_till_timeout(event, grok, field, value)
@@ -1,7 +1,7 @@
1
1
  Gem::Specification.new do |s|
2
2
 
3
3
  s.name = 'logstash-filter-grok'
4
- s.version = '3.4.0'
4
+ s.version = '3.4.1'
5
5
  s.licenses = ['Apache License (2.0)']
6
6
  s.summary = "Parse arbitrary text and structure it."
7
7
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -11,7 +11,7 @@ Gem::Specification.new do |s|
11
11
  s.require_paths = ["lib"]
12
12
 
13
13
  # Files
14
- s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
14
+ s.files = Dir["lib/**/*","spec/**/*","*.gemspec","*.md","CONTRIBUTORS","Gemfile","LICENSE","NOTICE.TXT", "vendor/jar-dependencies/**/*.jar", "vendor/jar-dependencies/**/*.rb", "VERSION", "docs/**/*"]
15
15
 
16
16
  # Tests
17
17
  s.test_files = s.files.grep(%r{^(test|spec|features)/})
@@ -23,6 +23,7 @@ Gem::Specification.new do |s|
23
23
  s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
24
 
25
25
  s.add_runtime_dependency 'jls-grok', '~> 0.11.3'
26
+ s.add_runtime_dependency 'stud', '~> 0.0.22'
26
27
  s.add_runtime_dependency 'logstash-patterns-core'
27
28
 
28
29
  s.add_development_dependency 'logstash-devutils'
@@ -1,6 +1,6 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/devutils/rspec/spec_helper"
3
-
3
+ require "stud/temporary"
4
4
 
5
5
  module LogStash::Environment
6
6
  # running the grok code outside a logstash package means
@@ -668,15 +668,13 @@ describe LogStash::Filters::Grok do
668
668
  end
669
669
 
670
670
  describe "patterns in the 'patterns/' dir override core patterns" do
671
- require 'tmpdir'
672
- require 'tempfile'
673
671
 
674
672
  let(:pattern_dir) { File.join(LogStash::Environment::LOGSTASH_HOME, "patterns") }
675
673
  let(:has_pattern_dir?) { Dir.exist?(pattern_dir) }
676
674
 
677
675
  before do
678
676
  FileUtils.mkdir(pattern_dir) unless has_pattern_dir?
679
- @file = Tempfile.new('grok', pattern_dir)
677
+ @file = File.new(File.join(pattern_dir, 'grok.pattern'), 'w+')
680
678
  @file.write('WORD \b[2-5]\b')
681
679
  @file.close
682
680
  end
@@ -690,25 +688,23 @@ describe LogStash::Filters::Grok do
690
688
  end
691
689
 
692
690
  after do
693
- @file.unlink
691
+ File.unlink @file
694
692
  FileUtils.rm_rf(pattern_dir) if has_pattern_dir?
695
693
  end
696
694
  end
697
695
 
698
696
  describe "patterns in custom dir override those in 'patterns/' dir" do
699
- require 'tmpdir'
700
- require 'tempfile'
701
697
 
702
- let(:tmpdir) { Dir.mktmpdir }
698
+ let(:tmpdir) { Stud::Temporary.directory }
703
699
  let(:pattern_dir) { File.join(LogStash::Environment::LOGSTASH_HOME, "patterns") }
704
700
  let(:has_pattern_dir?) { Dir.exist?(pattern_dir) }
705
701
 
706
702
  before do
707
703
  FileUtils.mkdir(pattern_dir) unless has_pattern_dir?
708
- @file1 = Tempfile.new('grok', pattern_dir)
704
+ @file1 = File.new(File.join(pattern_dir, 'grok.pattern'), 'w+')
709
705
  @file1.write('WORD \b[2-5]\b')
710
706
  @file1.close
711
- @file2 = Tempfile.new('grok', tmpdir)
707
+ @file2 = File.new(File.join(tmpdir, 'grok.pattern'), 'w+')
712
708
  @file2.write('WORD \b[0-1]\b')
713
709
  @file2.close
714
710
  end
@@ -722,24 +718,22 @@ describe LogStash::Filters::Grok do
722
718
  end
723
719
 
724
720
  after do
725
- @file1.unlink
726
- @file2.unlink
721
+ File.unlink @file1
722
+ File.unlink @file2
727
723
  FileUtils.remove_entry tmpdir
728
724
  FileUtils.rm_rf(pattern_dir) unless has_pattern_dir?
729
725
  end
730
726
  end
731
727
 
732
728
  describe "patterns with file glob" do
733
- require 'tmpdir'
734
- require 'tempfile'
735
729
 
736
- let(:tmpdir) { Dir.mktmpdir(nil, "/tmp") }
730
+ let(:tmpdir) { Stud::Temporary.directory }
737
731
 
738
732
  before do
739
- @file3 = Tempfile.new(['grok', '.pattern'], tmpdir)
733
+ @file3 = File.new(File.join(tmpdir, 'grok.pattern'), 'w+')
740
734
  @file3.write('WORD \b[0-1]\b')
741
735
  @file3.close
742
- @file4 = Tempfile.new(['grok', '.pattern.old'], tmpdir)
736
+ @file4 = File.new(File.join(tmpdir, 'grok.pattern.old'), 'w+')
743
737
  @file4.write('WORD \b[2-5]\b')
744
738
  @file4.close
745
739
  end
@@ -753,8 +747,33 @@ describe LogStash::Filters::Grok do
753
747
  end
754
748
 
755
749
  after do
756
- @file3.unlink
757
- @file4.unlink
750
+ File.unlink @file3
751
+ File.unlink @file4
752
+ FileUtils.remove_entry tmpdir
753
+ end
754
+ end
755
+
756
+ describe "patterns with file glob on directory that contains subdirectories" do
757
+
758
+ let(:tmpdir) { Stud::Temporary.directory }
759
+
760
+ before do
761
+ @file3 = File.new(File.join(tmpdir, 'grok.pattern'), 'w+')
762
+ @file3.write('WORD \b[0-1]\b')
763
+ @file3.close
764
+ Dir.mkdir(File.join(tmpdir, "subdir"))
765
+ end
766
+
767
+ let(:config) do
768
+ "filter { grok { patterns_dir => \"#{tmpdir}\" patterns_files_glob => \"*\" match => { \"message\" => \"%{WORD:word}\" } } }"
769
+ end
770
+
771
+ sample("message" => '0') do
772
+ insist { subject.get("tags") } == nil
773
+ end
774
+
775
+ after do
776
+ File.unlink @file3
758
777
  FileUtils.remove_entry tmpdir
759
778
  end
760
779
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-filter-grok
3
3
  version: !ruby/object:Gem::Version
4
- version: 3.4.0
4
+ version: 3.4.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-03-01 00:00:00.000000000 Z
11
+ date: 2017-05-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -44,6 +44,20 @@ dependencies:
44
44
  - - "~>"
45
45
  - !ruby/object:Gem::Version
46
46
  version: 0.11.3
47
+ - !ruby/object:Gem::Dependency
48
+ requirement: !ruby/object:Gem::Requirement
49
+ requirements:
50
+ - - "~>"
51
+ - !ruby/object:Gem::Version
52
+ version: 0.0.22
53
+ name: stud
54
+ prerelease: false
55
+ type: :runtime
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - "~>"
59
+ - !ruby/object:Gem::Version
60
+ version: 0.0.22
47
61
  - !ruby/object:Gem::Dependency
48
62
  requirement: !ruby/object:Gem::Requirement
49
63
  requirements:
@@ -84,6 +98,7 @@ files:
84
98
  - LICENSE
85
99
  - NOTICE.TXT
86
100
  - README.md
101
+ - docs/index.asciidoc
87
102
  - lib/logstash/filters/grok.rb
88
103
  - lib/logstash/filters/grok/timeout_enforcer.rb
89
104
  - lib/logstash/filters/grok/timeout_exception.rb