pysysops-filter-date 2.0.3
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +5 -0
- data/CONTRIBUTORS +32 -0
- data/Gemfile +2 -0
- data/LICENSE +13 -0
- data/NOTICE.TXT +5 -0
- data/README.md +89 -0
- data/lib/logstash/filters/date.rb +296 -0
- data/logstash-filter-date.gemspec +28 -0
- data/spec/filters/date_spec.rb +551 -0
- metadata +131 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 4a945e95fa172248c1620a80648b0434789a1b00
|
4
|
+
data.tar.gz: c6946268ff371a5b393edb85f58d1d91ecbc4a1b
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 56372dd4aa79a9e2c6241a6a79ee74e36e1784adf1268dc28bfcb63b60fad75ecc7d894506e37abb33eacd8890a0928586bfb53814894c1ba91b9c50835b355a
|
7
|
+
data.tar.gz: 1eb72298f62f0d8c81de21c69e8aa1c8058aa1a9f5b3c6d6d621602251f3cc74a4f7948d82963628bf19224dc6544a34d5d426424745dcf7112bb647684277a6
|
data/CHANGELOG.md
ADDED
@@ -0,0 +1,5 @@
|
|
1
|
+
## 2.0.0
|
2
|
+
- Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
|
3
|
+
instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
|
4
|
+
- Dependency on logstash-core update to 2.0
|
5
|
+
|
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,32 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* Aaron Mildenstein (untergeek)
|
6
|
+
* Bob Corsaro (dokipen)
|
7
|
+
* Christian S. (squiddle)
|
8
|
+
* Colin Surprenant (colinsurprenant)
|
9
|
+
* Danny Berger (dpb587)
|
10
|
+
* James Turnbull (jamtur01)
|
11
|
+
* Jason Kendall (coolacid)
|
12
|
+
* John E. Vincent (lusis)
|
13
|
+
* Jonathan Van Eenwyk (jdve)
|
14
|
+
* Jordan Sissel (jordansissel)
|
15
|
+
* Kevin O'Connor (kjoconnor)
|
16
|
+
* Kurt Hurtado (kurtado)
|
17
|
+
* Mike Worth (MikeWorth)
|
18
|
+
* Nick Ethier (nickethier)
|
19
|
+
* Olivier Le Moal (olivierlemoal)
|
20
|
+
* Pete Fritchman (fetep)
|
21
|
+
* Philippe Weber (wiibaa)
|
22
|
+
* Pier-Hugues Pellerin (ph)
|
23
|
+
* Pierre Baillet (octplane)
|
24
|
+
* Ralph Meijer (ralphm)
|
25
|
+
* Richard Pijnenburg (electrical)
|
26
|
+
* Suyog Rao (suyograo)
|
27
|
+
* debadair
|
28
|
+
|
29
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
30
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
31
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
32
|
+
open source awesome.
|
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,13 @@
|
|
1
|
+
Copyright (c) 2012–2015 Elasticsearch <http://www.elastic.co>
|
2
|
+
|
3
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
you may not use this file except in compliance with the License.
|
5
|
+
You may obtain a copy of the License at
|
6
|
+
|
7
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
|
9
|
+
Unless required by applicable law or agreed to in writing, software
|
10
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
See the License for the specific language governing permissions and
|
13
|
+
limitations under the License.
|
data/NOTICE.TXT
ADDED
data/README.md
ADDED
@@ -0,0 +1,89 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
[![Build
|
4
|
+
Status](http://build-eu-00.elastic.co/view/LS%20Plugins/view/LS%20Filters/job/logstash-plugin-filter-date-unit/badge/icon)](http://build-eu-00.elastic.co/view/LS%20Plugins/view/LS%20Filters/job/logstash-plugin-filter-date-unit/)
|
5
|
+
|
6
|
+
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
7
|
+
|
8
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
9
|
+
|
10
|
+
## Documentation
|
11
|
+
|
12
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
|
13
|
+
|
14
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
15
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
|
16
|
+
|
17
|
+
## Need Help?
|
18
|
+
|
19
|
+
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
20
|
+
|
21
|
+
## Developing
|
22
|
+
|
23
|
+
### 1. Plugin Developement and Testing
|
24
|
+
|
25
|
+
#### Code
|
26
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
27
|
+
|
28
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
|
29
|
+
|
30
|
+
- Install dependencies
|
31
|
+
```sh
|
32
|
+
bundle install
|
33
|
+
```
|
34
|
+
|
35
|
+
#### Test
|
36
|
+
|
37
|
+
- Update your dependencies
|
38
|
+
|
39
|
+
```sh
|
40
|
+
bundle install
|
41
|
+
```
|
42
|
+
|
43
|
+
- Run tests
|
44
|
+
|
45
|
+
```sh
|
46
|
+
bundle exec rspec
|
47
|
+
```
|
48
|
+
|
49
|
+
### 2. Running your unpublished Plugin in Logstash
|
50
|
+
|
51
|
+
#### 2.1 Run in a local Logstash clone
|
52
|
+
|
53
|
+
- Edit Logstash `Gemfile` and add the local plugin path, for example:
|
54
|
+
```ruby
|
55
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
56
|
+
```
|
57
|
+
- Install plugin
|
58
|
+
```sh
|
59
|
+
bin/plugin install --no-verify
|
60
|
+
```
|
61
|
+
- Run Logstash with your plugin
|
62
|
+
```sh
|
63
|
+
bin/logstash -e 'filter {awesome {}}'
|
64
|
+
```
|
65
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
66
|
+
|
67
|
+
#### 2.2 Run in an installed Logstash
|
68
|
+
|
69
|
+
You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
|
70
|
+
|
71
|
+
- Build your plugin gem
|
72
|
+
```sh
|
73
|
+
gem build logstash-filter-awesome.gemspec
|
74
|
+
```
|
75
|
+
- Install the plugin from the Logstash home
|
76
|
+
```sh
|
77
|
+
bin/plugin install /your/local/plugin/logstash-filter-awesome.gem
|
78
|
+
```
|
79
|
+
- Start Logstash and proceed to test the plugin
|
80
|
+
|
81
|
+
## Contributing
|
82
|
+
|
83
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
84
|
+
|
85
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
86
|
+
|
87
|
+
It is more important to the community that you are able to contribute.
|
88
|
+
|
89
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -0,0 +1,296 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/filters/base"
|
3
|
+
require "logstash/namespace"
|
4
|
+
require "logstash/timestamp"
|
5
|
+
|
6
|
+
# The date filter is used for parsing dates from fields, and then using that
|
7
|
+
# date or timestamp as the logstash timestamp for the event.
|
8
|
+
#
|
9
|
+
# For example, syslog events usually have timestamps like this:
|
10
|
+
# [source,ruby]
|
11
|
+
# "Apr 17 09:32:01"
|
12
|
+
#
|
13
|
+
# You would use the date format `MMM dd HH:mm:ss` to parse this.
|
14
|
+
#
|
15
|
+
# The date filter is especially important for sorting events and for
|
16
|
+
# backfilling old data. If you don't get the date correct in your
|
17
|
+
# event, then searching for them later will likely sort out of order.
|
18
|
+
#
|
19
|
+
# In the absence of this filter, logstash will choose a timestamp based on the
|
20
|
+
# first time it sees the event (at input time), if the timestamp is not already
|
21
|
+
# set in the event. For example, with file input, the timestamp is set to the
|
22
|
+
# time of each read.
|
23
|
+
class LogStash::Filters::Date < LogStash::Filters::Base
|
24
|
+
if RUBY_ENGINE == "jruby"
|
25
|
+
JavaException = java.lang.Exception
|
26
|
+
UTC = org.joda.time.DateTimeZone.forID("UTC")
|
27
|
+
end
|
28
|
+
|
29
|
+
config_name "date"
|
30
|
+
|
31
|
+
# Specify a time zone canonical ID to be used for date parsing.
|
32
|
+
# The valid IDs are listed on the http://joda-time.sourceforge.net/timezones.html[Joda.org available time zones page].
|
33
|
+
# This is useful in case the time zone cannot be extracted from the value,
|
34
|
+
# and is not the platform default.
|
35
|
+
# If this is not specified the platform default will be used.
|
36
|
+
# Canonical ID is good as it takes care of daylight saving time for you
|
37
|
+
# For example, `America/Los_Angeles` or `Europe/Paris` are valid IDs.
|
38
|
+
# This field can be dynamic and include parts of the event using the `%{field}` syntax
|
39
|
+
config :timezone, :validate => :string
|
40
|
+
|
41
|
+
# Specify a locale to be used for date parsing using either IETF-BCP47 or POSIX language tag.
|
42
|
+
# Simple examples are `en`,`en-US` for BCP47 or `en_US` for POSIX.
|
43
|
+
#
|
44
|
+
# The locale is mostly necessary to be set for parsing month names (pattern with `MMM`) and
|
45
|
+
# weekday names (pattern with `EEE`).
|
46
|
+
#
|
47
|
+
# If not specified, the platform default will be used but for non-english platform default
|
48
|
+
# an english parser will also be used as a fallback mechanism.
|
49
|
+
config :locale, :validate => :string
|
50
|
+
|
51
|
+
# The date formats allowed are anything allowed by Joda-Time (java time
|
52
|
+
# library). You can see the docs for this format here:
|
53
|
+
#
|
54
|
+
# http://joda-time.sourceforge.net/apidocs/org/joda/time/format/DateTimeFormat.html[joda.time.format.DateTimeFormat]
|
55
|
+
#
|
56
|
+
# An array with field name first, and format patterns following, `[ field,
|
57
|
+
# formats... ]`
|
58
|
+
#
|
59
|
+
# If your time field has multiple possible formats, you can do this:
|
60
|
+
# [source,ruby]
|
61
|
+
# match => [ "logdate", "MMM dd YYY HH:mm:ss",
|
62
|
+
# "MMM d YYY HH:mm:ss", "ISO8601" ]
|
63
|
+
#
|
64
|
+
# The above will match a syslog (rfc3164) or `iso8601` timestamp.
|
65
|
+
#
|
66
|
+
# There are a few special exceptions. The following format literals exist
|
67
|
+
# to help you save time and ensure correctness of date parsing.
|
68
|
+
#
|
69
|
+
# * `ISO8601` - should parse any valid ISO8601 timestamp, such as
|
70
|
+
# `2011-04-19T03:44:01.103Z`
|
71
|
+
# * `UNIX` - will parse *float or int* value expressing unix time in seconds since epoch like 1326149001.132 as well as 1326149001
|
72
|
+
# * `UNIX_MS` - will parse **int** value expressing unix time in milliseconds since epoch like 1366125117000
|
73
|
+
# * `TAI64N` - will parse tai64n time values
|
74
|
+
#
|
75
|
+
# For example, if you have a field `logdate`, with a value that looks like
|
76
|
+
# `Aug 13 2010 00:03:44`, you would use this configuration:
|
77
|
+
# [source,ruby]
|
78
|
+
# filter {
|
79
|
+
# date {
|
80
|
+
# match => [ "logdate", "MMM dd YYYY HH:mm:ss" ]
|
81
|
+
# }
|
82
|
+
# }
|
83
|
+
#
|
84
|
+
# If your field is nested in your structure, you can use the nested
|
85
|
+
# syntax `[foo][bar]` to match its value. For more information, please refer to
|
86
|
+
# <<logstash-config-field-references>>
|
87
|
+
config :match, :validate => :array, :default => []
|
88
|
+
|
89
|
+
# Store the matching timestamp into the given target field. If not provided,
|
90
|
+
# default to updating the `@timestamp` field of the event.
|
91
|
+
config :target, :validate => :string, :default => "@timestamp"
|
92
|
+
|
93
|
+
# Append values to the `tags` field when there has been no
|
94
|
+
# successful match
|
95
|
+
config :tag_on_failure, :validate => :array, :default => ["_dateparsefailure"]
|
96
|
+
|
97
|
+
# LOGSTASH-34
|
98
|
+
DATEPATTERNS = %w{ y d H m s S }
|
99
|
+
|
100
|
+
public
|
101
|
+
def initialize(config = {})
|
102
|
+
super
|
103
|
+
|
104
|
+
@parsers = Hash.new { |h,k| h[k] = [] }
|
105
|
+
end # def initialize
|
106
|
+
|
107
|
+
public
|
108
|
+
def register
|
109
|
+
require "java"
|
110
|
+
if @match.length < 2
|
111
|
+
raise LogStash::ConfigurationError, I18n.t("logstash.agent.configuration.invalid_plugin_register",
|
112
|
+
:plugin => "filter", :type => "date",
|
113
|
+
:error => "The match setting should contains first a field name and at least one date format, current value is #{@match}")
|
114
|
+
end
|
115
|
+
|
116
|
+
locale = nil
|
117
|
+
if @locale
|
118
|
+
if @locale.include? '_'
|
119
|
+
@logger.warn("Date filter now use BCP47 format for locale, replacing underscore with dash")
|
120
|
+
@locale.gsub!('_','-')
|
121
|
+
end
|
122
|
+
locale = java.util.Locale.forLanguageTag(@locale)
|
123
|
+
end
|
124
|
+
|
125
|
+
@sprintf_timezone = @timezone && !@timezone.index("%{").nil?
|
126
|
+
|
127
|
+
setupMatcher(@config["match"].shift, locale, @config["match"] )
|
128
|
+
end
|
129
|
+
|
130
|
+
def setupMatcher(field, locale, value)
|
131
|
+
value.each do |format|
|
132
|
+
parsers = []
|
133
|
+
case format
|
134
|
+
when "ISO8601"
|
135
|
+
iso_parser = org.joda.time.format.ISODateTimeFormat.dateTimeParser
|
136
|
+
if @timezone && !@sprintf_timezone
|
137
|
+
iso_parser = iso_parser.withZone(org.joda.time.DateTimeZone.forID(@timezone))
|
138
|
+
else
|
139
|
+
iso_parser = iso_parser.withOffsetParsed
|
140
|
+
end
|
141
|
+
parsers << lambda { |date| iso_parser.parseMillis(date) }
|
142
|
+
#Fall back solution of almost ISO8601 date-time
|
143
|
+
almostISOparsers = [
|
144
|
+
org.joda.time.format.DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss.SSSZ").getParser(),
|
145
|
+
org.joda.time.format.DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss.SSS").getParser(),
|
146
|
+
org.joda.time.format.DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss,SSSZ").getParser(),
|
147
|
+
org.joda.time.format.DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss,SSS").getParser()
|
148
|
+
].to_java(org.joda.time.format.DateTimeParser)
|
149
|
+
joda_parser = org.joda.time.format.DateTimeFormatterBuilder.new.append( nil, almostISOparsers ).toFormatter()
|
150
|
+
if @timezone && !@sprintf_timezone
|
151
|
+
joda_parser = joda_parser.withZone(org.joda.time.DateTimeZone.forID(@timezone))
|
152
|
+
else
|
153
|
+
joda_parser = joda_parser.withOffsetParsed
|
154
|
+
end
|
155
|
+
parsers << lambda { |date| joda_parser.parseMillis(date) }
|
156
|
+
when "UNIX" # unix epoch
|
157
|
+
parsers << lambda do |date|
|
158
|
+
raise "Invalid UNIX epoch value '#{date}'" unless /^\d+(?:\.\d+)?$/ === date || date.is_a?(Numeric)
|
159
|
+
(date.to_f * 1000).to_i
|
160
|
+
end
|
161
|
+
when "UNIX_MS" # unix epoch in ms
|
162
|
+
parsers << lambda do |date|
|
163
|
+
raise "Invalid UNIX epoch value '#{date}'" unless /^\d+$/ === date || date.is_a?(Numeric)
|
164
|
+
date.to_i
|
165
|
+
end
|
166
|
+
when "UNIX_NANO" # unix epoch in nanoseconds
|
167
|
+
parsers << lambda do |date|
|
168
|
+
raise "Invalid UNIX epoch value '#{date}'" unless /^\d+$/ === date || date.is_a?(Numeric)
|
169
|
+
(date.to_i / 1000000).to_i
|
170
|
+
end
|
171
|
+
when "TAI64N" # TAI64 with nanoseconds, -10000 accounts for leap seconds
|
172
|
+
parsers << lambda do |date|
|
173
|
+
# Skip leading "@" if it is present (common in tai64n times)
|
174
|
+
date = date[1..-1] if date[0, 1] == "@"
|
175
|
+
return (date[1..15].hex * 1000 - 10000)+(date[16..23].hex/1000000)
|
176
|
+
end
|
177
|
+
else
|
178
|
+
begin
|
179
|
+
format_has_year = format.match(/y|Y/)
|
180
|
+
joda_parser = org.joda.time.format.DateTimeFormat.forPattern(format)
|
181
|
+
if @timezone && !@sprintf_timezone
|
182
|
+
joda_parser = joda_parser.withZone(org.joda.time.DateTimeZone.forID(@timezone))
|
183
|
+
else
|
184
|
+
joda_parser = joda_parser.withOffsetParsed
|
185
|
+
end
|
186
|
+
if locale
|
187
|
+
joda_parser = joda_parser.withLocale(locale)
|
188
|
+
end
|
189
|
+
if @sprintf_timezone
|
190
|
+
parsers << lambda { |date , tz|
|
191
|
+
joda_parser.withZone(org.joda.time.DateTimeZone.forID(tz)).parseMillis(date)
|
192
|
+
}
|
193
|
+
end
|
194
|
+
parsers << lambda do |date|
|
195
|
+
return joda_parser.parseMillis(date) if format_has_year
|
196
|
+
now = Time.now
|
197
|
+
now_month = now.month
|
198
|
+
result = joda_parser.parseDateTime(date)
|
199
|
+
event_month = result.month_of_year.get
|
200
|
+
|
201
|
+
if (event_month == now_month)
|
202
|
+
result.with_year(now.year)
|
203
|
+
elsif (event_month == 12 && now_month == 1)
|
204
|
+
result.with_year(now.year-1)
|
205
|
+
elsif (event_month == 1 && now_month == 12)
|
206
|
+
result.with_year(now.year+1)
|
207
|
+
else
|
208
|
+
result.with_year(now.year)
|
209
|
+
end.get_millis
|
210
|
+
end
|
211
|
+
|
212
|
+
#Include a fallback parser to english when default locale is non-english
|
213
|
+
if !locale &&
|
214
|
+
"en" != java.util.Locale.getDefault().getLanguage() &&
|
215
|
+
(format.include?("MMM") || format.include?("E"))
|
216
|
+
en_joda_parser = joda_parser.withLocale(java.util.Locale.forLanguageTag('en-US'))
|
217
|
+
parsers << lambda { |date| en_joda_parser.parseMillis(date) }
|
218
|
+
end
|
219
|
+
rescue JavaException => e
|
220
|
+
raise LogStash::ConfigurationError, I18n.t("logstash.agent.configuration.invalid_plugin_register",
|
221
|
+
:plugin => "filter", :type => "date",
|
222
|
+
:error => "#{e.message} for pattern '#{format}'")
|
223
|
+
end
|
224
|
+
end
|
225
|
+
|
226
|
+
@logger.debug("Adding type with date config", :type => @type,
|
227
|
+
:field => field, :format => format)
|
228
|
+
@parsers[field] << {
|
229
|
+
:parser => parsers,
|
230
|
+
:format => format
|
231
|
+
}
|
232
|
+
end
|
233
|
+
end
|
234
|
+
|
235
|
+
# def register
|
236
|
+
|
237
|
+
public
|
238
|
+
def filter(event)
|
239
|
+
@logger.debug? && @logger.debug("Date filter: received event", :type => event["type"])
|
240
|
+
|
241
|
+
@parsers.each do |field, fieldparsers|
|
242
|
+
@logger.debug? && @logger.debug("Date filter looking for field",
|
243
|
+
:type => event["type"], :field => field)
|
244
|
+
next unless event.include?(field)
|
245
|
+
|
246
|
+
fieldvalues = event[field]
|
247
|
+
fieldvalues = [fieldvalues] if !fieldvalues.is_a?(Array)
|
248
|
+
fieldvalues.each do |value|
|
249
|
+
next if value.nil?
|
250
|
+
begin
|
251
|
+
epochmillis = nil
|
252
|
+
success = false
|
253
|
+
last_exception = RuntimeError.new "Unknown"
|
254
|
+
fieldparsers.each do |parserconfig|
|
255
|
+
parserconfig[:parser].each do |parser|
|
256
|
+
begin
|
257
|
+
if @sprintf_timezone
|
258
|
+
epochmillis = parser.call(value, event.sprintf(@timezone))
|
259
|
+
else
|
260
|
+
epochmillis = parser.call(value)
|
261
|
+
end
|
262
|
+
success = true
|
263
|
+
break # success
|
264
|
+
rescue StandardError, JavaException => e
|
265
|
+
last_exception = e
|
266
|
+
end
|
267
|
+
end # parserconfig[:parser].each
|
268
|
+
break if success
|
269
|
+
end # fieldparsers.each
|
270
|
+
|
271
|
+
raise last_exception unless success
|
272
|
+
|
273
|
+
# Convert joda DateTime to a ruby Time
|
274
|
+
event[@target] = LogStash::Timestamp.at(epochmillis / 1000, (epochmillis % 1000) * 1000)
|
275
|
+
|
276
|
+
@logger.debug? && @logger.debug("Date parsing done", :value => value, :timestamp => event[@target])
|
277
|
+
filter_matched(event)
|
278
|
+
rescue StandardError, JavaException => e
|
279
|
+
@logger.warn("Failed parsing date from field", :field => field,
|
280
|
+
:value => value, :exception => e.message,
|
281
|
+
:config_parsers => fieldparsers.collect {|x| x[:format]}.join(','),
|
282
|
+
:config_locale => @locale ? @locale : "default="+java.util.Locale.getDefault().toString()
|
283
|
+
)
|
284
|
+
# Tag this event if we can't parse it. We can use this later to
|
285
|
+
# reparse+reindex logs if we improve the patterns given.
|
286
|
+
@tag_on_failure.each do |tag|
|
287
|
+
event["tags"] ||= []
|
288
|
+
event["tags"] << tag unless event["tags"].include?(tag)
|
289
|
+
end
|
290
|
+
end # begin
|
291
|
+
end # fieldvalue.each
|
292
|
+
end # @parsers.each
|
293
|
+
|
294
|
+
return event
|
295
|
+
end # def filter
|
296
|
+
end # class LogStash::Filters::Date
|
@@ -0,0 +1,28 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
|
3
|
+
s.name = 'pysysops-filter-date'
|
4
|
+
s.version = '2.0.3'
|
5
|
+
s.licenses = ['Apache License (2.0)']
|
6
|
+
s.summary = "The date filter is used for parsing dates from fields, and then using that date or timestamp as the logstash timestamp for the event."
|
7
|
+
s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
|
8
|
+
s.authors = ["Elastic"]
|
9
|
+
s.email = 'info@elastic.co'
|
10
|
+
s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
|
11
|
+
s.require_paths = ["lib"]
|
12
|
+
|
13
|
+
# Files
|
14
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
15
|
+
|
16
|
+
# Tests
|
17
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
18
|
+
|
19
|
+
# Special flag to let us know this is actually a logstash plugin
|
20
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" }
|
21
|
+
|
22
|
+
# Gem dependencies
|
23
|
+
s.add_runtime_dependency "logstash-core", ">= 2.0.0.beta2", "< 3.0.0"
|
24
|
+
s.add_runtime_dependency 'logstash-input-generator'
|
25
|
+
s.add_runtime_dependency 'logstash-codec-json'
|
26
|
+
s.add_runtime_dependency 'logstash-output-null'
|
27
|
+
s.add_development_dependency 'logstash-devutils'
|
28
|
+
end
|
@@ -0,0 +1,551 @@
|
|
1
|
+
require "logstash/devutils/rspec/spec_helper"
|
2
|
+
require "logstash/filters/date"
|
3
|
+
|
4
|
+
puts "Skipping date performance tests because this ruby is not jruby" if RUBY_ENGINE != "jruby"
|
5
|
+
RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
6
|
+
|
7
|
+
describe "giving an invalid match config, raise a configuration error" do
|
8
|
+
config <<-CONFIG
|
9
|
+
filter {
|
10
|
+
date {
|
11
|
+
match => [ "mydate"]
|
12
|
+
locale => "en"
|
13
|
+
}
|
14
|
+
}
|
15
|
+
CONFIG
|
16
|
+
|
17
|
+
sample "not_really_important" do
|
18
|
+
insist {subject}.raises LogStash::ConfigurationError
|
19
|
+
end
|
20
|
+
|
21
|
+
end
|
22
|
+
|
23
|
+
describe "parsing with ISO8601" do
|
24
|
+
config <<-CONFIG
|
25
|
+
filter {
|
26
|
+
date {
|
27
|
+
match => [ "mydate", "ISO8601" ]
|
28
|
+
locale => "en"
|
29
|
+
timezone => "UTC"
|
30
|
+
}
|
31
|
+
}
|
32
|
+
CONFIG
|
33
|
+
|
34
|
+
times = {
|
35
|
+
"2001-01-01T00:00:00-0800" => "2001-01-01T08:00:00.000Z",
|
36
|
+
"1974-03-02T04:09:09-0800" => "1974-03-02T12:09:09.000Z",
|
37
|
+
"2010-05-03T08:18:18+00:00" => "2010-05-03T08:18:18.000Z",
|
38
|
+
"2004-07-04T12:27:27-00:00" => "2004-07-04T12:27:27.000Z",
|
39
|
+
"2001-09-05T16:36:36+0000" => "2001-09-05T16:36:36.000Z",
|
40
|
+
"2001-11-06T20:45:45-0000" => "2001-11-06T20:45:45.000Z",
|
41
|
+
"2001-12-07T23:54:54Z" => "2001-12-07T23:54:54.000Z",
|
42
|
+
|
43
|
+
# TODO: This test assumes PDT
|
44
|
+
#"2001-01-01T00:00:00.123" => "2001-01-01T08:00:00.123Z",
|
45
|
+
|
46
|
+
"2010-05-03T08:18:18.123+00:00" => "2010-05-03T08:18:18.123Z",
|
47
|
+
"2004-07-04T12:27:27.123-04:00" => "2004-07-04T16:27:27.123Z",
|
48
|
+
"2001-09-05T16:36:36.123+0700" => "2001-09-05T09:36:36.123Z",
|
49
|
+
"2001-11-06T20:45:45.123-0000" => "2001-11-06T20:45:45.123Z",
|
50
|
+
"2001-12-07T23:54:54.123Z" => "2001-12-07T23:54:54.123Z",
|
51
|
+
"2001-12-07T23:54:54,123Z" => "2001-12-07T23:54:54.123Z",
|
52
|
+
|
53
|
+
#Almost ISO8601 support, with timezone
|
54
|
+
|
55
|
+
"2001-11-06 20:45:45.123-0000" => "2001-11-06T20:45:45.123Z",
|
56
|
+
"2001-12-07 23:54:54.123Z" => "2001-12-07T23:54:54.123Z",
|
57
|
+
"2001-12-07 23:54:54,123Z" => "2001-12-07T23:54:54.123Z",
|
58
|
+
|
59
|
+
#Almost ISO8601 support, without timezone
|
60
|
+
|
61
|
+
"2001-11-06 20:45:45.123" => "2001-11-06T20:45:45.123Z",
|
62
|
+
"2001-11-06 20:45:45,123" => "2001-11-06T20:45:45.123Z",
|
63
|
+
|
64
|
+
}
|
65
|
+
|
66
|
+
times.each do |input, output|
|
67
|
+
sample("mydate" => input) do
|
68
|
+
begin
|
69
|
+
insist { subject["mydate"] } == input
|
70
|
+
insist { subject["@timestamp"].time } == Time.iso8601(output).utc
|
71
|
+
rescue
|
72
|
+
#require "pry"; binding.pry
|
73
|
+
raise
|
74
|
+
end
|
75
|
+
end
|
76
|
+
end # times.each
|
77
|
+
end
|
78
|
+
|
79
|
+
describe "parsing with java SimpleDateFormat syntax" do
|
80
|
+
config <<-CONFIG
|
81
|
+
filter {
|
82
|
+
date {
|
83
|
+
match => [ "mydate", "MMM dd HH:mm:ss Z" ]
|
84
|
+
locale => "en"
|
85
|
+
}
|
86
|
+
}
|
87
|
+
CONFIG
|
88
|
+
|
89
|
+
now = Time.now
|
90
|
+
year = now.year
|
91
|
+
require 'java'
|
92
|
+
|
93
|
+
times = {
|
94
|
+
"Nov 24 01:29:01 -0800" => "#{year}-11-24T09:29:01.000Z",
|
95
|
+
}
|
96
|
+
times.each do |input, output|
|
97
|
+
sample("mydate" => input) do
|
98
|
+
insist { subject["mydate"] } == input
|
99
|
+
insist { subject["@timestamp"].time } == Time.iso8601(output).utc
|
100
|
+
end
|
101
|
+
end # times.each
|
102
|
+
end
|
103
|
+
|
104
|
+
describe "parsing with UNIX" do
|
105
|
+
config <<-CONFIG
|
106
|
+
filter {
|
107
|
+
date {
|
108
|
+
match => [ "mydate", "UNIX" ]
|
109
|
+
locale => "en"
|
110
|
+
}
|
111
|
+
}
|
112
|
+
CONFIG
|
113
|
+
|
114
|
+
times = {
|
115
|
+
"0" => "1970-01-01T00:00:00.000Z",
|
116
|
+
"1000000000" => "2001-09-09T01:46:40.000Z",
|
117
|
+
|
118
|
+
# LOGSTASH-279 - sometimes the field is a number.
|
119
|
+
0 => "1970-01-01T00:00:00.000Z",
|
120
|
+
1000000000 => "2001-09-09T01:46:40.000Z"
|
121
|
+
}
|
122
|
+
times.each do |input, output|
|
123
|
+
sample("mydate" => input) do
|
124
|
+
insist { subject["mydate"] } == input
|
125
|
+
insist { subject["@timestamp"].time } == Time.iso8601(output).utc
|
126
|
+
end
|
127
|
+
end # times.each
|
128
|
+
|
129
|
+
#Invalid value should not be evaluated to zero (String#to_i madness)
|
130
|
+
sample("mydate" => "%{bad_value}") do
|
131
|
+
insist { subject["mydate"] } == "%{bad_value}"
|
132
|
+
insist { subject["@timestamp"] } != Time.iso8601("1970-01-01T00:00:00.000Z").utc
|
133
|
+
end
|
134
|
+
end
|
135
|
+
|
136
|
+
describe "parsing microsecond-precise times with UNIX (#213)" do
|
137
|
+
config <<-CONFIG
|
138
|
+
filter {
|
139
|
+
date {
|
140
|
+
match => [ "mydate", "UNIX" ]
|
141
|
+
locale => "en"
|
142
|
+
}
|
143
|
+
}
|
144
|
+
CONFIG
|
145
|
+
|
146
|
+
sample("mydate" => "1350414944.123456") do
|
147
|
+
# Joda time only supports milliseconds :\
|
148
|
+
insist { subject.timestamp.time } == Time.iso8601("2012-10-16T12:15:44.123-07:00").utc
|
149
|
+
end
|
150
|
+
|
151
|
+
#Support float values
|
152
|
+
sample("mydate" => 1350414944.123456) do
|
153
|
+
insist { subject["mydate"] } == 1350414944.123456
|
154
|
+
insist { subject["@timestamp"].time } == Time.iso8601("2012-10-16T12:15:44.123-07:00").utc
|
155
|
+
end
|
156
|
+
|
157
|
+
#Invalid value should not be evaluated to zero (String#to_i madness)
|
158
|
+
sample("mydate" => "%{bad_value}") do
|
159
|
+
insist { subject["mydate"] } == "%{bad_value}"
|
160
|
+
insist { subject["@timestamp"] } != Time.iso8601("1970-01-01T00:00:00.000Z").utc
|
161
|
+
end
|
162
|
+
end
|
163
|
+
|
164
|
+
describe "parsing with UNIX_MS" do
|
165
|
+
config <<-CONFIG
|
166
|
+
filter {
|
167
|
+
date {
|
168
|
+
match => [ "mydate", "UNIX_MS" ]
|
169
|
+
locale => "en"
|
170
|
+
}
|
171
|
+
}
|
172
|
+
CONFIG
|
173
|
+
|
174
|
+
times = {
|
175
|
+
"0" => "1970-01-01T00:00:00.000Z",
|
176
|
+
"456" => "1970-01-01T00:00:00.456Z",
|
177
|
+
"1000000000123" => "2001-09-09T01:46:40.123Z",
|
178
|
+
|
179
|
+
# LOGSTASH-279 - sometimes the field is a number.
|
180
|
+
0 => "1970-01-01T00:00:00.000Z",
|
181
|
+
456 => "1970-01-01T00:00:00.456Z",
|
182
|
+
1000000000123 => "2001-09-09T01:46:40.123Z"
|
183
|
+
}
|
184
|
+
times.each do |input, output|
|
185
|
+
sample("mydate" => input) do
|
186
|
+
insist { subject["mydate"] } == input
|
187
|
+
insist { subject["@timestamp"].time } == Time.iso8601(output)
|
188
|
+
end
|
189
|
+
end # times.each
|
190
|
+
end
|
191
|
+
|
192
|
+
describe "parsing with UNIX_NANO" do
|
193
|
+
config <<-CONFIG
|
194
|
+
filter {
|
195
|
+
date {
|
196
|
+
match => [ "mydate", "UNIX_NANO" ]
|
197
|
+
locale => "en"
|
198
|
+
}
|
199
|
+
}
|
200
|
+
CONFIG
|
201
|
+
|
202
|
+
times = {
|
203
|
+
"0" => "1970-01-01T00:00:00.000Z",
|
204
|
+
"4569999" => "1970-01-01T00:00:00.004Z",
|
205
|
+
"456999936" => "1970-01-01T00:00:00.456Z",
|
206
|
+
"1000000000123999936" => "2001-09-09T01:46:40.123Z",
|
207
|
+
|
208
|
+
# LOGSTASH-279 - sometimes the field is a number.
|
209
|
+
0 => "1970-01-01T00:00:00.000Z",
|
210
|
+
4569999 => "1970-01-01T00:00:00.004Z",
|
211
|
+
456999936 => "1970-01-01T00:00:00.456Z",
|
212
|
+
1000000000123999936 => "2001-09-09T01:46:40.123Z"
|
213
|
+
}
|
214
|
+
times.each do |input, output|
|
215
|
+
sample("mydate" => input) do
|
216
|
+
insist { subject["mydate"] } == input
|
217
|
+
insist { subject["@timestamp"].time } == Time.iso8601(output)
|
218
|
+
end
|
219
|
+
end # times.each
|
220
|
+
end
|
221
|
+
|
222
|
+
describe "failed parses should not cause a failure (LOGSTASH-641)" do
|
223
|
+
config <<-'CONFIG'
|
224
|
+
input {
|
225
|
+
generator {
|
226
|
+
lines => [
|
227
|
+
'{ "mydate": "this will not parse" }',
|
228
|
+
'{ }'
|
229
|
+
]
|
230
|
+
codec => json
|
231
|
+
type => foo
|
232
|
+
count => 1
|
233
|
+
}
|
234
|
+
}
|
235
|
+
filter {
|
236
|
+
date {
|
237
|
+
match => [ "mydate", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
|
238
|
+
locale => "en"
|
239
|
+
}
|
240
|
+
}
|
241
|
+
output {
|
242
|
+
null { }
|
243
|
+
}
|
244
|
+
CONFIG
|
245
|
+
|
246
|
+
agent do
|
247
|
+
# nothing to do, if this crashes it's an error..
|
248
|
+
end
|
249
|
+
end
|
250
|
+
|
251
|
+
describe "TAI64N support" do
|
252
|
+
config <<-'CONFIG'
|
253
|
+
filter {
|
254
|
+
date {
|
255
|
+
match => [ "t", TAI64N ]
|
256
|
+
locale => "en"
|
257
|
+
}
|
258
|
+
}
|
259
|
+
CONFIG
|
260
|
+
|
261
|
+
# Try without leading "@"
|
262
|
+
sample("t" => "4000000050d506482dbdf024") do
|
263
|
+
insist { subject.timestamp.time } == Time.iso8601("2012-12-22T01:00:46.767Z").utc
|
264
|
+
end
|
265
|
+
|
266
|
+
# Should still parse successfully if it's a full tai64n time (with leading
|
267
|
+
# '@')
|
268
|
+
sample("t" => "@4000000050d506482dbdf024") do
|
269
|
+
insist { subject.timestamp.time } == Time.iso8601("2012-12-22T01:00:46.767Z").utc
|
270
|
+
end
|
271
|
+
end
|
272
|
+
|
273
|
+
describe "accept match config option with hash value (LOGSTASH-735)" do
|
274
|
+
config <<-CONFIG
|
275
|
+
filter {
|
276
|
+
date {
|
277
|
+
match => [ "mydate", "ISO8601" ]
|
278
|
+
locale => "en"
|
279
|
+
}
|
280
|
+
}
|
281
|
+
CONFIG
|
282
|
+
|
283
|
+
time = "2001-09-09T01:46:40.000Z"
|
284
|
+
|
285
|
+
sample("mydate" => time) do
|
286
|
+
insist { subject["mydate"] } == time
|
287
|
+
insist { subject["@timestamp"].time } == Time.iso8601(time).utc
|
288
|
+
end
|
289
|
+
end
|
290
|
+
|
291
|
+
describe "support deep nested field access" do
|
292
|
+
config <<-CONFIG
|
293
|
+
filter {
|
294
|
+
date {
|
295
|
+
match => [ "[data][deep]", "ISO8601" ]
|
296
|
+
locale => "en"
|
297
|
+
}
|
298
|
+
}
|
299
|
+
CONFIG
|
300
|
+
|
301
|
+
sample("data" => { "deep" => "2013-01-01T00:00:00.000Z" }) do
|
302
|
+
insist { subject["@timestamp"].time } == Time.iso8601("2013-01-01T00:00:00.000Z").utc
|
303
|
+
end
|
304
|
+
end
|
305
|
+
|
306
|
+
describe "failing to parse should not throw an exception" do
|
307
|
+
config <<-CONFIG
|
308
|
+
filter {
|
309
|
+
date {
|
310
|
+
match => [ "thedate", "yyyy/MM/dd" ]
|
311
|
+
locale => "en"
|
312
|
+
}
|
313
|
+
}
|
314
|
+
CONFIG
|
315
|
+
|
316
|
+
sample("thedate" => "2013/Apr/21") do
|
317
|
+
insist { subject["@timestamp"] } != "2013-04-21T00:00:00.000Z"
|
318
|
+
end
|
319
|
+
end
|
320
|
+
|
321
|
+
describe "success to parse should apply on_success config(add_tag,add_field...)" do
|
322
|
+
config <<-CONFIG
|
323
|
+
filter {
|
324
|
+
date {
|
325
|
+
match => [ "thedate", "yyyy/MM/dd" ]
|
326
|
+
add_tag => "tagged"
|
327
|
+
}
|
328
|
+
}
|
329
|
+
CONFIG
|
330
|
+
|
331
|
+
sample("thedate" => "2013/04/21") do
|
332
|
+
insist { subject["@timestamp"] } != "2013-04-21T00:00:00.000Z"
|
333
|
+
insist { subject["tags"] } == ["tagged"]
|
334
|
+
end
|
335
|
+
end
|
336
|
+
|
337
|
+
describe "failing to parse should not apply on_success config(add_tag,add_field...)" do
|
338
|
+
config <<-CONFIG
|
339
|
+
filter {
|
340
|
+
date {
|
341
|
+
match => [ "thedate", "yyyy/MM/dd" ]
|
342
|
+
add_tag => "tagged"
|
343
|
+
}
|
344
|
+
}
|
345
|
+
CONFIG
|
346
|
+
|
347
|
+
sample("thedate" => "2013/Apr/21") do
|
348
|
+
insist { subject["@timestamp"] } != "2013-04-21T00:00:00.000Z"
|
349
|
+
reject { subject["tags"] }.include? "tagged"
|
350
|
+
end
|
351
|
+
end
|
352
|
+
|
353
|
+
describe "failing to parse should apply tag_on_failure" do
|
354
|
+
config <<-CONFIG
|
355
|
+
filter {
|
356
|
+
date {
|
357
|
+
match => [ "thedate", "yyyy/MM/dd" ]
|
358
|
+
tag_on_failure => ["date_failed"]
|
359
|
+
}
|
360
|
+
}
|
361
|
+
CONFIG
|
362
|
+
|
363
|
+
sample("thedate" => "2013/Apr/21") do
|
364
|
+
insist { subject["@timestamp"] } != "2013-04-21T00:00:00.000Z"
|
365
|
+
insist { subject["tags"] }.include? "date_failed"
|
366
|
+
end
|
367
|
+
end
|
368
|
+
|
369
|
+
describe "parsing with timezone parameter" do
|
370
|
+
config <<-CONFIG
|
371
|
+
filter {
|
372
|
+
date {
|
373
|
+
match => ["mydate", "yyyy MMM dd HH:mm:ss"]
|
374
|
+
locale => "en"
|
375
|
+
timezone => "America/Los_Angeles"
|
376
|
+
}
|
377
|
+
}
|
378
|
+
CONFIG
|
379
|
+
|
380
|
+
require 'java'
|
381
|
+
times = {
|
382
|
+
"2013 Nov 24 01:29:01" => "2013-11-24T09:29:01.000Z",
|
383
|
+
"2013 Jun 24 01:29:01" => "2013-06-24T08:29:01.000Z",
|
384
|
+
}
|
385
|
+
times.each do |input, output|
|
386
|
+
sample("mydate" => input) do
|
387
|
+
insist { subject["mydate"] } == input
|
388
|
+
insist { subject["@timestamp"].time } == Time.iso8601(output).utc
|
389
|
+
end
|
390
|
+
end # times.each
|
391
|
+
end
|
392
|
+
|
393
|
+
describe "parsing with timezone from event" do
|
394
|
+
config <<-CONFIG
|
395
|
+
filter {
|
396
|
+
date {
|
397
|
+
match => ["mydate", "yyyy MMM dd HH:mm:ss"]
|
398
|
+
locale => "en"
|
399
|
+
timezone => "%{mytz}"
|
400
|
+
}
|
401
|
+
}
|
402
|
+
CONFIG
|
403
|
+
|
404
|
+
require 'java'
|
405
|
+
times = {
|
406
|
+
"2013 Nov 24 01:29:01" => "2013-11-24T09:29:01.000Z",
|
407
|
+
"2013 Jun 24 01:29:01" => "2013-06-24T08:29:01.000Z",
|
408
|
+
}
|
409
|
+
times.each do |input, output|
|
410
|
+
sample("mydate" => input, "mytz" => "America/Los_Angeles") do
|
411
|
+
insist { subject["mydate"] } == input
|
412
|
+
insist { subject["@timestamp"].time } == Time.iso8601(output).utc
|
413
|
+
end
|
414
|
+
end # times.each
|
415
|
+
end
|
416
|
+
|
417
|
+
describe "LOGSTASH-34 - Default year should be this year" do
|
418
|
+
config <<-CONFIG
|
419
|
+
filter {
|
420
|
+
date {
|
421
|
+
match => [ "message", "EEE MMM dd HH:mm:ss" ]
|
422
|
+
locale => "en"
|
423
|
+
}
|
424
|
+
}
|
425
|
+
CONFIG
|
426
|
+
|
427
|
+
sample "Sun Jun 02 20:38:03" do
|
428
|
+
insist { subject["@timestamp"].year } == Time.now.year
|
429
|
+
end
|
430
|
+
end
|
431
|
+
|
432
|
+
describe "fill last year if december events arrive in january" do
|
433
|
+
config <<-CONFIG
|
434
|
+
filter {
|
435
|
+
date {
|
436
|
+
match => [ "message", "MMM dd HH:mm:ss" ]
|
437
|
+
locale => "en"
|
438
|
+
timezone => "UTC"
|
439
|
+
}
|
440
|
+
}
|
441
|
+
CONFIG
|
442
|
+
|
443
|
+
sample "Dec 31 23:59:00" do
|
444
|
+
logstash_time = Time.utc(2014,1,1,00,30,50)
|
445
|
+
expect(Time).to receive(:now).twice.and_return(logstash_time)
|
446
|
+
insist { subject["@timestamp"].year } == 2013
|
447
|
+
end
|
448
|
+
end
|
449
|
+
|
450
|
+
describe "fill next year if january events arrive in december" do
|
451
|
+
config <<-CONFIG
|
452
|
+
filter {
|
453
|
+
date {
|
454
|
+
match => [ "message", "MMM dd HH:mm:ss" ]
|
455
|
+
locale => "en"
|
456
|
+
timezone => "UTC"
|
457
|
+
}
|
458
|
+
}
|
459
|
+
CONFIG
|
460
|
+
|
461
|
+
sample "Jan 01 01:00:00" do
|
462
|
+
logstash_time = Time.utc(2013,12,31,23,59,50)
|
463
|
+
expect(Time).to receive(:now).twice.and_return(logstash_time)
|
464
|
+
insist { subject["@timestamp"].year } == 2014
|
465
|
+
end
|
466
|
+
end
|
467
|
+
|
468
|
+
describe "Supporting locale only" do
|
469
|
+
config <<-CONFIG
|
470
|
+
filter {
|
471
|
+
date {
|
472
|
+
match => [ "message", "dd MMMM yyyy" ]
|
473
|
+
locale => "fr"
|
474
|
+
timezone => "UTC"
|
475
|
+
}
|
476
|
+
}
|
477
|
+
CONFIG
|
478
|
+
|
479
|
+
sample "14 juillet 1789" do
|
480
|
+
insist { subject["@timestamp"].time } == Time.iso8601("1789-07-14T00:00:00.000Z").utc
|
481
|
+
end
|
482
|
+
end
|
483
|
+
|
484
|
+
describe "Supporting locale+country in BCP47" do
|
485
|
+
config <<-CONFIG
|
486
|
+
filter {
|
487
|
+
date {
|
488
|
+
match => [ "message", "dd MMMM yyyy" ]
|
489
|
+
locale => "fr-FR"
|
490
|
+
timezone => "UTC"
|
491
|
+
}
|
492
|
+
}
|
493
|
+
CONFIG
|
494
|
+
|
495
|
+
sample "14 juillet 1789" do
|
496
|
+
insist { subject["@timestamp"].time } == Time.iso8601("1789-07-14T00:00:00.000Z").utc
|
497
|
+
end
|
498
|
+
end
|
499
|
+
|
500
|
+
describe "Supporting locale+country in POSIX (internally replace '_' by '-')" do
|
501
|
+
config <<-CONFIG
|
502
|
+
filter {
|
503
|
+
date {
|
504
|
+
match => [ "message", "dd MMMM yyyy" ]
|
505
|
+
locale => "fr_FR"
|
506
|
+
timezone => "UTC"
|
507
|
+
}
|
508
|
+
}
|
509
|
+
CONFIG
|
510
|
+
|
511
|
+
sample "14 juillet 1789" do
|
512
|
+
insist { subject["@timestamp"].time } == Time.iso8601("1789-07-14T00:00:00.000Z").utc
|
513
|
+
end
|
514
|
+
end
|
515
|
+
|
516
|
+
describe "http dates" do
|
517
|
+
|
518
|
+
config <<-'CONFIG'
|
519
|
+
filter {
|
520
|
+
date {
|
521
|
+
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
|
522
|
+
locale => "en"
|
523
|
+
}
|
524
|
+
}
|
525
|
+
CONFIG
|
526
|
+
|
527
|
+
sample("timestamp" => "25/Mar/2013:20:33:56 +0000") do
|
528
|
+
insist { subject["@timestamp"].time } == Time.iso8601("2013-03-25T20:33:56.000Z")
|
529
|
+
end
|
530
|
+
end
|
531
|
+
|
532
|
+
describe "Support fallback to english for non-english default locale" do
|
533
|
+
default_locale = java.util.Locale.getDefault()
|
534
|
+
#Override default locale with non-english
|
535
|
+
java.util.Locale.setDefault(java.util.Locale.forLanguageTag('fr-FR'))
|
536
|
+
config <<-CONFIG
|
537
|
+
filter {
|
538
|
+
date {
|
539
|
+
match => [ "message", "dd MMMM yyyy" ]
|
540
|
+
timezone => "UTC"
|
541
|
+
}
|
542
|
+
}
|
543
|
+
CONFIG
|
544
|
+
|
545
|
+
sample "01 September 2014" do
|
546
|
+
insist { subject["@timestamp"].time } == Time.iso8601("2014-09-01T00:00:00.000Z").utc
|
547
|
+
end
|
548
|
+
#Restore default locale
|
549
|
+
java.util.Locale.setDefault(default_locale)
|
550
|
+
end
|
551
|
+
end
|
metadata
ADDED
@@ -0,0 +1,131 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: pysysops-filter-date
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 2.0.3
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Elastic
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2015-12-02 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
15
|
+
requirements:
|
16
|
+
- - '>='
|
17
|
+
- !ruby/object:Gem::Version
|
18
|
+
version: 2.0.0.beta2
|
19
|
+
- - <
|
20
|
+
- !ruby/object:Gem::Version
|
21
|
+
version: 3.0.0
|
22
|
+
name: logstash-core
|
23
|
+
prerelease: false
|
24
|
+
type: :runtime
|
25
|
+
version_requirements: !ruby/object:Gem::Requirement
|
26
|
+
requirements:
|
27
|
+
- - '>='
|
28
|
+
- !ruby/object:Gem::Version
|
29
|
+
version: 2.0.0.beta2
|
30
|
+
- - <
|
31
|
+
- !ruby/object:Gem::Version
|
32
|
+
version: 3.0.0
|
33
|
+
- !ruby/object:Gem::Dependency
|
34
|
+
requirement: !ruby/object:Gem::Requirement
|
35
|
+
requirements:
|
36
|
+
- - '>='
|
37
|
+
- !ruby/object:Gem::Version
|
38
|
+
version: '0'
|
39
|
+
name: logstash-input-generator
|
40
|
+
prerelease: false
|
41
|
+
type: :runtime
|
42
|
+
version_requirements: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - '>='
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: '0'
|
47
|
+
- !ruby/object:Gem::Dependency
|
48
|
+
requirement: !ruby/object:Gem::Requirement
|
49
|
+
requirements:
|
50
|
+
- - '>='
|
51
|
+
- !ruby/object:Gem::Version
|
52
|
+
version: '0'
|
53
|
+
name: logstash-codec-json
|
54
|
+
prerelease: false
|
55
|
+
type: :runtime
|
56
|
+
version_requirements: !ruby/object:Gem::Requirement
|
57
|
+
requirements:
|
58
|
+
- - '>='
|
59
|
+
- !ruby/object:Gem::Version
|
60
|
+
version: '0'
|
61
|
+
- !ruby/object:Gem::Dependency
|
62
|
+
requirement: !ruby/object:Gem::Requirement
|
63
|
+
requirements:
|
64
|
+
- - '>='
|
65
|
+
- !ruby/object:Gem::Version
|
66
|
+
version: '0'
|
67
|
+
name: logstash-output-null
|
68
|
+
prerelease: false
|
69
|
+
type: :runtime
|
70
|
+
version_requirements: !ruby/object:Gem::Requirement
|
71
|
+
requirements:
|
72
|
+
- - '>='
|
73
|
+
- !ruby/object:Gem::Version
|
74
|
+
version: '0'
|
75
|
+
- !ruby/object:Gem::Dependency
|
76
|
+
requirement: !ruby/object:Gem::Requirement
|
77
|
+
requirements:
|
78
|
+
- - '>='
|
79
|
+
- !ruby/object:Gem::Version
|
80
|
+
version: '0'
|
81
|
+
name: logstash-devutils
|
82
|
+
prerelease: false
|
83
|
+
type: :development
|
84
|
+
version_requirements: !ruby/object:Gem::Requirement
|
85
|
+
requirements:
|
86
|
+
- - '>='
|
87
|
+
- !ruby/object:Gem::Version
|
88
|
+
version: '0'
|
89
|
+
description: This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program
|
90
|
+
email: info@elastic.co
|
91
|
+
executables: []
|
92
|
+
extensions: []
|
93
|
+
extra_rdoc_files: []
|
94
|
+
files:
|
95
|
+
- CHANGELOG.md
|
96
|
+
- CONTRIBUTORS
|
97
|
+
- Gemfile
|
98
|
+
- LICENSE
|
99
|
+
- NOTICE.TXT
|
100
|
+
- README.md
|
101
|
+
- lib/logstash/filters/date.rb
|
102
|
+
- logstash-filter-date.gemspec
|
103
|
+
- spec/filters/date_spec.rb
|
104
|
+
homepage: http://www.elastic.co/guide/en/logstash/current/index.html
|
105
|
+
licenses:
|
106
|
+
- Apache License (2.0)
|
107
|
+
metadata:
|
108
|
+
logstash_plugin: 'true'
|
109
|
+
logstash_group: filter
|
110
|
+
post_install_message:
|
111
|
+
rdoc_options: []
|
112
|
+
require_paths:
|
113
|
+
- lib
|
114
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
115
|
+
requirements:
|
116
|
+
- - '>='
|
117
|
+
- !ruby/object:Gem::Version
|
118
|
+
version: '0'
|
119
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
120
|
+
requirements:
|
121
|
+
- - '>='
|
122
|
+
- !ruby/object:Gem::Version
|
123
|
+
version: '0'
|
124
|
+
requirements: []
|
125
|
+
rubyforge_project:
|
126
|
+
rubygems_version: 2.4.5
|
127
|
+
signing_key:
|
128
|
+
specification_version: 4
|
129
|
+
summary: The date filter is used for parsing dates from fields, and then using that date or timestamp as the logstash timestamp for the event.
|
130
|
+
test_files:
|
131
|
+
- spec/filters/date_spec.rb
|