logstash-filter-date 3.1.15 → 3.2.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +14 -0
- data/docs/index.asciidoc +32 -23
- data/lib/logstash/filters/date.rb +15 -7
- data/logstash-filter-date.gemspec +1 -1
- data/spec/filters/date_spec.rb +99 -370
- data/vendor/jar-dependencies/org/logstash/filters/logstash-filter-date/3.1.6/logstash-filter-date-3.1.6.jar +0 -0
- metadata +17 -20
checksums.yaml
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
---
|
|
2
2
|
SHA256:
|
|
3
|
-
metadata.gz:
|
|
4
|
-
data.tar.gz:
|
|
3
|
+
metadata.gz: 8ea85b9a739ee3c2f7074203a7a30d7db4d37615541b126b5372e2d118260d4b
|
|
4
|
+
data.tar.gz: 106e2c9d67c60fa1ce5eea4ca4e27654aa416a264ed704e06cbdafc5e054b1b4
|
|
5
5
|
SHA512:
|
|
6
|
-
metadata.gz:
|
|
7
|
-
data.tar.gz:
|
|
6
|
+
metadata.gz: 8051731609e613b09430d809b077ffcf48e3151fcc5ab5affaed09dd8da3f1a8dc7b4c39a90468e8f32599609881900abd9f5c6f4dc3f94b1f891705fd7ccd7b
|
|
7
|
+
data.tar.gz: e1b8552ede171c1ea0dc77398395c9fdaeb1b79ece9e8dabe7d61c58a94c23f44d712f91950a0adfbb8c1bafce64c4bf7aafea8312c3d2b41e53f726639030a4
|
data/CHANGELOG.md
CHANGED
|
@@ -1,3 +1,17 @@
|
|
|
1
|
+
## 3.2.0
|
|
2
|
+
- Add `precision` setting to support nanosecond precision timestamps [#165](https://github.com/logstash-plugins/logstash-filter-date/pull/165)
|
|
3
|
+
- `ms` (default): timestamps are stored with millisecond precision
|
|
4
|
+
- keeps the same behavior as before for backward compatibility
|
|
5
|
+
- fractional seconds are truncated to 3 digits
|
|
6
|
+
- custom parsing formats use `joda-time` library
|
|
7
|
+
- `ns`: timestamps are stored with nanosecond precision
|
|
8
|
+
- fractional seconds support up to 9 digits
|
|
9
|
+
- custom parsing formats use `java.time`
|
|
10
|
+
- `ISO8601` now accepts up to 9 fractional-second digits
|
|
11
|
+
|
|
12
|
+
## 3.1.16
|
|
13
|
+
- Re-packaging the plugin [#163](https://github.com/logstash-plugins/logstash-filter-date/pull/163)
|
|
14
|
+
|
|
1
15
|
## 3.1.15
|
|
2
16
|
- Build: review build to be more reliable/portable [#139](https://github.com/logstash-plugins/logstash-filter-date/pull/139)
|
|
3
17
|
* cleaned up Java dependencies
|
data/docs/index.asciidoc
CHANGED
|
@@ -20,23 +20,11 @@ include::{include_path}/plugin_header.asciidoc[]
|
|
|
20
20
|
|
|
21
21
|
==== Description
|
|
22
22
|
|
|
23
|
-
The date filter is used for parsing dates from fields, and then using that
|
|
24
|
-
|
|
23
|
+
The date filter is used for parsing dates from fields, and then using that date or timestamp as the logstash timestamp for the event.
|
|
24
|
+
Timestamp is stored with millisecond precision. Set `precision => "ns"` to preserve nanoseconds.
|
|
25
25
|
|
|
26
|
-
|
|
27
|
-
|
|
28
|
-
"Apr 17 09:32:01"
|
|
29
|
-
|
|
30
|
-
You would use the date format `MMM dd HH:mm:ss` to parse this.
|
|
31
|
-
|
|
32
|
-
The date filter is especially important for sorting events and for
|
|
33
|
-
backfilling old data. If you don't get the date correct in your
|
|
34
|
-
event, then searching for them later will likely sort out of order.
|
|
35
|
-
|
|
36
|
-
In the absence of this filter, logstash will choose a timestamp based on the
|
|
37
|
-
first time it sees the event (at input time), if the timestamp is not already
|
|
38
|
-
set in the event. For example, with file input, the timestamp is set to the
|
|
39
|
-
time of each read.
|
|
26
|
+
Custom parsing formats use the JVM’s default locale and time zone.
|
|
27
|
+
To override them, configure the `locale` and `timezone` settings.
|
|
40
28
|
|
|
41
29
|
[id="plugins-{type}s-{plugin}-options"]
|
|
42
30
|
==== Date Filter Configuration Options
|
|
@@ -48,6 +36,7 @@ This plugin supports the following configuration options plus the <<plugins-{typ
|
|
|
48
36
|
|Setting |Input type|Required
|
|
49
37
|
| <<plugins-{type}s-{plugin}-locale>> |<<string,string>>|No
|
|
50
38
|
| <<plugins-{type}s-{plugin}-match>> |<<array,array>>|No
|
|
39
|
+
| <<plugins-{type}s-{plugin}-precision>> |<<string,string>>|No
|
|
51
40
|
| <<plugins-{type}s-{plugin}-tag_on_failure>> |<<array,array>>|No
|
|
52
41
|
| <<plugins-{type}s-{plugin}-target>> |<<string,string>>|No
|
|
53
42
|
| <<plugins-{type}s-{plugin}-timezone>> |<<string,string>>|No
|
|
@@ -89,11 +78,14 @@ If your time field has multiple possible formats, you can do this:
|
|
|
89
78
|
|
|
90
79
|
The above will match a syslog (rfc3164) or `iso8601` timestamp.
|
|
91
80
|
|
|
92
|
-
|
|
81
|
+
By default, the custom formats use the deprecated https://www.joda.org/joda-time/key_format.html[joda-time] library for parsing and timestamp is stored with millisecond precision.
|
|
82
|
+
If you set `precision => "ns"`, parsing is performed using https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/time/format/DateTimeFormatter.html[java.time], and timestamps are stored with nanosecond precision.
|
|
83
|
+
Both libraries use similar syntax, but there are some differences in supported features. For example, `java.time` supports time zone IDs with `VV` while `joda-time` uses `ZZZ`.
|
|
84
|
+
|
|
85
|
+
Besides custom formats, the following format literals exist
|
|
93
86
|
to help you save time and ensure correctness of date parsing.
|
|
94
87
|
|
|
95
|
-
* `ISO8601` - should parse any valid ISO8601 timestamp, such as
|
|
96
|
-
`2011-04-19T03:44:01.103Z`
|
|
88
|
+
* `ISO8601` - should parse any valid ISO8601 timestamp, such as `2011-04-19T03:44:01.123456789Z`
|
|
97
89
|
* `UNIX` - will parse *float or int* value expressing unix time in seconds since epoch like 1326149001.132 as well as 1326149001
|
|
98
90
|
* `UNIX_MS` - will parse **int** value expressing unix time in milliseconds since epoch like 1366125117000
|
|
99
91
|
* `TAI64N` - will parse tai64n time values
|
|
@@ -147,10 +139,14 @@ s:: seconds of the minute (60 seconds per minute)
|
|
|
147
139
|
ss::: two-digit seconds, zero-padded if needed. Example: `00`.
|
|
148
140
|
|
|
149
141
|
S:: fraction of a second
|
|
150
|
-
*Maximum precision is milliseconds (`SSS`). Beyond that, zeroes are appended.*
|
|
151
142
|
S::: tenths of a second. Example: `0` for a subsecond value `012`
|
|
152
143
|
SS::: hundredths of a second. Example: `01` for a subsecond value `01`
|
|
153
|
-
SSS:::
|
|
144
|
+
SSS::: milliseconds. Example: `012` for a subsecond value `012`
|
|
145
|
+
SSSSSS::: microseconds. Example: `012345`
|
|
146
|
+
SSSSSSSSS::: nanoseconds. Example: `012345678`
|
|
147
|
+
|
|
148
|
+
V:: time-zone ID
|
|
149
|
+
VV::: time zone ID. Example: `America/Los_Angeles`. Note: This is only supported by `java.time` parsing.
|
|
154
150
|
|
|
155
151
|
Z:: time zone offset or identity
|
|
156
152
|
Z::: Timezone offset structured as HHmm (hour and minutes offset from Zulu/UTC). Example: `-0700`.
|
|
@@ -174,10 +170,23 @@ E:: day of the week (text)
|
|
|
174
170
|
For non-formatting syntax, you'll need to put single-quote characters around the value. For example, if you were parsing ISO8601 time, "2015-01-01T01:12:23" that little "T" isn't a valid time format, and you want to say "literally, a T", your format would be this: "yyyy-MM-dd'T'HH:mm:ss"
|
|
175
171
|
|
|
176
172
|
Other less common date units, such as era (G), century \(C), am/pm (a), and # more, can be learned about on the
|
|
177
|
-
|
|
173
|
+
https://www.joda.org/joda-time/key_format.html[joda-time documentation].
|
|
174
|
+
|
|
175
|
+
[id="plugins-{type}s-{plugin}-precision"]
|
|
176
|
+
===== `precision`
|
|
177
|
+
|
|
178
|
+
* Value type is <<string,string>>
|
|
179
|
+
* Valid values are `ms` and `ns`
|
|
180
|
+
* Default value is `ms`
|
|
181
|
+
|
|
182
|
+
Controls the sub-second precision of the stored timestamp.
|
|
183
|
+
|
|
184
|
+
`ms` stores data in millisecond precision. Custom pattern formats use https://www.joda.org/joda-time/key_format.html[joda-time] parsing rules. For example, `yyyy-MM-dd HH:mm:ss ZZZ`.
|
|
185
|
+
|
|
186
|
+
`ns` stores data in nanosecond precision. Custom pattern formats use https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/time/format/DateTimeFormatter.html[java.time] parsing rules. For example, `yyyy-MM-dd HH:mm:ss VV`.
|
|
178
187
|
|
|
179
188
|
[id="plugins-{type}s-{plugin}-tag_on_failure"]
|
|
180
|
-
===== `tag_on_failure`
|
|
189
|
+
===== `tag_on_failure`
|
|
181
190
|
|
|
182
191
|
* Value type is <<array,array>>
|
|
183
192
|
* Default value is `["_dateparsefailure"]`
|
|
@@ -57,9 +57,8 @@ class LogStash::Filters::Date < LogStash::Filters::Base
|
|
|
57
57
|
# There are a few special exceptions. The following format literals exist
|
|
58
58
|
# to help you save time and ensure correctness of date parsing.
|
|
59
59
|
#
|
|
60
|
-
# * `ISO8601` - should parse any valid ISO8601 timestamp
|
|
61
|
-
#
|
|
62
|
-
# * `UNIX` - will parse *float or int* value expressing unix time in seconds since epoch like 1326149001.132 as well as 1326149001
|
|
60
|
+
# * `ISO8601` - should parse any valid ISO8601 timestamp with up to 9 fractional-second digits
|
|
61
|
+
# * `UNIX` - will parse *float or int* value expressing unix time in seconds since epoch like 1326149001.123456789 as well as 1326149001
|
|
63
62
|
# * `UNIX_MS` - will parse **int** value expressing unix time in milliseconds since epoch like 1366125117000
|
|
64
63
|
# * `TAI64N` - will parse tai64n time values
|
|
65
64
|
#
|
|
@@ -112,10 +111,11 @@ class LogStash::Filters::Date < LogStash::Filters::Base
|
|
|
112
111
|
# ss::: two-digit seconds, zero-padded if needed. Example: `00`.
|
|
113
112
|
#
|
|
114
113
|
# S:: fraction of a second
|
|
115
|
-
# *Maximum precision is milliseconds (`SSS`). Beyond that, zeroes are appended.*
|
|
116
114
|
# S::: tenths of a second. Example: `0` for a subsecond value `012`
|
|
117
115
|
# SS::: hundredths of a second. Example: `01` for a subsecond value `01`
|
|
118
|
-
# SSS:::
|
|
116
|
+
# SSS::: milliseconds. Example: `012` for a subsecond value `012`
|
|
117
|
+
# SSSSSS::: microseconds.
|
|
118
|
+
# SSSSSSSSS::: nanoseconds.
|
|
119
119
|
#
|
|
120
120
|
# Z:: time zone offset or identity
|
|
121
121
|
# Z::: Timezone offset structured as HHmm (hour and minutes offset from Zulu/UTC). Example: `-0700`.
|
|
@@ -150,6 +150,14 @@ class LogStash::Filters::Date < LogStash::Filters::Base
|
|
|
150
150
|
# successful match
|
|
151
151
|
config :tag_on_failure, :validate => :array, :default => ["_dateparsefailure"]
|
|
152
152
|
|
|
153
|
+
# Controls the sub-second precision of the stored timestamp.
|
|
154
|
+
#
|
|
155
|
+
# `"ms"` (default):: Stores in millisecond precision. Custom-pattern formats use Joda-Time parsing rules.
|
|
156
|
+
# `"ns"`:: Stores in nanosecond precision. Custom-pattern formats use java.time parsing rules.
|
|
157
|
+
config :precision, :validate => [Java::OrgLogstashFiltersParser::TimestampParserFactory::PRECISION_MS,
|
|
158
|
+
Java::OrgLogstashFiltersParser::TimestampParserFactory::PRECISION_NS],
|
|
159
|
+
:default => Java::OrgLogstashFiltersParser::TimestampParserFactory::PRECISION_MS
|
|
160
|
+
|
|
153
161
|
def register
|
|
154
162
|
# nothing
|
|
155
163
|
end
|
|
@@ -179,13 +187,13 @@ class LogStash::Filters::Date < LogStash::Filters::Base
|
|
|
179
187
|
metric.increment(:failures)
|
|
180
188
|
end
|
|
181
189
|
|
|
182
|
-
@datefilter = org.logstash.filters.DateFilter.new(source, @target, @tag_on_failure, success_block, failure_block)
|
|
190
|
+
@datefilter = org.logstash.filters.DateFilter.new(source, @target, @tag_on_failure, @precision, success_block, failure_block)
|
|
183
191
|
|
|
184
192
|
@match[1..-1].map do |format|
|
|
185
193
|
@datefilter.accept_filter_config(format, @locale, @timezone)
|
|
186
194
|
|
|
187
195
|
# Offer a fallback parser such that if the default system Locale is non-english and that no locale is set,
|
|
188
|
-
# we should try to parse english if the first local parsing fails
|
|
196
|
+
# we should try to parse english if the first local parsing fails.
|
|
189
197
|
if !@locale && "en" != java.util.Locale.getDefault().getLanguage() && (format.include?("MMM") || format.include?("E"))
|
|
190
198
|
@datefilter.accept_filter_config(format, "en-US", @timezone)
|
|
191
199
|
end
|
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
Gem::Specification.new do |s|
|
|
2
2
|
|
|
3
3
|
s.name = 'logstash-filter-date'
|
|
4
|
-
s.version = '3.
|
|
4
|
+
s.version = '3.2.0'
|
|
5
5
|
s.licenses = ['Apache License (2.0)']
|
|
6
6
|
s.summary = "Parses dates from fields to use as the Logstash timestamp for an event"
|
|
7
7
|
s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
|
data/spec/filters/date_spec.rb
CHANGED
|
@@ -17,62 +17,6 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
17
17
|
end
|
|
18
18
|
end
|
|
19
19
|
|
|
20
|
-
describe "parsing with ISO8601" do
|
|
21
|
-
config <<-CONFIG
|
|
22
|
-
filter {
|
|
23
|
-
date {
|
|
24
|
-
match => [ "mydate", "ISO8601" ]
|
|
25
|
-
locale => "en"
|
|
26
|
-
timezone => "UTC"
|
|
27
|
-
}
|
|
28
|
-
}
|
|
29
|
-
CONFIG
|
|
30
|
-
|
|
31
|
-
times = {
|
|
32
|
-
"2001-01-01T00:00:00-0800" => "2001-01-01T08:00:00.000Z",
|
|
33
|
-
"1974-03-02T04:09:09-0800" => "1974-03-02T12:09:09.000Z",
|
|
34
|
-
"2010-05-03T08:18:18+00:00" => "2010-05-03T08:18:18.000Z",
|
|
35
|
-
"2004-07-04T12:27:27-00:00" => "2004-07-04T12:27:27.000Z",
|
|
36
|
-
"2001-09-05T16:36:36+0000" => "2001-09-05T16:36:36.000Z",
|
|
37
|
-
"2001-11-06T20:45:45-0000" => "2001-11-06T20:45:45.000Z",
|
|
38
|
-
"2001-12-07T23:54:54Z" => "2001-12-07T23:54:54.000Z",
|
|
39
|
-
|
|
40
|
-
# TODO: This test assumes PDT
|
|
41
|
-
#"2001-01-01T00:00:00.123" => "2001-01-01T08:00:00.123Z",
|
|
42
|
-
|
|
43
|
-
"2010-05-03T08:18:18.123+00:00" => "2010-05-03T08:18:18.123Z",
|
|
44
|
-
"2004-07-04T12:27:27.123-04:00" => "2004-07-04T16:27:27.123Z",
|
|
45
|
-
"2001-09-05T16:36:36.123+0700" => "2001-09-05T09:36:36.123Z",
|
|
46
|
-
"2001-11-06T20:45:45.123-0000" => "2001-11-06T20:45:45.123Z",
|
|
47
|
-
"2001-12-07T23:54:54.123Z" => "2001-12-07T23:54:54.123Z",
|
|
48
|
-
"2001-12-07T23:54:54,123Z" => "2001-12-07T23:54:54.123Z",
|
|
49
|
-
|
|
50
|
-
#Almost ISO8601 support, with timezone
|
|
51
|
-
|
|
52
|
-
"2001-11-06 20:45:45.123-0000" => "2001-11-06T20:45:45.123Z",
|
|
53
|
-
"2001-12-07 23:54:54.123Z" => "2001-12-07T23:54:54.123Z",
|
|
54
|
-
"2001-12-07 23:54:54,123Z" => "2001-12-07T23:54:54.123Z",
|
|
55
|
-
|
|
56
|
-
#Almost ISO8601 support, without timezone
|
|
57
|
-
|
|
58
|
-
"2001-11-06 20:45:45.123" => "2001-11-06T20:45:45.123Z",
|
|
59
|
-
"2001-11-06 20:45:45,123" => "2001-11-06T20:45:45.123Z",
|
|
60
|
-
|
|
61
|
-
}
|
|
62
|
-
|
|
63
|
-
times.each do |input, output|
|
|
64
|
-
sample("mydate" => input) do
|
|
65
|
-
begin
|
|
66
|
-
insist { subject.get("mydate") } == input
|
|
67
|
-
insist { subject.get("@timestamp").time } == Time.iso8601(output).utc
|
|
68
|
-
rescue
|
|
69
|
-
#require "pry"; binding.pry
|
|
70
|
-
raise
|
|
71
|
-
end
|
|
72
|
-
end
|
|
73
|
-
end # times.each
|
|
74
|
-
end
|
|
75
|
-
|
|
76
20
|
describe "parsing with java SimpleDateFormat syntax" do
|
|
77
21
|
config <<-CONFIG
|
|
78
22
|
filter {
|
|
@@ -91,7 +35,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
91
35
|
"Nov 24 01:29:01 -0800" => "#{year}-11-24T09:29:01.000Z",
|
|
92
36
|
}
|
|
93
37
|
times.each do |input, output|
|
|
94
|
-
sample("mydate" => input) do
|
|
38
|
+
sample({"mydate" => input}) do
|
|
95
39
|
insist { subject.get("mydate") } == input
|
|
96
40
|
insist { subject.get("@timestamp").time } == Time.iso8601(output).utc
|
|
97
41
|
end
|
|
@@ -119,55 +63,17 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
119
63
|
1478207457 => "2016-11-03T21:10:57.000Z"
|
|
120
64
|
}
|
|
121
65
|
times.each do |input, output|
|
|
122
|
-
sample("mydate" => input) do
|
|
66
|
+
sample({"mydate" => input}) do
|
|
123
67
|
insist { subject.get("mydate") } == input
|
|
124
68
|
insist { subject.get("@timestamp").time } == Time.iso8601(output).utc
|
|
125
69
|
end
|
|
126
70
|
end # times.each
|
|
127
71
|
|
|
128
72
|
#Invalid value should not be evaluated to zero (String#to_i madness)
|
|
129
|
-
sample("mydate" => "%{bad_value}") do
|
|
130
|
-
insist { subject.get("mydate") } == "%{bad_value}"
|
|
131
|
-
insist { subject.get("@timestamp").time } != Time.iso8601("1970-01-01T00:00:00.000Z").utc
|
|
132
|
-
end
|
|
133
|
-
end
|
|
134
|
-
|
|
135
|
-
describe "parsing microsecond-precise times with UNIX (#213)" do
|
|
136
|
-
config <<-CONFIG
|
|
137
|
-
filter {
|
|
138
|
-
date {
|
|
139
|
-
match => [ "mydate", "UNIX" ]
|
|
140
|
-
locale => "en"
|
|
141
|
-
}
|
|
142
|
-
}
|
|
143
|
-
CONFIG
|
|
144
|
-
|
|
145
|
-
sample("mydate" => "1350414944.123456") do
|
|
146
|
-
# Joda time only supports milliseconds :\
|
|
147
|
-
insist { subject.timestamp.time } == Time.iso8601("2012-10-16T12:15:44.123-07:00").utc
|
|
148
|
-
end
|
|
149
|
-
|
|
150
|
-
#Support float values
|
|
151
|
-
sample("mydate" => 1350414944.123456) do
|
|
152
|
-
insist { subject.get("mydate") } == 1350414944.123456
|
|
153
|
-
insist { subject.get("@timestamp").time } == Time.iso8601("2012-10-16T12:15:44.123-07:00").utc
|
|
154
|
-
end
|
|
155
|
-
|
|
156
|
-
#Invalid value should not be evaluated to zero (String#to_i madness)
|
|
157
|
-
sample("mydate" => "%{bad_value}") do
|
|
73
|
+
sample({"mydate" => "%{bad_value}"}) do
|
|
158
74
|
insist { subject.get("mydate") } == "%{bad_value}"
|
|
159
75
|
insist { subject.get("@timestamp").time } != Time.iso8601("1970-01-01T00:00:00.000Z").utc
|
|
160
76
|
end
|
|
161
|
-
|
|
162
|
-
# Regression test
|
|
163
|
-
# Support numeric values that come through the JSON parser. These numbers appear as BigDecimal
|
|
164
|
-
# instead of Float.
|
|
165
|
-
sample(LogStash::Json.load('{ "mydate": 1350414944.123456 }')) do
|
|
166
|
-
# its generally problematic to compare different Floating Point class implementations using equals
|
|
167
|
-
# because they can't always represent a value exactly.
|
|
168
|
-
insist { subject.get("mydate") } == BigDecimal.new("1350414944.123456")
|
|
169
|
-
insist { subject.get("@timestamp").time } == Time.iso8601("2012-10-16T12:15:44.123-07:00").utc
|
|
170
|
-
end
|
|
171
77
|
end
|
|
172
78
|
|
|
173
79
|
describe "parsing with UNIX_MS" do
|
|
@@ -191,7 +97,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
191
97
|
1000000000123 => "2001-09-09T01:46:40.123Z"
|
|
192
98
|
}
|
|
193
99
|
times.each do |input, output|
|
|
194
|
-
sample("mydate" => input) do
|
|
100
|
+
sample({"mydate" => input}) do
|
|
195
101
|
insist { subject.get("mydate") } == input
|
|
196
102
|
insist { subject.get("@timestamp").time } == Time.iso8601(output).utc
|
|
197
103
|
end
|
|
@@ -223,7 +129,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
223
129
|
1478207457.456 => "2016-11-03T21:10:57.456Z",
|
|
224
130
|
}
|
|
225
131
|
times.each do |input, output|
|
|
226
|
-
sample("mydate" => input) do
|
|
132
|
+
sample({"mydate" => input}) do
|
|
227
133
|
insist { subject.get("mydate") } == input
|
|
228
134
|
insist { subject.get("@timestamp").time } == Time.iso8601(output).utc
|
|
229
135
|
end
|
|
@@ -269,14 +175,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
269
175
|
}
|
|
270
176
|
CONFIG
|
|
271
177
|
|
|
272
|
-
|
|
273
|
-
sample("t" => "4000000050d506482dbdf024") do
|
|
274
|
-
insist { subject.timestamp.time } == Time.iso8601("2012-12-22T01:00:46.767Z").utc
|
|
275
|
-
end
|
|
276
|
-
|
|
277
|
-
# Should still parse successfully if it's a full tai64n time (with leading
|
|
278
|
-
# '@')
|
|
279
|
-
sample("t" => "@4000000050d506482dbdf024") do
|
|
178
|
+
sample({"t" => "4000000050d506482dbdf024"}) do
|
|
280
179
|
insist { subject.timestamp.time } == Time.iso8601("2012-12-22T01:00:46.767Z").utc
|
|
281
180
|
end
|
|
282
181
|
end
|
|
@@ -293,7 +192,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
293
192
|
|
|
294
193
|
time = "2001-09-09T01:46:40.000Z"
|
|
295
194
|
|
|
296
|
-
sample("mydate" => time) do
|
|
195
|
+
sample({"mydate" => time}) do
|
|
297
196
|
insist { subject.get("mydate") } == time
|
|
298
197
|
insist { subject.get("@timestamp").time } == Time.iso8601(time).utc
|
|
299
198
|
end
|
|
@@ -309,7 +208,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
309
208
|
}
|
|
310
209
|
CONFIG
|
|
311
210
|
|
|
312
|
-
sample("data" => { "deep" => "2013-01-01T00:00:00.000Z" }) do
|
|
211
|
+
sample({"data" => { "deep" => "2013-01-01T00:00:00.000Z" }}) do
|
|
313
212
|
insist { subject.get("@timestamp").time } == Time.iso8601("2013-01-01T00:00:00.000Z").utc
|
|
314
213
|
end
|
|
315
214
|
end
|
|
@@ -324,7 +223,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
324
223
|
}
|
|
325
224
|
CONFIG
|
|
326
225
|
|
|
327
|
-
sample("thedate" => "2013/Apr/21") do
|
|
226
|
+
sample({"thedate" => "2013/Apr/21"}) do
|
|
328
227
|
expected = Time.iso8601("2013-04-21T00:00:00.000Z").utc
|
|
329
228
|
expect(subject.get("@timestamp").time).not_to eq(expected)
|
|
330
229
|
end
|
|
@@ -341,7 +240,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
341
240
|
}
|
|
342
241
|
CONFIG
|
|
343
242
|
|
|
344
|
-
sample("thedate" => "2013/04/21") do
|
|
243
|
+
sample({"thedate" => "2013/04/21"}) do
|
|
345
244
|
expected = Time.iso8601("2013-04-21T00:00:00.000Z").utc
|
|
346
245
|
expect(subject.get("@timestamp").time).to eq(expected)
|
|
347
246
|
insist { subject.get("tags") } == ["tagged"]
|
|
@@ -359,7 +258,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
359
258
|
}
|
|
360
259
|
CONFIG
|
|
361
260
|
|
|
362
|
-
sample("thedate" => "2013/Apr/21") do
|
|
261
|
+
sample({"thedate" => "2013/Apr/21"}) do
|
|
363
262
|
expected = Time.iso8601("2013-04-21T00:00:00.000Z").utc
|
|
364
263
|
expect(subject.get("@timestamp").time).not_to eq(expected)
|
|
365
264
|
reject { subject.get("tags") }.include? "tagged"
|
|
@@ -377,7 +276,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
377
276
|
}
|
|
378
277
|
CONFIG
|
|
379
278
|
|
|
380
|
-
sample("thedate" => "2013/Apr/21") do
|
|
279
|
+
sample({"thedate" => "2013/Apr/21"}) do
|
|
381
280
|
expected = Time.iso8601("2013-04-21T00:00:00.000Z").utc
|
|
382
281
|
expect(subject.get("@timestamp").time).not_to eq(expected)
|
|
383
282
|
insist { subject.get("tags") }.include? "date_failed"
|
|
@@ -401,7 +300,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
401
300
|
"2013 Jun 24 01:29:01" => "2013-06-24T08:29:01.000Z",
|
|
402
301
|
}
|
|
403
302
|
times.each do |input, output|
|
|
404
|
-
sample("mydate" => input) do
|
|
303
|
+
sample({"mydate" => input}) do
|
|
405
304
|
insist { subject.get("mydate") } == input
|
|
406
305
|
insist { subject.get("@timestamp").time } == Time.iso8601(output).utc
|
|
407
306
|
end
|
|
@@ -425,7 +324,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
425
324
|
"2013 Jun 24 01:29:01" => "2013-06-24T08:29:01.000Z",
|
|
426
325
|
}
|
|
427
326
|
times.each do |input, output|
|
|
428
|
-
sample("mydate" => input, "mytz" => "America/Los_Angeles") do
|
|
327
|
+
sample({"mydate" => input, "mytz" => "America/Los_Angeles"}) do
|
|
429
328
|
insist { subject.get("mydate") } == input
|
|
430
329
|
insist { subject.get("@timestamp").time } == Time.iso8601(output).utc
|
|
431
330
|
end
|
|
@@ -446,204 +345,21 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
446
345
|
require 'java'
|
|
447
346
|
|
|
448
347
|
# Venezuela changed from -4:00 to -4:30 at 03:00 on Sun, 9 Dec 2007
|
|
449
|
-
sample("mydate" => "2007-12-09T01:00:00", "mytz" => "America/Caracas") do
|
|
348
|
+
sample({"mydate" => "2007-12-09T01:00:00", "mytz" => "America/Caracas"}) do
|
|
450
349
|
expect(subject.get("mydate")).to eq("2007-12-09T01:00:00")
|
|
451
350
|
expect(subject.get("@timestamp").time).to eq(Time.iso8601("2007-12-09T05:00:00.000Z").utc)
|
|
452
351
|
end
|
|
453
|
-
sample("mydate" => "2007-12-09T10:00:00", "mytz" => "America/Caracas") do
|
|
352
|
+
sample({"mydate" => "2007-12-09T10:00:00", "mytz" => "America/Caracas"}) do
|
|
454
353
|
expect(subject.get("mydate")).to eq("2007-12-09T10:00:00")
|
|
455
354
|
expect(subject.get("@timestamp").time).to eq(Time.iso8601("2007-12-09T14:30:00.000Z").utc)
|
|
456
355
|
end
|
|
457
356
|
# Venezuela changed from -4:30 to -4:00 at 02:30 on Sunday, 1 May 2016
|
|
458
|
-
sample("mydate" => "2016-05-01T08:18:18.123", "mytz" => "America/Caracas") do
|
|
357
|
+
sample({"mydate" => "2016-05-01T08:18:18.123", "mytz" => "America/Caracas"}) do
|
|
459
358
|
expect(subject.get("mydate")).to eq("2016-05-01T08:18:18.123")
|
|
460
359
|
expect(subject.get("@timestamp").time).to eq(Time.iso8601("2016-05-01T12:18:18.123Z").utc)
|
|
461
360
|
end
|
|
462
361
|
end
|
|
463
362
|
|
|
464
|
-
describe "don't fail on next years DST switchover in CET" do
|
|
465
|
-
config <<-CONFIG
|
|
466
|
-
filter {
|
|
467
|
-
date {
|
|
468
|
-
match => [ "message", "yyyy MMM dd HH:mm:ss" ]
|
|
469
|
-
locale => "en"
|
|
470
|
-
timezone => "CET"
|
|
471
|
-
}
|
|
472
|
-
}
|
|
473
|
-
CONFIG
|
|
474
|
-
|
|
475
|
-
before(:each) do
|
|
476
|
-
logstash_time = Time.utc(2016,03,29,23,59,50)
|
|
477
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
478
|
-
end
|
|
479
|
-
|
|
480
|
-
sample "2016 Mar 26 02:00:37" do
|
|
481
|
-
insist { subject.get("tags") } != ["_dateparsefailure"]
|
|
482
|
-
expect(subject.get("@timestamp")).to be_a_logstash_timestamp_equivalent_to("2016-03-26T01:00:37.000Z")
|
|
483
|
-
end
|
|
484
|
-
end
|
|
485
|
-
|
|
486
|
-
context "Default year handling when parsing with timezone from event" do
|
|
487
|
-
|
|
488
|
-
describe "LOGSTASH-34 - Default year should be this year" do
|
|
489
|
-
config <<-CONFIG
|
|
490
|
-
filter {
|
|
491
|
-
date {
|
|
492
|
-
match => [ "message", "EEE MMM dd HH:mm:ss" ]
|
|
493
|
-
locale => "en"
|
|
494
|
-
timezone => "%{mytz}"
|
|
495
|
-
}
|
|
496
|
-
}
|
|
497
|
-
CONFIG
|
|
498
|
-
|
|
499
|
-
sample("message" => "Sun Jun 02 20:38:03", "mytz" => "UTC") do
|
|
500
|
-
insist { subject.get("@timestamp").year } == Time.now.year
|
|
501
|
-
end
|
|
502
|
-
end
|
|
503
|
-
|
|
504
|
-
describe "fill last year if december events arrive in january" do
|
|
505
|
-
config <<-CONFIG
|
|
506
|
-
filter {
|
|
507
|
-
date {
|
|
508
|
-
match => [ "message", "MMM dd HH:mm:ss" ]
|
|
509
|
-
locale => "en"
|
|
510
|
-
timezone => "%{mytz}"
|
|
511
|
-
}
|
|
512
|
-
}
|
|
513
|
-
CONFIG
|
|
514
|
-
|
|
515
|
-
before(:each) do
|
|
516
|
-
logstash_time = Time.utc(2014,1,1,00,30,50)
|
|
517
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
518
|
-
org.logstash.filters.parser.JodaParser.setDefaultClock { org.joda.time.DateTime.new(2014,1,1,00,30,50, org.joda.time.DateTimeZone::UTC ) }
|
|
519
|
-
end
|
|
520
|
-
|
|
521
|
-
sample("message" => "Dec 31 23:59:00", "mytz" => "UTC") do
|
|
522
|
-
insist { subject.get("@timestamp").year } == 2013
|
|
523
|
-
end
|
|
524
|
-
end
|
|
525
|
-
|
|
526
|
-
describe "fill next year if january events arrive in december" do
|
|
527
|
-
config <<-CONFIG
|
|
528
|
-
filter {
|
|
529
|
-
date {
|
|
530
|
-
match => [ "message", "MMM dd HH:mm:ss" ]
|
|
531
|
-
locale => "en"
|
|
532
|
-
timezone => "%{mytz}"
|
|
533
|
-
}
|
|
534
|
-
}
|
|
535
|
-
CONFIG
|
|
536
|
-
|
|
537
|
-
before(:each) do
|
|
538
|
-
logstash_time = Time.utc(2013,12,31,23,59,50)
|
|
539
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
540
|
-
org.logstash.filters.parser.JodaParser.setDefaultClock { org.joda.time.DateTime.new(2013,12,31,23,59,50, org.joda.time.DateTimeZone::UTC ) }
|
|
541
|
-
end
|
|
542
|
-
|
|
543
|
-
sample( "message" => "Jan 01 01:00:00", "mytz" => "UTC") do
|
|
544
|
-
insist { subject.get("@timestamp").year } == 2014
|
|
545
|
-
end
|
|
546
|
-
end
|
|
547
|
-
|
|
548
|
-
describe "do fail on 2016 DST switchover in CET" do
|
|
549
|
-
# This test tries to parse a time that doesn't exist. '02:00:01' is a time that doesn't exist
|
|
550
|
-
# because this DST switch goes from 01:59:59 to 03:00:00, skipping 2am entirely. The last Sunday of March in 2016 was 27th.
|
|
551
|
-
# (Guy Boertje) Fixed the GuessYear logic
|
|
552
|
-
# Joda has a default year for DateTimeFormat of 2000
|
|
553
|
-
# meaning that the initial time parsed was Monday 2000-03-27 02:00:01 and the last Sunday of March in 2000 was the 26th
|
|
554
|
-
# then by adding the year of 2016 creates an invalid time
|
|
555
|
-
# The parser default now takes the year from the DefaultClock in the JodaParser
|
|
556
|
-
config <<-CONFIG
|
|
557
|
-
filter {
|
|
558
|
-
date {
|
|
559
|
-
match => [ "message", "MMM dd HH:mm:ss" ]
|
|
560
|
-
locale => "en"
|
|
561
|
-
timezone => "CET"
|
|
562
|
-
}
|
|
563
|
-
}
|
|
564
|
-
CONFIG
|
|
565
|
-
|
|
566
|
-
before(:each) do
|
|
567
|
-
logstash_time = Time.utc(2016,03,29,23,59,50)
|
|
568
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
569
|
-
org.logstash.filters.parser.JodaParser.setDefaultClock { org.joda.time.DateTime.new(2016,03,29,23,59,50, org.joda.time.DateTimeZone::UTC ) }
|
|
570
|
-
end
|
|
571
|
-
|
|
572
|
-
sample "Mar 27 01:59:59" do
|
|
573
|
-
expect(subject.get("tags")).to be_nil
|
|
574
|
-
expect(subject.get("@timestamp")).to be_a_logstash_timestamp_equivalent_to("2016-03-27T00:59:59.000Z")
|
|
575
|
-
end
|
|
576
|
-
|
|
577
|
-
sample "Mar 27 02:00:01" do
|
|
578
|
-
expect(subject.get("tags")).to eq ["_dateparsefailure"]
|
|
579
|
-
end
|
|
580
|
-
|
|
581
|
-
sample "Mar 27 03:00:01" do
|
|
582
|
-
expect(subject.get("tags")).to be_nil
|
|
583
|
-
expect(subject.get("@timestamp")).to be_a_logstash_timestamp_equivalent_to("2016-03-27T01:00:01.000Z")
|
|
584
|
-
end
|
|
585
|
-
end
|
|
586
|
-
end
|
|
587
|
-
|
|
588
|
-
describe "LOGSTASH-34 - Default year should be this year" do
|
|
589
|
-
config <<-CONFIG
|
|
590
|
-
filter {
|
|
591
|
-
date {
|
|
592
|
-
match => [ "message", "EEE MMM dd HH:mm:ss" ]
|
|
593
|
-
locale => "en"
|
|
594
|
-
}
|
|
595
|
-
}
|
|
596
|
-
CONFIG
|
|
597
|
-
|
|
598
|
-
sample "Sun Jun 02 20:38:03" do
|
|
599
|
-
insist { subject.get("@timestamp").year } == Time.now.year
|
|
600
|
-
end
|
|
601
|
-
end
|
|
602
|
-
|
|
603
|
-
describe "fill last year if december events arrive in january" do
|
|
604
|
-
config <<-CONFIG
|
|
605
|
-
filter {
|
|
606
|
-
date {
|
|
607
|
-
match => [ "message", "MMM dd HH:mm:ss" ]
|
|
608
|
-
locale => "en"
|
|
609
|
-
timezone => "UTC"
|
|
610
|
-
}
|
|
611
|
-
}
|
|
612
|
-
CONFIG
|
|
613
|
-
|
|
614
|
-
before(:each) do
|
|
615
|
-
logstash_time = Time.utc(2014,1,1,00,30,50)
|
|
616
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
617
|
-
org.logstash.filters.parser.JodaParser.setDefaultClock { org.joda.time.DateTime.new(2014,1,1,00,30,50, org.joda.time.DateTimeZone::UTC ) }
|
|
618
|
-
end
|
|
619
|
-
|
|
620
|
-
sample "Dec 31 23:59:00" do
|
|
621
|
-
insist { subject.get("@timestamp").year } == 2013
|
|
622
|
-
end
|
|
623
|
-
end
|
|
624
|
-
|
|
625
|
-
describe "fill next year if january events arrive in december" do
|
|
626
|
-
config <<-CONFIG
|
|
627
|
-
filter {
|
|
628
|
-
date {
|
|
629
|
-
match => [ "message", "MMM dd HH:mm:ss" ]
|
|
630
|
-
locale => "en"
|
|
631
|
-
timezone => "UTC"
|
|
632
|
-
}
|
|
633
|
-
}
|
|
634
|
-
CONFIG
|
|
635
|
-
|
|
636
|
-
before(:each) do
|
|
637
|
-
logstash_time = Time.utc(2013,12,31,23,59,50)
|
|
638
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
639
|
-
org.logstash.filters.parser.JodaParser.setDefaultClock { org.joda.time.DateTime.new(2013,12,31,15,59,50, org.joda.time.DateTimeZone::UTC ) }
|
|
640
|
-
end
|
|
641
|
-
|
|
642
|
-
sample "Jan 01 01:00:00" do
|
|
643
|
-
insist { subject.get("@timestamp").year } == 2014
|
|
644
|
-
end
|
|
645
|
-
end
|
|
646
|
-
|
|
647
363
|
describe "Supporting locale only" do
|
|
648
364
|
config <<-CONFIG
|
|
649
365
|
filter {
|
|
@@ -703,7 +419,7 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
703
419
|
}
|
|
704
420
|
CONFIG
|
|
705
421
|
|
|
706
|
-
sample("timestamp" => "25/Mar/2013:20:33:56 +0000") do
|
|
422
|
+
sample({"timestamp" => "25/Mar/2013:20:33:56 +0000"}) do
|
|
707
423
|
insist { subject.get("@timestamp").time } == Time.iso8601("2013-03-25T20:33:56.000Z")
|
|
708
424
|
end
|
|
709
425
|
end
|
|
@@ -731,74 +447,6 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
731
447
|
end
|
|
732
448
|
end
|
|
733
449
|
|
|
734
|
-
context "Default year handling when parsing with english fallback parser" do
|
|
735
|
-
|
|
736
|
-
around do |example|
|
|
737
|
-
default = java.util.Locale.getDefault
|
|
738
|
-
java.util.Locale.setDefault(java.util.Locale.forLanguageTag('fr-FR'))
|
|
739
|
-
example.run
|
|
740
|
-
java.util.Locale.setDefault(default)
|
|
741
|
-
end
|
|
742
|
-
|
|
743
|
-
puts "override locale"
|
|
744
|
-
describe "LOGSTASH-34 - Default year should be this year" do
|
|
745
|
-
config <<-CONFIG
|
|
746
|
-
filter {
|
|
747
|
-
date {
|
|
748
|
-
match => [ "message", "EEE MMM dd HH:mm:ss" ]
|
|
749
|
-
timezone => "UTC"
|
|
750
|
-
}
|
|
751
|
-
}
|
|
752
|
-
CONFIG
|
|
753
|
-
|
|
754
|
-
sample "Sun Jun 02 20:38:03" do
|
|
755
|
-
insist { subject.get("@timestamp").year } == Time.now.year
|
|
756
|
-
end
|
|
757
|
-
end
|
|
758
|
-
|
|
759
|
-
describe "fill last year if december events arrive in january" do
|
|
760
|
-
config <<-CONFIG
|
|
761
|
-
filter {
|
|
762
|
-
date {
|
|
763
|
-
match => [ "message", "MMM dd HH:mm:ss" ]
|
|
764
|
-
timezone => "UTC"
|
|
765
|
-
}
|
|
766
|
-
}
|
|
767
|
-
CONFIG
|
|
768
|
-
|
|
769
|
-
before(:each) do
|
|
770
|
-
logstash_time = Time.utc(2014,1,1,00,30,50)
|
|
771
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
772
|
-
org.logstash.filters.parser.JodaParser.setDefaultClock { org.joda.time.DateTime.new(2014,1,1,00,30,50, org.joda.time.DateTimeZone::UTC) }
|
|
773
|
-
end
|
|
774
|
-
|
|
775
|
-
sample "Dec 31 23:59:00" do
|
|
776
|
-
insist { subject.get("@timestamp").year } == 2013
|
|
777
|
-
end
|
|
778
|
-
end
|
|
779
|
-
|
|
780
|
-
describe "fill next year if january events arrive in december" do
|
|
781
|
-
config <<-CONFIG
|
|
782
|
-
filter {
|
|
783
|
-
date {
|
|
784
|
-
match => [ "message", "MMM dd HH:mm:ss" ]
|
|
785
|
-
timezone => "UTC"
|
|
786
|
-
}
|
|
787
|
-
}
|
|
788
|
-
CONFIG
|
|
789
|
-
|
|
790
|
-
before(:each) do
|
|
791
|
-
logstash_time = Time.utc(2013,12,31,23,59,50)
|
|
792
|
-
allow(Time).to receive(:now).and_return(logstash_time)
|
|
793
|
-
org.logstash.filters.parser.JodaParser.setDefaultClock { org.joda.time.DateTime.new(2013,12,31,23,59,50, org.joda.time.DateTimeZone::UTC) }
|
|
794
|
-
end
|
|
795
|
-
|
|
796
|
-
sample "Jan 01 01:00:00" do
|
|
797
|
-
insist { subject.get("@timestamp").year } == 2014
|
|
798
|
-
end
|
|
799
|
-
end
|
|
800
|
-
end
|
|
801
|
-
|
|
802
450
|
describe "metric counters" do
|
|
803
451
|
subject { described_class.new("match" => [ "message", "yyyy" ]) }
|
|
804
452
|
|
|
@@ -872,4 +520,85 @@ RUBY_ENGINE == "jruby" and describe LogStash::Filters::Date do
|
|
|
872
520
|
end
|
|
873
521
|
end
|
|
874
522
|
end
|
|
523
|
+
|
|
524
|
+
describe "precision => ns, java.time with nanoseconds" do
|
|
525
|
+
config <<-CONFIG
|
|
526
|
+
filter {
|
|
527
|
+
date {
|
|
528
|
+
match => [ "mydate", "yyyy-MM-dd HH:mm:ss.SSSSSSSSS VV" ]
|
|
529
|
+
precision => "ns"
|
|
530
|
+
}
|
|
531
|
+
}
|
|
532
|
+
CONFIG
|
|
533
|
+
|
|
534
|
+
sample({"mydate" => "2016-11-03 21:10:57.123456789 America/New_York"}) do
|
|
535
|
+
insist { subject.get("@timestamp").time.to_i } == 1478221857
|
|
536
|
+
insist { subject.get("@timestamp").time.nsec } == 123456789
|
|
537
|
+
end
|
|
538
|
+
end
|
|
539
|
+
|
|
540
|
+
describe "precision => ns, ISO8601 with nanoseconds" do
|
|
541
|
+
config <<-CONFIG
|
|
542
|
+
filter {
|
|
543
|
+
date {
|
|
544
|
+
match => [ "mydate", "ISO8601" ]
|
|
545
|
+
precision => "ns"
|
|
546
|
+
}
|
|
547
|
+
}
|
|
548
|
+
CONFIG
|
|
549
|
+
|
|
550
|
+
sample({"mydate" => "2016-11-03T21:10:57.123456789Z"}) do
|
|
551
|
+
insist { subject.get("@timestamp").time.to_i } == 1478207457
|
|
552
|
+
insist { subject.get("@timestamp").time.nsec } == 123456789
|
|
553
|
+
end
|
|
554
|
+
end
|
|
555
|
+
|
|
556
|
+
describe "precision => ns, UNIX epoch string with nanoseconds" do
|
|
557
|
+
config <<-CONFIG
|
|
558
|
+
filter {
|
|
559
|
+
date {
|
|
560
|
+
match => [ "mydate", "UNIX" ]
|
|
561
|
+
precision => "ns"
|
|
562
|
+
}
|
|
563
|
+
}
|
|
564
|
+
CONFIG
|
|
565
|
+
|
|
566
|
+
sample({"mydate" => "1478207457.123456789"}) do
|
|
567
|
+
insist { subject.get("@timestamp").time.to_i } == 1478207457
|
|
568
|
+
insist { subject.get("@timestamp").time.nsec } == 123456789
|
|
569
|
+
end
|
|
570
|
+
end
|
|
571
|
+
|
|
572
|
+
describe "precision => ns, TAI64N full nanoseconds" do
|
|
573
|
+
config <<-CONFIG
|
|
574
|
+
filter {
|
|
575
|
+
date {
|
|
576
|
+
match => [ "mydate", "TAI64N" ]
|
|
577
|
+
precision => "ns"
|
|
578
|
+
}
|
|
579
|
+
}
|
|
580
|
+
CONFIG
|
|
581
|
+
|
|
582
|
+
sample({"mydate" => "4000000050d506482dbdf024"}) do
|
|
583
|
+
# 0x2dbdf024 = 767422500 ns
|
|
584
|
+
insist { subject.get("@timestamp").time.to_i } == 1356138046
|
|
585
|
+
insist { subject.get("@timestamp").time.nsec } == 767422500
|
|
586
|
+
end
|
|
587
|
+
end
|
|
588
|
+
|
|
589
|
+
describe "precision => ms (default) still truncates to milliseconds" do
|
|
590
|
+
config <<-CONFIG
|
|
591
|
+
filter {
|
|
592
|
+
date {
|
|
593
|
+
match => [ "mydate", "TAI64N" ]
|
|
594
|
+
precision => "ms"
|
|
595
|
+
}
|
|
596
|
+
}
|
|
597
|
+
CONFIG
|
|
598
|
+
|
|
599
|
+
sample({"mydate" => "4000000050d506482dbdf024"}) do
|
|
600
|
+
insist { subject.get("@timestamp").time.to_i } == 1356138046
|
|
601
|
+
insist { subject.get("@timestamp").time.nsec } == 767000000
|
|
602
|
+
end
|
|
603
|
+
end
|
|
875
604
|
end
|
|
Binary file
|
metadata
CHANGED
|
@@ -1,16 +1,16 @@
|
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
|
2
2
|
name: logstash-filter-date
|
|
3
3
|
version: !ruby/object:Gem::Version
|
|
4
|
-
version: 3.
|
|
4
|
+
version: 3.2.0
|
|
5
5
|
platform: ruby
|
|
6
6
|
authors:
|
|
7
7
|
- Elastic
|
|
8
|
-
autorequire:
|
|
9
8
|
bindir: bin
|
|
10
9
|
cert_chain: []
|
|
11
|
-
date:
|
|
10
|
+
date: 2026-03-03 00:00:00.000000000 Z
|
|
12
11
|
dependencies:
|
|
13
12
|
- !ruby/object:Gem::Dependency
|
|
13
|
+
name: logstash-core-plugin-api
|
|
14
14
|
requirement: !ruby/object:Gem::Requirement
|
|
15
15
|
requirements:
|
|
16
16
|
- - ">="
|
|
@@ -19,9 +19,8 @@ dependencies:
|
|
|
19
19
|
- - "<="
|
|
20
20
|
- !ruby/object:Gem::Version
|
|
21
21
|
version: '2.99'
|
|
22
|
-
name: logstash-core-plugin-api
|
|
23
|
-
prerelease: false
|
|
24
22
|
type: :runtime
|
|
23
|
+
prerelease: false
|
|
25
24
|
version_requirements: !ruby/object:Gem::Requirement
|
|
26
25
|
requirements:
|
|
27
26
|
- - ">="
|
|
@@ -31,84 +30,84 @@ dependencies:
|
|
|
31
30
|
- !ruby/object:Gem::Version
|
|
32
31
|
version: '2.99'
|
|
33
32
|
- !ruby/object:Gem::Dependency
|
|
33
|
+
name: logstash-input-generator
|
|
34
34
|
requirement: !ruby/object:Gem::Requirement
|
|
35
35
|
requirements:
|
|
36
36
|
- - ">="
|
|
37
37
|
- !ruby/object:Gem::Version
|
|
38
38
|
version: '0'
|
|
39
|
-
name: logstash-input-generator
|
|
40
|
-
prerelease: false
|
|
41
39
|
type: :development
|
|
40
|
+
prerelease: false
|
|
42
41
|
version_requirements: !ruby/object:Gem::Requirement
|
|
43
42
|
requirements:
|
|
44
43
|
- - ">="
|
|
45
44
|
- !ruby/object:Gem::Version
|
|
46
45
|
version: '0'
|
|
47
46
|
- !ruby/object:Gem::Dependency
|
|
47
|
+
name: logstash-codec-json
|
|
48
48
|
requirement: !ruby/object:Gem::Requirement
|
|
49
49
|
requirements:
|
|
50
50
|
- - ">="
|
|
51
51
|
- !ruby/object:Gem::Version
|
|
52
52
|
version: '0'
|
|
53
|
-
name: logstash-codec-json
|
|
54
|
-
prerelease: false
|
|
55
53
|
type: :development
|
|
54
|
+
prerelease: false
|
|
56
55
|
version_requirements: !ruby/object:Gem::Requirement
|
|
57
56
|
requirements:
|
|
58
57
|
- - ">="
|
|
59
58
|
- !ruby/object:Gem::Version
|
|
60
59
|
version: '0'
|
|
61
60
|
- !ruby/object:Gem::Dependency
|
|
61
|
+
name: logstash-output-null
|
|
62
62
|
requirement: !ruby/object:Gem::Requirement
|
|
63
63
|
requirements:
|
|
64
64
|
- - ">="
|
|
65
65
|
- !ruby/object:Gem::Version
|
|
66
66
|
version: '0'
|
|
67
|
-
name: logstash-output-null
|
|
68
|
-
prerelease: false
|
|
69
67
|
type: :development
|
|
68
|
+
prerelease: false
|
|
70
69
|
version_requirements: !ruby/object:Gem::Requirement
|
|
71
70
|
requirements:
|
|
72
71
|
- - ">="
|
|
73
72
|
- !ruby/object:Gem::Version
|
|
74
73
|
version: '0'
|
|
75
74
|
- !ruby/object:Gem::Dependency
|
|
75
|
+
name: logstash-devutils
|
|
76
76
|
requirement: !ruby/object:Gem::Requirement
|
|
77
77
|
requirements:
|
|
78
78
|
- - ">="
|
|
79
79
|
- !ruby/object:Gem::Version
|
|
80
80
|
version: '2.3'
|
|
81
|
-
name: logstash-devutils
|
|
82
|
-
prerelease: false
|
|
83
81
|
type: :development
|
|
82
|
+
prerelease: false
|
|
84
83
|
version_requirements: !ruby/object:Gem::Requirement
|
|
85
84
|
requirements:
|
|
86
85
|
- - ">="
|
|
87
86
|
- !ruby/object:Gem::Version
|
|
88
87
|
version: '2.3'
|
|
89
88
|
- !ruby/object:Gem::Dependency
|
|
89
|
+
name: insist
|
|
90
90
|
requirement: !ruby/object:Gem::Requirement
|
|
91
91
|
requirements:
|
|
92
92
|
- - ">="
|
|
93
93
|
- !ruby/object:Gem::Version
|
|
94
94
|
version: '0'
|
|
95
|
-
name: insist
|
|
96
|
-
prerelease: false
|
|
97
95
|
type: :development
|
|
96
|
+
prerelease: false
|
|
98
97
|
version_requirements: !ruby/object:Gem::Requirement
|
|
99
98
|
requirements:
|
|
100
99
|
- - ">="
|
|
101
100
|
- !ruby/object:Gem::Version
|
|
102
101
|
version: '0'
|
|
103
102
|
- !ruby/object:Gem::Dependency
|
|
103
|
+
name: benchmark-ips
|
|
104
104
|
requirement: !ruby/object:Gem::Requirement
|
|
105
105
|
requirements:
|
|
106
106
|
- - ">="
|
|
107
107
|
- !ruby/object:Gem::Version
|
|
108
108
|
version: '0'
|
|
109
|
-
name: benchmark-ips
|
|
110
|
-
prerelease: false
|
|
111
109
|
type: :development
|
|
110
|
+
prerelease: false
|
|
112
111
|
version_requirements: !ruby/object:Gem::Requirement
|
|
113
112
|
requirements:
|
|
114
113
|
- - ">="
|
|
@@ -141,7 +140,6 @@ licenses:
|
|
|
141
140
|
metadata:
|
|
142
141
|
logstash_plugin: 'true'
|
|
143
142
|
logstash_group: filter
|
|
144
|
-
post_install_message:
|
|
145
143
|
rdoc_options: []
|
|
146
144
|
require_paths:
|
|
147
145
|
- lib
|
|
@@ -157,8 +155,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
|
157
155
|
- !ruby/object:Gem::Version
|
|
158
156
|
version: '0'
|
|
159
157
|
requirements: []
|
|
160
|
-
rubygems_version: 3.
|
|
161
|
-
signing_key:
|
|
158
|
+
rubygems_version: 3.6.3
|
|
162
159
|
specification_version: 4
|
|
163
160
|
summary: Parses dates from fields to use as the Logstash timestamp for an event
|
|
164
161
|
test_files:
|