fluent-plugin-elasticsearch 1.13.1 → 1.13.2
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +5 -5
- data/History.md +3 -0
- data/README.md +15 -15
- data/fluent-plugin-elasticsearch.gemspec +2 -2
- data/lib/fluent/plugin/out_elasticsearch_dynamic.rb +2 -1
- data/test/plugin/test_out_elasticsearch.rb +14 -16
- data/test/plugin/test_out_elasticsearch_dynamic.rb +14 -14
- metadata +5 -5
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
|
-
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: a09291d3314b725675ab4d940a2b71766d7576b2b145281545c4dfa5b6d6ec78
|
4
|
+
data.tar.gz: 4b44b618882f9ec85887b0c3f274d46c1ec7560e11567a69e6e0e7afb586be37
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 1198c5c8ad6232bdc4b1869d70d49016e3f9623c01a5c88e0bd6ed90ccfcb49731fffd430cf06b641eec011d7a0c671fb8cfabf7eef0cc5b9df736fafbbef294
|
7
|
+
data.tar.gz: a8f171967cde67dfd93409f31ed4cbeefa1d506da7020917daa03a78f349d5aad99114404f958290f0de530e0f51fab548d01e22a135a2f65796701a5b30f422
|
data/History.md
CHANGED
@@ -4,6 +4,9 @@
|
|
4
4
|
- Log ES response errors (#230)
|
5
5
|
- Use latest elasticsearch-ruby (#240)
|
6
6
|
|
7
|
+
### 1.13.2
|
8
|
+
- backport preventing error when using template in elasticsearch_dynamic for elementally use case (#363)
|
9
|
+
|
7
10
|
### 1.13.1
|
8
11
|
- backport adding config parameter to enable elasticsearch-ruby's transporter logging (#343)
|
9
12
|
|
data/README.md
CHANGED
@@ -7,7 +7,7 @@
|
|
7
7
|
[![Issue Stats](http://issuestats.com/github/uken/fluent-plugin-elasticsearch/badge/pr)](http://issuestats.com/github/uken/fluent-plugin-elasticsearch)
|
8
8
|
[![Issue Stats](http://issuestats.com/github/uken/fluent-plugin-elasticsearch/badge/issue)](http://issuestats.com/github/uken/fluent-plugin-elasticsearch)
|
9
9
|
|
10
|
-
Send your logs to
|
10
|
+
Send your logs to Elasticsearch (and search them with Kibana maybe?)
|
11
11
|
|
12
12
|
Note: For Amazon Elasticsearch Service please consider using [fluent-plugin-aws-elasticsearch-service](https://github.com/atomita/fluent-plugin-aws-elasticsearch-service)
|
13
13
|
|
@@ -82,7 +82,7 @@ In your Fluentd configuration, use `@type elasticsearch`. Additional configurati
|
|
82
82
|
|
83
83
|
### Index templates
|
84
84
|
|
85
|
-
This plugin creates
|
85
|
+
This plugin creates Elasticsearch indices by merely writing to them. Consider using [Index Templates](https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html) to gain control of what get indexed and how. See [this example](https://github.com/uken/fluent-plugin-elasticsearch/issues/33#issuecomment-38693282) for a good starting point.
|
86
86
|
|
87
87
|
## Configuration
|
88
88
|
|
@@ -94,9 +94,9 @@ hosts host1:port1,host2:port2,host3:port3
|
|
94
94
|
hosts https://customhost.com:443/path,https://username:password@host-failover.com:443
|
95
95
|
```
|
96
96
|
|
97
|
-
You can specify multiple
|
97
|
+
You can specify multiple Elasticsearch hosts with separator ",".
|
98
98
|
|
99
|
-
If you specify multiple hosts, this plugin will load balance updates to
|
99
|
+
If you specify multiple hosts, this plugin will load balance updates to Elasticsearch. This is an [elasticsearch-ruby](https://github.com/elasticsearch/elasticsearch-ruby) feature, the default strategy is round-robin.
|
100
100
|
|
101
101
|
And this plugin will escape required URL encoded characters within `%{}` placeholders.
|
102
102
|
|
@@ -132,7 +132,7 @@ Specify `ssl_verify false` to skip ssl verification (defaults to true)
|
|
132
132
|
logstash_format true # defaults to false
|
133
133
|
```
|
134
134
|
|
135
|
-
This is meant to make writing data into
|
135
|
+
This is meant to make writing data into Elasticsearch indices compatible to what [Logstash](https://www.elastic.co/products/logstash) calls them. By doing this, one could take advantage of [Kibana](https://www.elastic.co/products/kibana). See logstash_prefix and logstash_dateformat to customize this index name pattern. The index name will be `#{logstash_prefix}-#{formated_date}`
|
136
136
|
|
137
137
|
### include_timestamp
|
138
138
|
|
@@ -317,7 +317,7 @@ One of [template_file](#template_file) or [templates](#templates) must also be s
|
|
317
317
|
|
318
318
|
You can specify HTTP request timeout.
|
319
319
|
|
320
|
-
This is useful when
|
320
|
+
This is useful when Elasticsearch cannot return response for bulk request within the default of 5 seconds.
|
321
321
|
|
322
322
|
```
|
323
323
|
request_timeout 15s # defaults to 5s
|
@@ -325,7 +325,7 @@ request_timeout 15s # defaults to 5s
|
|
325
325
|
|
326
326
|
### reload_connections
|
327
327
|
|
328
|
-
You can tune how the elasticsearch-transport host reloading feature works. By default it will reload the host list from the server every 10,000th request to spread the load. This can be an issue if your
|
328
|
+
You can tune how the elasticsearch-transport host reloading feature works. By default it will reload the host list from the server every 10,000th request to spread the load. This can be an issue if your Elasticsearch cluster is behind a Reverse Proxy, as Fluentd process may not have direct network access to the Elasticsearch nodes.
|
329
329
|
|
330
330
|
```
|
331
331
|
reload_connections false # defaults to true
|
@@ -365,7 +365,7 @@ This will add the Fluentd tag in the JSON record. For instance, if you have a co
|
|
365
365
|
</match>
|
366
366
|
```
|
367
367
|
|
368
|
-
The record inserted into
|
368
|
+
The record inserted into Elasticsearch would be
|
369
369
|
|
370
370
|
```
|
371
371
|
{"_key":"my.logs", "name":"Johnny Doeie"}
|
@@ -377,9 +377,9 @@ The record inserted into ElasticSearch would be
|
|
377
377
|
id_key request_id # use "request_id" field as a record id in ES
|
378
378
|
```
|
379
379
|
|
380
|
-
By default, all records inserted into
|
380
|
+
By default, all records inserted into Elasticsearch get a random _id. This option allows to use a field in the record as an identifier.
|
381
381
|
|
382
|
-
This following record `{"name":"Johnny","request_id":"87d89af7daffad6"}` will trigger the following
|
382
|
+
This following record `{"name":"Johnny","request_id":"87d89af7daffad6"}` will trigger the following Elasticsearch command
|
383
383
|
|
384
384
|
```
|
385
385
|
{ "index" : { "_index" : "logstash-2013.01.01, "_type" : "fluentd", "_id" : "87d89af7daffad6" } }
|
@@ -397,7 +397,7 @@ If your input is
|
|
397
397
|
{ "name": "Johnny", "a_parent": "my_parent" }
|
398
398
|
```
|
399
399
|
|
400
|
-
|
400
|
+
Elasticsearch command would be
|
401
401
|
|
402
402
|
```
|
403
403
|
{ "index" : { "_index" : "****", "_type" : "****", "_id" : "****", "_parent" : "my_parent" } }
|
@@ -482,12 +482,12 @@ with_transporter_log true
|
|
482
482
|
|
483
483
|
### Client/host certificate options
|
484
484
|
|
485
|
-
Need to verify
|
485
|
+
Need to verify Elasticsearch's certificate? You can use the following parameter to specify a CA instead of using an environment variable.
|
486
486
|
```
|
487
487
|
ca_file /path/to/your/ca/cert
|
488
488
|
```
|
489
489
|
|
490
|
-
Does your
|
490
|
+
Does your Elasticsearch cluster want to verify client connections? You can specify the following parameters to use your client certificate, key, and key password for your connection.
|
491
491
|
```
|
492
492
|
client_cert /path/to/your/client/cert
|
493
493
|
client_key /path/to/your/private/key
|
@@ -560,7 +560,7 @@ Here is a sample config:
|
|
560
560
|
|
561
561
|
We try to keep the scope of this plugin small and not add too many configuration options. If you think an option would be useful to others, feel free to open an issue or contribute a Pull Request.
|
562
562
|
|
563
|
-
Alternatively, consider using [fluent-plugin-forest](https://github.com/tagomoris/fluent-plugin-forest). For example, to configure multiple tags to be sent to different
|
563
|
+
Alternatively, consider using [fluent-plugin-forest](https://github.com/tagomoris/fluent-plugin-forest). For example, to configure multiple tags to be sent to different Elasticsearch indices:
|
564
564
|
|
565
565
|
```
|
566
566
|
<match my.logs.*>
|
@@ -578,7 +578,7 @@ And yet another option is described in Dynamic Configuration section.
|
|
578
578
|
|
579
579
|
### Dynamic configuration
|
580
580
|
|
581
|
-
If you want configurations to depend on information in messages, you can use `elasticsearch_dynamic`. This is an experimental variation of the
|
581
|
+
If you want configurations to depend on information in messages, you can use `elasticsearch_dynamic`. This is an experimental variation of the Elasticsearch plugin allows configuration values to be specified in ways such as the below:
|
582
582
|
|
583
583
|
```
|
584
584
|
<match my.logs.*>
|
@@ -3,10 +3,10 @@ $:.push File.expand_path('../lib', __FILE__)
|
|
3
3
|
|
4
4
|
Gem::Specification.new do |s|
|
5
5
|
s.name = 'fluent-plugin-elasticsearch'
|
6
|
-
s.version = '1.13.
|
6
|
+
s.version = '1.13.2'
|
7
7
|
s.authors = ['diogo', 'pitr']
|
8
8
|
s.email = ['pitr.vern@gmail.com', 'me@diogoterror.com']
|
9
|
-
s.description = %q{
|
9
|
+
s.description = %q{Elasticsearch output plugin for Fluent event collector}
|
10
10
|
s.summary = s.description
|
11
11
|
s.homepage = 'https://github.com/uken/fluent-plugin-elasticsearch'
|
12
12
|
s.license = 'Apache-2.0'
|
@@ -32,7 +32,7 @@ class Fluent::ElasticsearchOutputDynamic < Fluent::ElasticsearchOutput
|
|
32
32
|
{'id_key' => '_id', 'parent_key' => '_parent', 'routing_key' => '_routing'}
|
33
33
|
end
|
34
34
|
|
35
|
-
def client(host)
|
35
|
+
def client(host = nil)
|
36
36
|
|
37
37
|
# check here to see if we already have a client connection for the given host
|
38
38
|
connection_options = get_connection_options(host)
|
@@ -266,6 +266,7 @@ class Fluent::ElasticsearchOutputDynamic < Fluent::ElasticsearchOutput
|
|
266
266
|
def is_existing_connection(host)
|
267
267
|
# check if the host provided match the current connection
|
268
268
|
return false if @_es.nil?
|
269
|
+
return false if @current_config.nil?
|
269
270
|
return false if host.length != @current_config.length
|
270
271
|
|
271
272
|
for i in 0...host.length
|
@@ -673,7 +673,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
673
673
|
def test_writes_to_target_index_key_logstash
|
674
674
|
driver.configure("target_index_key @target_index
|
675
675
|
logstash_format true")
|
676
|
-
time = Time.parse Date.today.
|
676
|
+
time = Time.parse Date.today.iso8601
|
677
677
|
stub_elastic_ping
|
678
678
|
stub_elastic
|
679
679
|
driver.emit(sample_record.merge('@target_index' => 'local-override'), time.to_i)
|
@@ -684,7 +684,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
684
684
|
def test_writes_to_target_index_key_logstash_uppercase
|
685
685
|
driver.configure("target_index_key @target_index
|
686
686
|
logstash_format true")
|
687
|
-
time = Time.parse Date.today.
|
687
|
+
time = Time.parse Date.today.iso8601
|
688
688
|
stub_elastic_ping
|
689
689
|
stub_elastic
|
690
690
|
driver.emit(sample_record.merge('@target_index' => 'Local-Override'), time.to_i)
|
@@ -716,7 +716,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
716
716
|
def test_writes_to_target_index_key_fallack_logstash
|
717
717
|
driver.configure("target_index_key @target_index\n
|
718
718
|
logstash_format true")
|
719
|
-
time = Time.parse Date.today.
|
719
|
+
time = Time.parse Date.today.iso8601
|
720
720
|
logstash_index = "logstash-#{time.getutc.strftime("%Y.%m.%d")}"
|
721
721
|
stub_elastic_ping
|
722
722
|
stub_elastic
|
@@ -923,7 +923,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
923
923
|
def test_writes_to_logstash_index_with_specified_prefix
|
924
924
|
driver.configure("logstash_format true
|
925
925
|
logstash_prefix myprefix")
|
926
|
-
time = Time.parse Date.today.
|
926
|
+
time = Time.parse Date.today.iso8601
|
927
927
|
logstash_index = "myprefix-#{time.getutc.strftime("%Y.%m.%d")}"
|
928
928
|
stub_elastic_ping
|
929
929
|
stub_elastic
|
@@ -937,7 +937,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
937
937
|
driver.configure("logstash_format true
|
938
938
|
logstash_prefix_separator #{separator}
|
939
939
|
logstash_prefix myprefix")
|
940
|
-
time = Time.parse Date.today.
|
940
|
+
time = Time.parse Date.today.iso8601
|
941
941
|
logstash_index = "myprefix#{separator}#{time.getutc.strftime("%Y.%m.%d")}"
|
942
942
|
stub_elastic_ping
|
943
943
|
stub_elastic
|
@@ -949,7 +949,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
949
949
|
def test_writes_to_logstash_index_with_specified_prefix_uppercase
|
950
950
|
driver.configure("logstash_format true
|
951
951
|
logstash_prefix MyPrefix")
|
952
|
-
time = Time.parse Date.today.
|
952
|
+
time = Time.parse Date.today.iso8601
|
953
953
|
logstash_index = "myprefix-#{time.getutc.strftime("%Y.%m.%d")}"
|
954
954
|
stub_elastic_ping
|
955
955
|
stub_elastic
|
@@ -963,7 +963,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
963
963
|
def test_writes_to_logstash_index_with_specified_dateformat
|
964
964
|
driver.configure("logstash_format true
|
965
965
|
logstash_dateformat %Y.%m")
|
966
|
-
time = Time.parse Date.today.
|
966
|
+
time = Time.parse Date.today.iso8601
|
967
967
|
logstash_index = "logstash-#{time.getutc.strftime("%Y.%m")}"
|
968
968
|
stub_elastic_ping
|
969
969
|
stub_elastic
|
@@ -976,7 +976,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
976
976
|
driver.configure("logstash_format true
|
977
977
|
logstash_prefix myprefix
|
978
978
|
logstash_dateformat %Y.%m")
|
979
|
-
time = Time.parse Date.today.
|
979
|
+
time = Time.parse Date.today.iso8601
|
980
980
|
logstash_index = "myprefix-#{time.getutc.strftime("%Y.%m")}"
|
981
981
|
stub_elastic_ping
|
982
982
|
stub_elastic
|
@@ -997,7 +997,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
997
997
|
driver.configure("logstash_format true\n")
|
998
998
|
stub_elastic_ping
|
999
999
|
stub_elastic
|
1000
|
-
ts = DateTime.now.
|
1000
|
+
ts = DateTime.now.iso8601
|
1001
1001
|
driver.emit(sample_record)
|
1002
1002
|
driver.run
|
1003
1003
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -1012,17 +1012,15 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
1012
1012
|
time = ts.to_time
|
1013
1013
|
driver.emit(sample_record, time)
|
1014
1014
|
driver.run
|
1015
|
-
tf = "%Y-%m-%dT%H:%M:%S%:z"
|
1016
|
-
timef = Fluent::TimeFormatter.new(tf, true, ENV["TZ"])
|
1017
1015
|
assert(index_cmds[1].has_key? '@timestamp')
|
1018
|
-
assert_equal(
|
1016
|
+
assert_equal(index_cmds[1]['@timestamp'], ts.iso8601)
|
1019
1017
|
end
|
1020
1018
|
|
1021
1019
|
def test_uses_custom_timestamp_when_included_in_record
|
1022
1020
|
driver.configure("logstash_format true\n")
|
1023
1021
|
stub_elastic_ping
|
1024
1022
|
stub_elastic
|
1025
|
-
ts = DateTime.new(2001,2,3).
|
1023
|
+
ts = DateTime.new(2001,2,3).iso8601
|
1026
1024
|
driver.emit(sample_record.merge!('@timestamp' => ts))
|
1027
1025
|
driver.run
|
1028
1026
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -1033,7 +1031,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
1033
1031
|
driver.configure("include_timestamp true\n")
|
1034
1032
|
stub_elastic_ping
|
1035
1033
|
stub_elastic
|
1036
|
-
ts = DateTime.new(2001,2,3).
|
1034
|
+
ts = DateTime.new(2001,2,3).iso8601
|
1037
1035
|
driver.emit(sample_record.merge!('@timestamp' => ts))
|
1038
1036
|
driver.run
|
1039
1037
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -1045,7 +1043,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
1045
1043
|
time_key vtm\n")
|
1046
1044
|
stub_elastic_ping
|
1047
1045
|
stub_elastic
|
1048
|
-
ts = DateTime.new(2001,2,3).
|
1046
|
+
ts = DateTime.new(2001,2,3).iso8601
|
1049
1047
|
driver.emit(sample_record.merge!('vtm' => ts))
|
1050
1048
|
driver.run
|
1051
1049
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -1087,7 +1085,7 @@ class ElasticsearchOutput < Test::Unit::TestCase
|
|
1087
1085
|
time_key_exclude_timestamp true\n")
|
1088
1086
|
stub_elastic_ping
|
1089
1087
|
stub_elastic
|
1090
|
-
ts = DateTime.new(2001,2,3).
|
1088
|
+
ts = DateTime.new(2001,2,3).iso8601
|
1091
1089
|
driver.emit(sample_record.merge!('vtm' => ts))
|
1092
1090
|
driver.run
|
1093
1091
|
assert(!index_cmds[1].key?('@timestamp'), '@timestamp should be messing')
|
@@ -351,7 +351,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
351
351
|
|
352
352
|
def test_writes_to_logstash_index
|
353
353
|
driver.configure("logstash_format true\n")
|
354
|
-
time = Time.parse Date.today.
|
354
|
+
time = Time.parse Date.today.iso8601
|
355
355
|
logstash_index = "logstash-#{time.getutc.strftime("%Y.%m.%d")}"
|
356
356
|
stub_elastic_ping
|
357
357
|
stub_elastic
|
@@ -363,7 +363,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
363
363
|
def test_writes_to_logstash_utc_index
|
364
364
|
driver.configure("logstash_format true
|
365
365
|
utc_index false")
|
366
|
-
time = Time.parse Date.today.
|
366
|
+
time = Time.parse Date.today.iso8601
|
367
367
|
utc_index = "logstash-#{time.strftime("%Y.%m.%d")}"
|
368
368
|
stub_elastic_ping
|
369
369
|
stub_elastic
|
@@ -375,7 +375,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
375
375
|
def test_writes_to_logstash_index_with_specified_prefix
|
376
376
|
driver.configure("logstash_format true
|
377
377
|
logstash_prefix myprefix")
|
378
|
-
time = Time.parse Date.today.
|
378
|
+
time = Time.parse Date.today.iso8601
|
379
379
|
logstash_index = "myprefix-#{time.getutc.strftime("%Y.%m.%d")}"
|
380
380
|
stub_elastic_ping
|
381
381
|
stub_elastic
|
@@ -387,7 +387,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
387
387
|
def test_writes_to_logstash_index_with_specified_prefix_uppercase
|
388
388
|
driver.configure("logstash_format true
|
389
389
|
logstash_prefix MyPrefix")
|
390
|
-
time = Time.parse Date.today.
|
390
|
+
time = Time.parse Date.today.iso8601
|
391
391
|
logstash_index = "myprefix-#{time.getutc.strftime("%Y.%m.%d")}"
|
392
392
|
stub_elastic_ping
|
393
393
|
stub_elastic
|
@@ -399,7 +399,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
399
399
|
def test_writes_to_logstash_index_with_specified_dateformat
|
400
400
|
driver.configure("logstash_format true
|
401
401
|
logstash_dateformat %Y.%m")
|
402
|
-
time = Time.parse Date.today.
|
402
|
+
time = Time.parse Date.today.iso8601
|
403
403
|
logstash_index = "logstash-#{time.getutc.strftime("%Y.%m")}"
|
404
404
|
stub_elastic_ping
|
405
405
|
stub_elastic
|
@@ -412,7 +412,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
412
412
|
driver.configure("logstash_format true
|
413
413
|
logstash_prefix myprefix
|
414
414
|
logstash_dateformat %Y.%m")
|
415
|
-
time = Time.parse Date.today.
|
415
|
+
time = Time.parse Date.today.iso8601
|
416
416
|
logstash_index = "myprefix-#{time.getutc.strftime("%Y.%m")}"
|
417
417
|
stub_elastic_ping
|
418
418
|
stub_elastic
|
@@ -433,7 +433,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
433
433
|
driver.configure("logstash_format true\n")
|
434
434
|
stub_elastic_ping
|
435
435
|
stub_elastic
|
436
|
-
ts = DateTime.now.
|
436
|
+
ts = DateTime.now.iso8601
|
437
437
|
driver.emit(sample_record)
|
438
438
|
driver.run
|
439
439
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -444,7 +444,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
444
444
|
driver.configure("include_timestamp true\n")
|
445
445
|
stub_elastic_ping
|
446
446
|
stub_elastic
|
447
|
-
ts = DateTime.new(2001,2,3).
|
447
|
+
ts = DateTime.new(2001,2,3).iso8601
|
448
448
|
driver.emit(sample_record.merge!('@timestamp' => ts))
|
449
449
|
driver.run
|
450
450
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -455,7 +455,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
455
455
|
driver.configure("logstash_format true\n")
|
456
456
|
stub_elastic_ping
|
457
457
|
stub_elastic
|
458
|
-
ts = DateTime.new(2001,2,3).
|
458
|
+
ts = DateTime.new(2001,2,3).iso8601
|
459
459
|
driver.emit(sample_record.merge!('@timestamp' => ts))
|
460
460
|
driver.run
|
461
461
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -467,7 +467,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
467
467
|
time_key vtm\n")
|
468
468
|
stub_elastic_ping
|
469
469
|
stub_elastic
|
470
|
-
ts = DateTime.new(2001,2,3).
|
470
|
+
ts = DateTime.new(2001,2,3).iso8601
|
471
471
|
driver.emit(sample_record.merge!('vtm' => ts))
|
472
472
|
driver.run
|
473
473
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -479,7 +479,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
479
479
|
time_key vtm\n")
|
480
480
|
stub_elastic_ping
|
481
481
|
stub_elastic
|
482
|
-
ts = DateTime.new(2001,2,3).
|
482
|
+
ts = DateTime.new(2001,2,3).iso8601
|
483
483
|
driver.emit(sample_record.merge!('vtm' => ts))
|
484
484
|
driver.run
|
485
485
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -492,7 +492,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
492
492
|
time_key vtm\n")
|
493
493
|
stub_elastic_ping
|
494
494
|
stub_elastic
|
495
|
-
ts = DateTime.new(2001,2,3).
|
495
|
+
ts = DateTime.new(2001,2,3).iso8601
|
496
496
|
driver.emit(sample_record.merge!('vtm' => ts))
|
497
497
|
driver.run
|
498
498
|
assert(index_cmds[1].has_key? '@timestamp')
|
@@ -506,7 +506,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
506
506
|
time_key_exclude_timestamp true\n")
|
507
507
|
stub_elastic_ping
|
508
508
|
stub_elastic
|
509
|
-
ts = DateTime.new(2001,2,3).
|
509
|
+
ts = DateTime.new(2001,2,3).iso8601
|
510
510
|
driver.emit(sample_record.merge!('vtm' => ts))
|
511
511
|
driver.run
|
512
512
|
assert(!index_cmds[1].key?('@timestamp'), '@timestamp should be missing')
|
@@ -518,7 +518,7 @@ class ElasticsearchOutputDynamic < Test::Unit::TestCase
|
|
518
518
|
time_key_exclude_timestamp true\n")
|
519
519
|
stub_elastic_ping
|
520
520
|
stub_elastic
|
521
|
-
ts = DateTime.new(2001,2,3).
|
521
|
+
ts = DateTime.new(2001,2,3).iso8601
|
522
522
|
driver.emit(sample_record.merge!('vtm' => ts))
|
523
523
|
driver.run
|
524
524
|
assert(!index_cmds[1].key?('@timestamp'), '@timestamp should be missing')
|
metadata
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: fluent-plugin-elasticsearch
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 1.13.
|
4
|
+
version: 1.13.2
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- diogo
|
@@ -9,7 +9,7 @@ authors:
|
|
9
9
|
autorequire:
|
10
10
|
bindir: bin
|
11
11
|
cert_chain: []
|
12
|
-
date:
|
12
|
+
date: 2018-02-20 00:00:00.000000000 Z
|
13
13
|
dependencies:
|
14
14
|
- !ruby/object:Gem::Dependency
|
15
15
|
name: fluentd
|
@@ -123,7 +123,7 @@ dependencies:
|
|
123
123
|
- - "~>"
|
124
124
|
- !ruby/object:Gem::Version
|
125
125
|
version: 2.3.5
|
126
|
-
description:
|
126
|
+
description: Elasticsearch output plugin for Fluent event collector
|
127
127
|
email:
|
128
128
|
- pitr.vern@gmail.com
|
129
129
|
- me@diogoterror.com
|
@@ -176,10 +176,10 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
176
176
|
version: '0'
|
177
177
|
requirements: []
|
178
178
|
rubyforge_project:
|
179
|
-
rubygems_version: 2.
|
179
|
+
rubygems_version: 2.7.3
|
180
180
|
signing_key:
|
181
181
|
specification_version: 4
|
182
|
-
summary:
|
182
|
+
summary: Elasticsearch output plugin for Fluent event collector
|
183
183
|
test_files:
|
184
184
|
- test/helper.rb
|
185
185
|
- test/plugin/test_filter_elasticsearch_genid.rb
|