fluent-plugin-opensearch 1.0.2 → 1.0.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 6382f3ad5be4b3c2c5aaa425568e4421c761bcb8cc5de65369123d01830a8888
4
- data.tar.gz: 3e38aceba420423bf15065a0203ed6d696fa862d86d365a090d72d8da290a92a
3
+ metadata.gz: 217feaf328809b259c6eb716e968b02417df7b10075a9a1e278c6e337d8a660b
4
+ data.tar.gz: 351327bd18d5dd3aa7b6a360ebba4a4f574a86b65ed470cffdf15b38e5d16022
5
5
  SHA512:
6
- metadata.gz: 0bccfd9d20535f856b4dd115eb9126efc2cf2cfbe02d52fcb1f2e14db43747ee90de71f87873924540ea973f1d077d4618b87beb053e09e4d1c52d7b1b33bcc6
7
- data.tar.gz: 3d9a7ce81fbb624f4af5e42dcfef10e4059073523a5b410af67575cffb11bf7e243f11d66cc6bf87851ea5e5dd0f2f7d76100923d53cf386df82ede3beb0c3ad
6
+ metadata.gz: c374adb923d9533e50c1c11355b304ba04ca92112dee36dcacd886d53be0ec910481559c7fdb0f6cdb3b61c7c949b5b917c3540d4bbd2789a9d568015cb52cbb
7
+ data.tar.gz: 11259ac1ee5e4dfe0de5b18336fd2167b40b8c10c381a283068557da1d206e06c03d5d76ef46411a8803b00c333370f47c21fbfdf4331e528cb05a145803a9b4
data/History.md CHANGED
@@ -2,6 +2,13 @@
2
2
 
3
3
  ### [Unreleased]
4
4
 
5
+ ### 1.0.3
6
+ - Configurable unrecoverable record types (#40)
7
+ - Handle exceptions on retrieving AWS credentials (#39)
8
+ - Suppress emit error label events (#38)
9
+ - Provide suppress_type_name parameter (#29)
10
+ - Honor @time_key (data streams) (#28)
11
+
5
12
  ### 1.0.2
6
13
  - Honor @hosts parameter for Data Streams (#21)
7
14
  - Use template_file for Data Streams (#20)
@@ -17,6 +17,7 @@
17
17
  + [reload_on_failure](#reload_on_failure)
18
18
  + [resurrect_after](#resurrect_after)
19
19
  + [with_transporter_log](#with_transporter_log)
20
+ + [emit_error_label_event](#emit-error-label-event)
20
21
  + [Client/host certificate options](#clienthost-certificate-options)
21
22
  + [sniffer_class_name](#sniffer-class-name)
22
23
  + [custom_headers](#custom_headers)
@@ -190,6 +191,18 @@ We recommend to set this true if you start to debug this plugin.
190
191
  with_transporter_log true
191
192
  ```
192
193
 
194
+ ### emit_error_label_event
195
+
196
+ Default `emit_error_label_event` value is `true`.
197
+
198
+ Emitting error label events is default behavior.
199
+
200
+ When using the followin configuration, OpenSearch plugin will cut error events on error handler:
201
+
202
+ ```aconf
203
+ emit_error_label_event false
204
+ ```
205
+
193
206
  ### Client/host certificate options
194
207
 
195
208
  Need to verify OpenSearch's certificate? You can use the following parameter to specify a CA instead of using an environment variable.
data/README.md CHANGED
@@ -76,6 +76,8 @@ Send your logs to OpenSearch (and search them with OpenSearch Dashboards maybe?)
76
76
  + [reload_after](#reload-after)
77
77
  + [validate_client_version](#validate-client-version)
78
78
  + [unrecoverable_error_types](#unrecoverable-error-types)
79
+ + [unrecoverable_record_types](#unrecoverable-record-types)
80
+ + [emit_error_label_event](#emit-error-label-event)
79
81
  + [verify os version at startup](#verify_os_version_at_startup)
80
82
  + [default_opensearch_version](#default_opensearch_version)
81
83
  + [custom_headers](#custom_headers)
@@ -356,6 +358,15 @@ utc_index true
356
358
 
357
359
  By default, the records inserted into index `logstash-YYMMDD` with UTC (Coordinated Universal Time). This option allows to use local time if you describe utc_index to false.
358
360
 
361
+
362
+ ### suppress_type_name
363
+
364
+ If OpenSearch cluster complains types removal warnings, this can be suppressed with:
365
+
366
+ ```
367
+ suppress_type_name true
368
+ ```
369
+
359
370
  ### target_index_key
360
371
 
361
372
  Tell this plugin to find the index name to write to in the record under this key in preference to other mechanisms. Key can be specified as path to nested record using dot ('.') as a separator.
@@ -1097,6 +1108,33 @@ Then, remove `rejected_execution_exception` from `unrecoverable_error_types` par
1097
1108
  unrecoverable_error_types ["out_of_memory_error"]
1098
1109
  ```
1099
1110
 
1111
+
1112
+ ### Unrecoverable Record Types
1113
+
1114
+ Default `unrecoverable_record_types` parameter is set up loosely.
1115
+ Because `json_parse_exception` is caused by invalid JSON record.
1116
+ Another possible unrecoverable record error should be included within this paramater.
1117
+
1118
+ If you want to handle `security_exception` as _unrecoverable exceptions_, please consider to change `unrecoverable_record_types` parameter from default value:
1119
+
1120
+ ```
1121
+ unrecoverable_record_types ["json_parse_exception", "security_exception"]
1122
+ ```
1123
+
1124
+ If this error type is included in OpenSearch response, an invalid record should be sent in `@ERROR` data pipeline.
1125
+
1126
+ ### emit_error_label_event
1127
+
1128
+ Default `emit_error_label_event` value is `true`.
1129
+
1130
+ Emitting error label events is default behavior.
1131
+
1132
+ When using the followin configuration, OpenSearch plugin will cut error events on error handler:
1133
+
1134
+ ```aconf
1135
+ emit_error_label_event false
1136
+ ```
1137
+
1100
1138
  ### verify_os_version_at_startup
1101
1139
 
1102
1140
  Because OpenSearch plugin will ought to change behavior each of OpenSearch major versions.
@@ -3,7 +3,7 @@ $:.push File.expand_path('../lib', __FILE__)
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = 'fluent-plugin-opensearch'
6
- s.version = '1.0.2'
6
+ s.version = '1.0.3'
7
7
  s.authors = ['Hiroshi Hatake']
8
8
  s.email = ['cosmo0920.wp@gmail.com']
9
9
  s.description = %q{Opensearch output plugin for Fluent event collector}
@@ -73,6 +73,7 @@ module Fluent::Plugin
73
73
  config_param :ca_file, :string, :default => nil
74
74
  config_param :ssl_version, :enum, list: [:SSLv23, :TLSv1, :TLSv1_1, :TLSv1_2], :default => :TLSv1_2
75
75
  config_param :with_transporter_log, :bool, :default => false
76
+ config_param :emit_error_label_event, :bool, :default => true
76
77
  config_param :sniffer_class_name, :string, :default => nil
77
78
  config_param :custom_headers, :hash, :default => {}
78
79
  config_param :docinfo_fields, :array, :default => ['_index', '_type', '_id']
@@ -180,6 +181,13 @@ module Fluent::Plugin
180
181
  }
181
182
  end
182
183
 
184
+ def emit_error_label_event(&block)
185
+ # If `emit_error_label_event` is specified as false, error event emittions are not occurred.
186
+ if emit_error_label_event
187
+ block.call
188
+ end
189
+ end
190
+
183
191
  def start
184
192
  super
185
193
 
@@ -224,7 +232,9 @@ module Fluent::Plugin
224
232
  def parse_time(value, event_time, tag)
225
233
  @timestamp_parser.call(value)
226
234
  rescue => e
227
- router.emit_error_event(@timestamp_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @timestamp_key_format, 'value' => value}, e)
235
+ emit_error_label_event do
236
+ router.emit_error_event(@timestamp_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @timestamp_key_format, 'value' => value}, e)
237
+ end
228
238
  return Time.at(event_time).to_time
229
239
  end
230
240
 
@@ -49,8 +49,12 @@ class Fluent::Plugin::OpenSearchErrorHandler
49
49
  unrecoverable_error_types.include?(type)
50
50
  end
51
51
 
52
+ def unrecoverable_record_types
53
+ @plugin.unrecoverable_record_types
54
+ end
55
+
52
56
  def unrecoverable_record_error?(type)
53
- ['json_parse_exception'].include?(type)
57
+ unrecoverable_record_types.include?(type)
54
58
  end
55
59
 
56
60
  def log_os_400_reason(&block)
@@ -61,6 +65,13 @@ class Fluent::Plugin::OpenSearchErrorHandler
61
65
  end
62
66
  end
63
67
 
68
+ def emit_error_label_event(&block)
69
+ # If `emit_error_label_event` is specified as false, error event emittions are not occurred.
70
+ if @plugin.emit_error_label_event
71
+ block.call
72
+ end
73
+ end
74
+
64
75
  def handle_error(response, tag, chunk, bulk_message_count, extracted_values)
65
76
  items = response['items']
66
77
  if items.nil? || !items.is_a?(Array)
@@ -127,12 +138,16 @@ class Fluent::Plugin::OpenSearchErrorHandler
127
138
  reason += " [reason]: \'#{item[write_operation]['error']['reason']}\'"
128
139
  end
129
140
  end
130
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("400 - Rejected by OpenSearch#{reason}"))
141
+ emit_error_label_event do
142
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("400 - Rejected by OpenSearch#{reason}"))
143
+ end
131
144
  else
132
145
  if item[write_operation]['error'].is_a?(String)
133
146
  reason = item[write_operation]['error']
134
147
  stats[:errors_block_resp] += 1
135
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{reason}"))
148
+ emit_error_label_event do
149
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{reason}"))
150
+ end
136
151
  next
137
152
  elsif item[write_operation].has_key?('error') && item[write_operation]['error'].has_key?('type')
138
153
  type = item[write_operation]['error']['type']
@@ -141,7 +156,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
141
156
  raise OpenSearchRequestAbortError, "Rejected OpenSearch due to #{type}"
142
157
  end
143
158
  if unrecoverable_record_error?(type)
144
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{type}: #{reason}"))
159
+ emit_error_label_event do
160
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{type}: #{reason}"))
161
+ end
145
162
  next
146
163
  else
147
164
  retry_stream.add(time, rawrecord) unless unrecoverable_record_error?(type)
@@ -150,7 +167,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
150
167
  # When we don't have a type field, something changed in the API
151
168
  # expected return values.
152
169
  stats[:errors_bad_resp] += 1
153
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - No error type provided in the response"))
170
+ emit_error_label_event do
171
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - No error type provided in the response"))
172
+ end
154
173
  next
155
174
  end
156
175
  stats[type] += 1
@@ -110,6 +110,7 @@ module Fluent::Plugin
110
110
  config_param :logstash_prefix_separator, :string, :default => '-'
111
111
  config_param :logstash_dateformat, :string, :default => "%Y.%m.%d"
112
112
  config_param :utc_index, :bool, :default => true
113
+ config_param :suppress_type_name, :bool, :default => false
113
114
  config_param :index_name, :string, :default => "fluentd"
114
115
  config_param :id_key, :string, :default => nil
115
116
  config_param :write_operation, :string, :default => "index"
@@ -160,6 +161,8 @@ module Fluent::Plugin
160
161
  config_param :validate_client_version, :bool, :default => false
161
162
  config_param :prefer_oj_serializer, :bool, :default => false
162
163
  config_param :unrecoverable_error_types, :array, :default => ["out_of_memory_error", "rejected_execution_exception"]
164
+ config_param :unrecoverable_record_types, :array, :default => ["json_parse_exception"]
165
+ config_param :emit_error_label_event, :bool, :default => true
163
166
  config_param :verify_os_version_at_startup, :bool, :default => true
164
167
  config_param :default_opensearch_version, :integer, :default => DEFAULT_OPENSEARCH_VERSION
165
168
  config_param :log_os_400_reason, :bool, :default => false
@@ -219,8 +222,8 @@ module Fluent::Plugin
219
222
  if conf[:assume_role_arn].nil?
220
223
  aws_container_credentials_relative_uri = conf[:ecs_container_credentials_relative_uri] || ENV["AWS_CONTAINER_CREDENTIALS_RELATIVE_URI"]
221
224
  if aws_container_credentials_relative_uri.nil?
222
- credentials = Aws::SharedCredentials.new({retries: 2}).credentials
223
- credentials ||= Aws::InstanceProfileCredentials.new.credentials
225
+ credentials = Aws::SharedCredentials.new({retries: 2}).credentials rescue nil
226
+ credentials ||= Aws::InstanceProfileCredentials.new.credentials rescue nil
224
227
  credentials ||= Aws::ECSCredentials.new.credentials
225
228
  else
226
229
  credentials = Aws::ECSCredentials.new({
@@ -462,6 +465,13 @@ module Fluent::Plugin
462
465
  placeholder_validities.include?(true)
463
466
  end
464
467
 
468
+ def emit_error_label_event(&block)
469
+ # If `emit_error_label_event` is specified as false, error event emittions are not occurred.
470
+ if @emit_error_label_event
471
+ block.call
472
+ end
473
+ end
474
+
465
475
  def compression
466
476
  !(@compression_level == :no_compression)
467
477
  end
@@ -578,7 +588,9 @@ module Fluent::Plugin
578
588
  def parse_time(value, event_time, tag)
579
589
  @time_parser.call(value)
580
590
  rescue => e
581
- router.emit_error_event(@time_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @time_key_format, 'value' => value}, e)
591
+ emit_error_label_event do
592
+ router.emit_error_event(@time_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @time_key_format, 'value' => value}, e)
593
+ end
582
594
  return Time.at(event_time).to_datetime
583
595
  end
584
596
 
@@ -870,7 +882,9 @@ module Fluent::Plugin
870
882
  end
871
883
  end
872
884
  rescue => e
873
- router.emit_error_event(tag, time, record, e)
885
+ emit_error_label_event do
886
+ router.emit_error_event(tag, time, record, e)
887
+ end
874
888
  end
875
889
  end
876
890
 
@@ -978,8 +992,12 @@ module Fluent::Plugin
978
992
  end
979
993
  end
980
994
 
981
- # OpenSearch only supports "_doc".
982
- target_type = DEFAULT_TYPE_NAME
995
+ if @suppress_type_name
996
+ target_type = nil
997
+ else
998
+ # OpenSearch only supports "_doc".
999
+ target_type = DEFAULT_TYPE_NAME
1000
+ end
983
1001
 
984
1002
  meta.clear
985
1003
  meta["_index".freeze] = target_index
@@ -191,12 +191,22 @@ module Fluent::Plugin
191
191
  tag = chunk.metadata.tag
192
192
  chunk.msgpack_each do |time, record|
193
193
  next unless record.is_a? Hash
194
-
195
194
  begin
196
- record.merge!({"@timestamp" => Time.at(time).iso8601(@time_precision)})
195
+ if record.has_key?(TIMESTAMP_FIELD)
196
+ rts = record[TIMESTAMP_FIELD]
197
+ dt = parse_time(rts, time, tag)
198
+ elsif record.has_key?(@time_key)
199
+ rts = record[@time_key]
200
+ dt = parse_time(rts, time, tag)
201
+ else
202
+ dt = Time.at(time).to_datetime
203
+ end
204
+ record.merge!({"@timestamp" => dt.iso8601(@time_precision)})
197
205
  bulk_message = append_record_to_messages(CREATE_OP, {}, headers, record, bulk_message)
198
206
  rescue => e
199
- router.emit_error_event(tag, time, record, e)
207
+ emit_error_label_event do
208
+ router.emit_error_event(tag, time, record, e)
209
+ end
200
210
  end
201
211
  end
202
212
 
@@ -35,14 +35,18 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
35
35
  attr_reader :log
36
36
  attr_reader :error_events
37
37
  attr_accessor :unrecoverable_error_types
38
+ attr_accessor :unrecoverable_record_types
38
39
  attr_accessor :log_os_400_reason
39
40
  attr_accessor :write_operation
41
+ attr_accessor :emit_error_label_event
40
42
  def initialize(log, log_os_400_reason = false)
41
43
  @log = log
42
44
  @write_operation = 'index'
43
45
  @error_events = []
44
46
  @unrecoverable_error_types = ["out_of_memory_error", "rejected_execution_exception"]
47
+ @unrecoverable_record_types = ["json_parse_exception"]
45
48
  @log_os_400_reason = log_os_400_reason
49
+ @emit_error_label_event = true
46
50
  end
47
51
 
48
52
  def router
@@ -135,6 +139,44 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
135
139
  end
136
140
  end
137
141
 
142
+ class TEST400ResponseReasonWithoutErrorLog < self
143
+ def setup
144
+ Fluent::Test.setup
145
+ @log_device = Fluent::Test::DummyLogDevice.new
146
+ dl_opts = {:log_level => ServerEngine::DaemonLogger::DEBUG}
147
+ logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
148
+ @log = Fluent::Log.new(logger)
149
+ @plugin = TestPlugin.new(@log)
150
+ @handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
151
+ @plugin.emit_error_label_event = false
152
+ end
153
+
154
+ def test_400_responses_reason_log
155
+ records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
156
+ response = parse_response(%({
157
+ "took" : 0,
158
+ "errors" : true,
159
+ "items" : [
160
+ {
161
+ "create" : {
162
+ "_index" : "foo",
163
+ "status" : 400,
164
+ "error" : {
165
+ "type" : "mapper_parsing_exception",
166
+ "reason" : "failed to parse"
167
+ }
168
+ }
169
+ }
170
+ ]
171
+ }))
172
+ chunk = MockChunk.new(records)
173
+ dummy_extracted_values = []
174
+ @handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
175
+ assert_equal(0, @plugin.error_events.size)
176
+ assert_true(@plugin.error_events.empty?)
177
+ end
178
+ end
179
+
138
180
  class TEST400ResponseReasonNoDebug < self
139
181
  def setup
140
182
  Fluent::Test.setup
@@ -177,6 +219,45 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
177
219
  end
178
220
  end
179
221
 
222
+ class TEST400ResponseReasonNoDebugAndNoErrorLog < self
223
+ def setup
224
+ Fluent::Test.setup
225
+ @log_device = Fluent::Test::DummyLogDevice.new
226
+ dl_opts = {:log_level => ServerEngine::DaemonLogger::INFO}
227
+ logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
228
+ @log = Fluent::Log.new(logger)
229
+ @plugin = TestPlugin.new(@log)
230
+ @handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
231
+ @plugin.log_os_400_reason = true
232
+ @plugin.emit_error_label_event = false
233
+ end
234
+
235
+ def test_400_responses_reason_log
236
+ records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
237
+ response = parse_response(%({
238
+ "took" : 0,
239
+ "errors" : true,
240
+ "items" : [
241
+ {
242
+ "create" : {
243
+ "_index" : "foo",
244
+ "status" : 400,
245
+ "error" : {
246
+ "type" : "mapper_parsing_exception",
247
+ "reason" : "failed to parse"
248
+ }
249
+ }
250
+ }
251
+ ]
252
+ }))
253
+ chunk = MockChunk.new(records)
254
+ dummy_extracted_values = []
255
+ @handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
256
+ assert_equal(0, @plugin.error_events.size)
257
+ assert_true(@plugin.error_events.empty?)
258
+ end
259
+ end
260
+
180
261
  def test_nil_items_responses
181
262
  records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
182
263
  response = parse_response(%({
@@ -2320,23 +2320,11 @@ class OpenSearchOutputTest < Test::Unit::TestCase
2320
2320
  end
2321
2321
 
2322
2322
  data(
2323
- "OpenSearch default"=> {"os_version" => 1, "_type" => "_doc"},
2323
+ "OpenSearch default"=> {"os_version" => 1, "_type" => "_doc", "suppress_type" => false},
2324
+ "Suppressed type" => {"os_version" => 1, "_type" => nil, "suppress_type" => true},
2324
2325
  )
2325
2326
  def test_writes_to_speficied_type(data)
2326
- driver('', data["os_version"]).configure("")
2327
- stub_opensearch
2328
- stub_opensearch_info
2329
- driver.run(default_tag: 'test') do
2330
- driver.feed(sample_record)
2331
- end
2332
- assert_equal(data['_type'], index_cmds.first['index']['_type'])
2333
- end
2334
-
2335
- data(
2336
- "OpenSearch 1"=> {"os_version" => 1, "_type" => "_doc"},
2337
- )
2338
- def test_writes_to_speficied_type_with_placeholders(data)
2339
- driver('', data["os_version"]).configure("")
2327
+ driver('', data["os_version"]).configure("suppress_type_name #{data['suppress_type']}")
2340
2328
  stub_opensearch
2341
2329
  stub_opensearch_info
2342
2330
  driver.run(default_tag: 'test') do
@@ -9,7 +9,7 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
9
9
  include FlexMock::TestCase
10
10
  include Fluent::Test::Helpers
11
11
 
12
- attr_accessor :bulk_records
12
+ attr_accessor :bulk_records, :index_cmds
13
13
 
14
14
  OPENSEARCH_DATA_STREAM_TYPE = "opensearch_data_stream"
15
15
 
@@ -96,12 +96,14 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
96
96
  # {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
97
97
  # {"@timestamp": ...}
98
98
  @bulk_records += req.body.split("\n").size / 2
99
+ @index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
99
100
  end
100
101
  stub_request(:post, "http://#{url}#{template_name}/_bulk").with do |req|
101
102
  # bulk data must be pair of OP and records
102
103
  # {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
103
104
  # {"@timestamp": ...}
104
105
  @bulk_records += req.body.split("\n").size / 2
106
+ @index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
105
107
  end
106
108
  end
107
109
 
@@ -591,4 +593,72 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
591
593
 
592
594
  assert_equal(4, connection_resets)
593
595
  end
596
+
597
+ def test_uses_custom_time_key
598
+ stub_default
599
+ stub_bulk_feed
600
+ conf = config_element(
601
+ 'ROOT', '', {
602
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
603
+ 'data_stream_name' => 'foo',
604
+ 'data_stream_template_name' => 'foo_tpl',
605
+ 'time_key' => 'vtm'
606
+ })
607
+
608
+ ts = DateTime.new(2021,2,3).iso8601(9)
609
+ record = {
610
+ 'vtm' => ts,
611
+ 'message' => 'Sample Record'
612
+ }
613
+
614
+ driver(conf).run(default_tag: 'test') do
615
+ driver.feed(record)
616
+ end
617
+ assert(index_cmds[1].has_key? '@timestamp')
618
+ assert_equal(ts, index_cmds[1]['@timestamp'])
619
+ end
620
+
621
+ def test_uses_custom_time_key_with_format
622
+ stub_default
623
+ stub_bulk_feed
624
+ conf = config_element(
625
+ 'ROOT', '', {
626
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
627
+ 'data_stream_name' => 'foo',
628
+ 'data_stream_template_name' => 'foo_tpl',
629
+ 'time_key' => 'vtm',
630
+ 'time_key_format' => '%Y-%m-%d %H:%M:%S.%N%z'
631
+ })
632
+ ts = "2021-02-03 13:14:01.673+02:00"
633
+ record = {
634
+ 'vtm' => ts,
635
+ 'message' => 'Sample Record'
636
+ }
637
+ driver(conf).run(default_tag: 'test') do
638
+ driver.feed(record)
639
+ end
640
+ assert(index_cmds[1].has_key? '@timestamp')
641
+ assert_equal(DateTime.parse(ts).iso8601(9), index_cmds[1]['@timestamp'])
642
+ end
643
+
644
+ def test_record_no_timestamp
645
+ stub_default
646
+ stub_bulk_feed
647
+ stub_default
648
+ stub_bulk_feed
649
+ conf = config_element(
650
+ 'ROOT', '', {
651
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
652
+ 'data_stream_name' => 'foo',
653
+ 'data_stream_template_name' => 'foo_tpl'
654
+ })
655
+ record = {
656
+ 'message' => 'Sample Record'
657
+ }
658
+ driver(conf).run(default_tag: 'test') do
659
+ driver.feed(record)
660
+ end
661
+ assert(index_cmds[1].has_key? '@timestamp')
662
+ end
663
+
594
664
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluent-plugin-opensearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.2
4
+ version: 1.0.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Hiroshi Hatake
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2022-02-17 00:00:00.000000000 Z
11
+ date: 2022-03-25 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: fluentd