fluent-plugin-opensearch 1.0.2 → 1.0.5

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 6382f3ad5be4b3c2c5aaa425568e4421c761bcb8cc5de65369123d01830a8888
4
- data.tar.gz: 3e38aceba420423bf15065a0203ed6d696fa862d86d365a090d72d8da290a92a
3
+ metadata.gz: 0dc7a8e74a02c8940f19855c7377b7022e62ba0a9645e63eb2df366b0e085bb1
4
+ data.tar.gz: 482ed0c66522385c60a0765fbd0852e6158050ed4f803b7fed8a7045faf67beb
5
5
  SHA512:
6
- metadata.gz: 0bccfd9d20535f856b4dd115eb9126efc2cf2cfbe02d52fcb1f2e14db43747ee90de71f87873924540ea973f1d077d4618b87beb053e09e4d1c52d7b1b33bcc6
7
- data.tar.gz: 3d9a7ce81fbb624f4af5e42dcfef10e4059073523a5b410af67575cffb11bf7e243f11d66cc6bf87851ea5e5dd0f2f7d76100923d53cf386df82ede3beb0c3ad
6
+ metadata.gz: ddab26f5bd880a43a29a345db493e7ceb39aafd5d6a583fd895975808041ee2856b125d579ffa64e8716f46e340e6a0e8d4079bc790b9a8b60f3ebcba1f53168
7
+ data.tar.gz: 4e2e566e012edcab487b66c6e5ca2cfe0e0e87d4470bbc79704d830ee561789a16e537967d549a1d0ea687cf59a7cf86781de1ae7dfdd15928934a2fc86ecae0
data/History.md CHANGED
@@ -2,6 +2,20 @@
2
2
 
3
3
  ### [Unreleased]
4
4
 
5
+ ### 1.0.5
6
+ - Use if clause to detect requirements for emitting error label events (#57)
7
+ - opensearch_data_stream: Align lowercases for data_stream and its template names (#50)
8
+
9
+ ### 1.0.4
10
+ - Automatically create data streams (#44)
11
+
12
+ ### 1.0.3
13
+ - Configurable unrecoverable record types (#40)
14
+ - Handle exceptions on retrieving AWS credentials (#39)
15
+ - Suppress emit error label events (#38)
16
+ - Provide suppress_type_name parameter (#29)
17
+ - Honor @time_key (data streams) (#28)
18
+
5
19
  ### 1.0.2
6
20
  - Honor @hosts parameter for Data Streams (#21)
7
21
  - Use template_file for Data Streams (#20)
@@ -17,6 +17,7 @@
17
17
  + [reload_on_failure](#reload_on_failure)
18
18
  + [resurrect_after](#resurrect_after)
19
19
  + [with_transporter_log](#with_transporter_log)
20
+ + [emit_error_label_event](#emit-error-label-event)
20
21
  + [Client/host certificate options](#clienthost-certificate-options)
21
22
  + [sniffer_class_name](#sniffer-class-name)
22
23
  + [custom_headers](#custom_headers)
@@ -190,6 +191,18 @@ We recommend to set this true if you start to debug this plugin.
190
191
  with_transporter_log true
191
192
  ```
192
193
 
194
+ ### emit_error_label_event
195
+
196
+ Default `emit_error_label_event` value is `true`.
197
+
198
+ Emitting error label events is default behavior.
199
+
200
+ When using the followin configuration, OpenSearch plugin will cut error events on error handler:
201
+
202
+ ```aconf
203
+ emit_error_label_event false
204
+ ```
205
+
193
206
  ### Client/host certificate options
194
207
 
195
208
  Need to verify OpenSearch's certificate? You can use the following parameter to specify a CA instead of using an environment variable.
data/README.md CHANGED
@@ -76,6 +76,8 @@ Send your logs to OpenSearch (and search them with OpenSearch Dashboards maybe?)
76
76
  + [reload_after](#reload-after)
77
77
  + [validate_client_version](#validate-client-version)
78
78
  + [unrecoverable_error_types](#unrecoverable-error-types)
79
+ + [unrecoverable_record_types](#unrecoverable-record-types)
80
+ + [emit_error_label_event](#emit-error-label-event)
79
81
  + [verify os version at startup](#verify_os_version_at_startup)
80
82
  + [default_opensearch_version](#default_opensearch_version)
81
83
  + [custom_headers](#custom_headers)
@@ -356,6 +358,15 @@ utc_index true
356
358
 
357
359
  By default, the records inserted into index `logstash-YYMMDD` with UTC (Coordinated Universal Time). This option allows to use local time if you describe utc_index to false.
358
360
 
361
+
362
+ ### suppress_type_name
363
+
364
+ If OpenSearch cluster complains types removal warnings, this can be suppressed with:
365
+
366
+ ```
367
+ suppress_type_name true
368
+ ```
369
+
359
370
  ### target_index_key
360
371
 
361
372
  Tell this plugin to find the index name to write to in the record under this key in preference to other mechanisms. Key can be specified as path to nested record using dot ('.') as a separator.
@@ -1036,11 +1047,11 @@ $ fluentd -r $sniffer [AND YOUR OTHER OPTIONS]
1036
1047
  The default selector used by the `OpenSearch::Transport` class works well when Fluentd should behave round robin and random selector cases. This doesn't work well when Fluentd should behave fallbacking from exhausted ES cluster to normal ES cluster.
1037
1048
  The parameter `selector_class_name` gives you the ability to provide your own Selector class to implement whatever selection nodes logic you require.
1038
1049
 
1039
- The below configuration is using plugin built-in `ElasticseatchFallbackSelector`:
1050
+ The below configuration is using plugin built-in `OpenSearchFallbackSelector`:
1040
1051
 
1041
1052
  ```
1042
1053
  hosts exhausted-host:9201,normal-host:9200
1043
- selector_class_name "Fluent::Plugin::ElasticseatchFallbackSelector"
1054
+ selector_class_name "Fluent::Plugin::OpenSeartchFallbackSelector"
1044
1055
  ```
1045
1056
 
1046
1057
  #### Tips
@@ -1097,6 +1108,33 @@ Then, remove `rejected_execution_exception` from `unrecoverable_error_types` par
1097
1108
  unrecoverable_error_types ["out_of_memory_error"]
1098
1109
  ```
1099
1110
 
1111
+
1112
+ ### Unrecoverable Record Types
1113
+
1114
+ Default `unrecoverable_record_types` parameter is set up loosely.
1115
+ Because `json_parse_exception` is caused by invalid JSON record.
1116
+ Another possible unrecoverable record error should be included within this paramater.
1117
+
1118
+ If you want to handle `security_exception` as _unrecoverable exceptions_, please consider to change `unrecoverable_record_types` parameter from default value:
1119
+
1120
+ ```
1121
+ unrecoverable_record_types ["json_parse_exception", "security_exception"]
1122
+ ```
1123
+
1124
+ If this error type is included in OpenSearch response, an invalid record should be sent in `@ERROR` data pipeline.
1125
+
1126
+ ### emit_error_label_event
1127
+
1128
+ Default `emit_error_label_event` value is `true`.
1129
+
1130
+ Emitting error label events is default behavior.
1131
+
1132
+ When using the followin configuration, OpenSearch plugin will cut error events on error handler:
1133
+
1134
+ ```aconf
1135
+ emit_error_label_event false
1136
+ ```
1137
+
1100
1138
  ### verify_os_version_at_startup
1101
1139
 
1102
1140
  Because OpenSearch plugin will ought to change behavior each of OpenSearch major versions.
@@ -3,7 +3,7 @@ $:.push File.expand_path('../lib', __FILE__)
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = 'fluent-plugin-opensearch'
6
- s.version = '1.0.2'
6
+ s.version = '1.0.5'
7
7
  s.authors = ['Hiroshi Hatake']
8
8
  s.email = ['cosmo0920.wp@gmail.com']
9
9
  s.description = %q{Opensearch output plugin for Fluent event collector}
@@ -73,6 +73,7 @@ module Fluent::Plugin
73
73
  config_param :ca_file, :string, :default => nil
74
74
  config_param :ssl_version, :enum, list: [:SSLv23, :TLSv1, :TLSv1_1, :TLSv1_2], :default => :TLSv1_2
75
75
  config_param :with_transporter_log, :bool, :default => false
76
+ config_param :emit_error_label_event, :bool, :default => true
76
77
  config_param :sniffer_class_name, :string, :default => nil
77
78
  config_param :custom_headers, :hash, :default => {}
78
79
  config_param :docinfo_fields, :array, :default => ['_index', '_type', '_id']
@@ -180,6 +181,13 @@ module Fluent::Plugin
180
181
  }
181
182
  end
182
183
 
184
+ def emit_error_label_event(&block)
185
+ # If `emit_error_label_event` is specified as false, error event emittions are not occurred.
186
+ if emit_error_label_event
187
+ block.call
188
+ end
189
+ end
190
+
183
191
  def start
184
192
  super
185
193
 
@@ -224,7 +232,9 @@ module Fluent::Plugin
224
232
  def parse_time(value, event_time, tag)
225
233
  @timestamp_parser.call(value)
226
234
  rescue => e
227
- router.emit_error_event(@timestamp_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @timestamp_key_format, 'value' => value}, e)
235
+ emit_error_label_event do
236
+ router.emit_error_event(@timestamp_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @timestamp_key_format, 'value' => value}, e)
237
+ end
228
238
  return Time.at(event_time).to_time
229
239
  end
230
240
 
@@ -49,8 +49,12 @@ class Fluent::Plugin::OpenSearchErrorHandler
49
49
  unrecoverable_error_types.include?(type)
50
50
  end
51
51
 
52
+ def unrecoverable_record_types
53
+ @plugin.unrecoverable_record_types
54
+ end
55
+
52
56
  def unrecoverable_record_error?(type)
53
- ['json_parse_exception'].include?(type)
57
+ unrecoverable_record_types.include?(type)
54
58
  end
55
59
 
56
60
  def log_os_400_reason(&block)
@@ -61,6 +65,10 @@ class Fluent::Plugin::OpenSearchErrorHandler
61
65
  end
62
66
  end
63
67
 
68
+ def emit_error_label_event?
69
+ !!@plugin.emit_error_label_event
70
+ end
71
+
64
72
  def handle_error(response, tag, chunk, bulk_message_count, extracted_values)
65
73
  items = response['items']
66
74
  if items.nil? || !items.is_a?(Array)
@@ -127,12 +135,16 @@ class Fluent::Plugin::OpenSearchErrorHandler
127
135
  reason += " [reason]: \'#{item[write_operation]['error']['reason']}\'"
128
136
  end
129
137
  end
130
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("400 - Rejected by OpenSearch#{reason}"))
138
+ if emit_error_label_event?
139
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("400 - Rejected by OpenSearch#{reason}"))
140
+ end
131
141
  else
132
142
  if item[write_operation]['error'].is_a?(String)
133
143
  reason = item[write_operation]['error']
134
144
  stats[:errors_block_resp] += 1
135
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{reason}"))
145
+ if emit_error_label_event?
146
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{reason}"))
147
+ end
136
148
  next
137
149
  elsif item[write_operation].has_key?('error') && item[write_operation]['error'].has_key?('type')
138
150
  type = item[write_operation]['error']['type']
@@ -141,7 +153,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
141
153
  raise OpenSearchRequestAbortError, "Rejected OpenSearch due to #{type}"
142
154
  end
143
155
  if unrecoverable_record_error?(type)
144
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{type}: #{reason}"))
156
+ if emit_error_label_event?
157
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{type}: #{reason}"))
158
+ end
145
159
  next
146
160
  else
147
161
  retry_stream.add(time, rawrecord) unless unrecoverable_record_error?(type)
@@ -150,7 +164,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
150
164
  # When we don't have a type field, something changed in the API
151
165
  # expected return values.
152
166
  stats[:errors_bad_resp] += 1
153
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - No error type provided in the response"))
167
+ if emit_error_label_event?
168
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - No error type provided in the response"))
169
+ end
154
170
  next
155
171
  end
156
172
  stats[type] += 1
@@ -62,10 +62,10 @@ module Fluent::OpenSearchIndexTemplate
62
62
  client.transport.transport.host_unreachable_exceptions
63
63
  end
64
64
 
65
- def retry_operate(max_retries, fail_on_retry_exceed = true, catch_trasport_exceptions = true)
65
+ def retry_operate(max_retries, fail_on_retry_exceed = true, catch_transport_exceptions = true)
66
66
  return unless block_given?
67
67
  retries = 0
68
- transport_errors = OpenSearch::Transport::Transport::Errors.constants.map{ |c| OpenSearch::Transport::Transport::Errors.const_get c } if catch_trasport_exceptions
68
+ transport_errors = OpenSearch::Transport::Transport::Errors.constants.map{ |c| OpenSearch::Transport::Transport::Errors.const_get c } if catch_transport_exceptions
69
69
  begin
70
70
  yield
71
71
  rescue *host_unreachable_exceptions, *transport_errors, Timeout::Error => e
@@ -110,6 +110,7 @@ module Fluent::Plugin
110
110
  config_param :logstash_prefix_separator, :string, :default => '-'
111
111
  config_param :logstash_dateformat, :string, :default => "%Y.%m.%d"
112
112
  config_param :utc_index, :bool, :default => true
113
+ config_param :suppress_type_name, :bool, :default => false
113
114
  config_param :index_name, :string, :default => "fluentd"
114
115
  config_param :id_key, :string, :default => nil
115
116
  config_param :write_operation, :string, :default => "index"
@@ -160,6 +161,8 @@ module Fluent::Plugin
160
161
  config_param :validate_client_version, :bool, :default => false
161
162
  config_param :prefer_oj_serializer, :bool, :default => false
162
163
  config_param :unrecoverable_error_types, :array, :default => ["out_of_memory_error", "rejected_execution_exception"]
164
+ config_param :unrecoverable_record_types, :array, :default => ["json_parse_exception"]
165
+ config_param :emit_error_label_event, :bool, :default => true
163
166
  config_param :verify_os_version_at_startup, :bool, :default => true
164
167
  config_param :default_opensearch_version, :integer, :default => DEFAULT_OPENSEARCH_VERSION
165
168
  config_param :log_os_400_reason, :bool, :default => false
@@ -219,8 +222,8 @@ module Fluent::Plugin
219
222
  if conf[:assume_role_arn].nil?
220
223
  aws_container_credentials_relative_uri = conf[:ecs_container_credentials_relative_uri] || ENV["AWS_CONTAINER_CREDENTIALS_RELATIVE_URI"]
221
224
  if aws_container_credentials_relative_uri.nil?
222
- credentials = Aws::SharedCredentials.new({retries: 2}).credentials
223
- credentials ||= Aws::InstanceProfileCredentials.new.credentials
225
+ credentials = Aws::SharedCredentials.new({retries: 2}).credentials rescue nil
226
+ credentials ||= Aws::InstanceProfileCredentials.new.credentials rescue nil
224
227
  credentials ||= Aws::ECSCredentials.new.credentials
225
228
  else
226
229
  credentials = Aws::ECSCredentials.new({
@@ -462,6 +465,10 @@ module Fluent::Plugin
462
465
  placeholder_validities.include?(true)
463
466
  end
464
467
 
468
+ def emit_error_label_event?
469
+ !!@emit_error_label_event
470
+ end
471
+
465
472
  def compression
466
473
  !(@compression_level == :no_compression)
467
474
  end
@@ -578,7 +585,9 @@ module Fluent::Plugin
578
585
  def parse_time(value, event_time, tag)
579
586
  @time_parser.call(value)
580
587
  rescue => e
581
- router.emit_error_event(@time_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @time_key_format, 'value' => value}, e)
588
+ if emit_error_label_event?
589
+ router.emit_error_event(@time_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @time_key_format, 'value' => value}, e)
590
+ end
582
591
  return Time.at(event_time).to_datetime
583
592
  end
584
593
 
@@ -870,7 +879,9 @@ module Fluent::Plugin
870
879
  end
871
880
  end
872
881
  rescue => e
873
- router.emit_error_event(tag, time, record, e)
882
+ if emit_error_label_event?
883
+ router.emit_error_event(tag, time, record, e)
884
+ end
874
885
  end
875
886
  end
876
887
 
@@ -978,8 +989,12 @@ module Fluent::Plugin
978
989
  end
979
990
  end
980
991
 
981
- # OpenSearch only supports "_doc".
982
- target_type = DEFAULT_TYPE_NAME
992
+ if @suppress_type_name
993
+ target_type = nil
994
+ else
995
+ # OpenSearch only supports "_doc".
996
+ target_type = DEFAULT_TYPE_NAME
997
+ end
983
998
 
984
999
  meta.clear
985
1000
  meta["_index".freeze] = target_index
@@ -36,7 +36,6 @@ module Fluent::Plugin
36
36
  @fail_on_putting_template_retry_exceed,
37
37
  @catch_transport_exception_on_retry) do
38
38
  create_index_template(@data_stream_name, @data_stream_template_name)
39
- create_data_stream(@data_stream_name)
40
39
  end
41
40
  rescue => e
42
41
  raise Fluent::ConfigError, "Failed to create data stream: <#{@data_stream_name}> #{e.message}"
@@ -105,19 +104,6 @@ module Fluent::Plugin
105
104
  end
106
105
  end
107
106
 
108
- def create_data_stream(datastream_name, host = nil)
109
- return if data_stream_exist?(datastream_name, host)
110
- params = {
111
- name: datastream_name
112
- }
113
- retry_operate(@max_retry_putting_template,
114
- @fail_on_putting_template_retry_exceed,
115
- @catch_transport_exception_on_retry) do
116
- # TODO: Use X-Pack equivalent performing DataStream operation method on the following line
117
- client(host).perform_request('PUT', "/_data_stream/#{datastream_name}", {}, params)
118
- end
119
- end
120
-
121
107
  def template_exists?(name, host = nil)
122
108
  if @use_legacy_template
123
109
  client(host).indices.get_template(:name => name)
@@ -171,12 +157,11 @@ module Fluent::Plugin
171
157
  else
172
158
  extract_placeholders(@host, chunk)
173
159
  end
174
- data_stream_name = extract_placeholders(@data_stream_name, chunk)
175
- data_stream_template_name = extract_placeholders(@data_stream_template_name, chunk)
160
+ data_stream_name = extract_placeholders(@data_stream_name, chunk).downcase
161
+ data_stream_template_name = extract_placeholders(@data_stream_template_name, chunk).downcase
176
162
  unless @data_stream_names.include?(data_stream_name)
177
163
  begin
178
164
  create_index_template(data_stream_name, data_stream_template_name, host)
179
- create_data_stream(data_stream_name, host)
180
165
  @data_stream_names << data_stream_name
181
166
  rescue => e
182
167
  raise Fluent::ConfigError, "Failed to create data stream: <#{data_stream_name}> #{e.message}"
@@ -191,12 +176,22 @@ module Fluent::Plugin
191
176
  tag = chunk.metadata.tag
192
177
  chunk.msgpack_each do |time, record|
193
178
  next unless record.is_a? Hash
194
-
195
179
  begin
196
- record.merge!({"@timestamp" => Time.at(time).iso8601(@time_precision)})
180
+ if record.has_key?(TIMESTAMP_FIELD)
181
+ rts = record[TIMESTAMP_FIELD]
182
+ dt = parse_time(rts, time, tag)
183
+ elsif record.has_key?(@time_key)
184
+ rts = record[@time_key]
185
+ dt = parse_time(rts, time, tag)
186
+ else
187
+ dt = Time.at(time).to_datetime
188
+ end
189
+ record.merge!({"@timestamp" => dt.iso8601(@time_precision)})
197
190
  bulk_message = append_record_to_messages(CREATE_OP, {}, headers, record, bulk_message)
198
191
  rescue => e
199
- router.emit_error_event(tag, time, record, e)
192
+ emit_error_label_event do
193
+ router.emit_error_event(tag, time, record, e)
194
+ end
200
195
  end
201
196
  end
202
197
 
@@ -35,14 +35,18 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
35
35
  attr_reader :log
36
36
  attr_reader :error_events
37
37
  attr_accessor :unrecoverable_error_types
38
+ attr_accessor :unrecoverable_record_types
38
39
  attr_accessor :log_os_400_reason
39
40
  attr_accessor :write_operation
41
+ attr_accessor :emit_error_label_event
40
42
  def initialize(log, log_os_400_reason = false)
41
43
  @log = log
42
44
  @write_operation = 'index'
43
45
  @error_events = []
44
46
  @unrecoverable_error_types = ["out_of_memory_error", "rejected_execution_exception"]
47
+ @unrecoverable_record_types = ["json_parse_exception"]
45
48
  @log_os_400_reason = log_os_400_reason
49
+ @emit_error_label_event = true
46
50
  end
47
51
 
48
52
  def router
@@ -135,6 +139,44 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
135
139
  end
136
140
  end
137
141
 
142
+ class TEST400ResponseReasonWithoutErrorLog < self
143
+ def setup
144
+ Fluent::Test.setup
145
+ @log_device = Fluent::Test::DummyLogDevice.new
146
+ dl_opts = {:log_level => ServerEngine::DaemonLogger::DEBUG}
147
+ logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
148
+ @log = Fluent::Log.new(logger)
149
+ @plugin = TestPlugin.new(@log)
150
+ @handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
151
+ @plugin.emit_error_label_event = false
152
+ end
153
+
154
+ def test_400_responses_reason_log
155
+ records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
156
+ response = parse_response(%({
157
+ "took" : 0,
158
+ "errors" : true,
159
+ "items" : [
160
+ {
161
+ "create" : {
162
+ "_index" : "foo",
163
+ "status" : 400,
164
+ "error" : {
165
+ "type" : "mapper_parsing_exception",
166
+ "reason" : "failed to parse"
167
+ }
168
+ }
169
+ }
170
+ ]
171
+ }))
172
+ chunk = MockChunk.new(records)
173
+ dummy_extracted_values = []
174
+ @handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
175
+ assert_equal(0, @plugin.error_events.size)
176
+ assert_true(@plugin.error_events.empty?)
177
+ end
178
+ end
179
+
138
180
  class TEST400ResponseReasonNoDebug < self
139
181
  def setup
140
182
  Fluent::Test.setup
@@ -177,6 +219,45 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
177
219
  end
178
220
  end
179
221
 
222
+ class TEST400ResponseReasonNoDebugAndNoErrorLog < self
223
+ def setup
224
+ Fluent::Test.setup
225
+ @log_device = Fluent::Test::DummyLogDevice.new
226
+ dl_opts = {:log_level => ServerEngine::DaemonLogger::INFO}
227
+ logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
228
+ @log = Fluent::Log.new(logger)
229
+ @plugin = TestPlugin.new(@log)
230
+ @handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
231
+ @plugin.log_os_400_reason = true
232
+ @plugin.emit_error_label_event = false
233
+ end
234
+
235
+ def test_400_responses_reason_log
236
+ records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
237
+ response = parse_response(%({
238
+ "took" : 0,
239
+ "errors" : true,
240
+ "items" : [
241
+ {
242
+ "create" : {
243
+ "_index" : "foo",
244
+ "status" : 400,
245
+ "error" : {
246
+ "type" : "mapper_parsing_exception",
247
+ "reason" : "failed to parse"
248
+ }
249
+ }
250
+ }
251
+ ]
252
+ }))
253
+ chunk = MockChunk.new(records)
254
+ dummy_extracted_values = []
255
+ @handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
256
+ assert_equal(0, @plugin.error_events.size)
257
+ assert_true(@plugin.error_events.empty?)
258
+ end
259
+ end
260
+
180
261
  def test_nil_items_responses
181
262
  records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
182
263
  response = parse_response(%({
@@ -2320,23 +2320,11 @@ class OpenSearchOutputTest < Test::Unit::TestCase
2320
2320
  end
2321
2321
 
2322
2322
  data(
2323
- "OpenSearch default"=> {"os_version" => 1, "_type" => "_doc"},
2323
+ "OpenSearch default"=> {"os_version" => 1, "_type" => "_doc", "suppress_type" => false},
2324
+ "Suppressed type" => {"os_version" => 1, "_type" => nil, "suppress_type" => true},
2324
2325
  )
2325
2326
  def test_writes_to_speficied_type(data)
2326
- driver('', data["os_version"]).configure("")
2327
- stub_opensearch
2328
- stub_opensearch_info
2329
- driver.run(default_tag: 'test') do
2330
- driver.feed(sample_record)
2331
- end
2332
- assert_equal(data['_type'], index_cmds.first['index']['_type'])
2333
- end
2334
-
2335
- data(
2336
- "OpenSearch 1"=> {"os_version" => 1, "_type" => "_doc"},
2337
- )
2338
- def test_writes_to_speficied_type_with_placeholders(data)
2339
- driver('', data["os_version"]).configure("")
2327
+ driver('', data["os_version"]).configure("suppress_type_name #{data['suppress_type']}")
2340
2328
  stub_opensearch
2341
2329
  stub_opensearch_info
2342
2330
  driver.run(default_tag: 'test') do
@@ -9,7 +9,7 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
9
9
  include FlexMock::TestCase
10
10
  include Fluent::Test::Helpers
11
11
 
12
- attr_accessor :bulk_records
12
+ attr_accessor :bulk_records, :index_cmds
13
13
 
14
14
  OPENSEARCH_DATA_STREAM_TYPE = "opensearch_data_stream"
15
15
 
@@ -96,12 +96,14 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
96
96
  # {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
97
97
  # {"@timestamp": ...}
98
98
  @bulk_records += req.body.split("\n").size / 2
99
+ @index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
99
100
  end
100
101
  stub_request(:post, "http://#{url}#{template_name}/_bulk").with do |req|
101
102
  # bulk data must be pair of OP and records
102
103
  # {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
103
104
  # {"@timestamp": ...}
104
105
  @bulk_records += req.body.split("\n").size / 2
106
+ @index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
105
107
  end
106
108
  end
107
109
 
@@ -437,6 +439,23 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
437
439
  assert_equal 1, @bulk_records
438
440
  end
439
441
 
442
+ def test_placeholder_with_capital_letters
443
+ dsname = "foo_test.capital_letters"
444
+ tplname = "foo_tpl_test.capital_letters"
445
+ stub_default(dsname, tplname)
446
+ stub_bulk_feed(dsname, tplname)
447
+ conf = config_element(
448
+ 'ROOT', '', {
449
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
450
+ 'data_stream_name' => 'foo_${tag}',
451
+ 'data_stream_template_name' => "foo_tpl_${tag}"
452
+ })
453
+ driver(conf).run(default_tag: 'TEST.CAPITAL_LETTERS') do
454
+ driver.feed(sample_record)
455
+ end
456
+ assert_equal 1, @bulk_records
457
+ end
458
+
440
459
  def test_placeholder_params_unset
441
460
  dsname = "foo_test"
442
461
  tplname = "foo_test_template"
@@ -591,4 +610,72 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
591
610
 
592
611
  assert_equal(4, connection_resets)
593
612
  end
613
+
614
+ def test_uses_custom_time_key
615
+ stub_default
616
+ stub_bulk_feed
617
+ conf = config_element(
618
+ 'ROOT', '', {
619
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
620
+ 'data_stream_name' => 'foo',
621
+ 'data_stream_template_name' => 'foo_tpl',
622
+ 'time_key' => 'vtm'
623
+ })
624
+
625
+ ts = DateTime.new(2021,2,3).iso8601(9)
626
+ record = {
627
+ 'vtm' => ts,
628
+ 'message' => 'Sample Record'
629
+ }
630
+
631
+ driver(conf).run(default_tag: 'test') do
632
+ driver.feed(record)
633
+ end
634
+ assert(index_cmds[1].has_key? '@timestamp')
635
+ assert_equal(ts, index_cmds[1]['@timestamp'])
636
+ end
637
+
638
+ def test_uses_custom_time_key_with_format
639
+ stub_default
640
+ stub_bulk_feed
641
+ conf = config_element(
642
+ 'ROOT', '', {
643
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
644
+ 'data_stream_name' => 'foo',
645
+ 'data_stream_template_name' => 'foo_tpl',
646
+ 'time_key' => 'vtm',
647
+ 'time_key_format' => '%Y-%m-%d %H:%M:%S.%N%z'
648
+ })
649
+ ts = "2021-02-03 13:14:01.673+02:00"
650
+ record = {
651
+ 'vtm' => ts,
652
+ 'message' => 'Sample Record'
653
+ }
654
+ driver(conf).run(default_tag: 'test') do
655
+ driver.feed(record)
656
+ end
657
+ assert(index_cmds[1].has_key? '@timestamp')
658
+ assert_equal(DateTime.parse(ts).iso8601(9), index_cmds[1]['@timestamp'])
659
+ end
660
+
661
+ def test_record_no_timestamp
662
+ stub_default
663
+ stub_bulk_feed
664
+ stub_default
665
+ stub_bulk_feed
666
+ conf = config_element(
667
+ 'ROOT', '', {
668
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
669
+ 'data_stream_name' => 'foo',
670
+ 'data_stream_template_name' => 'foo_tpl'
671
+ })
672
+ record = {
673
+ 'message' => 'Sample Record'
674
+ }
675
+ driver(conf).run(default_tag: 'test') do
676
+ driver.feed(record)
677
+ end
678
+ assert(index_cmds[1].has_key? '@timestamp')
679
+ end
680
+
594
681
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluent-plugin-opensearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.2
4
+ version: 1.0.5
5
5
  platform: ruby
6
6
  authors:
7
7
  - Hiroshi Hatake
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2022-02-17 00:00:00.000000000 Z
11
+ date: 2022-05-25 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: fluentd
@@ -242,7 +242,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
242
242
  - !ruby/object:Gem::Version
243
243
  version: '0'
244
244
  requirements: []
245
- rubygems_version: 3.2.30
245
+ rubygems_version: 3.2.32
246
246
  signing_key:
247
247
  specification_version: 4
248
248
  summary: Opensearch output plugin for Fluent event collector