fluent-plugin-opensearch 1.0.0 → 1.0.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 56dc0da995d5f55c59e13afd75452b46e824f8337de6b074655dc8c92926eba8
4
- data.tar.gz: dffab1c953b558fe1001df5c59844f1d71f8a2847199afade1b74ff9a05f739a
3
+ metadata.gz: 217feaf328809b259c6eb716e968b02417df7b10075a9a1e278c6e337d8a660b
4
+ data.tar.gz: 351327bd18d5dd3aa7b6a360ebba4a4f574a86b65ed470cffdf15b38e5d16022
5
5
  SHA512:
6
- metadata.gz: da67be6908b8000b6904655b0492cfedbedddae45dd7b408a41698e7c02ea1778b1fb8c64be4097b5cbfed1e58601c30a57892d07706de4782526cb0a6edd0c4
7
- data.tar.gz: f002f9ccec9f8dd7c292ef3d88d8425e0f05e3651cf831eb86204059726643db43502c3285b4ad8410840e7bb29d814f2e243c1b6671c24909abb1f1b7b14824
6
+ metadata.gz: c374adb923d9533e50c1c11355b304ba04ca92112dee36dcacd886d53be0ec910481559c7fdb0f6cdb3b61c7c949b5b917c3540d4bbd2789a9d568015cb52cbb
7
+ data.tar.gz: 11259ac1ee5e4dfe0de5b18336fd2167b40b8c10c381a283068557da1d206e06c03d5d76ef46411a8803b00c333370f47c21fbfdf4331e528cb05a145803a9b4
@@ -1,37 +1,29 @@
1
1
  ---
2
2
  name: Bug Report
3
- about: Create a report to help us improve. If you have questions about ES plugin on kubernetes, please direct these to https://discuss.kubernetes.io/ before sumbit kubernetes related issue.
3
+ about: Create a report to help us improve. If you have questions about OpenSearch plugin on kubernetes, please direct these to https://discuss.kubernetes.io/ before sumbit kubernetes related issue.
4
4
 
5
5
  ---
6
6
 
7
7
  (check apply)
8
- - [ ] read [the contribution guideline](https://github.com/uken/fluent-plugin-elasticsearch/blob/master/CONTRIBUTING.md)
8
+ - [ ] read [the contribution guideline](https://github.com/fluent/fluent-plugin-opensearch/blob/master/CONTRIBUTING.md)
9
9
  - [ ] (optional) already reported 3rd party upstream repository or mailing list if you use k8s addon or helm charts.
10
10
 
11
- #### Problem
12
-
13
- ...
14
-
15
11
  #### Steps to replicate
16
12
 
17
- Either clone and modify https://gist.github.com/pitr/9a518e840db58f435911
18
-
19
- **OR**
20
-
21
13
  Provide example config and message
22
14
 
23
15
  #### Expected Behavior or What you need to ask
24
16
 
25
17
  ...
26
18
 
27
- #### Using Fluentd and ES plugin versions
19
+ #### Using Fluentd and OpenSearch plugin versions
28
20
 
29
21
  * OS version
30
22
  * Bare Metal or within Docker or Kubernetes or others?
31
- * Fluentd v0.12 or v0.14/v1.0
23
+ * Fluentd v1.0 or later
32
24
  * paste result of ``fluentd --version`` or ``td-agent --version``
33
- * ES plugin 3.x.y/2.x.y or 1.x.y
25
+ * OpenSearch plugin version
34
26
  * paste boot log of fluentd or td-agent
35
27
  * paste result of ``fluent-gem list``, ``td-agent-gem list`` or your Gemfile.lock
36
- * ES version (optional)
37
- * ES template(s) (optional)
28
+ * OpenSearch version (optional)
29
+ * OpenSearch template(s) (optional)
data/History.md CHANGED
@@ -2,5 +2,22 @@
2
2
 
3
3
  ### [Unreleased]
4
4
 
5
+ ### 1.0.3
6
+ - Configurable unrecoverable record types (#40)
7
+ - Handle exceptions on retrieving AWS credentials (#39)
8
+ - Suppress emit error label events (#38)
9
+ - Provide suppress_type_name parameter (#29)
10
+ - Honor @time_key (data streams) (#28)
11
+
12
+ ### 1.0.2
13
+ - Honor @hosts parameter for Data Streams (#21)
14
+ - Use template_file for Data Streams (#20)
15
+ - Specify host argument if needed (#11)
16
+
17
+ ### 1.0.1
18
+ - Add testcases for hosts parameter (#10)
19
+ - Permit to handle @hosts parameter (#9)
20
+ - Retry creating data stream/template when OpenSearch is unreachable (#8)
21
+
5
22
  ### 1.0.0
6
23
  - Initial public gem release.
@@ -17,6 +17,7 @@
17
17
  + [reload_on_failure](#reload_on_failure)
18
18
  + [resurrect_after](#resurrect_after)
19
19
  + [with_transporter_log](#with_transporter_log)
20
+ + [emit_error_label_event](#emit-error-label-event)
20
21
  + [Client/host certificate options](#clienthost-certificate-options)
21
22
  + [sniffer_class_name](#sniffer-class-name)
22
23
  + [custom_headers](#custom_headers)
@@ -190,6 +191,18 @@ We recommend to set this true if you start to debug this plugin.
190
191
  with_transporter_log true
191
192
  ```
192
193
 
194
+ ### emit_error_label_event
195
+
196
+ Default `emit_error_label_event` value is `true`.
197
+
198
+ Emitting error label events is default behavior.
199
+
200
+ When using the followin configuration, OpenSearch plugin will cut error events on error handler:
201
+
202
+ ```aconf
203
+ emit_error_label_event false
204
+ ```
205
+
193
206
  ### Client/host certificate options
194
207
 
195
208
  Need to verify OpenSearch's certificate? You can use the following parameter to specify a CA instead of using an environment variable.
data/README.md CHANGED
@@ -6,7 +6,7 @@
6
6
  ![Testing on Ubuntu](https://github.com/fluent/fluent-plugin-opensearch/workflows/Testing%20on%20Ubuntu/badge.svg?branch=main)
7
7
  [![Coverage Status](https://coveralls.io/repos/github/fluent/fluent-plugin-opensearch/badge.svg?branch=upload-coverage-into-coveralls)](https://coveralls.io/github/fluent/fluent-plugin-opensearch?branch=main)
8
8
 
9
- Send your logs to OpenSearch (and search them with OpenSearch Dashboard maybe?)
9
+ Send your logs to OpenSearch (and search them with OpenSearch Dashboards maybe?)
10
10
 
11
11
  * [Installation](#installation)
12
12
  * [Usage](#usage)
@@ -76,6 +76,8 @@ Send your logs to OpenSearch (and search them with OpenSearch Dashboard maybe?)
76
76
  + [reload_after](#reload-after)
77
77
  + [validate_client_version](#validate-client-version)
78
78
  + [unrecoverable_error_types](#unrecoverable-error-types)
79
+ + [unrecoverable_record_types](#unrecoverable-record-types)
80
+ + [emit_error_label_event](#emit-error-label-event)
79
81
  + [verify os version at startup](#verify_os_version_at_startup)
80
82
  + [default_opensearch_version](#default_opensearch_version)
81
83
  + [custom_headers](#custom_headers)
@@ -356,6 +358,15 @@ utc_index true
356
358
 
357
359
  By default, the records inserted into index `logstash-YYMMDD` with UTC (Coordinated Universal Time). This option allows to use local time if you describe utc_index to false.
358
360
 
361
+
362
+ ### suppress_type_name
363
+
364
+ If OpenSearch cluster complains types removal warnings, this can be suppressed with:
365
+
366
+ ```
367
+ suppress_type_name true
368
+ ```
369
+
359
370
  ### target_index_key
360
371
 
361
372
  Tell this plugin to find the index name to write to in the record under this key in preference to other mechanisms. Key can be specified as path to nested record using dot ('.') as a separator.
@@ -1097,6 +1108,33 @@ Then, remove `rejected_execution_exception` from `unrecoverable_error_types` par
1097
1108
  unrecoverable_error_types ["out_of_memory_error"]
1098
1109
  ```
1099
1110
 
1111
+
1112
+ ### Unrecoverable Record Types
1113
+
1114
+ Default `unrecoverable_record_types` parameter is set up loosely.
1115
+ Because `json_parse_exception` is caused by invalid JSON record.
1116
+ Another possible unrecoverable record error should be included within this paramater.
1117
+
1118
+ If you want to handle `security_exception` as _unrecoverable exceptions_, please consider to change `unrecoverable_record_types` parameter from default value:
1119
+
1120
+ ```
1121
+ unrecoverable_record_types ["json_parse_exception", "security_exception"]
1122
+ ```
1123
+
1124
+ If this error type is included in OpenSearch response, an invalid record should be sent in `@ERROR` data pipeline.
1125
+
1126
+ ### emit_error_label_event
1127
+
1128
+ Default `emit_error_label_event` value is `true`.
1129
+
1130
+ Emitting error label events is default behavior.
1131
+
1132
+ When using the followin configuration, OpenSearch plugin will cut error events on error handler:
1133
+
1134
+ ```aconf
1135
+ emit_error_label_event false
1136
+ ```
1137
+
1100
1138
  ### verify_os_version_at_startup
1101
1139
 
1102
1140
  Because OpenSearch plugin will ought to change behavior each of OpenSearch major versions.
@@ -3,7 +3,7 @@ $:.push File.expand_path('../lib', __FILE__)
3
3
 
4
4
  Gem::Specification.new do |s|
5
5
  s.name = 'fluent-plugin-opensearch'
6
- s.version = '1.0.0'
6
+ s.version = '1.0.3'
7
7
  s.authors = ['Hiroshi Hatake']
8
8
  s.email = ['cosmo0920.wp@gmail.com']
9
9
  s.description = %q{Opensearch output plugin for Fluent event collector}
@@ -73,6 +73,7 @@ module Fluent::Plugin
73
73
  config_param :ca_file, :string, :default => nil
74
74
  config_param :ssl_version, :enum, list: [:SSLv23, :TLSv1, :TLSv1_1, :TLSv1_2], :default => :TLSv1_2
75
75
  config_param :with_transporter_log, :bool, :default => false
76
+ config_param :emit_error_label_event, :bool, :default => true
76
77
  config_param :sniffer_class_name, :string, :default => nil
77
78
  config_param :custom_headers, :hash, :default => {}
78
79
  config_param :docinfo_fields, :array, :default => ['_index', '_type', '_id']
@@ -180,6 +181,13 @@ module Fluent::Plugin
180
181
  }
181
182
  end
182
183
 
184
+ def emit_error_label_event(&block)
185
+ # If `emit_error_label_event` is specified as false, error event emittions are not occurred.
186
+ if emit_error_label_event
187
+ block.call
188
+ end
189
+ end
190
+
183
191
  def start
184
192
  super
185
193
 
@@ -224,7 +232,9 @@ module Fluent::Plugin
224
232
  def parse_time(value, event_time, tag)
225
233
  @timestamp_parser.call(value)
226
234
  rescue => e
227
- router.emit_error_event(@timestamp_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @timestamp_key_format, 'value' => value}, e)
235
+ emit_error_label_event do
236
+ router.emit_error_event(@timestamp_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @timestamp_key_format, 'value' => value}, e)
237
+ end
228
238
  return Time.at(event_time).to_time
229
239
  end
230
240
 
@@ -49,8 +49,12 @@ class Fluent::Plugin::OpenSearchErrorHandler
49
49
  unrecoverable_error_types.include?(type)
50
50
  end
51
51
 
52
+ def unrecoverable_record_types
53
+ @plugin.unrecoverable_record_types
54
+ end
55
+
52
56
  def unrecoverable_record_error?(type)
53
- ['json_parse_exception'].include?(type)
57
+ unrecoverable_record_types.include?(type)
54
58
  end
55
59
 
56
60
  def log_os_400_reason(&block)
@@ -61,6 +65,13 @@ class Fluent::Plugin::OpenSearchErrorHandler
61
65
  end
62
66
  end
63
67
 
68
+ def emit_error_label_event(&block)
69
+ # If `emit_error_label_event` is specified as false, error event emittions are not occurred.
70
+ if @plugin.emit_error_label_event
71
+ block.call
72
+ end
73
+ end
74
+
64
75
  def handle_error(response, tag, chunk, bulk_message_count, extracted_values)
65
76
  items = response['items']
66
77
  if items.nil? || !items.is_a?(Array)
@@ -127,12 +138,16 @@ class Fluent::Plugin::OpenSearchErrorHandler
127
138
  reason += " [reason]: \'#{item[write_operation]['error']['reason']}\'"
128
139
  end
129
140
  end
130
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("400 - Rejected by OpenSearch#{reason}"))
141
+ emit_error_label_event do
142
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("400 - Rejected by OpenSearch#{reason}"))
143
+ end
131
144
  else
132
145
  if item[write_operation]['error'].is_a?(String)
133
146
  reason = item[write_operation]['error']
134
147
  stats[:errors_block_resp] += 1
135
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{reason}"))
148
+ emit_error_label_event do
149
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{reason}"))
150
+ end
136
151
  next
137
152
  elsif item[write_operation].has_key?('error') && item[write_operation]['error'].has_key?('type')
138
153
  type = item[write_operation]['error']['type']
@@ -141,7 +156,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
141
156
  raise OpenSearchRequestAbortError, "Rejected OpenSearch due to #{type}"
142
157
  end
143
158
  if unrecoverable_record_error?(type)
144
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{type}: #{reason}"))
159
+ emit_error_label_event do
160
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{type}: #{reason}"))
161
+ end
145
162
  next
146
163
  else
147
164
  retry_stream.add(time, rawrecord) unless unrecoverable_record_error?(type)
@@ -150,7 +167,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
150
167
  # When we don't have a type field, something changed in the API
151
168
  # expected return values.
152
169
  stats[:errors_bad_resp] += 1
153
- @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - No error type provided in the response"))
170
+ emit_error_label_event do
171
+ @plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - No error type provided in the response"))
172
+ end
154
173
  next
155
174
  end
156
175
  stats[type] += 1
@@ -110,6 +110,7 @@ module Fluent::Plugin
110
110
  config_param :logstash_prefix_separator, :string, :default => '-'
111
111
  config_param :logstash_dateformat, :string, :default => "%Y.%m.%d"
112
112
  config_param :utc_index, :bool, :default => true
113
+ config_param :suppress_type_name, :bool, :default => false
113
114
  config_param :index_name, :string, :default => "fluentd"
114
115
  config_param :id_key, :string, :default => nil
115
116
  config_param :write_operation, :string, :default => "index"
@@ -160,6 +161,8 @@ module Fluent::Plugin
160
161
  config_param :validate_client_version, :bool, :default => false
161
162
  config_param :prefer_oj_serializer, :bool, :default => false
162
163
  config_param :unrecoverable_error_types, :array, :default => ["out_of_memory_error", "rejected_execution_exception"]
164
+ config_param :unrecoverable_record_types, :array, :default => ["json_parse_exception"]
165
+ config_param :emit_error_label_event, :bool, :default => true
163
166
  config_param :verify_os_version_at_startup, :bool, :default => true
164
167
  config_param :default_opensearch_version, :integer, :default => DEFAULT_OPENSEARCH_VERSION
165
168
  config_param :log_os_400_reason, :bool, :default => false
@@ -219,8 +222,8 @@ module Fluent::Plugin
219
222
  if conf[:assume_role_arn].nil?
220
223
  aws_container_credentials_relative_uri = conf[:ecs_container_credentials_relative_uri] || ENV["AWS_CONTAINER_CREDENTIALS_RELATIVE_URI"]
221
224
  if aws_container_credentials_relative_uri.nil?
222
- credentials = Aws::SharedCredentials.new({retries: 2}).credentials
223
- credentials ||= Aws::InstanceProfileCredentials.new.credentials
225
+ credentials = Aws::SharedCredentials.new({retries: 2}).credentials rescue nil
226
+ credentials ||= Aws::InstanceProfileCredentials.new.credentials rescue nil
224
227
  credentials ||= Aws::ECSCredentials.new.credentials
225
228
  else
226
229
  credentials = Aws::ECSCredentials.new({
@@ -462,6 +465,13 @@ module Fluent::Plugin
462
465
  placeholder_validities.include?(true)
463
466
  end
464
467
 
468
+ def emit_error_label_event(&block)
469
+ # If `emit_error_label_event` is specified as false, error event emittions are not occurred.
470
+ if @emit_error_label_event
471
+ block.call
472
+ end
473
+ end
474
+
465
475
  def compression
466
476
  !(@compression_level == :no_compression)
467
477
  end
@@ -578,7 +588,9 @@ module Fluent::Plugin
578
588
  def parse_time(value, event_time, tag)
579
589
  @time_parser.call(value)
580
590
  rescue => e
581
- router.emit_error_event(@time_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @time_key_format, 'value' => value}, e)
591
+ emit_error_label_event do
592
+ router.emit_error_event(@time_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @time_key_format, 'value' => value}, e)
593
+ end
582
594
  return Time.at(event_time).to_datetime
583
595
  end
584
596
 
@@ -870,7 +882,9 @@ module Fluent::Plugin
870
882
  end
871
883
  end
872
884
  rescue => e
873
- router.emit_error_event(tag, time, record, e)
885
+ emit_error_label_event do
886
+ router.emit_error_event(tag, time, record, e)
887
+ end
874
888
  end
875
889
  end
876
890
 
@@ -978,8 +992,12 @@ module Fluent::Plugin
978
992
  end
979
993
  end
980
994
 
981
- # OpenSearch only supports "_doc".
982
- target_type = DEFAULT_TYPE_NAME
995
+ if @suppress_type_name
996
+ target_type = nil
997
+ else
998
+ # OpenSearch only supports "_doc".
999
+ target_type = DEFAULT_TYPE_NAME
1000
+ end
983
1001
 
984
1002
  meta.clear
985
1003
  meta["_index".freeze] = target_index
@@ -29,12 +29,15 @@ module Fluent::Plugin
29
29
  @data_stream_names = []
30
30
  end
31
31
 
32
- @client = client
33
32
  unless @use_placeholder
34
33
  begin
35
34
  @data_stream_names = [@data_stream_name]
36
- create_index_template(@data_stream_name, @data_stream_template_name, @host)
37
- create_data_stream(@data_stream_name)
35
+ retry_operate(@max_retry_putting_template,
36
+ @fail_on_putting_template_retry_exceed,
37
+ @catch_transport_exception_on_retry) do
38
+ create_index_template(@data_stream_name, @data_stream_template_name)
39
+ create_data_stream(@data_stream_name)
40
+ end
38
41
  rescue => e
39
42
  raise Fluent::ConfigError, "Failed to create data stream: <#{@data_stream_name}> #{e.message}"
40
43
  end
@@ -43,7 +46,7 @@ module Fluent::Plugin
43
46
 
44
47
  def validate_data_stream_parameters
45
48
  {"data_stream_name" => @data_stream_name,
46
- "data_stream_template_name"=> @data_stream_template_name}.each do |parameter, value|
49
+ "data_stream_template_name" => @data_stream_template_name}.each do |parameter, value|
47
50
  unless valid_data_stream_parameters?(value)
48
51
  unless start_with_valid_characters?(value)
49
52
  if not_dots?(value)
@@ -65,30 +68,36 @@ module Fluent::Plugin
65
68
  end
66
69
  end
67
70
 
68
- def create_index_template(datastream_name, template_name, host)
69
- return if data_stream_exist?(datastream_name) or template_exists?(template_name, host)
70
- body = {
71
- "index_patterns" => ["#{datastream_name}*"],
72
- "data_stream" => {},
73
- }
74
- params = {
75
- name: template_name,
76
- body: body
77
- }
78
- retry_operate(@max_retry_putting_template,
79
- @fail_on_putting_template_retry_exceed,
80
- @catch_transport_exception_on_retry) do
81
- @client.indices.put_index_template(params)
71
+ def create_index_template(datastream_name, template_name, host = nil)
72
+ # Create index template from file
73
+ if @template_file
74
+ template_installation_actual(template_name, @customize_template, @application_name, datastream_name, host)
75
+ else # Create default index template
76
+ return if data_stream_exist?(datastream_name, host) or template_exists?(template_name, host)
77
+ body = {
78
+ "index_patterns" => ["#{datastream_name}*"],
79
+ "data_stream" => {},
80
+ }
81
+
82
+ params = {
83
+ name: template_name,
84
+ body: body
85
+ }
86
+ retry_operate(@max_retry_putting_template,
87
+ @fail_on_putting_template_retry_exceed,
88
+ @catch_transport_exception_on_retry) do
89
+ client(host).indices.put_index_template(params)
90
+ end
82
91
  end
83
92
  end
84
93
 
85
- def data_stream_exist?(datastream_name)
94
+ def data_stream_exist?(datastream_name, host = nil)
86
95
  params = {
87
96
  name: datastream_name
88
97
  }
89
98
  begin
90
99
  # TODO: Use X-Pack equivalent performing DataStream operation method on the following line
91
- response = @client.perform_request('GET', "/_data_stream/#{datastream_name}", {}, params)
100
+ response = client(host).perform_request('GET', "/_data_stream/#{datastream_name}", {}, params)
92
101
  return (not response.is_a?(OpenSearch::Transport::Transport::Errors::NotFound))
93
102
  rescue OpenSearch::Transport::Transport::Errors::NotFound => e
94
103
  log.info "Specified data stream does not exist. Will be created: <#{e}>"
@@ -96,8 +105,8 @@ module Fluent::Plugin
96
105
  end
97
106
  end
98
107
 
99
- def create_data_stream(datastream_name)
100
- return if data_stream_exist?(datastream_name)
108
+ def create_data_stream(datastream_name, host = nil)
109
+ return if data_stream_exist?(datastream_name, host)
101
110
  params = {
102
111
  name: datastream_name
103
112
  }
@@ -105,7 +114,7 @@ module Fluent::Plugin
105
114
  @fail_on_putting_template_retry_exceed,
106
115
  @catch_transport_exception_on_retry) do
107
116
  # TODO: Use X-Pack equivalent performing DataStream operation method on the following line
108
- @client.perform_request('PUT', "/_data_stream/#{datastream_name}", {}, params)
117
+ client(host).perform_request('PUT', "/_data_stream/#{datastream_name}", {}, params)
109
118
  end
110
119
  end
111
120
 
@@ -155,14 +164,19 @@ module Fluent::Plugin
155
164
  def write(chunk)
156
165
  data_stream_name = @data_stream_name
157
166
  data_stream_template_name = @data_stream_template_name
158
- host = @host
167
+ host = nil
159
168
  if @use_placeholder
169
+ host = if @hosts
170
+ extract_placeholders(@hosts, chunk)
171
+ else
172
+ extract_placeholders(@host, chunk)
173
+ end
160
174
  data_stream_name = extract_placeholders(@data_stream_name, chunk)
161
175
  data_stream_template_name = extract_placeholders(@data_stream_template_name, chunk)
162
176
  unless @data_stream_names.include?(data_stream_name)
163
177
  begin
164
178
  create_index_template(data_stream_name, data_stream_template_name, host)
165
- create_data_stream(data_stream_name)
179
+ create_data_stream(data_stream_name, host)
166
180
  @data_stream_names << data_stream_name
167
181
  rescue => e
168
182
  raise Fluent::ConfigError, "Failed to create data stream: <#{data_stream_name}> #{e.message}"
@@ -177,12 +191,22 @@ module Fluent::Plugin
177
191
  tag = chunk.metadata.tag
178
192
  chunk.msgpack_each do |time, record|
179
193
  next unless record.is_a? Hash
180
-
181
194
  begin
182
- record.merge!({"@timestamp" => Time.at(time).iso8601(@time_precision)})
195
+ if record.has_key?(TIMESTAMP_FIELD)
196
+ rts = record[TIMESTAMP_FIELD]
197
+ dt = parse_time(rts, time, tag)
198
+ elsif record.has_key?(@time_key)
199
+ rts = record[@time_key]
200
+ dt = parse_time(rts, time, tag)
201
+ else
202
+ dt = Time.at(time).to_datetime
203
+ end
204
+ record.merge!({"@timestamp" => dt.iso8601(@time_precision)})
183
205
  bulk_message = append_record_to_messages(CREATE_OP, {}, headers, record, bulk_message)
184
206
  rescue => e
185
- router.emit_error_event(tag, time, record, e)
207
+ emit_error_label_event do
208
+ router.emit_error_event(tag, time, record, e)
209
+ end
186
210
  end
187
211
  end
188
212
 
@@ -191,7 +215,7 @@ module Fluent::Plugin
191
215
  body: bulk_message
192
216
  }
193
217
  begin
194
- response = @client.bulk(params)
218
+ response = client(host).bulk(params)
195
219
  if response['errors']
196
220
  log.error "Could not bulk insert to Data Stream: #{data_stream_name} #{response}"
197
221
  end
@@ -0,0 +1,4 @@
1
+ {
2
+ "index_patterns": ["foo*"],
3
+ "data_stream": {}
4
+ }
@@ -35,14 +35,18 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
35
35
  attr_reader :log
36
36
  attr_reader :error_events
37
37
  attr_accessor :unrecoverable_error_types
38
+ attr_accessor :unrecoverable_record_types
38
39
  attr_accessor :log_os_400_reason
39
40
  attr_accessor :write_operation
41
+ attr_accessor :emit_error_label_event
40
42
  def initialize(log, log_os_400_reason = false)
41
43
  @log = log
42
44
  @write_operation = 'index'
43
45
  @error_events = []
44
46
  @unrecoverable_error_types = ["out_of_memory_error", "rejected_execution_exception"]
47
+ @unrecoverable_record_types = ["json_parse_exception"]
45
48
  @log_os_400_reason = log_os_400_reason
49
+ @emit_error_label_event = true
46
50
  end
47
51
 
48
52
  def router
@@ -135,6 +139,44 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
135
139
  end
136
140
  end
137
141
 
142
+ class TEST400ResponseReasonWithoutErrorLog < self
143
+ def setup
144
+ Fluent::Test.setup
145
+ @log_device = Fluent::Test::DummyLogDevice.new
146
+ dl_opts = {:log_level => ServerEngine::DaemonLogger::DEBUG}
147
+ logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
148
+ @log = Fluent::Log.new(logger)
149
+ @plugin = TestPlugin.new(@log)
150
+ @handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
151
+ @plugin.emit_error_label_event = false
152
+ end
153
+
154
+ def test_400_responses_reason_log
155
+ records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
156
+ response = parse_response(%({
157
+ "took" : 0,
158
+ "errors" : true,
159
+ "items" : [
160
+ {
161
+ "create" : {
162
+ "_index" : "foo",
163
+ "status" : 400,
164
+ "error" : {
165
+ "type" : "mapper_parsing_exception",
166
+ "reason" : "failed to parse"
167
+ }
168
+ }
169
+ }
170
+ ]
171
+ }))
172
+ chunk = MockChunk.new(records)
173
+ dummy_extracted_values = []
174
+ @handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
175
+ assert_equal(0, @plugin.error_events.size)
176
+ assert_true(@plugin.error_events.empty?)
177
+ end
178
+ end
179
+
138
180
  class TEST400ResponseReasonNoDebug < self
139
181
  def setup
140
182
  Fluent::Test.setup
@@ -177,6 +219,45 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
177
219
  end
178
220
  end
179
221
 
222
+ class TEST400ResponseReasonNoDebugAndNoErrorLog < self
223
+ def setup
224
+ Fluent::Test.setup
225
+ @log_device = Fluent::Test::DummyLogDevice.new
226
+ dl_opts = {:log_level => ServerEngine::DaemonLogger::INFO}
227
+ logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
228
+ @log = Fluent::Log.new(logger)
229
+ @plugin = TestPlugin.new(@log)
230
+ @handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
231
+ @plugin.log_os_400_reason = true
232
+ @plugin.emit_error_label_event = false
233
+ end
234
+
235
+ def test_400_responses_reason_log
236
+ records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
237
+ response = parse_response(%({
238
+ "took" : 0,
239
+ "errors" : true,
240
+ "items" : [
241
+ {
242
+ "create" : {
243
+ "_index" : "foo",
244
+ "status" : 400,
245
+ "error" : {
246
+ "type" : "mapper_parsing_exception",
247
+ "reason" : "failed to parse"
248
+ }
249
+ }
250
+ }
251
+ ]
252
+ }))
253
+ chunk = MockChunk.new(records)
254
+ dummy_extracted_values = []
255
+ @handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
256
+ assert_equal(0, @plugin.error_events.size)
257
+ assert_true(@plugin.error_events.empty?)
258
+ end
259
+ end
260
+
180
261
  def test_nil_items_responses
181
262
  records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
182
263
  response = parse_response(%({
@@ -2320,23 +2320,11 @@ class OpenSearchOutputTest < Test::Unit::TestCase
2320
2320
  end
2321
2321
 
2322
2322
  data(
2323
- "OpenSearch default"=> {"os_version" => 1, "_type" => "_doc"},
2323
+ "OpenSearch default"=> {"os_version" => 1, "_type" => "_doc", "suppress_type" => false},
2324
+ "Suppressed type" => {"os_version" => 1, "_type" => nil, "suppress_type" => true},
2324
2325
  )
2325
2326
  def test_writes_to_speficied_type(data)
2326
- driver('', data["os_version"]).configure("")
2327
- stub_opensearch
2328
- stub_opensearch_info
2329
- driver.run(default_tag: 'test') do
2330
- driver.feed(sample_record)
2331
- end
2332
- assert_equal(data['_type'], index_cmds.first['index']['_type'])
2333
- end
2334
-
2335
- data(
2336
- "OpenSearch 1"=> {"os_version" => 1, "_type" => "_doc"},
2337
- )
2338
- def test_writes_to_speficied_type_with_placeholders(data)
2339
- driver('', data["os_version"]).configure("")
2327
+ driver('', data["os_version"]).configure("suppress_type_name #{data['suppress_type']}")
2340
2328
  stub_opensearch
2341
2329
  stub_opensearch_info
2342
2330
  driver.run(default_tag: 'test') do
@@ -9,7 +9,7 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
9
9
  include FlexMock::TestCase
10
10
  include Fluent::Test::Helpers
11
11
 
12
- attr_accessor :bulk_records
12
+ attr_accessor :bulk_records, :index_cmds
13
13
 
14
14
  OPENSEARCH_DATA_STREAM_TYPE = "opensearch_data_stream"
15
15
 
@@ -61,58 +61,77 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
61
61
  DUPLICATED_DATA_STREAM_EXCEPTION = {"error": {}, "status": 400}
62
62
  NONEXISTENT_DATA_STREAM_EXCEPTION = {"error": {}, "status": 404}
63
63
 
64
- def stub_index_template(name="foo_tpl")
65
- stub_request(:put, "http://localhost:9200/_index_template/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
64
+ def stub_index_template(name="foo_tpl", url="http://localhost:9200")
65
+ stub_request(:put, "#{url}/_index_template/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
66
66
  end
67
67
 
68
- def stub_data_stream(name="foo")
69
- stub_request(:put, "http://localhost:9200/_data_stream/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
68
+ def stub_data_stream(name="foo", url="localhost:9200")
69
+ stub_request(:put, "#{url}/_data_stream/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
70
70
  end
71
71
 
72
- def stub_existent_data_stream?(name="foo")
73
- stub_request(:get, "http://localhost:9200/_data_stream/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
72
+ def stub_existent_data_stream?(name="foo", url="localhost:9200")
73
+ stub_request(:get, "#{url}/_data_stream/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
74
74
  end
75
75
 
76
- def stub_existent_template?(name="foo_tpl")
77
- stub_request(:get, "http://localhost:9200/_index_template/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
76
+ def stub_existent_template?(name="foo_tpl", url="localhost:9200")
77
+ stub_request(:get, "#{url}/_index_template/#{name}").to_return(:status => [200, RESPONSE_ACKNOWLEDGED])
78
78
  end
79
79
 
80
- def stub_nonexistent_data_stream?(name="foo")
81
- stub_request(:get, "http://localhost:9200/_data_stream/#{name}").to_return(:status => [404, OpenSearch::Transport::Transport::Errors::NotFound])
80
+ def stub_nonexistent_data_stream?(name="foo", url="localhost:9200")
81
+ stub_request(:get, "#{url}/_data_stream/#{name}").to_return(:status => [404, OpenSearch::Transport::Transport::Errors::NotFound])
82
82
  end
83
83
 
84
- def stub_nonexistent_template?(name="foo_tpl")
85
- stub_request(:get, "http://localhost:9200/_index_template/#{name}").to_return(:status => [404, OpenSearch::Transport::Transport::Errors::NotFound])
84
+ def stub_nonexistent_template?(name="foo_tpl", url="http://localhost:9200")
85
+ stub_request(:get, "#{url}/_index_template/#{name}").to_return(:status => [404, OpenSearch::Transport::Transport::Errors::NotFound])
86
86
  end
87
87
 
88
- def stub_bulk_feed(datastream_name="foo", template_name="foo_tpl")
89
- stub_request(:post, "http://localhost:9200/#{datastream_name}/_bulk").with do |req|
88
+ def stub_nonexistent_template_retry?(name="foo_tpl", url="http://localhost:9200")
89
+ stub_request(:get, "#{url}/_index_template/#{name}").
90
+ to_return({ status: 500, body: 'Internal Server Error' }, { status: 404, body: '{}' })
91
+ end
92
+
93
+ def stub_bulk_feed(datastream_name="foo", template_name="foo_tpl", url="http://localhost:9200")
94
+ stub_request(:post, "#{url}/#{datastream_name}/_bulk").with do |req|
90
95
  # bulk data must be pair of OP and records
91
96
  # {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
92
97
  # {"@timestamp": ...}
93
98
  @bulk_records += req.body.split("\n").size / 2
99
+ @index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
94
100
  end
95
- stub_request(:post, "http://localhost:9200/#{template_name}/_bulk").with do |req|
101
+ stub_request(:post, "http://#{url}#{template_name}/_bulk").with do |req|
96
102
  # bulk data must be pair of OP and records
97
103
  # {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
98
104
  # {"@timestamp": ...}
99
105
  @bulk_records += req.body.split("\n").size / 2
106
+ @index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
100
107
  end
101
108
  end
102
109
 
103
- def stub_elastic_info(url="http://localhost:9200/", version="1.2.2")
110
+ def stub_opensearch_info(url="http://localhost:9200/", version="1.2.2", headers={})
104
111
  body ="{\"version\":{\"number\":\"#{version}\", \"distribution\":\"opensearch\"},\"tagline\":\"The OpenSearch Project: https://opensearch.org/\"}"
105
- stub_request(:get, url).to_return({:status => 200, :body => body, :headers => { 'Content-Type' => 'json' } })
112
+ stub_request(:get, url).to_return({:status => 200, :body => body, :headers => { 'Content-Type' => 'json' }.merge(headers) })
106
113
  end
107
114
 
108
115
  def stub_default(datastream_name="foo", template_name="foo_tpl", host="http://localhost:9200")
109
- stub_elastic_info(host)
116
+ stub_opensearch_info(host)
110
117
  stub_nonexistent_template?(template_name)
111
118
  stub_index_template(template_name)
112
119
  stub_nonexistent_data_stream?(datastream_name)
113
120
  stub_data_stream(datastream_name)
114
121
  end
115
122
 
123
+ def stub_opensearch_with_store_index_command_counts(url="http://localhost:9200/_bulk")
124
+ if @index_command_counts == nil
125
+ @index_command_counts = {}
126
+ @index_command_counts.default = 0
127
+ end
128
+
129
+ stub_request(:post, url).with do |req|
130
+ index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
131
+ @index_command_counts[url] += index_cmds.size
132
+ end
133
+ end
134
+
116
135
  # ref. https://opensearch.org/docs/latest/opensearch/data-streams/
117
136
  class DataStreamNameTest < self
118
137
 
@@ -315,11 +334,71 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
315
334
  assert_equal "foo", driver(conf).instance.data_stream_name
316
335
  end
317
336
 
337
+ def test_hosts_list_configure
338
+ config = %{
339
+ hosts https://john:password@host1:443/elastic/,http://host2
340
+ path /default_path
341
+ user default_user
342
+ password default_password
343
+ data_stream_name default
344
+ }
345
+ stub_opensearch_info("https://host1:443/elastic//", "1.2.2",
346
+ {'Authorization'=>"Basic #{Base64.encode64('john:password').split.first}"})
347
+ stub_opensearch_info("http://host2/default_path/_data_stream/default", "1.2.2",
348
+ {'Authorization'=>"Basic #{Base64.encode64('john:password').split.first}"})
349
+ stub_existent_data_stream?("default", "https://host1/elastic/")
350
+ instance = driver(config).instance
351
+
352
+ assert_equal 2, instance.get_connection_options[:hosts].length
353
+ host1, host2 = instance.get_connection_options[:hosts]
354
+
355
+ assert_equal 'host1', host1[:host]
356
+ assert_equal 443, host1[:port]
357
+ assert_equal 'https', host1[:scheme]
358
+ assert_equal 'john', host1[:user]
359
+ assert_equal 'password', host1[:password]
360
+ assert_equal '/elastic/', host1[:path]
361
+
362
+ assert_equal 'host2', host2[:host]
363
+ assert_equal 'http', host2[:scheme]
364
+ assert_equal 'default_user', host2[:user]
365
+ assert_equal 'default_password', host2[:password]
366
+ assert_equal '/default_path', host2[:path]
367
+ end
368
+
369
+ def test_datastream_configure_retry
370
+ stub_opensearch_info
371
+ stub_nonexistent_template_retry?
372
+ stub_index_template
373
+ stub_nonexistent_data_stream?
374
+ stub_data_stream
375
+ conf = config_element(
376
+ 'ROOT', '', {
377
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
378
+ 'data_stream_name' => 'foo',
379
+ 'data_stream_template_name' => "foo_tpl"
380
+ })
381
+ assert_equal "foo", driver(conf).instance.data_stream_name
382
+ end
383
+
384
+ def test_template_file
385
+ stub_default
386
+ cwd = File.dirname(__FILE__)
387
+ conf = config_element(
388
+ 'ROOT', '', {
389
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
390
+ 'data_stream_name' => 'foo',
391
+ 'data_stream_template_name' => "foo_tpl",
392
+ 'template_file' => File.join(cwd, 'datastream_template.json')
393
+ })
394
+ assert_equal "foo", driver(conf).instance.data_stream_name
395
+ end
396
+
318
397
  def test_existent_data_stream
319
398
  stub_index_template
320
399
  stub_existent_data_stream?
321
400
  stub_data_stream
322
- stub_elastic_info
401
+ stub_opensearch_info
323
402
  conf = config_element(
324
403
  'ROOT', '', {
325
404
  '@type' => OPENSEARCH_DATA_STREAM_TYPE,
@@ -333,7 +412,7 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
333
412
  stub_index_template
334
413
  stub_existent_data_stream?
335
414
  stub_data_stream
336
- stub_elastic_info
415
+ stub_opensearch_info
337
416
  conf = config_element(
338
417
  'ROOT', '', {
339
418
  '@type' => OPENSEARCH_DATA_STREAM_TYPE,
@@ -432,13 +511,56 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
432
511
  '@type' => OPENSEARCH_DATA_STREAM_TYPE,
433
512
  'data_stream_name' => 'foo',
434
513
  'data_stream_template_name' => 'foo_tpl'
435
- })
514
+ })
436
515
  driver(conf).run(default_tag: 'test') do
437
516
  driver.feed(sample_record)
438
517
  end
439
518
  assert_equal 1, @bulk_records
440
519
  end
441
520
 
521
+ def test_placeholder_writes_to_multi_hosts
522
+ stub_default("foo_bar", "foo_tpl_bar")
523
+ hosts = [['192.168.33.50', 9201], ['192.168.33.51', 9201], ['192.168.33.52', 9201]]
524
+ hosts_string = hosts.map {|x| "#{x[0]}:#{x[1]}"}.compact.join(',')
525
+ hosts.each do |host_info|
526
+ host, port = host_info
527
+ stub_opensearch_with_store_index_command_counts("http://#{host}:#{port}/foo_bar/_bulk")
528
+ stub_opensearch_info("http://#{host}:#{port}/")
529
+ stub_request(:get, "http://#{host}:#{port}/_data_stream/foo_bar").
530
+ to_return(status: 200, body: "", headers: {})
531
+ end
532
+
533
+ conf = config_element(
534
+ 'ROOT', '', {
535
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
536
+ 'data_stream_name' => 'foo_${key1}',
537
+ 'data_stream_template_name' => 'foo_tpl_${key1}',
538
+ 'hosts' => "#{hosts_string}"
539
+ }, [config_element('buffer', 'tag,key1', {
540
+ 'timekey' => '1d'
541
+ }, [])])
542
+ driver(conf).run(default_tag: 'test') do
543
+ hashes = {
544
+ 'age' => rand(100),
545
+ 'key1' => 'bar'
546
+ }
547
+ 1000.times do
548
+ driver.feed(sample_record.merge(hashes))
549
+ end
550
+ end
551
+
552
+ # @note: we cannot make multi chunks with options (flush_interval, buffer_chunk_limit)
553
+ # it's Fluentd test driver's constraint
554
+ # so @index_command_counts.size is always 1
555
+ assert(@index_command_counts.size > 0, "not working with hosts options")
556
+
557
+ total = 0
558
+ @index_command_counts.each do |_, count|
559
+ total += count
560
+ end
561
+ assert_equal(2000, total)
562
+ end
563
+
442
564
  def test_template_retry_install_fails
443
565
  cwd = File.dirname(__FILE__)
444
566
  template_file = File.join(cwd, 'test_index_template.json')
@@ -463,7 +585,7 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
463
585
  connection_resets += 1
464
586
  raise Faraday::ConnectionFailed, "Test message"
465
587
  end
466
- stub_elastic_info("https://logs.google.com:778/")
588
+ stub_opensearch_info("https://logs.google.com:778/")
467
589
 
468
590
  assert_raise(Fluent::Plugin::OpenSearchError::RetryableOperationExhaustedFailure) do
469
591
  driver(config)
@@ -471,4 +593,72 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
471
593
 
472
594
  assert_equal(4, connection_resets)
473
595
  end
596
+
597
+ def test_uses_custom_time_key
598
+ stub_default
599
+ stub_bulk_feed
600
+ conf = config_element(
601
+ 'ROOT', '', {
602
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
603
+ 'data_stream_name' => 'foo',
604
+ 'data_stream_template_name' => 'foo_tpl',
605
+ 'time_key' => 'vtm'
606
+ })
607
+
608
+ ts = DateTime.new(2021,2,3).iso8601(9)
609
+ record = {
610
+ 'vtm' => ts,
611
+ 'message' => 'Sample Record'
612
+ }
613
+
614
+ driver(conf).run(default_tag: 'test') do
615
+ driver.feed(record)
616
+ end
617
+ assert(index_cmds[1].has_key? '@timestamp')
618
+ assert_equal(ts, index_cmds[1]['@timestamp'])
619
+ end
620
+
621
+ def test_uses_custom_time_key_with_format
622
+ stub_default
623
+ stub_bulk_feed
624
+ conf = config_element(
625
+ 'ROOT', '', {
626
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
627
+ 'data_stream_name' => 'foo',
628
+ 'data_stream_template_name' => 'foo_tpl',
629
+ 'time_key' => 'vtm',
630
+ 'time_key_format' => '%Y-%m-%d %H:%M:%S.%N%z'
631
+ })
632
+ ts = "2021-02-03 13:14:01.673+02:00"
633
+ record = {
634
+ 'vtm' => ts,
635
+ 'message' => 'Sample Record'
636
+ }
637
+ driver(conf).run(default_tag: 'test') do
638
+ driver.feed(record)
639
+ end
640
+ assert(index_cmds[1].has_key? '@timestamp')
641
+ assert_equal(DateTime.parse(ts).iso8601(9), index_cmds[1]['@timestamp'])
642
+ end
643
+
644
+ def test_record_no_timestamp
645
+ stub_default
646
+ stub_bulk_feed
647
+ stub_default
648
+ stub_bulk_feed
649
+ conf = config_element(
650
+ 'ROOT', '', {
651
+ '@type' => OPENSEARCH_DATA_STREAM_TYPE,
652
+ 'data_stream_name' => 'foo',
653
+ 'data_stream_template_name' => 'foo_tpl'
654
+ })
655
+ record = {
656
+ 'message' => 'Sample Record'
657
+ }
658
+ driver(conf).run(default_tag: 'test') do
659
+ driver.feed(record)
660
+ end
661
+ assert(index_cmds[1].has_key? '@timestamp')
662
+ end
663
+
474
664
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluent-plugin-opensearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0
4
+ version: 1.0.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Hiroshi Hatake
8
- autorequire:
8
+ autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2022-01-13 00:00:00.000000000 Z
11
+ date: 2022-03-25 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: fluentd
@@ -208,6 +208,7 @@ files:
208
208
  - lib/fluent/plugin/out_opensearch.rb
209
209
  - lib/fluent/plugin/out_opensearch_data_stream.rb
210
210
  - test/helper.rb
211
+ - test/plugin/datastream_template.json
211
212
  - test/plugin/test_alias_template.json
212
213
  - test/plugin/test_filter_opensearch_genid.rb
213
214
  - test/plugin/test_in_opensearch.rb
@@ -226,7 +227,7 @@ licenses:
226
227
  - Apache-2.0
227
228
  metadata:
228
229
  changelog_uri: https://github.com/fluent/fluent-plugin-opensearch/blob/master/History.md
229
- post_install_message:
230
+ post_install_message:
230
231
  rdoc_options: []
231
232
  require_paths:
232
233
  - lib
@@ -241,12 +242,13 @@ required_rubygems_version: !ruby/object:Gem::Requirement
241
242
  - !ruby/object:Gem::Version
242
243
  version: '0'
243
244
  requirements: []
244
- rubygems_version: 3.2.22
245
- signing_key:
245
+ rubygems_version: 3.2.30
246
+ signing_key:
246
247
  specification_version: 4
247
248
  summary: Opensearch output plugin for Fluent event collector
248
249
  test_files:
249
250
  - test/helper.rb
251
+ - test/plugin/datastream_template.json
250
252
  - test/plugin/test_alias_template.json
251
253
  - test/plugin/test_filter_opensearch_genid.rb
252
254
  - test/plugin/test_in_opensearch.rb