fluent-plugin-opensearch 1.0.1 → 1.0.4
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/History.md +15 -0
- data/README.OpenSearchInput.md +13 -0
- data/README.md +40 -2
- data/fluent-plugin-opensearch.gemspec +1 -1
- data/lib/fluent/plugin/in_opensearch.rb +11 -1
- data/lib/fluent/plugin/opensearch_error_handler.rb +24 -5
- data/lib/fluent/plugin/opensearch_index_template.rb +2 -2
- data/lib/fluent/plugin/out_opensearch.rb +24 -6
- data/lib/fluent/plugin/out_opensearch_data_stream.rb +40 -45
- data/test/plugin/datastream_template.json +4 -0
- data/test/plugin/test_opensearch_error_handler.rb +81 -0
- data/test/plugin/test_out_opensearch.rb +3 -15
- data/test/plugin/test_out_opensearch_data_stream.rb +142 -4
- metadata +5 -3
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 6d2bf385e2b57f561be89d6ccb74da8a95870668368643321c9217a83683d3ca
|
4
|
+
data.tar.gz: a5e622af13e62991eac0845ef4e0d33e12252f18c5073d335b78b8dc7055c2f1
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 88ad2347a77fdd8a95851f53514def3396e424634a1e906c7700b875000881e5cd6825d8441cc765912e838dc9b2068702949d6722343fac0d476b9993fb3c35
|
7
|
+
data.tar.gz: b0f852ea5f3b42778ebfcad0d32a657a554077806e3e500dc9eb4bb7625cc95188aded8437d6ba7ca43eeafdecfa4dd40f01220f2956d78b9eb9b476d2e6ece9
|
data/History.md
CHANGED
@@ -2,6 +2,21 @@
|
|
2
2
|
|
3
3
|
### [Unreleased]
|
4
4
|
|
5
|
+
### 1.0.4
|
6
|
+
- Automatically create data streams (#44)
|
7
|
+
|
8
|
+
### 1.0.3
|
9
|
+
- Configurable unrecoverable record types (#40)
|
10
|
+
- Handle exceptions on retrieving AWS credentials (#39)
|
11
|
+
- Suppress emit error label events (#38)
|
12
|
+
- Provide suppress_type_name parameter (#29)
|
13
|
+
- Honor @time_key (data streams) (#28)
|
14
|
+
|
15
|
+
### 1.0.2
|
16
|
+
- Honor @hosts parameter for Data Streams (#21)
|
17
|
+
- Use template_file for Data Streams (#20)
|
18
|
+
- Specify host argument if needed (#11)
|
19
|
+
|
5
20
|
### 1.0.1
|
6
21
|
- Add testcases for hosts parameter (#10)
|
7
22
|
- Permit to handle @hosts parameter (#9)
|
data/README.OpenSearchInput.md
CHANGED
@@ -17,6 +17,7 @@
|
|
17
17
|
+ [reload_on_failure](#reload_on_failure)
|
18
18
|
+ [resurrect_after](#resurrect_after)
|
19
19
|
+ [with_transporter_log](#with_transporter_log)
|
20
|
+
+ [emit_error_label_event](#emit-error-label-event)
|
20
21
|
+ [Client/host certificate options](#clienthost-certificate-options)
|
21
22
|
+ [sniffer_class_name](#sniffer-class-name)
|
22
23
|
+ [custom_headers](#custom_headers)
|
@@ -190,6 +191,18 @@ We recommend to set this true if you start to debug this plugin.
|
|
190
191
|
with_transporter_log true
|
191
192
|
```
|
192
193
|
|
194
|
+
### emit_error_label_event
|
195
|
+
|
196
|
+
Default `emit_error_label_event` value is `true`.
|
197
|
+
|
198
|
+
Emitting error label events is default behavior.
|
199
|
+
|
200
|
+
When using the followin configuration, OpenSearch plugin will cut error events on error handler:
|
201
|
+
|
202
|
+
```aconf
|
203
|
+
emit_error_label_event false
|
204
|
+
```
|
205
|
+
|
193
206
|
### Client/host certificate options
|
194
207
|
|
195
208
|
Need to verify OpenSearch's certificate? You can use the following parameter to specify a CA instead of using an environment variable.
|
data/README.md
CHANGED
@@ -76,6 +76,8 @@ Send your logs to OpenSearch (and search them with OpenSearch Dashboards maybe?)
|
|
76
76
|
+ [reload_after](#reload-after)
|
77
77
|
+ [validate_client_version](#validate-client-version)
|
78
78
|
+ [unrecoverable_error_types](#unrecoverable-error-types)
|
79
|
+
+ [unrecoverable_record_types](#unrecoverable-record-types)
|
80
|
+
+ [emit_error_label_event](#emit-error-label-event)
|
79
81
|
+ [verify os version at startup](#verify_os_version_at_startup)
|
80
82
|
+ [default_opensearch_version](#default_opensearch_version)
|
81
83
|
+ [custom_headers](#custom_headers)
|
@@ -356,6 +358,15 @@ utc_index true
|
|
356
358
|
|
357
359
|
By default, the records inserted into index `logstash-YYMMDD` with UTC (Coordinated Universal Time). This option allows to use local time if you describe utc_index to false.
|
358
360
|
|
361
|
+
|
362
|
+
### suppress_type_name
|
363
|
+
|
364
|
+
If OpenSearch cluster complains types removal warnings, this can be suppressed with:
|
365
|
+
|
366
|
+
```
|
367
|
+
suppress_type_name true
|
368
|
+
```
|
369
|
+
|
359
370
|
### target_index_key
|
360
371
|
|
361
372
|
Tell this plugin to find the index name to write to in the record under this key in preference to other mechanisms. Key can be specified as path to nested record using dot ('.') as a separator.
|
@@ -1036,11 +1047,11 @@ $ fluentd -r $sniffer [AND YOUR OTHER OPTIONS]
|
|
1036
1047
|
The default selector used by the `OpenSearch::Transport` class works well when Fluentd should behave round robin and random selector cases. This doesn't work well when Fluentd should behave fallbacking from exhausted ES cluster to normal ES cluster.
|
1037
1048
|
The parameter `selector_class_name` gives you the ability to provide your own Selector class to implement whatever selection nodes logic you require.
|
1038
1049
|
|
1039
|
-
The below configuration is using plugin built-in `
|
1050
|
+
The below configuration is using plugin built-in `OpenSearchFallbackSelector`:
|
1040
1051
|
|
1041
1052
|
```
|
1042
1053
|
hosts exhausted-host:9201,normal-host:9200
|
1043
|
-
selector_class_name "Fluent::Plugin::
|
1054
|
+
selector_class_name "Fluent::Plugin::OpenSeartchFallbackSelector"
|
1044
1055
|
```
|
1045
1056
|
|
1046
1057
|
#### Tips
|
@@ -1097,6 +1108,33 @@ Then, remove `rejected_execution_exception` from `unrecoverable_error_types` par
|
|
1097
1108
|
unrecoverable_error_types ["out_of_memory_error"]
|
1098
1109
|
```
|
1099
1110
|
|
1111
|
+
|
1112
|
+
### Unrecoverable Record Types
|
1113
|
+
|
1114
|
+
Default `unrecoverable_record_types` parameter is set up loosely.
|
1115
|
+
Because `json_parse_exception` is caused by invalid JSON record.
|
1116
|
+
Another possible unrecoverable record error should be included within this paramater.
|
1117
|
+
|
1118
|
+
If you want to handle `security_exception` as _unrecoverable exceptions_, please consider to change `unrecoverable_record_types` parameter from default value:
|
1119
|
+
|
1120
|
+
```
|
1121
|
+
unrecoverable_record_types ["json_parse_exception", "security_exception"]
|
1122
|
+
```
|
1123
|
+
|
1124
|
+
If this error type is included in OpenSearch response, an invalid record should be sent in `@ERROR` data pipeline.
|
1125
|
+
|
1126
|
+
### emit_error_label_event
|
1127
|
+
|
1128
|
+
Default `emit_error_label_event` value is `true`.
|
1129
|
+
|
1130
|
+
Emitting error label events is default behavior.
|
1131
|
+
|
1132
|
+
When using the followin configuration, OpenSearch plugin will cut error events on error handler:
|
1133
|
+
|
1134
|
+
```aconf
|
1135
|
+
emit_error_label_event false
|
1136
|
+
```
|
1137
|
+
|
1100
1138
|
### verify_os_version_at_startup
|
1101
1139
|
|
1102
1140
|
Because OpenSearch plugin will ought to change behavior each of OpenSearch major versions.
|
@@ -3,7 +3,7 @@ $:.push File.expand_path('../lib', __FILE__)
|
|
3
3
|
|
4
4
|
Gem::Specification.new do |s|
|
5
5
|
s.name = 'fluent-plugin-opensearch'
|
6
|
-
s.version = '1.0.
|
6
|
+
s.version = '1.0.4'
|
7
7
|
s.authors = ['Hiroshi Hatake']
|
8
8
|
s.email = ['cosmo0920.wp@gmail.com']
|
9
9
|
s.description = %q{Opensearch output plugin for Fluent event collector}
|
@@ -73,6 +73,7 @@ module Fluent::Plugin
|
|
73
73
|
config_param :ca_file, :string, :default => nil
|
74
74
|
config_param :ssl_version, :enum, list: [:SSLv23, :TLSv1, :TLSv1_1, :TLSv1_2], :default => :TLSv1_2
|
75
75
|
config_param :with_transporter_log, :bool, :default => false
|
76
|
+
config_param :emit_error_label_event, :bool, :default => true
|
76
77
|
config_param :sniffer_class_name, :string, :default => nil
|
77
78
|
config_param :custom_headers, :hash, :default => {}
|
78
79
|
config_param :docinfo_fields, :array, :default => ['_index', '_type', '_id']
|
@@ -180,6 +181,13 @@ module Fluent::Plugin
|
|
180
181
|
}
|
181
182
|
end
|
182
183
|
|
184
|
+
def emit_error_label_event(&block)
|
185
|
+
# If `emit_error_label_event` is specified as false, error event emittions are not occurred.
|
186
|
+
if emit_error_label_event
|
187
|
+
block.call
|
188
|
+
end
|
189
|
+
end
|
190
|
+
|
183
191
|
def start
|
184
192
|
super
|
185
193
|
|
@@ -224,7 +232,9 @@ module Fluent::Plugin
|
|
224
232
|
def parse_time(value, event_time, tag)
|
225
233
|
@timestamp_parser.call(value)
|
226
234
|
rescue => e
|
227
|
-
|
235
|
+
emit_error_label_event do
|
236
|
+
router.emit_error_event(@timestamp_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @timestamp_key_format, 'value' => value}, e)
|
237
|
+
end
|
228
238
|
return Time.at(event_time).to_time
|
229
239
|
end
|
230
240
|
|
@@ -49,8 +49,12 @@ class Fluent::Plugin::OpenSearchErrorHandler
|
|
49
49
|
unrecoverable_error_types.include?(type)
|
50
50
|
end
|
51
51
|
|
52
|
+
def unrecoverable_record_types
|
53
|
+
@plugin.unrecoverable_record_types
|
54
|
+
end
|
55
|
+
|
52
56
|
def unrecoverable_record_error?(type)
|
53
|
-
|
57
|
+
unrecoverable_record_types.include?(type)
|
54
58
|
end
|
55
59
|
|
56
60
|
def log_os_400_reason(&block)
|
@@ -61,6 +65,13 @@ class Fluent::Plugin::OpenSearchErrorHandler
|
|
61
65
|
end
|
62
66
|
end
|
63
67
|
|
68
|
+
def emit_error_label_event(&block)
|
69
|
+
# If `emit_error_label_event` is specified as false, error event emittions are not occurred.
|
70
|
+
if @plugin.emit_error_label_event
|
71
|
+
block.call
|
72
|
+
end
|
73
|
+
end
|
74
|
+
|
64
75
|
def handle_error(response, tag, chunk, bulk_message_count, extracted_values)
|
65
76
|
items = response['items']
|
66
77
|
if items.nil? || !items.is_a?(Array)
|
@@ -127,12 +138,16 @@ class Fluent::Plugin::OpenSearchErrorHandler
|
|
127
138
|
reason += " [reason]: \'#{item[write_operation]['error']['reason']}\'"
|
128
139
|
end
|
129
140
|
end
|
130
|
-
|
141
|
+
emit_error_label_event do
|
142
|
+
@plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("400 - Rejected by OpenSearch#{reason}"))
|
143
|
+
end
|
131
144
|
else
|
132
145
|
if item[write_operation]['error'].is_a?(String)
|
133
146
|
reason = item[write_operation]['error']
|
134
147
|
stats[:errors_block_resp] += 1
|
135
|
-
|
148
|
+
emit_error_label_event do
|
149
|
+
@plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{reason}"))
|
150
|
+
end
|
136
151
|
next
|
137
152
|
elsif item[write_operation].has_key?('error') && item[write_operation]['error'].has_key?('type')
|
138
153
|
type = item[write_operation]['error']['type']
|
@@ -141,7 +156,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
|
|
141
156
|
raise OpenSearchRequestAbortError, "Rejected OpenSearch due to #{type}"
|
142
157
|
end
|
143
158
|
if unrecoverable_record_error?(type)
|
144
|
-
|
159
|
+
emit_error_label_event do
|
160
|
+
@plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - #{type}: #{reason}"))
|
161
|
+
end
|
145
162
|
next
|
146
163
|
else
|
147
164
|
retry_stream.add(time, rawrecord) unless unrecoverable_record_error?(type)
|
@@ -150,7 +167,9 @@ class Fluent::Plugin::OpenSearchErrorHandler
|
|
150
167
|
# When we don't have a type field, something changed in the API
|
151
168
|
# expected return values.
|
152
169
|
stats[:errors_bad_resp] += 1
|
153
|
-
|
170
|
+
emit_error_label_event do
|
171
|
+
@plugin.router.emit_error_event(tag, time, rawrecord, OpenSearchError.new("#{status} - No error type provided in the response"))
|
172
|
+
end
|
154
173
|
next
|
155
174
|
end
|
156
175
|
stats[type] += 1
|
@@ -62,10 +62,10 @@ module Fluent::OpenSearchIndexTemplate
|
|
62
62
|
client.transport.transport.host_unreachable_exceptions
|
63
63
|
end
|
64
64
|
|
65
|
-
def retry_operate(max_retries, fail_on_retry_exceed = true,
|
65
|
+
def retry_operate(max_retries, fail_on_retry_exceed = true, catch_transport_exceptions = true)
|
66
66
|
return unless block_given?
|
67
67
|
retries = 0
|
68
|
-
transport_errors = OpenSearch::Transport::Transport::Errors.constants.map{ |c| OpenSearch::Transport::Transport::Errors.const_get c } if
|
68
|
+
transport_errors = OpenSearch::Transport::Transport::Errors.constants.map{ |c| OpenSearch::Transport::Transport::Errors.const_get c } if catch_transport_exceptions
|
69
69
|
begin
|
70
70
|
yield
|
71
71
|
rescue *host_unreachable_exceptions, *transport_errors, Timeout::Error => e
|
@@ -110,6 +110,7 @@ module Fluent::Plugin
|
|
110
110
|
config_param :logstash_prefix_separator, :string, :default => '-'
|
111
111
|
config_param :logstash_dateformat, :string, :default => "%Y.%m.%d"
|
112
112
|
config_param :utc_index, :bool, :default => true
|
113
|
+
config_param :suppress_type_name, :bool, :default => false
|
113
114
|
config_param :index_name, :string, :default => "fluentd"
|
114
115
|
config_param :id_key, :string, :default => nil
|
115
116
|
config_param :write_operation, :string, :default => "index"
|
@@ -160,6 +161,8 @@ module Fluent::Plugin
|
|
160
161
|
config_param :validate_client_version, :bool, :default => false
|
161
162
|
config_param :prefer_oj_serializer, :bool, :default => false
|
162
163
|
config_param :unrecoverable_error_types, :array, :default => ["out_of_memory_error", "rejected_execution_exception"]
|
164
|
+
config_param :unrecoverable_record_types, :array, :default => ["json_parse_exception"]
|
165
|
+
config_param :emit_error_label_event, :bool, :default => true
|
163
166
|
config_param :verify_os_version_at_startup, :bool, :default => true
|
164
167
|
config_param :default_opensearch_version, :integer, :default => DEFAULT_OPENSEARCH_VERSION
|
165
168
|
config_param :log_os_400_reason, :bool, :default => false
|
@@ -219,8 +222,8 @@ module Fluent::Plugin
|
|
219
222
|
if conf[:assume_role_arn].nil?
|
220
223
|
aws_container_credentials_relative_uri = conf[:ecs_container_credentials_relative_uri] || ENV["AWS_CONTAINER_CREDENTIALS_RELATIVE_URI"]
|
221
224
|
if aws_container_credentials_relative_uri.nil?
|
222
|
-
credentials = Aws::SharedCredentials.new({retries: 2}).credentials
|
223
|
-
credentials ||= Aws::InstanceProfileCredentials.new.credentials
|
225
|
+
credentials = Aws::SharedCredentials.new({retries: 2}).credentials rescue nil
|
226
|
+
credentials ||= Aws::InstanceProfileCredentials.new.credentials rescue nil
|
224
227
|
credentials ||= Aws::ECSCredentials.new.credentials
|
225
228
|
else
|
226
229
|
credentials = Aws::ECSCredentials.new({
|
@@ -462,6 +465,13 @@ module Fluent::Plugin
|
|
462
465
|
placeholder_validities.include?(true)
|
463
466
|
end
|
464
467
|
|
468
|
+
def emit_error_label_event(&block)
|
469
|
+
# If `emit_error_label_event` is specified as false, error event emittions are not occurred.
|
470
|
+
if @emit_error_label_event
|
471
|
+
block.call
|
472
|
+
end
|
473
|
+
end
|
474
|
+
|
465
475
|
def compression
|
466
476
|
!(@compression_level == :no_compression)
|
467
477
|
end
|
@@ -578,7 +588,9 @@ module Fluent::Plugin
|
|
578
588
|
def parse_time(value, event_time, tag)
|
579
589
|
@time_parser.call(value)
|
580
590
|
rescue => e
|
581
|
-
|
591
|
+
emit_error_label_event do
|
592
|
+
router.emit_error_event(@time_parse_error_tag, Fluent::Engine.now, {'tag' => tag, 'time' => event_time, 'format' => @time_key_format, 'value' => value}, e)
|
593
|
+
end
|
582
594
|
return Time.at(event_time).to_datetime
|
583
595
|
end
|
584
596
|
|
@@ -870,7 +882,9 @@ module Fluent::Plugin
|
|
870
882
|
end
|
871
883
|
end
|
872
884
|
rescue => e
|
873
|
-
|
885
|
+
emit_error_label_event do
|
886
|
+
router.emit_error_event(tag, time, record, e)
|
887
|
+
end
|
874
888
|
end
|
875
889
|
end
|
876
890
|
|
@@ -978,8 +992,12 @@ module Fluent::Plugin
|
|
978
992
|
end
|
979
993
|
end
|
980
994
|
|
981
|
-
|
982
|
-
|
995
|
+
if @suppress_type_name
|
996
|
+
target_type = nil
|
997
|
+
else
|
998
|
+
# OpenSearch only supports "_doc".
|
999
|
+
target_type = DEFAULT_TYPE_NAME
|
1000
|
+
end
|
983
1001
|
|
984
1002
|
meta.clear
|
985
1003
|
meta["_index".freeze] = target_index
|
@@ -29,16 +29,13 @@ module Fluent::Plugin
|
|
29
29
|
@data_stream_names = []
|
30
30
|
end
|
31
31
|
|
32
|
-
host = data_stream_connection
|
33
|
-
|
34
32
|
unless @use_placeholder
|
35
33
|
begin
|
36
34
|
@data_stream_names = [@data_stream_name]
|
37
35
|
retry_operate(@max_retry_putting_template,
|
38
36
|
@fail_on_putting_template_retry_exceed,
|
39
37
|
@catch_transport_exception_on_retry) do
|
40
|
-
create_index_template(@data_stream_name, @data_stream_template_name
|
41
|
-
create_data_stream(@data_stream_name, host)
|
38
|
+
create_index_template(@data_stream_name, @data_stream_template_name)
|
42
39
|
end
|
43
40
|
rescue => e
|
44
41
|
raise Fluent::ConfigError, "Failed to create data stream: <#{@data_stream_name}> #{e.message}"
|
@@ -46,18 +43,9 @@ module Fluent::Plugin
|
|
46
43
|
end
|
47
44
|
end
|
48
45
|
|
49
|
-
# FIXME: Currently, the first element from hosts is only used and extracted.
|
50
|
-
def data_stream_connection
|
51
|
-
if host = get_connection_options[:hosts].first
|
52
|
-
"#{host[:scheme]}://#{host[:host]}:#{host[:port]}#{host[:path]}"
|
53
|
-
else
|
54
|
-
@host
|
55
|
-
end
|
56
|
-
end
|
57
|
-
|
58
46
|
def validate_data_stream_parameters
|
59
47
|
{"data_stream_name" => @data_stream_name,
|
60
|
-
"data_stream_template_name"=> @data_stream_template_name}.each do |parameter, value|
|
48
|
+
"data_stream_template_name" => @data_stream_template_name}.each do |parameter, value|
|
61
49
|
unless valid_data_stream_parameters?(value)
|
62
50
|
unless start_with_valid_characters?(value)
|
63
51
|
if not_dots?(value)
|
@@ -80,19 +68,25 @@ module Fluent::Plugin
|
|
80
68
|
end
|
81
69
|
|
82
70
|
def create_index_template(datastream_name, template_name, host = nil)
|
83
|
-
|
84
|
-
|
85
|
-
|
86
|
-
|
87
|
-
|
88
|
-
|
89
|
-
|
90
|
-
|
91
|
-
|
92
|
-
|
93
|
-
|
94
|
-
|
95
|
-
|
71
|
+
# Create index template from file
|
72
|
+
if @template_file
|
73
|
+
template_installation_actual(template_name, @customize_template, @application_name, datastream_name, host)
|
74
|
+
else # Create default index template
|
75
|
+
return if data_stream_exist?(datastream_name, host) or template_exists?(template_name, host)
|
76
|
+
body = {
|
77
|
+
"index_patterns" => ["#{datastream_name}*"],
|
78
|
+
"data_stream" => {},
|
79
|
+
}
|
80
|
+
|
81
|
+
params = {
|
82
|
+
name: template_name,
|
83
|
+
body: body
|
84
|
+
}
|
85
|
+
retry_operate(@max_retry_putting_template,
|
86
|
+
@fail_on_putting_template_retry_exceed,
|
87
|
+
@catch_transport_exception_on_retry) do
|
88
|
+
client(host).indices.put_index_template(params)
|
89
|
+
end
|
96
90
|
end
|
97
91
|
end
|
98
92
|
|
@@ -110,19 +104,6 @@ module Fluent::Plugin
|
|
110
104
|
end
|
111
105
|
end
|
112
106
|
|
113
|
-
def create_data_stream(datastream_name, host = nil)
|
114
|
-
return if data_stream_exist?(datastream_name, host)
|
115
|
-
params = {
|
116
|
-
name: datastream_name
|
117
|
-
}
|
118
|
-
retry_operate(@max_retry_putting_template,
|
119
|
-
@fail_on_putting_template_retry_exceed,
|
120
|
-
@catch_transport_exception_on_retry) do
|
121
|
-
# TODO: Use X-Pack equivalent performing DataStream operation method on the following line
|
122
|
-
client(host).perform_request('PUT', "/_data_stream/#{datastream_name}", {}, params)
|
123
|
-
end
|
124
|
-
end
|
125
|
-
|
126
107
|
def template_exists?(name, host = nil)
|
127
108
|
if @use_legacy_template
|
128
109
|
client(host).indices.get_template(:name => name)
|
@@ -169,14 +150,18 @@ module Fluent::Plugin
|
|
169
150
|
def write(chunk)
|
170
151
|
data_stream_name = @data_stream_name
|
171
152
|
data_stream_template_name = @data_stream_template_name
|
172
|
-
host =
|
153
|
+
host = nil
|
173
154
|
if @use_placeholder
|
155
|
+
host = if @hosts
|
156
|
+
extract_placeholders(@hosts, chunk)
|
157
|
+
else
|
158
|
+
extract_placeholders(@host, chunk)
|
159
|
+
end
|
174
160
|
data_stream_name = extract_placeholders(@data_stream_name, chunk)
|
175
161
|
data_stream_template_name = extract_placeholders(@data_stream_template_name, chunk)
|
176
162
|
unless @data_stream_names.include?(data_stream_name)
|
177
163
|
begin
|
178
164
|
create_index_template(data_stream_name, data_stream_template_name, host)
|
179
|
-
create_data_stream(data_stream_name, host)
|
180
165
|
@data_stream_names << data_stream_name
|
181
166
|
rescue => e
|
182
167
|
raise Fluent::ConfigError, "Failed to create data stream: <#{data_stream_name}> #{e.message}"
|
@@ -191,12 +176,22 @@ module Fluent::Plugin
|
|
191
176
|
tag = chunk.metadata.tag
|
192
177
|
chunk.msgpack_each do |time, record|
|
193
178
|
next unless record.is_a? Hash
|
194
|
-
|
195
179
|
begin
|
196
|
-
record.
|
180
|
+
if record.has_key?(TIMESTAMP_FIELD)
|
181
|
+
rts = record[TIMESTAMP_FIELD]
|
182
|
+
dt = parse_time(rts, time, tag)
|
183
|
+
elsif record.has_key?(@time_key)
|
184
|
+
rts = record[@time_key]
|
185
|
+
dt = parse_time(rts, time, tag)
|
186
|
+
else
|
187
|
+
dt = Time.at(time).to_datetime
|
188
|
+
end
|
189
|
+
record.merge!({"@timestamp" => dt.iso8601(@time_precision)})
|
197
190
|
bulk_message = append_record_to_messages(CREATE_OP, {}, headers, record, bulk_message)
|
198
191
|
rescue => e
|
199
|
-
|
192
|
+
emit_error_label_event do
|
193
|
+
router.emit_error_event(tag, time, record, e)
|
194
|
+
end
|
200
195
|
end
|
201
196
|
end
|
202
197
|
|
@@ -35,14 +35,18 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
|
|
35
35
|
attr_reader :log
|
36
36
|
attr_reader :error_events
|
37
37
|
attr_accessor :unrecoverable_error_types
|
38
|
+
attr_accessor :unrecoverable_record_types
|
38
39
|
attr_accessor :log_os_400_reason
|
39
40
|
attr_accessor :write_operation
|
41
|
+
attr_accessor :emit_error_label_event
|
40
42
|
def initialize(log, log_os_400_reason = false)
|
41
43
|
@log = log
|
42
44
|
@write_operation = 'index'
|
43
45
|
@error_events = []
|
44
46
|
@unrecoverable_error_types = ["out_of_memory_error", "rejected_execution_exception"]
|
47
|
+
@unrecoverable_record_types = ["json_parse_exception"]
|
45
48
|
@log_os_400_reason = log_os_400_reason
|
49
|
+
@emit_error_label_event = true
|
46
50
|
end
|
47
51
|
|
48
52
|
def router
|
@@ -135,6 +139,44 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
|
|
135
139
|
end
|
136
140
|
end
|
137
141
|
|
142
|
+
class TEST400ResponseReasonWithoutErrorLog < self
|
143
|
+
def setup
|
144
|
+
Fluent::Test.setup
|
145
|
+
@log_device = Fluent::Test::DummyLogDevice.new
|
146
|
+
dl_opts = {:log_level => ServerEngine::DaemonLogger::DEBUG}
|
147
|
+
logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
|
148
|
+
@log = Fluent::Log.new(logger)
|
149
|
+
@plugin = TestPlugin.new(@log)
|
150
|
+
@handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
|
151
|
+
@plugin.emit_error_label_event = false
|
152
|
+
end
|
153
|
+
|
154
|
+
def test_400_responses_reason_log
|
155
|
+
records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
|
156
|
+
response = parse_response(%({
|
157
|
+
"took" : 0,
|
158
|
+
"errors" : true,
|
159
|
+
"items" : [
|
160
|
+
{
|
161
|
+
"create" : {
|
162
|
+
"_index" : "foo",
|
163
|
+
"status" : 400,
|
164
|
+
"error" : {
|
165
|
+
"type" : "mapper_parsing_exception",
|
166
|
+
"reason" : "failed to parse"
|
167
|
+
}
|
168
|
+
}
|
169
|
+
}
|
170
|
+
]
|
171
|
+
}))
|
172
|
+
chunk = MockChunk.new(records)
|
173
|
+
dummy_extracted_values = []
|
174
|
+
@handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
|
175
|
+
assert_equal(0, @plugin.error_events.size)
|
176
|
+
assert_true(@plugin.error_events.empty?)
|
177
|
+
end
|
178
|
+
end
|
179
|
+
|
138
180
|
class TEST400ResponseReasonNoDebug < self
|
139
181
|
def setup
|
140
182
|
Fluent::Test.setup
|
@@ -177,6 +219,45 @@ class TestOpenSearchErrorHandler < Test::Unit::TestCase
|
|
177
219
|
end
|
178
220
|
end
|
179
221
|
|
222
|
+
class TEST400ResponseReasonNoDebugAndNoErrorLog < self
|
223
|
+
def setup
|
224
|
+
Fluent::Test.setup
|
225
|
+
@log_device = Fluent::Test::DummyLogDevice.new
|
226
|
+
dl_opts = {:log_level => ServerEngine::DaemonLogger::INFO}
|
227
|
+
logger = ServerEngine::DaemonLogger.new(@log_device, dl_opts)
|
228
|
+
@log = Fluent::Log.new(logger)
|
229
|
+
@plugin = TestPlugin.new(@log)
|
230
|
+
@handler = Fluent::Plugin::OpenSearchErrorHandler.new(@plugin)
|
231
|
+
@plugin.log_os_400_reason = true
|
232
|
+
@plugin.emit_error_label_event = false
|
233
|
+
end
|
234
|
+
|
235
|
+
def test_400_responses_reason_log
|
236
|
+
records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
|
237
|
+
response = parse_response(%({
|
238
|
+
"took" : 0,
|
239
|
+
"errors" : true,
|
240
|
+
"items" : [
|
241
|
+
{
|
242
|
+
"create" : {
|
243
|
+
"_index" : "foo",
|
244
|
+
"status" : 400,
|
245
|
+
"error" : {
|
246
|
+
"type" : "mapper_parsing_exception",
|
247
|
+
"reason" : "failed to parse"
|
248
|
+
}
|
249
|
+
}
|
250
|
+
}
|
251
|
+
]
|
252
|
+
}))
|
253
|
+
chunk = MockChunk.new(records)
|
254
|
+
dummy_extracted_values = []
|
255
|
+
@handler.handle_error(response, 'atag', chunk, records.length, dummy_extracted_values)
|
256
|
+
assert_equal(0, @plugin.error_events.size)
|
257
|
+
assert_true(@plugin.error_events.empty?)
|
258
|
+
end
|
259
|
+
end
|
260
|
+
|
180
261
|
def test_nil_items_responses
|
181
262
|
records = [{time: 123, record: {"foo" => "bar", '_id' => 'abc'}}]
|
182
263
|
response = parse_response(%({
|
@@ -2320,23 +2320,11 @@ class OpenSearchOutputTest < Test::Unit::TestCase
|
|
2320
2320
|
end
|
2321
2321
|
|
2322
2322
|
data(
|
2323
|
-
"OpenSearch default"=> {"os_version" => 1, "_type" => "_doc"},
|
2323
|
+
"OpenSearch default"=> {"os_version" => 1, "_type" => "_doc", "suppress_type" => false},
|
2324
|
+
"Suppressed type" => {"os_version" => 1, "_type" => nil, "suppress_type" => true},
|
2324
2325
|
)
|
2325
2326
|
def test_writes_to_speficied_type(data)
|
2326
|
-
driver('', data["os_version"]).configure("")
|
2327
|
-
stub_opensearch
|
2328
|
-
stub_opensearch_info
|
2329
|
-
driver.run(default_tag: 'test') do
|
2330
|
-
driver.feed(sample_record)
|
2331
|
-
end
|
2332
|
-
assert_equal(data['_type'], index_cmds.first['index']['_type'])
|
2333
|
-
end
|
2334
|
-
|
2335
|
-
data(
|
2336
|
-
"OpenSearch 1"=> {"os_version" => 1, "_type" => "_doc"},
|
2337
|
-
)
|
2338
|
-
def test_writes_to_speficied_type_with_placeholders(data)
|
2339
|
-
driver('', data["os_version"]).configure("")
|
2327
|
+
driver('', data["os_version"]).configure("suppress_type_name #{data['suppress_type']}")
|
2340
2328
|
stub_opensearch
|
2341
2329
|
stub_opensearch_info
|
2342
2330
|
driver.run(default_tag: 'test') do
|
@@ -9,7 +9,7 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
|
|
9
9
|
include FlexMock::TestCase
|
10
10
|
include Fluent::Test::Helpers
|
11
11
|
|
12
|
-
attr_accessor :bulk_records
|
12
|
+
attr_accessor :bulk_records, :index_cmds
|
13
13
|
|
14
14
|
OPENSEARCH_DATA_STREAM_TYPE = "opensearch_data_stream"
|
15
15
|
|
@@ -96,12 +96,14 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
|
|
96
96
|
# {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
|
97
97
|
# {"@timestamp": ...}
|
98
98
|
@bulk_records += req.body.split("\n").size / 2
|
99
|
+
@index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
|
99
100
|
end
|
100
101
|
stub_request(:post, "http://#{url}#{template_name}/_bulk").with do |req|
|
101
102
|
# bulk data must be pair of OP and records
|
102
103
|
# {"create": {}}\nhttp://localhost:9200/_data_stream/foo_bar
|
103
104
|
# {"@timestamp": ...}
|
104
105
|
@bulk_records += req.body.split("\n").size / 2
|
106
|
+
@index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
|
105
107
|
end
|
106
108
|
end
|
107
109
|
|
@@ -118,6 +120,18 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
|
|
118
120
|
stub_data_stream(datastream_name)
|
119
121
|
end
|
120
122
|
|
123
|
+
def stub_opensearch_with_store_index_command_counts(url="http://localhost:9200/_bulk")
|
124
|
+
if @index_command_counts == nil
|
125
|
+
@index_command_counts = {}
|
126
|
+
@index_command_counts.default = 0
|
127
|
+
end
|
128
|
+
|
129
|
+
stub_request(:post, url).with do |req|
|
130
|
+
index_cmds = req.body.split("\n").map {|r| JSON.parse(r) }
|
131
|
+
@index_command_counts[url] += index_cmds.size
|
132
|
+
end
|
133
|
+
end
|
134
|
+
|
121
135
|
# ref. https://opensearch.org/docs/latest/opensearch/data-streams/
|
122
136
|
class DataStreamNameTest < self
|
123
137
|
|
@@ -329,9 +343,9 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
|
|
329
343
|
data_stream_name default
|
330
344
|
}
|
331
345
|
stub_opensearch_info("https://host1:443/elastic//", "1.2.2",
|
332
|
-
{'Authorization'=>
|
346
|
+
{'Authorization'=>"Basic #{Base64.encode64('john:password').split.first}"})
|
333
347
|
stub_opensearch_info("http://host2/default_path/_data_stream/default", "1.2.2",
|
334
|
-
{'Authorization'=>
|
348
|
+
{'Authorization'=>"Basic #{Base64.encode64('john:password').split.first}"})
|
335
349
|
stub_existent_data_stream?("default", "https://host1/elastic/")
|
336
350
|
instance = driver(config).instance
|
337
351
|
|
@@ -367,6 +381,19 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
|
|
367
381
|
assert_equal "foo", driver(conf).instance.data_stream_name
|
368
382
|
end
|
369
383
|
|
384
|
+
def test_template_file
|
385
|
+
stub_default
|
386
|
+
cwd = File.dirname(__FILE__)
|
387
|
+
conf = config_element(
|
388
|
+
'ROOT', '', {
|
389
|
+
'@type' => OPENSEARCH_DATA_STREAM_TYPE,
|
390
|
+
'data_stream_name' => 'foo',
|
391
|
+
'data_stream_template_name' => "foo_tpl",
|
392
|
+
'template_file' => File.join(cwd, 'datastream_template.json')
|
393
|
+
})
|
394
|
+
assert_equal "foo", driver(conf).instance.data_stream_name
|
395
|
+
end
|
396
|
+
|
370
397
|
def test_existent_data_stream
|
371
398
|
stub_index_template
|
372
399
|
stub_existent_data_stream?
|
@@ -484,13 +511,56 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
|
|
484
511
|
'@type' => OPENSEARCH_DATA_STREAM_TYPE,
|
485
512
|
'data_stream_name' => 'foo',
|
486
513
|
'data_stream_template_name' => 'foo_tpl'
|
487
|
-
|
514
|
+
})
|
488
515
|
driver(conf).run(default_tag: 'test') do
|
489
516
|
driver.feed(sample_record)
|
490
517
|
end
|
491
518
|
assert_equal 1, @bulk_records
|
492
519
|
end
|
493
520
|
|
521
|
+
def test_placeholder_writes_to_multi_hosts
|
522
|
+
stub_default("foo_bar", "foo_tpl_bar")
|
523
|
+
hosts = [['192.168.33.50', 9201], ['192.168.33.51', 9201], ['192.168.33.52', 9201]]
|
524
|
+
hosts_string = hosts.map {|x| "#{x[0]}:#{x[1]}"}.compact.join(',')
|
525
|
+
hosts.each do |host_info|
|
526
|
+
host, port = host_info
|
527
|
+
stub_opensearch_with_store_index_command_counts("http://#{host}:#{port}/foo_bar/_bulk")
|
528
|
+
stub_opensearch_info("http://#{host}:#{port}/")
|
529
|
+
stub_request(:get, "http://#{host}:#{port}/_data_stream/foo_bar").
|
530
|
+
to_return(status: 200, body: "", headers: {})
|
531
|
+
end
|
532
|
+
|
533
|
+
conf = config_element(
|
534
|
+
'ROOT', '', {
|
535
|
+
'@type' => OPENSEARCH_DATA_STREAM_TYPE,
|
536
|
+
'data_stream_name' => 'foo_${key1}',
|
537
|
+
'data_stream_template_name' => 'foo_tpl_${key1}',
|
538
|
+
'hosts' => "#{hosts_string}"
|
539
|
+
}, [config_element('buffer', 'tag,key1', {
|
540
|
+
'timekey' => '1d'
|
541
|
+
}, [])])
|
542
|
+
driver(conf).run(default_tag: 'test') do
|
543
|
+
hashes = {
|
544
|
+
'age' => rand(100),
|
545
|
+
'key1' => 'bar'
|
546
|
+
}
|
547
|
+
1000.times do
|
548
|
+
driver.feed(sample_record.merge(hashes))
|
549
|
+
end
|
550
|
+
end
|
551
|
+
|
552
|
+
# @note: we cannot make multi chunks with options (flush_interval, buffer_chunk_limit)
|
553
|
+
# it's Fluentd test driver's constraint
|
554
|
+
# so @index_command_counts.size is always 1
|
555
|
+
assert(@index_command_counts.size > 0, "not working with hosts options")
|
556
|
+
|
557
|
+
total = 0
|
558
|
+
@index_command_counts.each do |_, count|
|
559
|
+
total += count
|
560
|
+
end
|
561
|
+
assert_equal(2000, total)
|
562
|
+
end
|
563
|
+
|
494
564
|
def test_template_retry_install_fails
|
495
565
|
cwd = File.dirname(__FILE__)
|
496
566
|
template_file = File.join(cwd, 'test_index_template.json')
|
@@ -523,4 +593,72 @@ class OpenSearchOutputDataStreamTest < Test::Unit::TestCase
|
|
523
593
|
|
524
594
|
assert_equal(4, connection_resets)
|
525
595
|
end
|
596
|
+
|
597
|
+
def test_uses_custom_time_key
|
598
|
+
stub_default
|
599
|
+
stub_bulk_feed
|
600
|
+
conf = config_element(
|
601
|
+
'ROOT', '', {
|
602
|
+
'@type' => OPENSEARCH_DATA_STREAM_TYPE,
|
603
|
+
'data_stream_name' => 'foo',
|
604
|
+
'data_stream_template_name' => 'foo_tpl',
|
605
|
+
'time_key' => 'vtm'
|
606
|
+
})
|
607
|
+
|
608
|
+
ts = DateTime.new(2021,2,3).iso8601(9)
|
609
|
+
record = {
|
610
|
+
'vtm' => ts,
|
611
|
+
'message' => 'Sample Record'
|
612
|
+
}
|
613
|
+
|
614
|
+
driver(conf).run(default_tag: 'test') do
|
615
|
+
driver.feed(record)
|
616
|
+
end
|
617
|
+
assert(index_cmds[1].has_key? '@timestamp')
|
618
|
+
assert_equal(ts, index_cmds[1]['@timestamp'])
|
619
|
+
end
|
620
|
+
|
621
|
+
def test_uses_custom_time_key_with_format
|
622
|
+
stub_default
|
623
|
+
stub_bulk_feed
|
624
|
+
conf = config_element(
|
625
|
+
'ROOT', '', {
|
626
|
+
'@type' => OPENSEARCH_DATA_STREAM_TYPE,
|
627
|
+
'data_stream_name' => 'foo',
|
628
|
+
'data_stream_template_name' => 'foo_tpl',
|
629
|
+
'time_key' => 'vtm',
|
630
|
+
'time_key_format' => '%Y-%m-%d %H:%M:%S.%N%z'
|
631
|
+
})
|
632
|
+
ts = "2021-02-03 13:14:01.673+02:00"
|
633
|
+
record = {
|
634
|
+
'vtm' => ts,
|
635
|
+
'message' => 'Sample Record'
|
636
|
+
}
|
637
|
+
driver(conf).run(default_tag: 'test') do
|
638
|
+
driver.feed(record)
|
639
|
+
end
|
640
|
+
assert(index_cmds[1].has_key? '@timestamp')
|
641
|
+
assert_equal(DateTime.parse(ts).iso8601(9), index_cmds[1]['@timestamp'])
|
642
|
+
end
|
643
|
+
|
644
|
+
def test_record_no_timestamp
|
645
|
+
stub_default
|
646
|
+
stub_bulk_feed
|
647
|
+
stub_default
|
648
|
+
stub_bulk_feed
|
649
|
+
conf = config_element(
|
650
|
+
'ROOT', '', {
|
651
|
+
'@type' => OPENSEARCH_DATA_STREAM_TYPE,
|
652
|
+
'data_stream_name' => 'foo',
|
653
|
+
'data_stream_template_name' => 'foo_tpl'
|
654
|
+
})
|
655
|
+
record = {
|
656
|
+
'message' => 'Sample Record'
|
657
|
+
}
|
658
|
+
driver(conf).run(default_tag: 'test') do
|
659
|
+
driver.feed(record)
|
660
|
+
end
|
661
|
+
assert(index_cmds[1].has_key? '@timestamp')
|
662
|
+
end
|
663
|
+
|
526
664
|
end
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: fluent-plugin-opensearch
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 1.0.
|
4
|
+
version: 1.0.4
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Hiroshi Hatake
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2022-
|
11
|
+
date: 2022-04-12 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
name: fluentd
|
@@ -208,6 +208,7 @@ files:
|
|
208
208
|
- lib/fluent/plugin/out_opensearch.rb
|
209
209
|
- lib/fluent/plugin/out_opensearch_data_stream.rb
|
210
210
|
- test/helper.rb
|
211
|
+
- test/plugin/datastream_template.json
|
211
212
|
- test/plugin/test_alias_template.json
|
212
213
|
- test/plugin/test_filter_opensearch_genid.rb
|
213
214
|
- test/plugin/test_in_opensearch.rb
|
@@ -241,12 +242,13 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
241
242
|
- !ruby/object:Gem::Version
|
242
243
|
version: '0'
|
243
244
|
requirements: []
|
244
|
-
rubygems_version: 3.2.
|
245
|
+
rubygems_version: 3.2.32
|
245
246
|
signing_key:
|
246
247
|
specification_version: 4
|
247
248
|
summary: Opensearch output plugin for Fluent event collector
|
248
249
|
test_files:
|
249
250
|
- test/helper.rb
|
251
|
+
- test/plugin/datastream_template.json
|
250
252
|
- test/plugin/test_alias_template.json
|
251
253
|
- test/plugin/test_filter_opensearch_genid.rb
|
252
254
|
- test/plugin/test_in_opensearch.rb
|