logstash-output-elasticsearch 11.7.0-java → 11.9.0-java

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 0002c4b7258cc2493f3ab5e1ddadf6dc3f3e6d51c3585af9c93f9d9a578cf01f
4
- data.tar.gz: 1f4b7e4587ee5831ac32f499dba2e73324f5b00578db20369b46e2b53b0fa874
3
+ metadata.gz: ad30f3f3af7faaebf3409ac23375e99531def7d30940330f0f617e007f341b1d
4
+ data.tar.gz: d06a9954211995b266d08cfbc6586164275d5d269ef5716ba18d4f7e3ca4cf5c
5
5
  SHA512:
6
- metadata.gz: a45147a1fc1f7d73bae911f05417d36150796edae9bd7622c11dbeb35bebbc938c3eddc7d7f65652743dfe94425ba4b1cc6905eaae4390116e2fa19d5c1f4be3
7
- data.tar.gz: 2ad8fec97436ac571436407711793d8a97055f7a2fbe986ce9f75417d0ce414a01467c900c47c08e1caa7eb6ffd7884edcabedb12b3684c5b40ed543c040fc21
6
+ metadata.gz: ad84072b2dcc3d3a567b18cc89bbd3390e93303e952188053a8f61cbefbddb0a8bf46453fb8df0b94cecaf29304452b85fbe7a822bb9ece0763c4278b514ff91
7
+ data.tar.gz: 348d53c674b7b2730ce918403e705ddbda5f9745ca2564e922d3762bff9de7a455ff871c269669dc1e74d5e0c5f89c7997221eca4c851a073b1540b2a8bbdc15
data/CHANGELOG.md CHANGED
@@ -1,3 +1,9 @@
1
+ ## 11.9.0
2
+ - Feature: force unresolved dynamic index names to be sent into DLQ. This feature could be explicitly disabled using `dlq_on_failed_indexname_interpolation` setting [#1084](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1084)
3
+
4
+ ## 11.8.0
5
+ - Feature: Adds a new `dlq_custom_codes` option to customize DLQ codes [#1067](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1067)
6
+
1
7
  ## 11.7.0
2
8
  - Feature: deprecates the `failure_type_logging_whitelist` configuration option, renaming it `silence_errors_in_log` [#1068](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1068)
3
9
 
data/docs/index.asciidoc CHANGED
@@ -196,6 +196,8 @@ processed at a later time. Often times, the offending field can be removed and
196
196
  re-indexed to Elasticsearch. If the DLQ is not enabled, and a mapping error
197
197
  happens, the problem is logged as a warning, and the event is dropped. See
198
198
  <<dead-letter-queues>> for more information about processing events in the DLQ.
199
+ The list of error codes accepted for DLQ could be customized with <<plugins-{type}s-{plugin}-dlq_custom_codes>>
200
+ but should be used only in motivated cases.
199
201
 
200
202
  [id="plugins-{type}s-{plugin}-ilm"]
201
203
  ==== Index Lifecycle Management
@@ -317,6 +319,8 @@ This plugin supports the following configuration options plus the
317
319
  | <<plugins-{type}s-{plugin}-data_stream_namespace>> |<<string,string>>|No
318
320
  | <<plugins-{type}s-{plugin}-data_stream_sync_fields>> |<<boolean,boolean>>|No
319
321
  | <<plugins-{type}s-{plugin}-data_stream_type>> |<<string,string>>|No
322
+ | <<plugins-{type}s-{plugin}-dlq_custom_codes>> |<<number,number>>|No
323
+ | <<plugins-{type}s-{plugin}-dlq_on_failed_indexname_interpolation>> |<<boolean,boolean>>|No
320
324
  | <<plugins-{type}s-{plugin}-doc_as_upsert>> |<<boolean,boolean>>|No
321
325
  | <<plugins-{type}s-{plugin}-document_id>> |<<string,string>>|No
322
326
  | <<plugins-{type}s-{plugin}-document_type>> |<<string,string>>|No
@@ -519,6 +523,25 @@ overwritten with a warning.
519
523
  The data stream type used to construct the data stream at index time.
520
524
  Currently, only `logs`, `metrics`, `synthetics` and `traces` are supported.
521
525
 
526
+ [id="plugins-{type}s-{plugin}-dlq_custom_codes"]
527
+ ===== `dlq_custom_codes`
528
+
529
+ * Value type is <<number,number>>
530
+ * Default value is `[]`.
531
+
532
+ List single-action error codes from Elasticsearch's Bulk API that are considered valid to move the events into the dead letter queue.
533
+ This list is an addition to the ordinary error codes considered for this feature, 400 and 404.
534
+ It's considered a configuration error to re-use the same predefined codes for success, DLQ or conflict.
535
+ The option accepts a list of natural numbers corresponding to HTTP errors codes.
536
+
537
+ [id="plugins-{type}s-{plugin}-dlq_on_failed_indexname_interpolation"]
538
+ ===== `dlq_on_failed_indexname_interpolation`
539
+
540
+ * Value type is <<boolean,boolean>>
541
+ * Default value is `true`.
542
+
543
+ If enabled, failed index name interpolation events go into dead letter queue.
544
+
522
545
  [id="plugins-{type}s-{plugin}-doc_as_upsert"]
523
546
  ===== `doc_as_upsert`
524
547
 
@@ -9,6 +9,7 @@ require "socket" # for Socket.gethostname
9
9
  require "thread" # for safe queueing
10
10
  require "uri" # for escaping user input
11
11
  require "forwardable"
12
+ require "set"
12
13
 
13
14
  # .Compatibility Note
14
15
  # [NOTE]
@@ -255,6 +256,14 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
255
256
  # ILM policy to use, if undefined the default policy will be used.
256
257
  config :ilm_policy, :validate => :string, :default => DEFAULT_POLICY
257
258
 
259
+ # List extra HTTP's error codes that are considered valid to move the events into the dead letter queue.
260
+ # It's considered a configuration error to re-use the same predefined codes for success, DLQ or conflict.
261
+ # The option accepts a list of natural numbers corresponding to HTTP errors codes.
262
+ config :dlq_custom_codes, :validate => :number, :list => true, :default => []
263
+
264
+ # if enabled, failed index name interpolation events go into dead letter queue.
265
+ config :dlq_on_failed_indexname_interpolation, :validate => :boolean, :default => true
266
+
258
267
  attr_reader :client
259
268
  attr_reader :default_index
260
269
  attr_reader :default_ilm_rollover_alias
@@ -301,6 +310,15 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
301
310
  # To support BWC, we check if DLQ exists in core (< 5.4). If it doesn't, we use nil to resort to previous behavior.
302
311
  @dlq_writer = dlq_enabled? ? execution_context.dlq_writer : nil
303
312
 
313
+ @dlq_codes = DOC_DLQ_CODES.to_set
314
+
315
+ if dlq_enabled?
316
+ check_dlq_custom_codes
317
+ @dlq_codes.merge(dlq_custom_codes)
318
+ else
319
+ raise LogStash::ConfigurationError, "DLQ feature (dlq_custom_codes) is configured while DLQ is not enabled" unless dlq_custom_codes.empty?
320
+ end
321
+
304
322
  if data_stream_config?
305
323
  @event_mapper = -> (e) { data_stream_event_action_tuple(e) }
306
324
  @event_target = -> (e) { data_stream_name(e) }
@@ -347,11 +365,43 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
347
365
  # Receive an array of events and immediately attempt to index them (no buffering)
348
366
  def multi_receive(events)
349
367
  wait_for_successful_connection if @after_successful_connection_done
350
- retrying_submit map_events(events)
368
+ events_mapped = safe_interpolation_map_events(events)
369
+ retrying_submit(events_mapped.successful_events)
370
+ unless events_mapped.failed_events.empty?
371
+ @logger.error("Can't map some events, needs to be handled by DLQ #{events_mapped.failed_events}")
372
+ send_failed_resolutions_to_dlq(events_mapped.failed_events)
373
+ end
351
374
  end
352
375
 
376
+ # @param: Arrays of EventActionTuple
377
+ private
378
+ def send_failed_resolutions_to_dlq(failed_action_tuples)
379
+ failed_action_tuples.each do |action|
380
+ handle_dlq_status(action, "warn", "Could not resolve dynamic index")
381
+ end
382
+ end
383
+
384
+ MapEventsResult = Struct.new(:successful_events, :failed_events)
385
+
386
+ private
387
+ def safe_interpolation_map_events(events)
388
+ successful_events = [] # list of LogStash::Outputs::ElasticSearch::EventActionTuple
389
+ failed_events = [] # list of LogStash::Event
390
+ events.each do |event|
391
+ begin
392
+ successful_events << @event_mapper.call(event)
393
+ rescue IndexInterpolationError, e
394
+ action = event.sprintf(@action || 'index')
395
+ event_action_tuple = EventActionTuple.new(action, [], event)
396
+ failed_events << event_action_tuple
397
+ end
398
+ end
399
+ MapEventsResult.new(successful_events, failed_events)
400
+ end
401
+
402
+ public
353
403
  def map_events(events)
354
- events.map(&@event_mapper)
404
+ safe_interpolation_map_events(events).successful_events
355
405
  end
356
406
 
357
407
  def wait_for_successful_connection
@@ -426,12 +476,24 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
426
476
 
427
477
  end
428
478
 
479
+ class IndexInterpolationError < ArgumentError
480
+ attr_reader :bad_formatted_index
481
+
482
+ def initialize(bad_formatted_index)
483
+ super("Badly formatted index, after interpolation still contains placeholder: [#{bad_formatted_index}]")
484
+
485
+ @bad_formatted_index = bad_formatted_index
486
+ end
487
+ end
488
+
429
489
  # @return Hash (initial) parameters for given event
430
490
  # @private shared event params factory between index and data_stream mode
431
491
  def common_event_params(event)
492
+ sprintf_index = @event_target.call(event)
493
+ raise IndexInterpolationError, sprintf_index if sprintf_index.match(/%{.*?}/) && dlq_on_failed_indexname_interpolation
432
494
  params = {
433
495
  :_id => @document_id ? event.sprintf(@document_id) : nil,
434
- :_index => @event_target.call(event),
496
+ :_index => sprintf_index,
435
497
  routing_field_name => @routing ? event.sprintf(@routing) : nil
436
498
  }
437
499
 
@@ -539,4 +601,15 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
539
601
 
540
602
  raise LogStash::ConfigurationError, "Action '#{@action}' is invalid! Pick one of #{valid_actions} or use a sprintf style statement"
541
603
  end
604
+
605
+ def check_dlq_custom_codes
606
+ intersection = dlq_custom_codes & DOC_DLQ_CODES
607
+ raise LogStash::ConfigurationError, "#{intersection} are already defined as standard DLQ error codes" unless intersection.empty?
608
+
609
+ intersection = dlq_custom_codes & DOC_SUCCESS_CODES
610
+ raise LogStash::ConfigurationError, "#{intersection} are success codes which cannot be redefined in dlq_custom_codes" unless intersection.empty?
611
+
612
+ intersection = dlq_custom_codes & [DOC_CONFLICT_CODE]
613
+ raise LogStash::ConfigurationError, "#{intersection} are error codes already defined as conflict which cannot be redefined in dlq_custom_codes" unless intersection.empty?
614
+ end
542
615
  end
@@ -206,19 +206,23 @@ module LogStash; module PluginMixins; module ElasticSearch
206
206
  doubled > @retry_max_interval ? @retry_max_interval : doubled
207
207
  end
208
208
 
209
- def handle_dlq_status(message, action, status, response)
209
+ def handle_dlq_response(message, action, status, response)
210
+ _, action_params = action.event, [action[0], action[1], action[2]]
211
+
212
+ # TODO: Change this to send a map with { :status => status, :action => action } in the future
213
+ detailed_message = "#{message} status: #{status}, action: #{action_params}, response: #{response}"
214
+
215
+ log_level = dig_value(response, 'index', 'error', 'type') == 'invalid_index_name_exception' ? :error : :warn
216
+
217
+ handle_dlq_status(action, log_level, detailed_message)
218
+ end
219
+
220
+ def handle_dlq_status(action, log_level, message)
210
221
  # To support bwc, we check if DLQ exists. otherwise we log and drop event (previous behavior)
211
222
  if @dlq_writer
212
- event, action = action.event, [action[0], action[1], action[2]]
213
- # TODO: Change this to send a map with { :status => status, :action => action } in the future
214
- @dlq_writer.write(event, "#{message} status: #{status}, action: #{action}, response: #{response}")
223
+ @dlq_writer.write(action.event, "#{message}")
215
224
  else
216
- if dig_value(response, 'index', 'error', 'type') == 'invalid_index_name_exception'
217
- level = :error
218
- else
219
- level = :warn
220
- end
221
- @logger.send level, message, status: status, action: action, response: response
225
+ @logger.send log_level, message
222
226
  end
223
227
  end
224
228
 
@@ -268,8 +272,8 @@ module LogStash; module PluginMixins; module ElasticSearch
268
272
  @document_level_metrics.increment(:non_retryable_failures)
269
273
  @logger.warn "Failed action", status: status, action: action, response: response if log_failure_type?(error)
270
274
  next
271
- elsif DOC_DLQ_CODES.include?(status)
272
- handle_dlq_status("Could not index event to Elasticsearch.", action, status, response)
275
+ elsif @dlq_codes.include?(status)
276
+ handle_dlq_response("Could not index event to Elasticsearch.", action, status, response)
273
277
  @document_level_metrics.increment(:non_retryable_failures)
274
278
  next
275
279
  else
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-output-elasticsearch'
3
- s.version = '11.7.0'
3
+ s.version = '11.9.0'
4
4
  s.licenses = ['apache-2.0']
5
5
  s.summary = "Stores logs in Elasticsearch"
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -1,5 +1,6 @@
1
1
  require_relative "../../../spec/es_spec_helper"
2
2
  require "logstash/outputs/elasticsearch"
3
+ require 'cgi'
3
4
 
4
5
  describe "TARGET_BULK_BYTES", :integration => true do
5
6
  let(:target_bulk_bytes) { LogStash::Outputs::ElasticSearch::TARGET_BULK_BYTES }
@@ -45,16 +46,57 @@ describe "TARGET_BULK_BYTES", :integration => true do
45
46
  end
46
47
  end
47
48
 
48
- describe "indexing" do
49
+ def curl_and_get_json_response(url, method: :get, retrieve_err_payload: false); require 'open3'
50
+ cmd = "curl -s -v --show-error #{curl_opts} -X #{method.to_s.upcase} -k #{url}"
51
+ begin
52
+ out, err, status = Open3.capture3(cmd)
53
+ rescue Errno::ENOENT
54
+ fail "curl not available, make sure curl binary is installed and available on $PATH"
55
+ end
56
+
57
+ if status.success?
58
+ http_status = err.match(/< HTTP\/1.1 (\d+)/)[1] || '0' # < HTTP/1.1 200 OK\r\n
59
+
60
+ if http_status.strip[0].to_i > 2
61
+ error = (LogStash::Json.load(out)['error']) rescue nil
62
+ if error
63
+ if retrieve_err_payload
64
+ return error
65
+ else
66
+ fail "#{cmd.inspect} received an error: #{http_status}\n\n#{error.inspect}"
67
+ end
68
+ else
69
+ warn out
70
+ fail "#{cmd.inspect} unexpected response: #{http_status}\n\n#{err}"
71
+ end
72
+ end
73
+
74
+ LogStash::Json.load(out)
75
+ else
76
+ warn out
77
+ fail "#{cmd.inspect} process failed: #{status}\n\n#{err}"
78
+ end
79
+ end
80
+
81
+ describe "indexing with sprintf resolution", :integration => true do
49
82
  let(:message) { "Hello from #{__FILE__}" }
50
83
  let(:event) { LogStash::Event.new("message" => message, "type" => type) }
51
- let(:index) { 10.times.collect { rand(10).to_s }.join("") }
84
+ let (:index) { "%{[index_name]}_dynamic" }
52
85
  let(:type) { ESHelper.es_version_satisfies?("< 7") ? "doc" : "_doc" }
53
- let(:event_count) { 1 + rand(2) }
54
- let(:config) { "not implemented" }
86
+ let(:event_count) { 1 }
87
+ let(:user) { "simpleuser" }
88
+ let(:password) { "abc123" }
89
+ let(:config) do
90
+ {
91
+ "hosts" => [ get_host_port ],
92
+ "user" => user,
93
+ "password" => password,
94
+ "index" => index
95
+ }
96
+ end
55
97
  let(:events) { event_count.times.map { event }.to_a }
56
98
  subject { LogStash::Outputs::ElasticSearch.new(config) }
57
-
99
+
58
100
  let(:es_url) { "http://#{get_host_port}" }
59
101
  let(:index_url) { "#{es_url}/#{index}" }
60
102
 
@@ -63,33 +105,65 @@ describe "indexing" do
63
105
  let(:es_admin) { 'admin' } # default user added in ES -> 8.x requires auth credentials for /_refresh etc
64
106
  let(:es_admin_pass) { 'elastic' }
65
107
 
66
- def curl_and_get_json_response(url, method: :get); require 'open3'
67
- cmd = "curl -s -v --show-error #{curl_opts} -X #{method.to_s.upcase} -k #{url}"
68
- begin
69
- out, err, status = Open3.capture3(cmd)
70
- rescue Errno::ENOENT
71
- fail "curl not available, make sure curl binary is installed and available on $PATH"
72
- end
108
+ let(:initial_events) { [] }
73
109
 
74
- if status.success?
75
- http_status = err.match(/< HTTP\/1.1 (\d+)/)[1] || '0' # < HTTP/1.1 200 OK\r\n
110
+ let(:do_register) { true }
76
111
 
77
- if http_status.strip[0].to_i > 2
78
- error = (LogStash::Json.load(out)['error']) rescue nil
79
- if error
80
- fail "#{cmd.inspect} received an error: #{http_status}\n\n#{error.inspect}"
81
- else
82
- warn out
83
- fail "#{cmd.inspect} unexpected response: #{http_status}\n\n#{err}"
84
- end
85
- end
112
+ before do
113
+ subject.register if do_register
114
+ subject.multi_receive(initial_events) if initial_events
115
+ end
86
116
 
87
- LogStash::Json.load(out)
88
- else
89
- warn out
90
- fail "#{cmd.inspect} process failed: #{status}\n\n#{err}"
117
+ after do
118
+ subject.do_close
119
+ end
120
+
121
+ let(:event) { LogStash::Event.new("message" => message, "type" => type, "index_name" => "test") }
122
+
123
+ it "should index successfully when field is resolved" do
124
+ expected_index_name = "test_dynamic"
125
+ subject.multi_receive(events)
126
+
127
+ # curl_and_get_json_response "#{es_url}/_refresh", method: :post
128
+
129
+ result = curl_and_get_json_response "#{es_url}/#{expected_index_name}"
130
+
131
+ expect(result[expected_index_name]).not_to be(nil)
132
+ end
133
+
134
+ context "when dynamic field doesn't resolve the index_name" do
135
+ let(:event) { LogStash::Event.new("message" => message, "type" => type) }
136
+ let(:dlq_writer) { double('DLQ writer') }
137
+ before { subject.instance_variable_set('@dlq_writer', dlq_writer) }
138
+
139
+ it "should doesn't create an index name with unresolved placeholders" do
140
+ expect(dlq_writer).to receive(:write).once.with(event, /Could not resolve dynamic index/)
141
+ subject.multi_receive(events)
142
+
143
+ escaped_index_name = CGI.escape("%{[index_name]}_dynamic")
144
+ result = curl_and_get_json_response "#{es_url}/#{escaped_index_name}", retrieve_err_payload: true
145
+ expect(result["root_cause"].first()["type"]).to eq("index_not_found_exception")
91
146
  end
92
147
  end
148
+ end
149
+
150
+ describe "indexing" do
151
+ let(:message) { "Hello from #{__FILE__}" }
152
+ let(:event) { LogStash::Event.new("message" => message, "type" => type) }
153
+ let(:index) { 10.times.collect { rand(10).to_s }.join("") }
154
+ let(:type) { ESHelper.es_version_satisfies?("< 7") ? "doc" : "_doc" }
155
+ let(:event_count) { 1 + rand(2) }
156
+ let(:config) { "not implemented" }
157
+ let(:events) { event_count.times.map { event }.to_a }
158
+ subject { LogStash::Outputs::ElasticSearch.new(config) }
159
+
160
+ let(:es_url) { "http://#{get_host_port}" }
161
+ let(:index_url) { "#{es_url}/#{index}" }
162
+
163
+ let(:curl_opts) { nil }
164
+
165
+ let(:es_admin) { 'admin' } # default user added in ES -> 8.x requires auth credentials for /_refresh etc
166
+ let(:es_admin_pass) { 'elastic' }
93
167
 
94
168
  let(:initial_events) { [] }
95
169
 
@@ -260,6 +260,24 @@ describe LogStash::Outputs::ElasticSearch do
260
260
  end
261
261
  end
262
262
 
263
+ describe "when 'dlq_custom_codes'" do
264
+ let(:options) { super().merge('dlq_custom_codes' => [404]) }
265
+ let(:do_register) { false }
266
+
267
+ context "contains already defined codes" do
268
+ it "should raise a configuration error" do
269
+ expect{ subject.register }.to raise_error(LogStash::ConfigurationError, /are already defined as standard DLQ error codes/)
270
+ end
271
+ end
272
+
273
+ context "is configured but DLQ is not enabled" do
274
+ it "raise a configuration error" do
275
+ allow(subject).to receive(:dlq_enabled?).and_return(false)
276
+ expect{ subject.register }.to raise_error(LogStash::ConfigurationError, /configured while DLQ is not enabled/)
277
+ end
278
+ end
279
+ end if LOGSTASH_VERSION > '7.0'
280
+
263
281
  describe "#multi_receive" do
264
282
  let(:events) { [double("one"), double("two"), double("three")] }
265
283
  let(:events_tuples) { [double("one t"), double("two t"), double("three t")] }
@@ -750,6 +768,7 @@ describe LogStash::Outputs::ElasticSearch do
750
768
 
751
769
  context 'handling elasticsearch document-level status meant for the DLQ' do
752
770
  let(:options) { { "manage_template" => false } }
771
+ let(:action) { LogStash::Outputs::ElasticSearch::EventActionTuple.new(:action, :params, LogStash::Event.new("foo" => "bar")) }
753
772
 
754
773
  context 'when @dlq_writer is nil' do
755
774
  before { subject.instance_variable_set '@dlq_writer', nil }
@@ -759,8 +778,7 @@ describe LogStash::Outputs::ElasticSearch do
759
778
  it 'should log at ERROR level' do
760
779
  subject.instance_variable_set(:@logger, double("logger").as_null_object)
761
780
  mock_response = { 'index' => { 'error' => { 'type' => 'invalid_index_name_exception' } } }
762
- subject.handle_dlq_status("Could not index event to Elasticsearch.",
763
- [:action, :params, :event], :some_status, mock_response)
781
+ subject.handle_dlq_response("Could not index event to Elasticsearch.", action, :some_status, mock_response)
764
782
  end
765
783
  end
766
784
 
@@ -768,10 +786,9 @@ describe LogStash::Outputs::ElasticSearch do
768
786
  it 'should log at WARN level' do
769
787
  logger = double("logger").as_null_object
770
788
  subject.instance_variable_set(:@logger, logger)
771
- expect(logger).to receive(:warn).with(/Could not index/, hash_including(:status, :action, :response))
789
+ expect(logger).to receive(:warn).with(a_string_including "Could not index event to Elasticsearch. status: some_status, action: [:action, :params, {")
772
790
  mock_response = { 'index' => { 'error' => { 'type' => 'illegal_argument_exception' } } }
773
- subject.handle_dlq_status("Could not index event to Elasticsearch.",
774
- [:action, :params, :event], :some_status, mock_response)
791
+ subject.handle_dlq_response("Could not index event to Elasticsearch.", action, :some_status, mock_response)
775
792
  end
776
793
  end
777
794
 
@@ -779,11 +796,10 @@ describe LogStash::Outputs::ElasticSearch do
779
796
  it 'should not fail, but just log a warning' do
780
797
  logger = double("logger").as_null_object
781
798
  subject.instance_variable_set(:@logger, logger)
782
- expect(logger).to receive(:warn).with(/Could not index/, hash_including(:status, :action, :response))
799
+ expect(logger).to receive(:warn).with(a_string_including "Could not index event to Elasticsearch. status: some_status, action: [:action, :params, {")
783
800
  mock_response = { 'index' => {} }
784
801
  expect do
785
- subject.handle_dlq_status("Could not index event to Elasticsearch.",
786
- [:action, :params, :event], :some_status, mock_response)
802
+ subject.handle_dlq_response("Could not index event to Elasticsearch.", action, :some_status, mock_response)
787
803
  end.to_not raise_error
788
804
  end
789
805
  end
@@ -803,11 +819,11 @@ describe LogStash::Outputs::ElasticSearch do
803
819
  expect(dlq_writer).to receive(:write).once.with(event, /Could not index/)
804
820
  mock_response = { 'index' => { 'error' => { 'type' => 'illegal_argument_exception' } } }
805
821
  action = LogStash::Outputs::ElasticSearch::EventActionTuple.new(:action, :params, event)
806
- subject.handle_dlq_status("Could not index event to Elasticsearch.", action, 404, mock_response)
822
+ subject.handle_dlq_response("Could not index event to Elasticsearch.", action, 404, mock_response)
807
823
  end
808
824
  end
809
825
 
810
- context 'with response status 400' do
826
+ context 'with error response status' do
811
827
 
812
828
  let(:options) { super().merge 'document_id' => '%{foo}' }
813
829
 
@@ -815,11 +831,13 @@ describe LogStash::Outputs::ElasticSearch do
815
831
 
816
832
  let(:dlq_writer) { subject.instance_variable_get(:@dlq_writer) }
817
833
 
834
+ let(:error_code) { 400 }
835
+
818
836
  let(:bulk_response) do
819
837
  {
820
838
  "took"=>1, "ingest_took"=>11, "errors"=>true, "items"=>
821
839
  [{
822
- "index"=>{"_index"=>"bar", "_type"=>"_doc", "_id"=>'bar', "status"=>400,
840
+ "index"=>{"_index"=>"bar", "_type"=>"_doc", "_id"=>'bar', "status" => error_code,
823
841
  "error"=>{"type" => "illegal_argument_exception", "reason" => "TEST" }
824
842
  }
825
843
  }]
@@ -830,21 +848,34 @@ describe LogStash::Outputs::ElasticSearch do
830
848
  allow(subject.client).to receive(:bulk_send).and_return(bulk_response)
831
849
  end
832
850
 
833
- it "should write event to DLQ" do
834
- expect(dlq_writer).to receive(:write).and_wrap_original do |method, *args|
835
- expect( args.size ).to eql 2
851
+ shared_examples "should write event to DLQ" do
852
+ it "should write event to DLQ" do
853
+ expect(dlq_writer).to receive(:write).and_wrap_original do |method, *args|
854
+ expect( args.size ).to eql 2
855
+
856
+ event, reason = *args
857
+ expect( event ).to be_a LogStash::Event
858
+ expect( event ).to be events.first
859
+ expect( reason ).to start_with "Could not index event to Elasticsearch. status: #{error_code}, action: [\"index\""
860
+ expect( reason ).to match /_id=>"bar".*"foo"=>"bar".*response:.*"reason"=>"TEST"/
836
861
 
837
- event, reason = *args
838
- expect( event ).to be_a LogStash::Event
839
- expect( event ).to be events.first
840
- expect( reason ).to start_with 'Could not index event to Elasticsearch. status: 400, action: ["index"'
841
- expect( reason ).to match /_id=>"bar".*"foo"=>"bar".*response:.*"reason"=>"TEST"/
862
+ method.call(*args) # won't hurt to call LogStash::Util::DummyDeadLetterQueueWriter
863
+ end.once
864
+
865
+ event_action_tuples = subject.map_events(events)
866
+ subject.send(:submit, event_action_tuples)
867
+ end
868
+ end
869
+
870
+ context "is one of the predefined codes" do
871
+ include_examples "should write event to DLQ"
872
+ end
842
873
 
843
- method.call(*args) # won't hurt to call LogStash::Util::DummyDeadLetterQueueWriter
844
- end.once
874
+ context "when user customized dlq_custom_codes option" do
875
+ let(:error_code) { 403 }
876
+ let(:options) { super().merge 'dlq_custom_codes' => [error_code] }
845
877
 
846
- event_action_tuples = subject.map_events(events)
847
- subject.send(:submit, event_action_tuples)
878
+ include_examples "should write event to DLQ"
848
879
  end
849
880
 
850
881
  end if LOGSTASH_VERSION > '7.0'
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-elasticsearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 11.7.0
4
+ version: 11.9.0
5
5
  platform: java
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2022-08-30 00:00:00.000000000 Z
11
+ date: 2022-09-20 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement