logstash-output-elasticsearch 11.16.0-java → 11.18.0-java

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c03b46351b4d27e176925ab6aeac172bd2aa8ba7060e73240059489349190af6
4
- data.tar.gz: 5e80945b9fd0410587071545e368b0819da30bade44b487db6cbf516410237a1
3
+ metadata.gz: b42642e174e8a6f0bf30c67cbdf25e27cc8197f2aeb56cb2268fb11f65426a03
4
+ data.tar.gz: 26ca32e908ef3d42ec4281259ea9a6bf31746031b12c14ad08fbfd189c069c3b
5
5
  SHA512:
6
- metadata.gz: 7b396ead52eefce52b2f4331f92682654ba4532df52a76020897c9a33b043c72e840e2d9c5c6980b7913360101ad64110f9494808fdae39600658d2541d4c699
7
- data.tar.gz: d038c717aa4a7588545f14379fbb76ad1218f9c213030f0dc2d9853a8074e24816dc664e09a0eadcfc6fd42d9daa173d5f90032ae15f10808a187579f7d9ee93
6
+ metadata.gz: dd0b5731beb34c4e331a8ea52252c4b53af5d8c62ffb97c106eab961709d0d884a02a228d48a236eee635629d23a6c2243528a20e3f1f7368f609e232f26d7f7
7
+ data.tar.gz: 79e980b61c3bdd1b3339f1f03eccff4c52fe596fc34e6069fac1239a5c67e17357d118a95d9536bc1937f930f1c60f935858c236447f5b87f8517ec9b79ef53e
data/CHANGELOG.md CHANGED
@@ -1,3 +1,9 @@
1
+ ## 11.18.0
2
+ - Added request header `Elastic-Api-Version` for serverless [#1147](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1147)
3
+
4
+ ## 11.17.0
5
+ - Added support to http compression level. Deprecated `http_compression` in favour of `compression_level` and enabled compression level 1 by default. [#1148](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1148)
6
+
1
7
  ## 11.16.0
2
8
  - Added support to Serverless Elasticsearch [#1445](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1145)
3
9
 
data/docs/index.asciidoc CHANGED
@@ -277,9 +277,9 @@ not reevaluate its DNS value while the keepalive is in effect.
277
277
  ==== HTTP Compression
278
278
 
279
279
  This plugin always reads compressed responses from {es}.
280
- It _can be configured_ to send compressed bulk requests to {es}.
280
+ By default, it sends compressed bulk requests to {es}.
281
281
 
282
- If you are concerned about bandwidth, you can enable <<plugins-{type}s-{plugin}-http_compression>> to trade a small amount of CPU capacity for a significant reduction in network IO.
282
+ If you are concerned about bandwidth, you can set a higher <<plugins-{type}s-{plugin}-compression_level>> to trade CPU capacity for a reduction in network IO.
283
283
 
284
284
  ==== Authentication
285
285
 
@@ -310,6 +310,7 @@ This plugin supports the following configuration options plus the
310
310
  | <<plugins-{type}s-{plugin}-ca_trusted_fingerprint>> |<<string,string>>|No
311
311
  | <<plugins-{type}s-{plugin}-cloud_auth>> |<<password,password>>|No
312
312
  | <<plugins-{type}s-{plugin}-cloud_id>> |<<string,string>>|No
313
+ | <<plugins-{type}s-{plugin}-compression_level>> |<<number,number>>, one of `[0 ~ 9]`|No
313
314
  | <<plugins-{type}s-{plugin}-custom_headers>> |<<hash,hash>>|No
314
315
  | <<plugins-{type}s-{plugin}-data_stream>> |<<string,string>>, one of `["true", "false", "auto"]`|No
315
316
  | <<plugins-{type}s-{plugin}-data_stream_auto_routing>> |<<boolean,boolean>>|No
@@ -459,6 +460,17 @@ Cloud ID, from the Elastic Cloud web console. If set `hosts` should not be used.
459
460
  For more details, check out the
460
461
  {logstash-ref}/connecting-to-cloud.html[Logstash-to-Cloud documentation].
461
462
 
463
+ [id="plugins-{type}s-{plugin}-compression_level"]
464
+ ===== `compression_level`
465
+
466
+ * Value can be any of: `0`, `1`, `2`, `3`, `4`, `5`, `6`, `7`, `8`, `9`
467
+ * Default value is `1`
468
+
469
+ The gzip compression level. Setting this value to `0` disables compression.
470
+ The compression level must be in the range of `1` (best speed) to `9` (best compression).
471
+
472
+ Increasing the compression level will reduce the network usage but will increase the CPU usage.
473
+
462
474
  [id="plugins-{type}s-{plugin}-data_stream"]
463
475
  ===== `data_stream`
464
476
 
@@ -618,7 +630,7 @@ NOTE: Deprecated, refer to <<plugins-{type}s-{plugin}-silence_errors_in_log>>.
618
630
  Pass a set of key value pairs as the headers sent in each request to
619
631
  an elasticsearch node. The headers will be used for any kind of request
620
632
  (_bulk request, template installation, health checks and sniffing).
621
- These custom headers will be overidden by settings like `http_compression`.
633
+ These custom headers will be overidden by settings like `compression_level`.
622
634
 
623
635
  [id="plugins-{type}s-{plugin}-healthcheck_path"]
624
636
  ===== `healthcheck_path`
@@ -659,11 +671,12 @@ Any special characters present in the URLs here MUST be URL escaped! This means
659
671
 
660
672
  [id="plugins-{type}s-{plugin}-http_compression"]
661
673
  ===== `http_compression`
674
+ deprecated[11.17.0, Replaced by <<plugins-{type}s-{plugin}-compression_level>>]
662
675
 
663
676
  * Value type is <<boolean,boolean>>
664
677
  * Default value is `false`
665
678
 
666
- Enable gzip compression on requests.
679
+ Setting `true` enables gzip compression level 1 on requests.
667
680
 
668
681
  This setting allows you to reduce this plugin's outbound network traffic by
669
682
  compressing each bulk _request_ to {es}.
@@ -16,6 +16,18 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
16
16
  @response_body = response_body
17
17
  end
18
18
 
19
+ def invalid_eav_header?
20
+ @response_code == 400 && @response_body&.include?(ELASTIC_API_VERSION)
21
+ end
22
+
23
+ def invalid_credentials?
24
+ @response_code == 401
25
+ end
26
+
27
+ def forbidden?
28
+ @response_code == 403
29
+ end
30
+
19
31
  end
20
32
  class HostUnreachableError < Error;
21
33
  attr_reader :original_error, :url
@@ -48,7 +60,9 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
48
60
  :sniffer_delay => 10,
49
61
  }.freeze
50
62
 
51
- BUILD_FLAVOUR_SERVERLESS = 'serverless'.freeze
63
+ BUILD_FLAVOR_SERVERLESS = 'serverless'.freeze
64
+ ELASTIC_API_VERSION = "Elastic-Api-Version".freeze
65
+ DEFAULT_EAV_HEADER = { ELASTIC_API_VERSION => "2023-10-31" }.freeze
52
66
 
53
67
  def initialize(logger, adapter, initial_urls=[], options={})
54
68
  @logger = logger
@@ -77,7 +91,7 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
77
91
  @license_checker = options[:license_checker] || LogStash::PluginMixins::ElasticSearch::NoopLicenseChecker::INSTANCE
78
92
 
79
93
  @last_es_version = Concurrent::AtomicReference.new
80
- @build_flavour = Concurrent::AtomicReference.new
94
+ @build_flavor = Concurrent::AtomicReference.new
81
95
  end
82
96
 
83
97
  def start
@@ -232,39 +246,56 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
232
246
  end
233
247
 
234
248
  def health_check_request(url)
235
- response = perform_request_to_url(url, :head, @healthcheck_path)
236
- raise BadResponseCodeError.new(response.code, url, nil, response.body) unless (200..299).cover?(response.code)
249
+ logger.debug("Running health check to see if an Elasticsearch connection is working",
250
+ :healthcheck_url => url.sanitized.to_s, :path => @healthcheck_path)
251
+ begin
252
+ response = perform_request_to_url(url, :head, @healthcheck_path)
253
+ return response, nil
254
+ rescue ::LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError => e
255
+ logger.warn("Health check failed", code: e.response_code, url: e.url, message: e.message)
256
+ return nil, e
257
+ end
237
258
  end
238
259
 
239
260
  def healthcheck!(register_phase = true)
240
261
  # Try to keep locking granularity low such that we don't affect IO...
241
262
  @state_mutex.synchronize { @url_info.select {|url,meta| meta[:state] != :alive } }.each do |url,meta|
242
263
  begin
243
- logger.debug("Running health check to see if an Elasticsearch connection is working",
244
- :healthcheck_url => url.sanitized.to_s, :path => @healthcheck_path)
245
- health_check_request(url)
264
+ _, health_bad_code_err = health_check_request(url)
265
+ root_response, root_bad_code_err = get_root_path(url) if health_bad_code_err.nil? || register_phase
246
266
 
247
267
  # when called from resurrectionist skip the product check done during register phase
248
268
  if register_phase
249
- if !elasticsearch?(url)
250
- raise LogStash::ConfigurationError, "Could not connect to a compatible version of Elasticsearch"
251
- end
269
+ raise LogStash::ConfigurationError,
270
+ "Could not read Elasticsearch. Please check the credentials" if root_bad_code_err&.invalid_credentials?
271
+ raise LogStash::ConfigurationError,
272
+ "Could not read Elasticsearch. Please check the privileges" if root_bad_code_err&.forbidden?
273
+ # when customer_headers is invalid
274
+ raise LogStash::ConfigurationError,
275
+ "The Elastic-Api-Version header is not valid" if root_bad_code_err&.invalid_eav_header?
276
+ # when it is not Elasticserach
277
+ raise LogStash::ConfigurationError,
278
+ "Could not connect to a compatible version of Elasticsearch" if root_bad_code_err.nil? && !elasticsearch?(root_response)
279
+
280
+ test_serverless_connection(url, root_response)
252
281
  end
282
+
283
+ raise health_bad_code_err if health_bad_code_err
284
+ raise root_bad_code_err if root_bad_code_err
285
+
253
286
  # If no exception was raised it must have succeeded!
254
287
  logger.warn("Restored connection to ES instance", url: url.sanitized.to_s)
255
- # We reconnected to this node, check its ES version
256
- version_info = get_es_version(url)
257
- es_version = version_info.fetch('number', nil)
258
- build_flavour = version_info.fetch('build_flavor', nil)
259
-
260
- if es_version.nil?
261
- logger.warn("Failed to retrieve Elasticsearch version data from connected endpoint, connection aborted", :url => url.sanitized.to_s)
262
- next
263
- end
288
+
289
+ # We check its ES version
290
+ es_version, build_flavor = parse_es_version(root_response)
291
+ logger.warn("Failed to retrieve Elasticsearch build flavor") if build_flavor.nil?
292
+ logger.warn("Failed to retrieve Elasticsearch version data from connected endpoint, connection aborted", :url => url.sanitized.to_s) if es_version.nil?
293
+ next if es_version.nil?
294
+
264
295
  @state_mutex.synchronize do
265
296
  meta[:version] = es_version
266
297
  set_last_es_version(es_version, url)
267
- set_build_flavour(build_flavour)
298
+ set_build_flavor(build_flavor)
268
299
 
269
300
  alive = @license_checker.appropriate_license?(self, url)
270
301
  meta[:state] = alive ? :alive : :dead
@@ -275,40 +306,21 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
275
306
  end
276
307
  end
277
308
 
278
- def elasticsearch?(url)
309
+ def get_root_path(url, params={})
279
310
  begin
280
- response = perform_request_to_url(url, :get, ROOT_URI_PATH)
311
+ resp = perform_request_to_url(url, :get, ROOT_URI_PATH, params)
312
+ return resp, nil
281
313
  rescue ::LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError => e
282
- return false if response.code == 401 || response.code == 403
283
- raise e
314
+ logger.warn("Elasticsearch main endpoint returns #{e.response_code}", message: e.message, body: e.response_body)
315
+ return nil, e
284
316
  end
285
-
286
- version_info = LogStash::Json.load(response.body)
287
- return false if version_info['version'].nil?
288
-
289
- version = ::Gem::Version.new(version_info["version"]['number'])
290
- return false if version < ::Gem::Version.new('6.0.0')
291
-
292
- if VERSION_6_TO_7.satisfied_by?(version)
293
- return valid_tagline?(version_info)
294
- elsif VERSION_7_TO_7_14.satisfied_by?(version)
295
- build_flavor = version_info["version"]['build_flavor']
296
- return false if build_flavor.nil? || build_flavor != 'default' || !valid_tagline?(version_info)
297
- else
298
- # case >= 7.14
299
- lower_headers = response.headers.transform_keys {|key| key.to_s.downcase }
300
- product_header = lower_headers['x-elastic-product']
301
- return false if product_header != 'Elasticsearch'
302
- end
303
- return true
304
- rescue => e
305
- logger.error("Unable to retrieve Elasticsearch version", url: url.sanitized.to_s, exception: e.class, message: e.message)
306
- false
307
317
  end
308
318
 
309
- def valid_tagline?(version_info)
310
- tagline = version_info['tagline']
311
- tagline == "You Know, for Search"
319
+ def test_serverless_connection(url, root_response)
320
+ _, build_flavor = parse_es_version(root_response)
321
+ params = { :headers => DEFAULT_EAV_HEADER }
322
+ _, bad_code_err = get_root_path(url, params) if build_flavor == BUILD_FLAVOR_SERVERLESS
323
+ raise LogStash::ConfigurationError, "The Elastic-Api-Version header is not valid" if bad_code_err&.invalid_eav_header?
312
324
  end
313
325
 
314
326
  def stop_resurrectionist
@@ -334,6 +346,7 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
334
346
  end
335
347
 
336
348
  def perform_request_to_url(url, method, path, params={}, body=nil)
349
+ params[:headers] = DEFAULT_EAV_HEADER.merge(params[:headers] || {}) if serverless?
337
350
  @adapter.perform_request(url, method, path, params, body)
338
351
  end
339
352
 
@@ -476,15 +489,6 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
476
489
  end
477
490
  end
478
491
 
479
- def get_es_version(url)
480
- response = perform_request_to_url(url, :get, ROOT_URI_PATH)
481
- return nil unless (200..299).cover?(response.code)
482
-
483
- response = LogStash::Json.load(response.body)
484
-
485
- response.fetch('version', {})
486
- end
487
-
488
492
  def last_es_version
489
493
  @last_es_version.get
490
494
  end
@@ -494,7 +498,7 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
494
498
  end
495
499
 
496
500
  def serverless?
497
- @build_flavour.get == BUILD_FLAVOUR_SERVERLESS
501
+ @build_flavor.get == BUILD_FLAVOR_SERVERLESS
498
502
  end
499
503
 
500
504
  private
@@ -526,9 +530,50 @@ module LogStash; module Outputs; class ElasticSearch; class HttpClient;
526
530
  previous_major: @maximum_seen_major_version, new_major: major, node_url: url.sanitized.to_s)
527
531
  end
528
532
 
529
- def set_build_flavour(flavour)
530
- @build_flavour.set(flavour)
533
+ def set_build_flavor(flavor)
534
+ @build_flavor.set(flavor)
535
+ end
536
+
537
+ def parse_es_version(response)
538
+ return nil, nil unless (200..299).cover?(response&.code)
539
+
540
+ response = LogStash::Json.load(response&.body)
541
+ version_info = response.fetch('version', {})
542
+ es_version = version_info.fetch('number', nil)
543
+ build_flavor = version_info.fetch('build_flavor', nil)
544
+
545
+ return es_version, build_flavor
546
+ end
547
+
548
+ def elasticsearch?(response)
549
+ return false if response.nil?
550
+
551
+ version_info = LogStash::Json.load(response.body)
552
+ return false if version_info['version'].nil?
553
+
554
+ version = ::Gem::Version.new(version_info["version"]['number'])
555
+ return false if version < ::Gem::Version.new('6.0.0')
556
+
557
+ if VERSION_6_TO_7.satisfied_by?(version)
558
+ return valid_tagline?(version_info)
559
+ elsif VERSION_7_TO_7_14.satisfied_by?(version)
560
+ build_flavor = version_info["version"]['build_flavor']
561
+ return false if build_flavor.nil? || build_flavor != 'default' || !valid_tagline?(version_info)
562
+ else
563
+ # case >= 7.14
564
+ lower_headers = response.headers.transform_keys {|key| key.to_s.downcase }
565
+ product_header = lower_headers['x-elastic-product']
566
+ return false if product_header != 'Elasticsearch'
567
+ end
568
+ return true
569
+ rescue => e
570
+ logger.error("Unable to retrieve Elasticsearch version", exception: e.class, message: e.message)
571
+ false
531
572
  end
532
573
 
574
+ def valid_tagline?(version_info)
575
+ tagline = version_info['tagline']
576
+ tagline == "You Know, for Search"
577
+ end
533
578
  end
534
579
  end; end; end; end;
@@ -118,7 +118,7 @@ module LogStash; module Outputs; class ElasticSearch;
118
118
  end
119
119
 
120
120
  body_stream = StringIO.new
121
- if http_compression
121
+ if compression_level?
122
122
  body_stream.set_encoding "BINARY"
123
123
  stream_writer = gzip_writer(body_stream)
124
124
  else
@@ -141,14 +141,14 @@ module LogStash; module Outputs; class ElasticSearch;
141
141
  :batch_offset => (index + 1 - batch_actions.size))
142
142
  bulk_responses << bulk_send(body_stream, batch_actions)
143
143
  body_stream.truncate(0) && body_stream.seek(0)
144
- stream_writer = gzip_writer(body_stream) if http_compression
144
+ stream_writer = gzip_writer(body_stream) if compression_level?
145
145
  batch_actions.clear
146
146
  end
147
147
  stream_writer.write(as_json)
148
148
  batch_actions << action
149
149
  end
150
150
 
151
- stream_writer.close if http_compression
151
+ stream_writer.close if compression_level?
152
152
 
153
153
  logger.debug("Sending final bulk request for batch.",
154
154
  :action_count => batch_actions.size,
@@ -157,7 +157,7 @@ module LogStash; module Outputs; class ElasticSearch;
157
157
  :batch_offset => (actions.size - batch_actions.size))
158
158
  bulk_responses << bulk_send(body_stream, batch_actions) if body_stream.size > 0
159
159
 
160
- body_stream.close if !http_compression
160
+ body_stream.close unless compression_level?
161
161
  join_bulk_responses(bulk_responses)
162
162
  end
163
163
 
@@ -165,7 +165,7 @@ module LogStash; module Outputs; class ElasticSearch;
165
165
  fail(ArgumentError, "Cannot create gzip writer on IO with unread bytes") unless io.eof?
166
166
  fail(ArgumentError, "Cannot create gzip writer on non-empty IO") unless io.pos == 0
167
167
 
168
- Zlib::GzipWriter.new(io, Zlib::DEFAULT_COMPRESSION, Zlib::DEFAULT_STRATEGY)
168
+ Zlib::GzipWriter.new(io, client_settings.fetch(:compression_level), Zlib::DEFAULT_STRATEGY)
169
169
  end
170
170
 
171
171
  def join_bulk_responses(bulk_responses)
@@ -176,7 +176,7 @@ module LogStash; module Outputs; class ElasticSearch;
176
176
  end
177
177
 
178
178
  def bulk_send(body_stream, batch_actions)
179
- params = http_compression ? {:headers => {"Content-Encoding" => "gzip"}} : {}
179
+ params = compression_level? ? {:headers => {"Content-Encoding" => "gzip"}} : {}
180
180
  response = @pool.post(@bulk_path, params, body_stream.string)
181
181
 
182
182
  @bulk_response_metrics.increment(response.code.to_s)
@@ -209,7 +209,7 @@ module LogStash; module Outputs; class ElasticSearch;
209
209
  end
210
210
 
211
211
  def get(path)
212
- response = @pool.get(path, nil)
212
+ response = @pool.get(path)
213
213
  LogStash::Json.load(response.body)
214
214
  end
215
215
 
@@ -298,8 +298,10 @@ module LogStash; module Outputs; class ElasticSearch;
298
298
  @_ssl_options ||= client_settings.fetch(:ssl, {})
299
299
  end
300
300
 
301
- def http_compression
302
- client_settings.fetch(:http_compression, false)
301
+ # return true if compression_level is [1..9]
302
+ # return false if it is 0
303
+ def compression_level?
304
+ client_settings.fetch(:compression_level) > 0
303
305
  end
304
306
 
305
307
  def build_adapter(options)
@@ -8,7 +8,7 @@ module LogStash; module Outputs; class ElasticSearch;
8
8
  :pool_max => params["pool_max"],
9
9
  :pool_max_per_route => params["pool_max_per_route"],
10
10
  :check_connection_timeout => params["validate_after_inactivity"],
11
- :http_compression => params["http_compression"],
11
+ :compression_level => params["compression_level"],
12
12
  :headers => params["custom_headers"] || {}
13
13
  }
14
14
 
@@ -276,6 +276,7 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
276
276
  super
277
277
  setup_ecs_compatibility_related_defaults
278
278
  setup_ssl_params!
279
+ setup_compression_level!
279
280
  end
280
281
 
281
282
  def register
@@ -368,6 +369,7 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
368
369
  params['proxy'] = proxy # do not do resolving again
369
370
  end
370
371
  end
372
+
371
373
  super(params)
372
374
  end
373
375
 
@@ -594,7 +596,9 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
594
596
  def install_template
595
597
  TemplateManager.install_template(self)
596
598
  rescue => e
597
- @logger.error("Failed to install template", message: e.message, exception: e.class, backtrace: e.backtrace)
599
+ details = { message: e.message, exception: e.class, backtrace: e.backtrace }
600
+ details[:body] = e.response_body if e.respond_to?(:response_body)
601
+ @logger.error("Failed to install template", details)
598
602
  end
599
603
 
600
604
  def setup_ecs_compatibility_related_defaults
@@ -669,6 +673,20 @@ class LogStash::Outputs::ElasticSearch < LogStash::Outputs::Base
669
673
  params['ssl_verification_mode'] = @ssl_verification_mode unless @ssl_verification_mode.nil?
670
674
  end
671
675
 
676
+ def setup_compression_level!
677
+ @compression_level = normalize_config(:compression_level) do |normalize|
678
+ normalize.with_deprecated_mapping(:http_compression) do |http_compression|
679
+ if http_compression == true
680
+ DEFAULT_ZIP_LEVEL
681
+ else
682
+ 0
683
+ end
684
+ end
685
+ end
686
+
687
+ params['compression_level'] = @compression_level unless @compression_level.nil?
688
+ end
689
+
672
690
  # To be overidden by the -java version
673
691
  VALID_HTTP_ACTIONS = ["index", "delete", "create", "update"]
674
692
  def valid_actions
@@ -7,6 +7,7 @@ module LogStash; module PluginMixins; module ElasticSearch
7
7
  # This module defines common options that can be reused by alternate elasticsearch output plugins such as the elasticsearch_data_streams output.
8
8
 
9
9
  DEFAULT_HOST = ::LogStash::Util::SafeURI.new("//127.0.0.1")
10
+ DEFAULT_ZIP_LEVEL = 1
10
11
 
11
12
  CONFIG_PARAMS = {
12
13
  # Username to authenticate to a secure Elasticsearch cluster
@@ -186,7 +187,14 @@ module LogStash; module PluginMixins; module ElasticSearch
186
187
  :validate_after_inactivity => { :validate => :number, :default => 10000 },
187
188
 
188
189
  # Enable gzip compression on requests. Note that response compression is on by default for Elasticsearch v5.0 and beyond
189
- :http_compression => { :validate => :boolean, :default => false },
190
+ # Set `true` to enable compression with level 1
191
+ # Set `false` to disable compression with level 0
192
+ :http_compression => { :validate => :boolean, :default => true, :deprecated => "Set 'compression_level' instead." },
193
+
194
+ # Number `1` ~ `9` are the gzip compression level
195
+ # Set `0` to disable compression
196
+ # Set `1` (best speed) to `9` (best compression) to use compression
197
+ :compression_level => { :validate => [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 ], :default => DEFAULT_ZIP_LEVEL },
190
198
 
191
199
  # Custom Headers to send on each request to elasticsearch nodes
192
200
  :custom_headers => { :validate => :hash, :default => {} },
@@ -179,7 +179,9 @@ module LogStash; module PluginMixins; module ElasticSearch
179
179
  cluster_info = client.get('/')
180
180
  plugin_metadata.set(:cluster_uuid, cluster_info['cluster_uuid'])
181
181
  rescue => e
182
- @logger.error("Unable to retrieve Elasticsearch cluster uuid", message: e.message, exception: e.class, backtrace: e.backtrace)
182
+ details = { message: e.message, exception: e.class, backtrace: e.backtrace }
183
+ details[:body] = e.response_body if e.respond_to?(:response_body)
184
+ @logger.error("Unable to retrieve Elasticsearch cluster uuid", details)
183
185
  end
184
186
 
185
187
  def retrying_submit(actions)
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-output-elasticsearch'
3
- s.version = '11.16.0'
3
+ s.version = '11.18.0'
4
4
  s.licenses = ['apache-2.0']
5
5
  s.summary = "Stores logs in Elasticsearch"
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -8,63 +8,64 @@ RSpec::Matchers.define :a_valid_gzip_encoded_string do
8
8
  }
9
9
  end
10
10
 
11
- describe "indexing with http_compression turned on", :integration => true do
12
- let(:event) { LogStash::Event.new("message" => "Hello World!", "type" => type) }
13
- let(:index) { 10.times.collect { rand(10).to_s }.join("") }
14
- let(:type) { ESHelper.es_version_satisfies?("< 7") ? "doc" : "_doc" }
15
- let(:event_count) { 10000 + rand(500) }
16
- let(:events) { event_count.times.map { event }.to_a }
17
- let(:config) {
18
- {
19
- "hosts" => get_host_port,
20
- "index" => index,
21
- "http_compression" => true
11
+ [ {"http_compression" => true}, {"compression_level" => 1} ].each do |compression_config|
12
+ describe "indexing with http_compression turned on", :integration => true do
13
+ let(:event) { LogStash::Event.new("message" => "Hello World!", "type" => type) }
14
+ let(:index) { 10.times.collect { rand(10).to_s }.join("") }
15
+ let(:type) { ESHelper.es_version_satisfies?("< 7") ? "doc" : "_doc" }
16
+ let(:event_count) { 10000 + rand(500) }
17
+ let(:events) { event_count.times.map { event }.to_a }
18
+ let(:config) {
19
+ {
20
+ "hosts" => get_host_port,
21
+ "index" => index
22
+ }
22
23
  }
23
- }
24
- subject { LogStash::Outputs::ElasticSearch.new(config) }
24
+ subject { LogStash::Outputs::ElasticSearch.new(config.merge(compression_config)) }
25
25
 
26
- let(:es_url) { "http://#{get_host_port}" }
27
- let(:index_url) {"#{es_url}/#{index}"}
28
- let(:http_client_options) { {} }
29
- let(:http_client) do
30
- Manticore::Client.new(http_client_options)
31
- end
26
+ let(:es_url) { "http://#{get_host_port}" }
27
+ let(:index_url) {"#{es_url}/#{index}"}
28
+ let(:http_client_options) { {} }
29
+ let(:http_client) do
30
+ Manticore::Client.new(http_client_options)
31
+ end
32
32
 
33
- before do
34
- subject.register
35
- subject.multi_receive([])
36
- end
33
+ before do
34
+ subject.register
35
+ subject.multi_receive([])
36
+ end
37
37
 
38
- shared_examples "an indexer" do
39
- it "ships events" do
40
- subject.multi_receive(events)
38
+ shared_examples "an indexer" do
39
+ it "ships events" do
40
+ subject.multi_receive(events)
41
41
 
42
- http_client.post("#{es_url}/_refresh").call
42
+ http_client.post("#{es_url}/_refresh").call
43
43
 
44
- response = http_client.get("#{index_url}/_count?q=*")
45
- result = LogStash::Json.load(response.body)
46
- cur_count = result["count"]
47
- expect(cur_count).to eq(event_count)
44
+ response = http_client.get("#{index_url}/_count?q=*")
45
+ result = LogStash::Json.load(response.body)
46
+ cur_count = result["count"]
47
+ expect(cur_count).to eq(event_count)
48
48
 
49
- response = http_client.get("#{index_url}/_search?q=*&size=1000")
50
- result = LogStash::Json.load(response.body)
51
- result["hits"]["hits"].each do |doc|
52
- if ESHelper.es_version_satisfies?("< 8")
53
- expect(doc["_type"]).to eq(type)
54
- else
55
- expect(doc).not_to include("_type")
49
+ response = http_client.get("#{index_url}/_search?q=*&size=1000")
50
+ result = LogStash::Json.load(response.body)
51
+ result["hits"]["hits"].each do |doc|
52
+ if ESHelper.es_version_satisfies?("< 8")
53
+ expect(doc["_type"]).to eq(type)
54
+ else
55
+ expect(doc).not_to include("_type")
56
+ end
57
+ expect(doc["_index"]).to eq(index)
56
58
  end
57
- expect(doc["_index"]).to eq(index)
58
59
  end
59
60
  end
60
- end
61
61
 
62
- it "sets the correct content-encoding header and body is compressed" do
63
- expect(subject.client.pool.adapter.client).to receive(:send).
64
- with(anything, anything, {:headers=>{"Content-Encoding"=>"gzip", "Content-Type"=>"application/json"}, :body => a_valid_gzip_encoded_string}).
65
- and_call_original
66
- subject.multi_receive(events)
67
- end
62
+ it "sets the correct content-encoding header and body is compressed" do
63
+ expect(subject.client.pool.adapter.client).to receive(:send).
64
+ with(anything, anything, {:headers=>{"Content-Encoding"=>"gzip", "Content-Type"=>"application/json"}, :body => a_valid_gzip_encoded_string}).
65
+ and_call_original
66
+ subject.multi_receive(events)
67
+ end
68
68
 
69
- it_behaves_like("an indexer")
70
- end
69
+ it_behaves_like("an indexer")
70
+ end
71
+ end
@@ -262,7 +262,8 @@ describe "indexing" do
262
262
  let(:config) {
263
263
  {
264
264
  "hosts" => get_host_port,
265
- "index" => index
265
+ "index" => index,
266
+ "http_compression" => false
266
267
  }
267
268
  }
268
269
  it_behaves_like("an indexer")
@@ -273,7 +274,8 @@ describe "indexing" do
273
274
  let(:config) {
274
275
  {
275
276
  "hosts" => get_host_port,
276
- "index" => index
277
+ "index" => index,
278
+ "http_compression" => false
277
279
  }
278
280
  }
279
281
  it_behaves_like("an indexer")
@@ -291,7 +293,8 @@ describe "indexing" do
291
293
  "password" => password,
292
294
  "ssl_enabled" => true,
293
295
  "ssl_certificate_authorities" => cacert,
294
- "index" => index
296
+ "index" => index,
297
+ "http_compression" => false
295
298
  }
296
299
  end
297
300
 
@@ -351,7 +354,8 @@ describe "indexing" do
351
354
  "hosts" => ["https://#{CGI.escape(user)}:#{CGI.escape(password)}@elasticsearch:9200"],
352
355
  "ssl_enabled" => true,
353
356
  "ssl_certificate_authorities" => "spec/fixtures/test_certs/test.crt",
354
- "index" => index
357
+ "index" => index,
358
+ "http_compression" => false
355
359
  }
356
360
  end
357
361
 
@@ -7,8 +7,14 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
7
7
  let(:adapter) { LogStash::Outputs::ElasticSearch::HttpClient::ManticoreAdapter.new(logger, {}) }
8
8
  let(:initial_urls) { [::LogStash::Util::SafeURI.new("http://localhost:9200")] }
9
9
  let(:options) { {:resurrect_delay => 3, :url_normalizer => proc {|u| u}} } # Shorten the delay a bit to speed up tests
10
- let(:es_version_info) { [ { "number" => '0.0.0', "build_flavor" => 'default'} ] }
11
10
  let(:license_status) { 'active' }
11
+ let(:root_response) { MockResponse.new(200,
12
+ {"tagline" => "You Know, for Search",
13
+ "version" => {
14
+ "number" => '8.9.0',
15
+ "build_flavor" => 'default'} },
16
+ { "X-Elastic-Product" => "Elasticsearch" }
17
+ ) }
12
18
 
13
19
  subject { described_class.new(logger, adapter, initial_urls, options) }
14
20
 
@@ -22,7 +28,6 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
22
28
 
23
29
  allow(::Manticore::Client).to receive(:new).and_return(manticore_double)
24
30
 
25
- allow(subject).to receive(:get_es_version).with(any_args).and_return(*es_version_info)
26
31
  allow(subject.license_checker).to receive(:license_status).and_return(license_status)
27
32
  end
28
33
 
@@ -37,35 +42,42 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
37
42
  end
38
43
  end
39
44
 
40
- describe "the resurrectionist" do
41
- before(:each) { subject.start }
42
- it "should start the resurrectionist when created" do
43
- expect(subject.resurrectionist_alive?).to eql(true)
44
- end
45
+ describe "healthcheck" do
45
46
 
46
- it "should attempt to resurrect connections after the ressurrect delay" do
47
- expect(subject).to receive(:healthcheck!).once
48
- sleep(subject.resurrect_delay + 1)
47
+ describe "the resurrectionist" do
48
+ before(:each) { subject.start }
49
+ it "should start the resurrectionist when created" do
50
+ expect(subject.resurrectionist_alive?).to eql(true)
51
+ end
52
+
53
+ it "should attempt to resurrect connections after the ressurrect delay" do
54
+ expect(subject).to receive(:healthcheck!).once
55
+ sleep(subject.resurrect_delay + 1)
56
+ end
49
57
  end
50
58
 
51
- describe "healthcheck url handling" do
59
+ describe "healthcheck path handling" do
52
60
  let(:initial_urls) { [::LogStash::Util::SafeURI.new("http://localhost:9200")] }
53
- let(:success_response) { double("Response", :code => 200) }
61
+ let(:healthcheck_response) { double("Response", :code => 200) }
54
62
 
55
63
  before(:example) do
64
+ subject.start
65
+
66
+ expect(adapter).to receive(:perform_request).with(anything, :head, eq(healthcheck_path), anything, anything) do |url, _, _, _, _|
67
+ expect(url.path).to be_empty
68
+ healthcheck_response
69
+ end
70
+
56
71
  expect(adapter).to receive(:perform_request).with(anything, :get, "/", anything, anything) do |url, _, _, _, _|
57
72
  expect(url.path).to be_empty
73
+ root_response
58
74
  end
59
75
  end
60
76
 
61
77
  context "and not setting healthcheck_path" do
78
+ let(:healthcheck_path) { "/" }
62
79
  it "performs the healthcheck to the root" do
63
- expect(adapter).to receive(:perform_request).with(anything, :head, "/", anything, anything) do |url, _, _, _, _|
64
- expect(url.path).to be_empty
65
-
66
- success_response
67
- end
68
- expect { subject.healthcheck! }.to raise_error(LogStash::ConfigurationError, "Could not connect to a compatible version of Elasticsearch")
80
+ subject.healthcheck!
69
81
  end
70
82
  end
71
83
 
@@ -73,14 +85,116 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
73
85
  let(:healthcheck_path) { "/my/health" }
74
86
  let(:options) { super().merge(:healthcheck_path => healthcheck_path) }
75
87
  it "performs the healthcheck to the healthcheck_path" do
76
- expect(adapter).to receive(:perform_request).with(anything, :head, eq(healthcheck_path), anything, anything) do |url, _, _, _, _|
77
- expect(url.path).to be_empty
88
+ subject.healthcheck!
89
+ end
90
+ end
91
+ end
92
+
93
+ describe "register phase" do
94
+ shared_examples_for "root path returns bad code error" do |err_msg|
95
+ before :each do
96
+ subject.update_initial_urls
97
+ expect(subject).to receive(:elasticsearch?).never
98
+ end
99
+
100
+ it "raises ConfigurationError" do
101
+ expect(subject).to receive(:health_check_request).with(anything).and_return(["", nil])
102
+ expect(subject).to receive(:get_root_path).with(anything).and_return([nil,
103
+ ::LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError.new(mock_resp.code, nil, nil, mock_resp.body)])
104
+ expect { subject.healthcheck! }.to raise_error(LogStash::ConfigurationError, err_msg)
105
+ end
106
+ end
107
+
108
+ context "with 200 without version" do
109
+ let(:mock_resp) { MockResponse.new(200, {"tagline" => "You Know, for Search"}) }
78
110
 
79
- success_response
80
- end
111
+ it "raises ConfigurationError" do
112
+ subject.update_initial_urls
113
+
114
+ expect(subject).to receive(:health_check_request).with(anything).and_return(["", nil])
115
+ expect(subject).to receive(:get_root_path).with(anything).and_return([mock_resp, nil])
81
116
  expect { subject.healthcheck! }.to raise_error(LogStash::ConfigurationError, "Could not connect to a compatible version of Elasticsearch")
82
117
  end
83
118
  end
119
+
120
+ context "with 200 serverless" do
121
+ let(:good_resp) { MockResponse.new(200,
122
+ { "tagline" => "You Know, for Search",
123
+ "version" => { "number" => '8.10.0', "build_flavor" => 'serverless'}
124
+ },
125
+ { "X-Elastic-Product" => "Elasticsearch" }
126
+ ) }
127
+ let(:bad_400_err) do
128
+ ::LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError.new(400,
129
+ nil, nil,
130
+ "The requested [Elastic-Api-Version] header value of [2024-10-31] is not valid. Only [2023-10-31] is supported")
131
+ end
132
+
133
+ it "raises ConfigurationError when the serverless connection test fails" do
134
+ subject.update_initial_urls
135
+
136
+ expect(subject).to receive(:health_check_request).with(anything).and_return(["", nil])
137
+ expect(subject).to receive(:get_root_path).with(anything).and_return([good_resp, nil])
138
+ expect(subject).to receive(:get_root_path).with(anything, hash_including(:headers => LogStash::Outputs::ElasticSearch::HttpClient::Pool::DEFAULT_EAV_HEADER)).and_return([nil, bad_400_err])
139
+ expect { subject.healthcheck! }.to raise_error(LogStash::ConfigurationError, "The Elastic-Api-Version header is not valid")
140
+ end
141
+
142
+ it "passes when the serverless connection test succeeds" do
143
+ subject.update_initial_urls
144
+
145
+ expect(subject).to receive(:health_check_request).with(anything).and_return(["", nil])
146
+ expect(subject).to receive(:get_root_path).with(anything).and_return([good_resp, nil])
147
+ expect(subject).to receive(:get_root_path).with(anything, hash_including(:headers => LogStash::Outputs::ElasticSearch::HttpClient::Pool::DEFAULT_EAV_HEADER)).and_return([good_resp, nil])
148
+ expect { subject.healthcheck! }.not_to raise_error
149
+ end
150
+ end
151
+
152
+ context "with 200 default" do
153
+ let(:good_resp) { MockResponse.new(200,
154
+ { "tagline" => "You Know, for Search",
155
+ "version" => { "number" => '8.10.0', "build_flavor" => 'default'}
156
+ },
157
+ { "X-Elastic-Product" => "Elasticsearch" }
158
+ ) }
159
+
160
+ it "passes without checking serverless connection" do
161
+ subject.update_initial_urls
162
+
163
+ expect(subject).to receive(:health_check_request).with(anything).and_return(["", nil])
164
+ expect(subject).to receive(:get_root_path).with(anything).and_return([good_resp, nil])
165
+ expect(subject).not_to receive(:get_root_path).with(anything, hash_including(:headers => LogStash::Outputs::ElasticSearch::HttpClient::Pool::DEFAULT_EAV_HEADER))
166
+ expect { subject.healthcheck! }.not_to raise_error
167
+ end
168
+ end
169
+
170
+ context "with 400" do
171
+ let(:mock_resp) { MockResponse.new(400, "The requested [Elastic-Api-Version] header value of [2024-10-31] is not valid. Only [2023-10-31] is supported") }
172
+ it_behaves_like "root path returns bad code error", "The Elastic-Api-Version header is not valid"
173
+ end
174
+
175
+ context "with 401" do
176
+ let(:mock_resp) { MockResponse.new(401, "missing authentication") }
177
+ it_behaves_like "root path returns bad code error", "Could not read Elasticsearch. Please check the credentials"
178
+ end
179
+
180
+ context "with 403" do
181
+ let(:mock_resp) { MockResponse.new(403, "Forbidden") }
182
+ it_behaves_like "root path returns bad code error", "Could not read Elasticsearch. Please check the privileges"
183
+ end
184
+ end
185
+
186
+ describe "non register phase" do
187
+ let(:health_bad_code_err) { ::LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError.new(400, nil, nil, nil) }
188
+
189
+ before :each do
190
+ subject.update_initial_urls
191
+ end
192
+
193
+ it "does not call root path when health check request fails" do
194
+ expect(subject).to receive(:health_check_request).with(anything).and_return(["", health_bad_code_err])
195
+ expect(subject).to receive(:get_root_path).never
196
+ subject.healthcheck!(false)
197
+ end
84
198
  end
85
199
  end
86
200
 
@@ -251,23 +365,23 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
251
365
  ::LogStash::Util::SafeURI.new("http://otherhost:9201")
252
366
  ] }
253
367
 
254
- let(:valid_response) { MockResponse.new(200, {"tagline" => "You Know, for Search",
255
- "version" => {
256
- "number" => '7.13.0',
257
- "build_flavor" => 'default'}
258
- }) }
259
-
260
- before(:each) do
261
- allow(subject).to receive(:perform_request_to_url).and_return(valid_response)
262
- subject.start
263
- end
264
-
265
- it "picks the largest major version" do
266
- expect(subject.maximum_seen_major_version).to eq(0)
267
- end
368
+ let(:root_response) { MockResponse.new(200, {"tagline" => "You Know, for Search",
369
+ "version" => {
370
+ "number" => '0.0.0',
371
+ "build_flavor" => 'default'}
372
+ }) }
373
+ let(:root_response2) { MockResponse.new(200, {"tagline" => "You Know, for Search",
374
+ "version" => {
375
+ "number" => '6.0.0',
376
+ "build_flavor" => 'default'}
377
+ }) }
268
378
 
269
379
  context "if there are nodes with multiple major versions" do
270
- let(:es_version_info) { [ { "number" => '0.0.0', "build_flavor" => 'default'}, { "number" => '6.0.0', "build_flavor" => 'default'} ] }
380
+ before(:each) do
381
+ allow(subject).to receive(:perform_request_to_url).and_return(root_response, root_response2)
382
+ subject.start
383
+ end
384
+
271
385
  it "picks the largest major version" do
272
386
  expect(subject.maximum_seen_major_version).to eq(6)
273
387
  end
@@ -275,32 +389,31 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
275
389
  end
276
390
 
277
391
 
278
- describe "build flavour tracking" do
392
+ describe "build flavor tracking" do
279
393
  let(:initial_urls) { [::LogStash::Util::SafeURI.new("http://somehost:9200")] }
280
394
 
281
- let(:es_version_info) { [ { "number" => '8.9.0', "build_flavor" => "serverless" } ] }
282
-
283
- let(:valid_response) { MockResponse.new(200,
395
+ let(:root_response) { MockResponse.new(200,
284
396
  {"tagline" => "You Know, for Search",
285
397
  "version" => {
286
398
  "number" => '8.9.0',
287
- "build_flavor" => LogStash::Outputs::ElasticSearch::HttpClient::Pool::BUILD_FLAVOUR_SERVERLESS} },
399
+ "build_flavor" => LogStash::Outputs::ElasticSearch::HttpClient::Pool::BUILD_FLAVOR_SERVERLESS} },
288
400
  { "X-Elastic-Product" => "Elasticsearch" }
289
401
  ) }
290
402
 
291
403
  before(:each) do
292
- allow(subject).to receive(:perform_request_to_url).and_return(valid_response)
404
+ allow(subject).to receive(:perform_request_to_url).and_return(root_response)
293
405
  subject.start
294
406
  end
295
407
 
296
- it "picks the build flavour" do
408
+ it "picks the build flavor" do
297
409
  expect(subject.serverless?).to be_truthy
298
410
  end
299
411
  end
300
412
 
301
413
  describe "license checking" do
302
414
  before(:each) do
303
- allow(subject).to receive(:health_check_request)
415
+ allow(subject).to receive(:health_check_request).and_return(["", nil])
416
+ allow(subject).to receive(:perform_request_to_url).and_return(root_response)
304
417
  allow(subject).to receive(:elasticsearch?).and_return(true)
305
418
  end
306
419
 
@@ -327,6 +440,36 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
327
440
  end
328
441
  end
329
442
 
443
+ describe "elastic api version header" do
444
+ let(:eav) { "Elastic-Api-Version" }
445
+
446
+ context "when it is serverless" do
447
+ before(:each) do
448
+ expect(subject).to receive(:serverless?).and_return(true)
449
+ end
450
+
451
+ it "add the default header" do
452
+ expect(adapter).to receive(:perform_request).with(anything, :get, "/", anything, anything) do |_, _, _, params, _|
453
+ expect(params[:headers]).to eq({ "User-Agent" => "chromium", "Elastic-Api-Version" => "2023-10-31"})
454
+ end
455
+ subject.perform_request_to_url(initial_urls, :get, "/", { :headers => { "User-Agent" => "chromium" }} )
456
+ end
457
+ end
458
+
459
+ context "when it is stateful" do
460
+ before(:each) do
461
+ expect(subject).to receive(:serverless?).and_return(false)
462
+ end
463
+
464
+ it "add the default header" do
465
+ expect(adapter).to receive(:perform_request).with(anything, :get, "/", anything, anything) do |_, _, _, params, _|
466
+ expect(params[:headers]).to be_nil
467
+ end
468
+ subject.perform_request_to_url(initial_urls, :get, "/" )
469
+ end
470
+ end
471
+ end
472
+
330
473
  # TODO: extract to ElasticSearchOutputLicenseChecker unit spec
331
474
  describe "license checking with ElasticSearchOutputLicenseChecker" do
332
475
  let(:options) do
@@ -334,7 +477,8 @@ describe LogStash::Outputs::ElasticSearch::HttpClient::Pool do
334
477
  end
335
478
 
336
479
  before(:each) do
337
- allow(subject).to receive(:health_check_request)
480
+ allow(subject).to receive(:health_check_request).and_return(["", nil])
481
+ allow(subject).to receive(:perform_request_to_url).and_return(root_response)
338
482
  allow(subject).to receive(:elasticsearch?).and_return(true)
339
483
  end
340
484
 
@@ -388,114 +532,71 @@ describe "#elasticsearch?" do
388
532
  let(:adapter) { double("Manticore Adapter") }
389
533
  let(:initial_urls) { [::LogStash::Util::SafeURI.new("http://localhost:9200")] }
390
534
  let(:options) { {:resurrect_delay => 2, :url_normalizer => proc {|u| u}} } # Shorten the delay a bit to speed up tests
391
- let(:es_version_info) { [{ "number" => '0.0.0', "build_flavor" => 'default'}] }
392
- let(:license_status) { 'active' }
393
535
 
394
536
  subject { LogStash::Outputs::ElasticSearch::HttpClient::Pool.new(logger, adapter, initial_urls, options) }
395
537
 
396
- let(:url) { ::LogStash::Util::SafeURI.new("http://localhost:9200") }
397
-
398
- context "in case HTTP error code" do
399
- it "should fail for 401" do
400
- allow(adapter).to receive(:perform_request)
401
- .with(anything, :get, "/", anything, anything)
402
- .and_return(MockResponse.new(401))
403
-
404
- expect(subject.elasticsearch?(url)).to be false
405
- end
406
-
407
- it "should fail for 403" do
408
- allow(adapter).to receive(:perform_request)
409
- .with(anything, :get, "/", anything, anything)
410
- .and_return(status: 403)
411
- expect(subject.elasticsearch?(url)).to be false
412
- end
413
- end
414
-
415
538
  context "when connecting to a cluster which reply without 'version' field" do
416
539
  it "should fail" do
417
- allow(adapter).to receive(:perform_request)
418
- .with(anything, :get, "/", anything, anything)
419
- .and_return(body: {"field" => "funky.com"}.to_json)
420
- expect(subject.elasticsearch?(url)).to be false
540
+ resp = MockResponse.new(200, {"field" => "funky.com"} )
541
+ expect(subject.send(:elasticsearch?, resp)).to be false
421
542
  end
422
543
  end
423
544
 
424
545
  context "when connecting to a cluster with version < 6.0.0" do
425
546
  it "should fail" do
426
- allow(adapter).to receive(:perform_request)
427
- .with(anything, :get, "/", anything, anything)
428
- .and_return(200, {"version" => { "number" => "5.0.0"}}.to_json)
429
- expect(subject.elasticsearch?(url)).to be false
547
+ resp = MockResponse.new(200, {"version" => { "number" => "5.0.0" }})
548
+ expect(subject.send(:elasticsearch?, resp)).to be false
430
549
  end
431
550
  end
432
551
 
433
552
  context "when connecting to a cluster with version in [6.0.0..7.0.0)" do
434
553
  it "must be successful with valid 'tagline'" do
435
- allow(adapter).to receive(:perform_request)
436
- .with(anything, :get, "/", anything, anything)
437
- .and_return(MockResponse.new(200, {"version" => {"number" => "6.5.0"}, "tagline" => "You Know, for Search"}))
438
- expect(subject.elasticsearch?(url)).to be true
554
+ resp = MockResponse.new(200, {"version" => {"number" => "6.5.0"}, "tagline" => "You Know, for Search"} )
555
+ expect(subject.send(:elasticsearch?, resp)).to be true
439
556
  end
440
557
 
441
558
  it "should fail if invalid 'tagline'" do
442
- allow(adapter).to receive(:perform_request)
443
- .with(anything, :get, "/", anything, anything)
444
- .and_return(MockResponse.new(200, {"version" => {"number" => "6.5.0"}, "tagline" => "You don't know"}))
445
- expect(subject.elasticsearch?(url)).to be false
559
+ resp = MockResponse.new(200, {"version" => {"number" => "6.5.0"}, "tagline" => "You don't know"} )
560
+ expect(subject.send(:elasticsearch?, resp)).to be false
446
561
  end
447
562
 
448
563
  it "should fail if 'tagline' is not present" do
449
- allow(adapter).to receive(:perform_request)
450
- .with(anything, :get, "/", anything, anything)
451
- .and_return(MockResponse.new(200, {"version" => {"number" => "6.5.0"}}))
452
- expect(subject.elasticsearch?(url)).to be false
564
+ resp = MockResponse.new(200, {"version" => {"number" => "6.5.0"}} )
565
+ expect(subject.send(:elasticsearch?, resp)).to be false
453
566
  end
454
567
  end
455
568
 
456
569
  context "when connecting to a cluster with version in [7.0.0..7.14.0)" do
457
570
  it "must be successful is 'build_flavor' is 'default' and tagline is correct" do
458
- allow(adapter).to receive(:perform_request)
459
- .with(anything, :get, "/", anything, anything)
460
- .and_return(MockResponse.new(200, {"version": {"number": "7.5.0", "build_flavor": "default"}, "tagline": "You Know, for Search"}))
461
- expect(subject.elasticsearch?(url)).to be true
571
+ resp = MockResponse.new(200, {"version": {"number": "7.5.0", "build_flavor": "default"}, "tagline": "You Know, for Search"} )
572
+ expect(subject.send(:elasticsearch?, resp)).to be true
462
573
  end
463
574
 
464
575
  it "should fail if 'build_flavor' is not 'default' and tagline is correct" do
465
- allow(adapter).to receive(:perform_request)
466
- .with(anything, :get, "/", anything, anything)
467
- .and_return(MockResponse.new(200, {"version": {"number": "7.5.0", "build_flavor": "oss"}, "tagline": "You Know, for Search"}))
468
- expect(subject.elasticsearch?(url)).to be false
576
+ resp = MockResponse.new(200, {"version": {"number": "7.5.0", "build_flavor": "oss"}, "tagline": "You Know, for Search"} )
577
+ expect(subject.send(:elasticsearch?, resp)).to be false
469
578
  end
470
579
 
471
580
  it "should fail if 'build_flavor' is not present and tagline is correct" do
472
- allow(adapter).to receive(:perform_request)
473
- .with(anything, :get, "/", anything, anything)
474
- .and_return(MockResponse.new(200, {"version": {"number": "7.5.0"}, "tagline": "You Know, for Search"}))
475
- expect(subject.elasticsearch?(url)).to be false
581
+ resp = MockResponse.new(200, {"version": {"number": "7.5.0"}, "tagline": "You Know, for Search"} )
582
+ expect(subject.send(:elasticsearch?, resp)).to be false
476
583
  end
477
584
  end
478
585
 
479
586
  context "when connecting to a cluster with version >= 7.14.0" do
480
587
  it "should fail if 'X-elastic-product' header is not present" do
481
- allow(adapter).to receive(:perform_request)
482
- .with(anything, :get, "/", anything, anything)
483
- .and_return(MockResponse.new(200, {"version": {"number": "7.14.0"}}))
484
- expect(subject.elasticsearch?(url)).to be false
588
+ resp = MockResponse.new(200, {"version": {"number": "7.14.0"}} )
589
+ expect(subject.send(:elasticsearch?, resp)).to be false
485
590
  end
486
591
 
487
592
  it "should fail if 'X-elastic-product' header is present but with bad value" do
488
- allow(adapter).to receive(:perform_request)
489
- .with(anything, :get, "/", anything, anything)
490
- .and_return(MockResponse.new(200, {"version": {"number": "7.14.0"}}, {'X-elastic-product' => 'not good'}))
491
- expect(subject.elasticsearch?(url)).to be false
593
+ resp = MockResponse.new(200, {"version": {"number": "7.14.0"}}, {'X-elastic-product' => 'not good'} )
594
+ expect(subject.send(:elasticsearch?, resp)).to be false
492
595
  end
493
596
 
494
597
  it "must be successful when 'X-elastic-product' header is present with 'Elasticsearch' value" do
495
- allow(adapter).to receive(:perform_request)
496
- .with(anything, :get, "/", anything, anything)
497
- .and_return(MockResponse.new(200, {"version": {"number": "7.14.0"}}, {'X-elastic-product' => 'Elasticsearch'}))
498
- expect(subject.elasticsearch?(url)).to be true
598
+ resp = MockResponse.new(200, {"version": {"number": "7.14.0"}}, {'X-elastic-product' => 'Elasticsearch'} )
599
+ expect(subject.send(:elasticsearch?, resp)).to be true
499
600
  end
500
601
  end
501
602
  end
@@ -135,7 +135,7 @@ describe LogStash::Outputs::ElasticSearch::HttpClient do
135
135
  }
136
136
 
137
137
  it "returns the hash response" do
138
- expect(subject.pool).to receive(:get).with(path, nil).and_return(get_response)
138
+ expect(subject.pool).to receive(:get).with(path).and_return(get_response)
139
139
  expect(subject.get(path)["body"]).to eq(body)
140
140
  end
141
141
  end
@@ -183,6 +183,25 @@ describe LogStash::Outputs::ElasticSearch::HttpClient do
183
183
  end
184
184
  end
185
185
 
186
+ describe "compression_level?" do
187
+ subject { described_class.new(base_options) }
188
+ let(:base_options) { super().merge(:client_settings => {:compression_level => compression_level}) }
189
+
190
+ context "with client_settings `compression_level => 1`" do
191
+ let(:compression_level) { 1 }
192
+ it "gives true" do
193
+ expect(subject.compression_level?).to be_truthy
194
+ end
195
+ end
196
+
197
+ context "with client_settings `compression_level => 0`" do
198
+ let(:compression_level) { 0 }
199
+ it "gives false" do
200
+ expect(subject.compression_level?).to be_falsey
201
+ end
202
+ end
203
+ end
204
+
186
205
  describe "#bulk" do
187
206
  subject(:http_client) { described_class.new(base_options) }
188
207
 
@@ -192,13 +211,14 @@ describe LogStash::Outputs::ElasticSearch::HttpClient do
192
211
  ["index", {:_id=>nil, :_index=>"logstash"}, {"message"=> message}],
193
212
  ]}
194
213
 
195
- [true,false].each do |http_compression_enabled|
196
- context "with `http_compression => #{http_compression_enabled}`" do
214
+ [0, 9].each do |compression_level|
215
+ context "with `compression_level => #{compression_level}`" do
197
216
 
198
- let(:base_options) { super().merge(:client_settings => {:http_compression => http_compression_enabled}) }
217
+ let(:base_options) { super().merge(:client_settings => {:compression_level => compression_level}) }
218
+ let(:compression_level_enabled) { compression_level > 0 }
199
219
 
200
220
  before(:each) do
201
- if http_compression_enabled
221
+ if compression_level_enabled
202
222
  expect(http_client).to receive(:gzip_writer).at_least(:once).and_call_original
203
223
  else
204
224
  expect(http_client).to_not receive(:gzip_writer)
@@ -212,7 +232,7 @@ describe LogStash::Outputs::ElasticSearch::HttpClient do
212
232
  it "should be handled properly" do
213
233
  allow(subject).to receive(:join_bulk_responses)
214
234
  expect(subject).to receive(:bulk_send).once do |data|
215
- if !http_compression_enabled
235
+ if !compression_level_enabled
216
236
  expect(data.size).to be > target_bulk_bytes
217
237
  else
218
238
  expect(Zlib::gunzip(data.string).size).to be > target_bulk_bytes
@@ -474,7 +474,7 @@ describe LogStash::Outputs::ElasticSearch do
474
474
 
475
475
  context "unexpected bulk response" do
476
476
  let(:options) do
477
- { "hosts" => "127.0.0.1:9999", "index" => "%{foo}", "manage_template" => false }
477
+ { "hosts" => "127.0.0.1:9999", "index" => "%{foo}", "manage_template" => false, "http_compression" => false }
478
478
  end
479
479
 
480
480
  let(:events) { [ ::LogStash::Event.new("foo" => "bar1"), ::LogStash::Event.new("foo" => "bar2") ] }
@@ -624,6 +624,7 @@ describe LogStash::Outputs::ElasticSearch do
624
624
  end
625
625
 
626
626
  context '413 errors' do
627
+ let(:options) { super().merge("http_compression" => "false") }
627
628
  let(:payload_size) { LogStash::Outputs::ElasticSearch::TARGET_BULK_BYTES + 1024 }
628
629
  let(:event) { ::LogStash::Event.new("message" => ("a" * payload_size ) ) }
629
630
 
@@ -1557,6 +1558,37 @@ describe LogStash::Outputs::ElasticSearch do
1557
1558
  end
1558
1559
  end
1559
1560
 
1561
+ describe "http compression" do
1562
+ describe "initialize setting" do
1563
+ context "with `http_compression` => true" do
1564
+ let(:options) { super().merge('http_compression' => true) }
1565
+ it "set compression level to 1" do
1566
+ subject.register
1567
+ expect(subject.instance_variable_get(:@compression_level)).to eq(1)
1568
+ end
1569
+ end
1570
+
1571
+ context "with `http_compression` => false" do
1572
+ let(:options) { super().merge('http_compression' => false) }
1573
+ it "set compression level to 0" do
1574
+ subject.register
1575
+ expect(subject.instance_variable_get(:@compression_level)).to eq(0)
1576
+ end
1577
+ end
1578
+
1579
+ [0, 9].each do |config|
1580
+ context "with `compression_level` => #{config}" do
1581
+ let(:options) { super().merge('compression_level' => config) }
1582
+ it "keeps the setting" do
1583
+ subject.register
1584
+ expect(subject.instance_variable_get(:@compression_level)).to eq(config)
1585
+ end
1586
+ end
1587
+ end
1588
+ end
1589
+
1590
+ end
1591
+
1560
1592
  @private
1561
1593
 
1562
1594
  def stub_manticore_client!(manticore_double = nil)
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-elasticsearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 11.16.0
4
+ version: 11.18.0
5
5
  platform: java
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-08-09 00:00:00.000000000 Z
11
+ date: 2023-09-25 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement