logstash-input-elasticsearch 4.12.3 → 4.15.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4a4ed685c9f90446a47fc84cb4e52e585f164c5451c9dca3f7359b756077bbd4
4
- data.tar.gz: 050b54e5c660a4ab436f57ab16ea20d6a64a6b8a8c063eaaca69cc2c77235bd4
3
+ metadata.gz: 9379a31460bf649615c038b1b35d207f6a6db869bb1a27e963da70e278ae6bcd
4
+ data.tar.gz: f62823b8ddfb587ce9614a1d131be1bce2186dbd859287b1107757917a754c7e
5
5
  SHA512:
6
- metadata.gz: 902070db657622bd65bcba8c17470456366134aae23cd38fb70ec89c97686b74da591e6f47d34957f3e5753f0e029ad78b845dd6355d6d559b6c135e0c94d28e
7
- data.tar.gz: 22391299a87f5ce0a3085b4be7886a7f7251c9e33bd7051996871ea7cc34cbd7fbc53a3d54d841722b319949c4422e0ab7e6ff7d04899db6255543363fc32f66
6
+ metadata.gz: c675ed6a5d1a4a104313611e4895171c6461149d0415c5ad4c276e42b5b99672584ae38ddc37f3e32521074727051c68f0d10a37b77064d0cea6ea03f041a3b9
7
+ data.tar.gz: beba1b98be77880e6e3712cebdd0474d68c33556a86236e828a3332d2e812b187e920e78f6fb006b27b82269faa0e0654568d050f7057ae7db48da7fd9a20e31
data/CHANGELOG.md CHANGED
@@ -1,5 +1,14 @@
1
+ ## 4.15.0
2
+ - Feat: add `retries` option. allow retry for failing query [#179](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/179)
3
+
4
+ ## 4.14.0
5
+ - Refactor: switch to using scheduler mixin [#177](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/177)
6
+
7
+ ## 4.13.0
8
+ - Added support for `ca_trusted_fingerprint` when run on Logstash 8.3+ [#178](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/178)
9
+
1
10
  ## 4.12.3
2
- - Fix: update Elasticsearch Ruby client to correctly customize 'user-agent' header[#171](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/171)
11
+ - Fix: update Elasticsearch Ruby client to correctly customize 'user-agent' header [#171](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/171)
3
12
 
4
13
  ## 4.12.2
5
14
  - Fix: hosts => "es_host:port" regression [#168](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/168)
@@ -24,7 +33,6 @@
24
33
  header isn't passed, this leads to the plugin not being able to leverage `user`/`password` credentials set by the user.
25
34
  [#153](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/153)
26
35
 
27
-
28
36
  ## 4.9.1
29
37
  - [DOC] Replaced hard-coded links with shared attributes [#143](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/143)
30
38
  - [DOC] Added missing quote to docinfo_fields example [#145](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/145)
data/docs/index.asciidoc CHANGED
@@ -103,6 +103,7 @@ This plugin supports the following configuration options plus the <<plugins-{typ
103
103
  |Setting |Input type|Required
104
104
  | <<plugins-{type}s-{plugin}-api_key>> |<<password,password>>|No
105
105
  | <<plugins-{type}s-{plugin}-ca_file>> |a valid filesystem path|No
106
+ | <<plugins-{type}s-{plugin}-ca_trusted_fingerprint>> |<<string,string>>|No
106
107
  | <<plugins-{type}s-{plugin}-cloud_auth>> |<<password,password>>|No
107
108
  | <<plugins-{type}s-{plugin}-cloud_id>> |<<string,string>>|No
108
109
  | <<plugins-{type}s-{plugin}-connect_timeout_seconds>> | <<number,number>>|No
@@ -123,6 +124,7 @@ This plugin supports the following configuration options plus the <<plugins-{typ
123
124
  | <<plugins-{type}s-{plugin}-ssl>> |<<boolean,boolean>>|No
124
125
  | <<plugins-{type}s-{plugin}-socket_timeout_seconds>> | <<number,number>>|No
125
126
  | <<plugins-{type}s-{plugin}-target>> | {logstash-ref}/field-references-deepdive.html[field reference] | No
127
+ | <<plugins-{type}s-{plugin}-retries>> | <<number,number>>|No
126
128
  | <<plugins-{type}s-{plugin}-user>> |<<string,string>>|No
127
129
  |=======================================================================
128
130
 
@@ -152,6 +154,15 @@ API key API].
152
154
 
153
155
  SSL Certificate Authority file in PEM encoded format, must also include any chain certificates as necessary.
154
156
 
157
+ [id="plugins-{type}s-{plugin}-ca_trusted_fingerprint"]
158
+ ===== `ca_trusted_fingerprint`
159
+
160
+ * Value type is <<string,string>>, and must contain exactly 64 hexadecimal characters.
161
+ * There is no default value for this setting.
162
+ * Use of this option _requires_ Logstash 8.3+
163
+
164
+ The SHA-256 fingerprint of an SSL Certificate Authority to trust, such as the autogenerated self-signed CA for an Elasticsearch cluster.
165
+
155
166
  [id="plugins-{type}s-{plugin}-cloud_auth"]
156
167
  ===== `cloud_auth`
157
168
 
@@ -329,6 +340,17 @@ The maximum amount of time, in seconds, for a single request to Elasticsearch.
329
340
  Request timeouts tend to occur when an individual page of data is very large, such as when it contains large-payload
330
341
  documents and/or the <<plugins-{type}s-{plugin}-size>> has been specified as a large value.
331
342
 
343
+
344
+ [id="plugins-{type}s-{plugin}-retries"]
345
+ ===== `retries`
346
+
347
+ * Value type is <<number,number>>
348
+ * Default value is `0`
349
+
350
+ The number of times to re-run the query after the first failure. If the query fails after all retries, it logs an error message.
351
+ The default is 0 (no retry). This value should be equal to or greater than zero.
352
+
353
+
332
354
  [id="plugins-{type}s-{plugin}-schedule"]
333
355
  ===== `schedule`
334
356
 
@@ -414,6 +436,7 @@ When the `target` is set to a field reference, the `_source` of the hit is place
414
436
  This option can be useful to avoid populating unknown fields when a downstream schema such as ECS is enforced.
415
437
  It is also possible to target an entry in the event's metadata, which will be available during event processing but not exported to your outputs (e.g., `target \=> "[@metadata][_source]"`).
416
438
 
439
+
417
440
  [id="plugins-{type}s-{plugin}-user"]
418
441
  ===== `user`
419
442
 
@@ -0,0 +1,18 @@
1
+ # Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
2
+ # or more contributor license agreements. Licensed under the Elastic License;
3
+ # you may not use this file except in compliance with the Elastic License.
4
+
5
+ require 'stud/try'
6
+
7
+ module LogStash module Helpers
8
+ class LoggableTry < Stud::Try
9
+ def initialize(logger, name)
10
+ @logger = logger
11
+ @name = name
12
+ end
13
+
14
+ def log_failure(exception, fail_count, message)
15
+ @logger.warn("Attempt to #{@name} but failed. #{message}", fail_count: fail_count, exception: exception.message)
16
+ end
17
+ end
18
+ end end
@@ -7,7 +7,10 @@ require 'logstash/plugin_mixins/validator_support/field_reference_validation_ada
7
7
  require 'logstash/plugin_mixins/event_support/event_factory_adapter'
8
8
  require 'logstash/plugin_mixins/ecs_compatibility_support'
9
9
  require 'logstash/plugin_mixins/ecs_compatibility_support/target_check'
10
+ require 'logstash/plugin_mixins/ca_trusted_fingerprint_support'
11
+ require "logstash/plugin_mixins/scheduler"
10
12
  require "base64"
13
+ require 'logstash/helpers/loggable_try'
11
14
 
12
15
  require "elasticsearch"
13
16
  require "elasticsearch/transport/transport/http/manticore"
@@ -77,6 +80,8 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
77
80
 
78
81
  extend LogStash::PluginMixins::ValidatorSupport::FieldReferenceValidationAdapter
79
82
 
83
+ include LogStash::PluginMixins::Scheduler
84
+
80
85
  config_name "elasticsearch"
81
86
 
82
87
  # List of elasticsearch hosts to use for querying.
@@ -95,6 +100,9 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
95
100
  # This allows you to set the maximum number of hits returned per scroll.
96
101
  config :size, :validate => :number, :default => 1000
97
102
 
103
+ # The number of retries to run the query. If the query fails after all retries, it logs an error message.
104
+ config :retries, :validate => :number, :default => 0
105
+
98
106
  # This parameter controls the keepalive time in seconds of the scrolling
99
107
  # request and initiates the scrolling process. The timeout applies per
100
108
  # round trip (i.e. between the previous scroll request, to the next).
@@ -192,6 +200,9 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
192
200
  # If set, the _source of each hit will be added nested under the target instead of at the top-level
193
201
  config :target, :validate => :field_reference
194
202
 
203
+ # config :ca_trusted_fingerprint, :validate => :sha_256_hex
204
+ include LogStash::PluginMixins::CATrustedFingerprintSupport
205
+
195
206
  def initialize(params={})
196
207
  super(params)
197
208
 
@@ -214,6 +225,8 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
214
225
  @slices < 1 && fail(LogStash::ConfigurationError, "Elasticsearch Input Plugin's `slices` option must be greater than zero, got `#{@slices}`")
215
226
  end
216
227
 
228
+ @retries < 0 && fail(LogStash::ConfigurationError, "Elasticsearch Input Plugin's `retries` option must be equal or greater than zero, got `#{@retries}`")
229
+
217
230
  validate_authentication
218
231
  fill_user_password_from_cloud_auth
219
232
  fill_hosts_from_cloud_id
@@ -247,37 +260,75 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
247
260
 
248
261
  def run(output_queue)
249
262
  if @schedule
250
- @scheduler = Rufus::Scheduler.new(:max_work_threads => 1)
251
- @scheduler.cron @schedule do
252
- do_run(output_queue)
253
- end
254
-
255
- @scheduler.join
263
+ scheduler.cron(@schedule) { do_run(output_queue) }
264
+ scheduler.join
256
265
  else
257
266
  do_run(output_queue)
258
267
  end
259
268
  end
260
269
 
261
- def stop
262
- @scheduler.stop if @scheduler
263
- end
264
-
265
270
  private
266
-
271
+ JOB_NAME = "run query"
267
272
  def do_run(output_queue)
268
273
  # if configured to run a single slice, don't bother spinning up threads
269
- return do_run_slice(output_queue) if @slices.nil? || @slices <= 1
274
+ if @slices.nil? || @slices <= 1
275
+ success, events = retryable_slice
276
+ success && events.each { |event| output_queue << event }
277
+ return
278
+ end
270
279
 
271
280
  logger.warn("managed slices for query is very large (#{@slices}); consider reducing") if @slices > 8
272
281
 
282
+ slice_results = parallel_slice # array of tuple(ok, events)
283
+
284
+ # insert events to queue if all slices success
285
+ if slice_results.all?(&:first)
286
+ slice_results.flat_map { |success, events| events }
287
+ .each { |event| output_queue << event }
288
+ end
289
+
290
+ logger.trace("#{@slices} slices completed")
291
+ end
292
+
293
+ def retryable(job_name, &block)
294
+ begin
295
+ stud_try = ::LogStash::Helpers::LoggableTry.new(logger, job_name)
296
+ output = stud_try.try((@retries + 1).times) { yield }
297
+ [true, output]
298
+ rescue => e
299
+ error_details = {:message => e.message, :cause => e.cause}
300
+ error_details[:backtrace] = e.backtrace if logger.debug?
301
+ logger.error("Tried #{job_name} unsuccessfully", error_details)
302
+ [false, nil]
303
+ end
304
+ end
305
+
306
+
307
+ # @return [(ok, events)] : Array of tuple(Boolean, [Logstash::Event])
308
+ def parallel_slice
309
+ pipeline_id = execution_context&.pipeline_id || 'main'
273
310
  @slices.times.map do |slice_id|
274
311
  Thread.new do
275
- LogStash::Util::set_thread_name("#{@id}_slice_#{slice_id}")
276
- do_run_slice(output_queue, slice_id)
312
+ LogStash::Util::set_thread_name("[#{pipeline_id}]|input|elasticsearch|slice_#{slice_id}")
313
+ retryable_slice(slice_id)
277
314
  end
278
- end.map(&:join)
315
+ end.map do |t|
316
+ t.join
317
+ t.value
318
+ end
279
319
  end
280
320
 
321
+ # @param scroll_id [Integer]
322
+ # @return (ok, events) [Boolean, Array(Logstash::Event)]
323
+ def retryable_slice(slice_id=nil)
324
+ retryable(JOB_NAME) do
325
+ output = []
326
+ do_run_slice(output, slice_id)
327
+ output
328
+ end
329
+ end
330
+
331
+
281
332
  def do_run_slice(output_queue, slice_id=nil)
282
333
  slice_query = @base_query
283
334
  slice_query = slice_query.merge('slice' => { 'id' => slice_id, 'max' => @slices}) unless slice_id.nil?
@@ -314,11 +365,6 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
314
365
  r = scroll_request(scroll_id)
315
366
  r['hits']['hits'].each { |hit| push_hit(hit, output_queue) }
316
367
  [r['hits']['hits'].any?, r['_scroll_id']]
317
- rescue => e
318
- # this will typically be triggered by a scroll timeout
319
- logger.error("Scroll request error, aborting scroll", message: e.message, exception: e.class)
320
- # return no hits and original scroll_id so we can try to clear it
321
- [false, scroll_id]
322
368
  end
323
369
 
324
370
  def push_hit(hit, output_queue)
@@ -353,7 +399,7 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
353
399
  logger.warn("Ignoring clear_scroll exception", message: e.message, exception: e.class)
354
400
  end
355
401
 
356
- def scroll_request scroll_id
402
+ def scroll_request(scroll_id)
357
403
  @client.scroll(:body => { :scroll_id => scroll_id }, :scroll => @scroll)
358
404
  end
359
405
 
@@ -381,7 +427,13 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
381
427
  end
382
428
 
383
429
  def setup_ssl
384
- @ssl && @ca_file ? { :ssl => true, :ca_file => @ca_file } : {}
430
+ ssl_options = {}
431
+
432
+ ssl_options[:ssl] = true if @ssl
433
+ ssl_options[:ca_file] = @ca_file if @ssl && @ca_file
434
+ ssl_options[:trust_strategy] = trust_strategy_for_ca_trusted_fingerprint
435
+
436
+ ssl_options
385
437
  end
386
438
 
387
439
  def setup_hosts
@@ -1,7 +1,7 @@
1
1
  Gem::Specification.new do |s|
2
2
 
3
3
  s.name = 'logstash-input-elasticsearch'
4
- s.version = '4.12.3'
4
+ s.version = '4.15.0'
5
5
  s.licenses = ['Apache License (2.0)']
6
6
  s.summary = "Reads query results from an Elasticsearch cluster"
7
7
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -24,12 +24,13 @@ Gem::Specification.new do |s|
24
24
  s.add_runtime_dependency 'logstash-mixin-ecs_compatibility_support', '~> 1.3'
25
25
  s.add_runtime_dependency 'logstash-mixin-event_support', '~> 1.0'
26
26
  s.add_runtime_dependency "logstash-mixin-validator_support", '~> 1.0'
27
+ s.add_runtime_dependency "logstash-mixin-scheduler", '~> 1.0'
27
28
 
28
29
  s.add_runtime_dependency 'elasticsearch', '>= 7.17.1'
30
+ s.add_runtime_dependency 'logstash-mixin-ca_trusted_fingerprint_support', '~> 1.0'
29
31
 
30
32
  s.add_runtime_dependency 'tzinfo'
31
33
  s.add_runtime_dependency 'tzinfo-data'
32
- s.add_runtime_dependency 'rufus-scheduler'
33
34
  s.add_runtime_dependency 'manticore', ">= 0.7.1"
34
35
 
35
36
  s.add_development_dependency 'logstash-codec-plain'
@@ -37,4 +38,7 @@ Gem::Specification.new do |s|
37
38
  s.add_development_dependency 'timecop'
38
39
  s.add_development_dependency 'cabin', ['~> 0.6']
39
40
  s.add_development_dependency 'webrick'
41
+
42
+ # 3.8.0 has breaking changes WRT to joining, which break our specs
43
+ s.add_development_dependency 'rufus-scheduler', '~> 3.0.9'
40
44
  end
@@ -0,0 +1 @@
1
+ 195a7e7b1bc29f3d7913a918a44721704d27fa56facea0cd72a8093c7107c283
@@ -19,7 +19,14 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
19
19
  let(:queue) { Queue.new }
20
20
 
21
21
  before(:each) do
22
- Elasticsearch::Client.send(:define_method, :ping) { } # define no-action ping method
22
+ Elasticsearch::Client.send(:define_method, :ping) { } # define no-action ping method
23
+ end
24
+
25
+ let(:base_config) do
26
+ {
27
+ 'hosts' => ["localhost"],
28
+ 'query' => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
29
+ }
23
30
  end
24
31
 
25
32
  context "register" do
@@ -44,6 +51,17 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
44
51
  expect { plugin.register }.to raise_error(LogStash::ConfigurationError)
45
52
  end
46
53
  end
54
+
55
+ context "retry" do
56
+ let(:config) do
57
+ {
58
+ "retries" => -1
59
+ }
60
+ end
61
+ it "should raise an exception with negative number" do
62
+ expect { plugin.register }.to raise_error(LogStash::ConfigurationError)
63
+ end
64
+ end
47
65
  end
48
66
 
49
67
  it_behaves_like "an interruptible input plugin" do
@@ -78,14 +96,10 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
78
96
  end
79
97
 
80
98
  let(:config) do
81
- %q[
82
- input {
83
- elasticsearch {
84
- hosts => ["localhost"]
85
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
86
- }
87
- }
88
- ]
99
+ {
100
+ 'hosts' => ["localhost"],
101
+ 'query' => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
102
+ }
89
103
  end
90
104
 
91
105
  let(:mock_response) do
@@ -128,10 +142,11 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
128
142
  expect(client).to receive(:ping)
129
143
  end
130
144
 
145
+ before { plugin.register }
146
+
131
147
  it 'creates the events from the hits' do
132
- event = input(config) do |pipeline, queue|
133
- queue.pop
134
- end
148
+ plugin.run queue
149
+ event = queue.pop
135
150
 
136
151
  expect(event).to be_a(LogStash::Event)
137
152
  expect(event.get("message")).to eql [ "ohayo" ]
@@ -139,21 +154,16 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
139
154
 
140
155
  context 'when a target is set' do
141
156
  let(:config) do
142
- %q[
143
- input {
144
- elasticsearch {
145
- hosts => ["localhost"]
146
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
147
- target => "[@metadata][_source]"
148
- }
149
- }
150
- ]
157
+ {
158
+ 'hosts' => ["localhost"],
159
+ 'query' => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }',
160
+ 'target' => "[@metadata][_source]"
161
+ }
151
162
  end
152
163
 
153
164
  it 'creates the event using the target' do
154
- event = input(config) do |pipeline, queue|
155
- queue.pop
156
- end
165
+ plugin.run queue
166
+ event = queue.pop
157
167
 
158
168
  expect(event).to be_a(LogStash::Event)
159
169
  expect(event.get("[@metadata][_source][message]")).to eql [ "ohayo" ]
@@ -194,7 +204,7 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
194
204
  context 'with `slices => 1`' do
195
205
  let(:slices) { 1 }
196
206
  it 'runs just one slice' do
197
- expect(plugin).to receive(:do_run_slice).with(duck_type(:<<))
207
+ expect(plugin).to receive(:do_run_slice).with(duck_type(:<<), nil)
198
208
  expect(Thread).to_not receive(:new)
199
209
 
200
210
  plugin.register
@@ -205,7 +215,7 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
205
215
  context 'without slices directive' do
206
216
  let(:config) { super().tap { |h| h.delete('slices') } }
207
217
  it 'runs just one slice' do
208
- expect(plugin).to receive(:do_run_slice).with(duck_type(:<<))
218
+ expect(plugin).to receive(:do_run_slice).with(duck_type(:<<), nil)
209
219
  expect(Thread).to_not receive(:new)
210
220
 
211
221
  plugin.register
@@ -316,7 +326,6 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
316
326
  end
317
327
  # END SLICE 1
318
328
 
319
- let(:client) { Elasticsearch::Client.new }
320
329
 
321
330
  # RSpec mocks validations are not threadsafe.
322
331
  # Allow caller to synchronize.
@@ -330,69 +339,112 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
330
339
  end
331
340
  end
332
341
 
333
- before(:each) do
334
- expect(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
335
- plugin.register
342
+ describe "with normal response" do
343
+ before(:each) do
344
+ expect(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
345
+ plugin.register
336
346
 
337
- expect(client).to receive(:clear_scroll).and_return(nil)
347
+ expect(client).to receive(:clear_scroll).and_return(nil)
338
348
 
339
- # SLICE0 is a three-page scroll in which the last page is empty
340
- slice0_query = LogStash::Json.dump(query.merge('slice' => { 'id' => 0, 'max' => 2}))
341
- expect(client).to receive(:search).with(hash_including(:body => slice0_query)).and_return(slice0_response0)
342
- expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice0_scroll1 })).and_return(slice0_response1)
343
- expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice0_scroll2 })).and_return(slice0_response2)
344
- allow(client).to receive(:ping)
349
+ # SLICE0 is a three-page scroll in which the last page is empty
350
+ slice0_query = LogStash::Json.dump(query.merge('slice' => { 'id' => 0, 'max' => 2}))
351
+ expect(client).to receive(:search).with(hash_including(:body => slice0_query)).and_return(slice0_response0)
352
+ expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice0_scroll1 })).and_return(slice0_response1)
353
+ expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice0_scroll2 })).and_return(slice0_response2)
354
+ allow(client).to receive(:ping)
345
355
 
346
- # SLICE1 is a two-page scroll in which the last page has no next scroll id
347
- slice1_query = LogStash::Json.dump(query.merge('slice' => { 'id' => 1, 'max' => 2}))
348
- expect(client).to receive(:search).with(hash_including(:body => slice1_query)).and_return(slice1_response0)
349
- expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice1_scroll1 })).and_return(slice1_response1)
356
+ # SLICE1 is a two-page scroll in which the last page has no next scroll id
357
+ slice1_query = LogStash::Json.dump(query.merge('slice' => { 'id' => 1, 'max' => 2}))
358
+ expect(client).to receive(:search).with(hash_including(:body => slice1_query)).and_return(slice1_response0)
359
+ expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice1_scroll1 })).and_return(slice1_response1)
350
360
 
351
- synchronize_method!(plugin, :scroll_request)
352
- synchronize_method!(plugin, :search_request)
353
- end
361
+ synchronize_method!(plugin, :scroll_request)
362
+ synchronize_method!(plugin, :search_request)
363
+ end
354
364
 
355
- let(:emitted_events) do
356
- queue = Queue.new # since we are running slices in threads, we need a thread-safe queue.
357
- plugin.run(queue)
358
- events = []
359
- events << queue.pop until queue.empty?
360
- events
361
- end
365
+ let(:client) { Elasticsearch::Client.new }
362
366
 
363
- let(:emitted_event_ids) do
364
- emitted_events.map { |event| event.get('[@metadata][_id]') }
365
- end
367
+ let(:emitted_events) do
368
+ queue = Queue.new # since we are running slices in threads, we need a thread-safe queue.
369
+ plugin.run(queue)
370
+ events = []
371
+ events << queue.pop until queue.empty?
372
+ events
373
+ end
366
374
 
367
- it 'emits the hits on the first page of the first slice' do
368
- expect(emitted_event_ids).to include('slice0-response0-item0')
369
- expect(emitted_event_ids).to include('slice0-response0-item1')
370
- end
371
- it 'emits the hits on the second page of the first slice' do
372
- expect(emitted_event_ids).to include('slice0-response1-item0')
373
- end
375
+ let(:emitted_event_ids) do
376
+ emitted_events.map { |event| event.get('[@metadata][_id]') }
377
+ end
374
378
 
375
- it 'emits the hits on the first page of the second slice' do
376
- expect(emitted_event_ids).to include('slice1-response0-item0')
377
- expect(emitted_event_ids).to include('slice1-response0-item1')
378
- end
379
+ it 'emits the hits on the first page of the first slice' do
380
+ expect(emitted_event_ids).to include('slice0-response0-item0')
381
+ expect(emitted_event_ids).to include('slice0-response0-item1')
382
+ end
383
+ it 'emits the hits on the second page of the first slice' do
384
+ expect(emitted_event_ids).to include('slice0-response1-item0')
385
+ end
379
386
 
380
- it 'emits the hitson the second page of the second slice' do
381
- expect(emitted_event_ids).to include('slice1-response1-item0')
382
- expect(emitted_event_ids).to include('slice1-response1-item1')
383
- end
387
+ it 'emits the hits on the first page of the second slice' do
388
+ expect(emitted_event_ids).to include('slice1-response0-item0')
389
+ expect(emitted_event_ids).to include('slice1-response0-item1')
390
+ end
391
+
392
+ it 'emits the hits on the second page of the second slice' do
393
+ expect(emitted_event_ids).to include('slice1-response1-item0')
394
+ expect(emitted_event_ids).to include('slice1-response1-item1')
395
+ end
384
396
 
385
- it 'does not double-emit' do
386
- expect(emitted_event_ids.uniq).to eq(emitted_event_ids)
397
+ it 'does not double-emit' do
398
+ expect(emitted_event_ids.uniq).to eq(emitted_event_ids)
399
+ end
400
+
401
+ it 'emits events with appropriate fields' do
402
+ emitted_events.each do |event|
403
+ expect(event).to be_a(LogStash::Event)
404
+ expect(event.get('message')).to eq(['hello, world'])
405
+ expect(event.get('[@metadata][_id]')).to_not be_nil
406
+ expect(event.get('[@metadata][_id]')).to_not be_empty
407
+ expect(event.get('[@metadata][_index]')).to start_with('logstash-')
408
+ end
409
+ end
387
410
  end
388
411
 
389
- it 'emits events with appropriate fields' do
390
- emitted_events.each do |event|
391
- expect(event).to be_a(LogStash::Event)
392
- expect(event.get('message')).to eq(['hello, world'])
393
- expect(event.get('[@metadata][_id]')).to_not be_nil
394
- expect(event.get('[@metadata][_id]')).to_not be_empty
395
- expect(event.get('[@metadata][_index]')).to start_with('logstash-')
412
+ describe "with scroll request fail" do
413
+ before(:each) do
414
+ expect(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
415
+ plugin.register
416
+
417
+ expect(client).to receive(:clear_scroll).and_return(nil)
418
+
419
+ # SLICE0 is a three-page scroll in which the second page throw exception
420
+ slice0_query = LogStash::Json.dump(query.merge('slice' => { 'id' => 0, 'max' => 2}))
421
+ expect(client).to receive(:search).with(hash_including(:body => slice0_query)).and_return(slice0_response0)
422
+ expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice0_scroll1 })).and_raise("boom")
423
+ allow(client).to receive(:ping)
424
+
425
+ # SLICE1 is a two-page scroll in which the last page has no next scroll id
426
+ slice1_query = LogStash::Json.dump(query.merge('slice' => { 'id' => 1, 'max' => 2}))
427
+ expect(client).to receive(:search).with(hash_including(:body => slice1_query)).and_return(slice1_response0)
428
+ expect(client).to receive(:scroll).with(hash_including(:body => { :scroll_id => slice1_scroll1 })).and_return(slice1_response1)
429
+
430
+ synchronize_method!(plugin, :scroll_request)
431
+ synchronize_method!(plugin, :search_request)
432
+ end
433
+
434
+ let(:client) { Elasticsearch::Client.new }
435
+
436
+ it 'does not insert event to queue' do
437
+ expect(plugin).to receive(:parallel_slice).and_wrap_original do |m, *args|
438
+ slice0, slice1 = m.call
439
+ expect(slice0[0]).to be_falsey
440
+ expect(slice1[0]).to be_truthy
441
+ expect(slice1[1].size).to eq(4) # four items from SLICE1
442
+ [slice0, slice1]
443
+ end
444
+
445
+ queue = Queue.new
446
+ plugin.run(queue)
447
+ expect(queue.size).to eq(0)
396
448
  end
397
449
  end
398
450
  end
@@ -450,24 +502,21 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
450
502
  allow_any_instance_of(described_class).to receive(:ecs_compatibility).and_return(ecs_compatibility)
451
503
  end
452
504
 
453
- context 'with docinfo enabled' do
454
- let(:config_metadata) do
455
- %q[
456
- input {
457
- elasticsearch {
458
- hosts => ["localhost"]
459
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
460
- docinfo => true
461
- }
462
- }
463
- ]
505
+ before do
506
+ if do_register
507
+ plugin.register
508
+ plugin.run queue
464
509
  end
510
+ end
465
511
 
466
- it "provides document info under metadata" do
467
- event = input(config_metadata) do |pipeline, queue|
468
- queue.pop
469
- end
512
+ let(:do_register) { true }
513
+
514
+ let(:event) { queue.pop }
470
515
 
516
+ context 'with docinfo enabled' do
517
+ let(:config) { base_config.merge 'docinfo' => true }
518
+
519
+ it "provides document info under metadata" do
471
520
  if ecs_select.active_mode == :disabled
472
521
  expect(event.get("[@metadata][_index]")).to eq('logstash-2014.10.12')
473
522
  expect(event.get("[@metadata][_type]")).to eq('logs')
@@ -479,123 +528,72 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
479
528
  end
480
529
  end
481
530
 
482
- it 'merges values if the `docinfo_target` already exist in the `_source` document' do
483
- config_metadata_with_hash = %Q[
484
- input {
485
- elasticsearch {
486
- hosts => ["localhost"]
487
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
488
- docinfo => true
489
- docinfo_target => 'metadata_with_hash'
490
- }
491
- }
492
- ]
531
+ context 'with docinfo_target' do
532
+ let(:config) { base_config.merge 'docinfo' => true, 'docinfo_target' => docinfo_target }
533
+ let(:docinfo_target) { 'metadata_with_hash' }
534
+
535
+ it 'merges values if the `docinfo_target` already exist in the `_source` document' do
536
+ expect(event.get("[metadata_with_hash][_index]")).to eq('logstash-2014.10.12')
537
+ expect(event.get("[metadata_with_hash][_type]")).to eq('logs')
538
+ expect(event.get("[metadata_with_hash][_id]")).to eq('C5b2xLQwTZa76jBmHIbwHQ')
539
+ expect(event.get("[metadata_with_hash][awesome]")).to eq("logstash")
540
+ end
541
+
542
+ context 'non-existent' do
543
+ let(:docinfo_target) { 'meta' }
544
+
545
+ it 'should move the document information to the specified field' do
546
+ expect(event.get("[meta][_index]")).to eq('logstash-2014.10.12')
547
+ expect(event.get("[meta][_type]")).to eq('logs')
548
+ expect(event.get("[meta][_id]")).to eq('C5b2xLQwTZa76jBmHIbwHQ')
549
+ end
493
550
 
494
- event = input(config_metadata_with_hash) do |pipeline, queue|
495
- queue.pop
496
551
  end
497
552
 
498
- expect(event.get("[metadata_with_hash][_index]")).to eq('logstash-2014.10.12')
499
- expect(event.get("[metadata_with_hash][_type]")).to eq('logs')
500
- expect(event.get("[metadata_with_hash][_id]")).to eq('C5b2xLQwTZa76jBmHIbwHQ')
501
- expect(event.get("[metadata_with_hash][awesome]")).to eq("logstash")
502
553
  end
503
554
 
504
555
  context 'if the `docinfo_target` exist but is not of type hash' do
505
- let (:config) { {
506
- "hosts" => ["localhost"],
507
- "query" => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }',
508
- "docinfo" => true,
509
- "docinfo_target" => 'metadata_with_string'
510
- } }
511
- it 'thows an exception if the `docinfo_target` exist but is not of type hash' do
556
+ let(:config) { base_config.merge 'docinfo' => true, "docinfo_target" => 'metadata_with_string' }
557
+ let(:do_register) { false }
558
+
559
+ it 'raises an exception if the `docinfo_target` exist but is not of type hash' do
512
560
  expect(client).not_to receive(:clear_scroll)
513
561
  plugin.register
514
562
  expect { plugin.run([]) }.to raise_error(Exception, /incompatible event/)
515
563
  end
516
- end
517
564
 
518
- it 'should move the document information to the specified field' do
519
- config = %q[
520
- input {
521
- elasticsearch {
522
- hosts => ["localhost"]
523
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
524
- docinfo => true
525
- docinfo_target => 'meta'
526
- }
527
- }
528
- ]
529
- event = input(config) do |pipeline, queue|
530
- queue.pop
531
- end
532
-
533
- expect(event.get("[meta][_index]")).to eq('logstash-2014.10.12')
534
- expect(event.get("[meta][_type]")).to eq('logs')
535
- expect(event.get("[meta][_id]")).to eq('C5b2xLQwTZa76jBmHIbwHQ')
536
565
  end
537
566
 
538
- it "allows to specify which fields from the document info to save to metadata" do
539
- fields = ["_index"]
540
- config = %Q[
541
- input {
542
- elasticsearch {
543
- hosts => ["localhost"]
544
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
545
- docinfo => true
546
- docinfo_fields => #{fields}
547
- }
548
- }]
567
+ context 'with docinfo_fields' do
568
+ let(:config) { base_config.merge 'docinfo' => true, "docinfo_fields" => ["_index"] }
549
569
 
550
- event = input(config) do |pipeline, queue|
551
- queue.pop
570
+ it "allows to specify which fields from the document info to save to metadata" do
571
+ meta_base = event.get(ecs_select.active_mode == :disabled ? "@metadata" : "[@metadata][input][elasticsearch]")
572
+ expect(meta_base.keys).to eql ["_index"]
552
573
  end
553
574
 
554
- meta_base = event.get(ecs_select.active_mode == :disabled ? "@metadata" : "[@metadata][input][elasticsearch]")
555
- expect(meta_base.keys).to eq(fields)
556
575
  end
557
576
 
558
- it 'should be able to reference metadata fields in `add_field` decorations' do
559
- config = %q[
560
- input {
561
- elasticsearch {
562
- hosts => ["localhost"]
563
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
564
- docinfo => true
565
- add_field => {
566
- 'identifier' => "foo:%{[@metadata][_type]}:%{[@metadata][_id]}"
567
- }
568
- }
569
- }
570
- ]
577
+ context 'add_field' do
578
+ let(:config) { base_config.merge 'docinfo' => true,
579
+ 'add_field' => { 'identifier' => "foo:%{[@metadata][_type]}:%{[@metadata][_id]}" } }
571
580
 
572
- event = input(config) do |pipeline, queue|
573
- queue.pop
574
- end
581
+ it 'should be able to reference metadata fields in `add_field` decorations' do
582
+ expect(event.get('identifier')).to eq('foo:logs:C5b2xLQwTZa76jBmHIbwHQ')
583
+ end if ecs_select.active_mode == :disabled
575
584
 
576
- expect(event.get('identifier')).to eq('foo:logs:C5b2xLQwTZa76jBmHIbwHQ')
577
- end if ecs_select.active_mode == :disabled
585
+ end
578
586
 
579
587
  end
580
588
 
581
- end
589
+ context "when not defining the docinfo" do
590
+ let(:config) { base_config }
582
591
 
583
- context "when not defining the docinfo" do
584
- it 'should keep the document information in the root of the event' do
585
- config = %q[
586
- input {
587
- elasticsearch {
588
- hosts => ["localhost"]
589
- query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
590
- }
591
- }
592
- ]
593
- event = input(config) do |pipeline, queue|
594
- queue.pop
592
+ it 'should keep the document information in the root of the event' do
593
+ expect(event.get("[@metadata]")).to be_empty
595
594
  end
596
-
597
- expect(event.get("[@metadata]")).to be_empty
598
595
  end
596
+
599
597
  end
600
598
  end
601
599
 
@@ -740,9 +738,7 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
740
738
  begin
741
739
  @server = WEBrick::HTTPServer.new :Port => 0, :DocumentRoot => ".",
742
740
  :Logger => Cabin::Channel.get, # silence WEBrick logging
743
- :StartCallback => Proc.new {
744
- queue.push("started")
745
- }
741
+ :StartCallback => Proc.new { queue.push("started") }
746
742
  @port = @server.config[:Port]
747
743
  @server.mount_proc '/' do |req, res|
748
744
  res.body = '''
@@ -811,11 +807,9 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
811
807
  @first_req_waiter.countDown()
812
808
  end
813
809
 
814
-
815
-
816
810
  @server.start
817
811
  rescue => e
818
- puts "Error in webserver thread #{e}"
812
+ warn "ERROR in webserver thread #{e.inspect}\n #{e.backtrace.join("\n ")}"
819
813
  # ignore
820
814
  end
821
815
  end
@@ -914,6 +908,8 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
914
908
 
915
909
  plugin.register
916
910
  end
911
+
912
+ after { plugin.do_stop }
917
913
  end
918
914
  end
919
915
 
@@ -942,18 +938,98 @@ describe LogStash::Inputs::Elasticsearch, :ecs_compatibility_support do
942
938
  end
943
939
 
944
940
  it "should properly schedule" do
945
- expect(plugin).to receive(:do_run) {
946
- queue << LogStash::Event.new({})
947
- }.at_least(:twice)
948
- runner = Thread.start { plugin.run(queue) }
949
- sleep 3.0
950
- plugin.stop
951
- runner.join
952
- expect(queue.size).to be >= 2
941
+ begin
942
+ expect(plugin).to receive(:do_run) {
943
+ queue << LogStash::Event.new({})
944
+ }.at_least(:twice)
945
+ runner = Thread.start { plugin.run(queue) }
946
+ expect(queue.pop).not_to be_nil
947
+ cron_jobs = plugin.instance_variable_get(:@_scheduler).instance_variable_get(:@impl).jobs
948
+ expect(cron_jobs[0].next_time - cron_jobs[0].last_time).to be <= 5.0
949
+ expect(queue.pop).not_to be_nil
950
+ ensure
951
+ plugin.do_stop
952
+ runner.join if runner
953
+ end
953
954
  end
954
955
 
955
956
  end
956
957
 
958
+ context "retries" do
959
+ let(:mock_response) do
960
+ {
961
+ "_scroll_id" => "cXVlcnlUaGVuRmV0Y2g",
962
+ "took" => 27,
963
+ "timed_out" => false,
964
+ "_shards" => {
965
+ "total" => 169,
966
+ "successful" => 169,
967
+ "failed" => 0
968
+ },
969
+ "hits" => {
970
+ "total" => 1,
971
+ "max_score" => 1.0,
972
+ "hits" => [ {
973
+ "_index" => "logstash-2014.10.12",
974
+ "_type" => "logs",
975
+ "_id" => "C5b2xLQwTZa76jBmHIbwHQ",
976
+ "_score" => 1.0,
977
+ "_source" => { "message" => ["ohayo"] }
978
+ } ]
979
+ }
980
+ }
981
+ end
982
+
983
+ let(:mock_scroll_response) do
984
+ {
985
+ "_scroll_id" => "r453Wc1jh0caLJhSDg",
986
+ "hits" => { "hits" => [] }
987
+ }
988
+ end
989
+
990
+ before(:each) do
991
+ client = Elasticsearch::Client.new
992
+ allow(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
993
+ allow(client).to receive(:search).with(any_args).and_return(mock_response)
994
+ allow(client).to receive(:scroll).with({ :body => { :scroll_id => "cXVlcnlUaGVuRmV0Y2g" }, :scroll=> "1m" }).and_return(mock_scroll_response)
995
+ allow(client).to receive(:clear_scroll).and_return(nil)
996
+ allow(client).to receive(:ping)
997
+ end
998
+
999
+ let(:config) do
1000
+ {
1001
+ "hosts" => ["localhost"],
1002
+ "query" => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }',
1003
+ "retries" => 1
1004
+ }
1005
+ end
1006
+
1007
+ it "retry and log error when all search request fail" do
1008
+ expect(plugin.logger).to receive(:error).with(/Tried .* unsuccessfully/,
1009
+ hash_including(:message => 'Manticore::UnknownException'))
1010
+ expect(plugin.logger).to receive(:warn).twice.with(/Attempt to .* but failed/,
1011
+ hash_including(:exception => "Manticore::UnknownException"))
1012
+ expect(plugin).to receive(:search_request).with(instance_of(Hash)).and_raise(Manticore::UnknownException).at_least(:twice)
1013
+
1014
+ plugin.register
1015
+
1016
+ expect{ plugin.run(queue) }.not_to raise_error
1017
+ expect(queue.size).to eq(0)
1018
+ end
1019
+
1020
+ it "retry successfully when search request fail for one time" do
1021
+ expect(plugin.logger).to receive(:warn).once.with(/Attempt to .* but failed/,
1022
+ hash_including(:exception => "Manticore::UnknownException"))
1023
+ expect(plugin).to receive(:search_request).with(instance_of(Hash)).once.and_raise(Manticore::UnknownException)
1024
+ expect(plugin).to receive(:search_request).with(instance_of(Hash)).once.and_call_original
1025
+
1026
+ plugin.register
1027
+
1028
+ expect{ plugin.run(queue) }.not_to raise_error
1029
+ expect(queue.size).to eq(1)
1030
+ end
1031
+ end
1032
+
957
1033
  # @note can be removed once we depends on elasticsearch gem >= 6.x
958
1034
  def extract_transport(client) # on 7.x client.transport is a ES::Transport::Client
959
1035
  client.transport.respond_to?(:transport) ? client.transport.transport : client.transport
@@ -72,25 +72,45 @@ describe LogStash::Inputs::Elasticsearch do
72
72
 
73
73
  describe 'against a secured elasticsearch', secure_integration: true do
74
74
 
75
+ # client_options is for an out-of-band helper
75
76
  let(:client_options) { { :ca_file => ca_file, :user => user, :password => password } }
76
77
 
77
- let(:config) { super().merge('user' => user, 'password' => password, 'ssl' => true, 'ca_file' => ca_file) }
78
+ let(:config) { super().merge('user' => user, 'password' => password) }
78
79
 
79
- it_behaves_like 'an elasticsearch index plugin'
80
+ shared_examples 'secured_elasticsearch' do
81
+ it_behaves_like 'an elasticsearch index plugin'
80
82
 
81
- context "incorrect auth credentials" do
83
+ context "incorrect auth credentials" do
82
84
 
83
- let(:config) do
84
- super().merge('user' => 'archer', 'password' => 'b0gus!')
85
- end
85
+ let(:config) do
86
+ super().merge('user' => 'archer', 'password' => 'b0gus!')
87
+ end
86
88
 
87
- let(:queue) { [] }
89
+ let(:queue) { [] }
88
90
 
89
- it "fails to run the plugin" do
90
- expect { plugin.register }.to raise_error Elasticsearch::Transport::Transport::Errors::Unauthorized
91
+ it "fails to run the plugin" do
92
+ expect { plugin.register }.to raise_error Elasticsearch::Transport::Transport::Errors::Unauthorized
93
+ end
91
94
  end
92
95
  end
93
96
 
97
+ context 'with ca_file' do
98
+ let(:config) { super().merge('ssl' => true, 'ca_file' => ca_file) }
99
+ it_behaves_like 'secured_elasticsearch'
100
+ end
101
+
102
+ context 'with `ca_trusted_fingerprint`' do
103
+ let(:ca_trusted_fingerprint) { File.read("spec/fixtures/test_certs/ca.der.sha256").chomp }
104
+ let(:config) { super().merge('ssl' => true, 'ca_trusted_fingerprint' => ca_trusted_fingerprint) }
105
+
106
+ if Gem::Version.create(LOGSTASH_VERSION) >= Gem::Version.create("8.3.0")
107
+ it_behaves_like 'secured_elasticsearch'
108
+ else
109
+ it 'raises a configuration error' do
110
+ expect { plugin }.to raise_exception(LogStash::ConfigurationError, a_string_including("ca_trusted_fingerprint"))
111
+ end
112
+ end
113
+ end
94
114
  end
95
115
 
96
116
  context 'setting host:port', integration: true do
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-input-elasticsearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 4.12.3
4
+ version: 4.15.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2022-03-29 00:00:00.000000000 Z
11
+ date: 2022-08-08 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -72,6 +72,20 @@ dependencies:
72
72
  - - "~>"
73
73
  - !ruby/object:Gem::Version
74
74
  version: '1.0'
75
+ - !ruby/object:Gem::Dependency
76
+ requirement: !ruby/object:Gem::Requirement
77
+ requirements:
78
+ - - "~>"
79
+ - !ruby/object:Gem::Version
80
+ version: '1.0'
81
+ name: logstash-mixin-scheduler
82
+ prerelease: false
83
+ type: :runtime
84
+ version_requirements: !ruby/object:Gem::Requirement
85
+ requirements:
86
+ - - "~>"
87
+ - !ruby/object:Gem::Version
88
+ version: '1.0'
75
89
  - !ruby/object:Gem::Dependency
76
90
  requirement: !ruby/object:Gem::Requirement
77
91
  requirements:
@@ -89,24 +103,24 @@ dependencies:
89
103
  - !ruby/object:Gem::Dependency
90
104
  requirement: !ruby/object:Gem::Requirement
91
105
  requirements:
92
- - - ">="
106
+ - - "~>"
93
107
  - !ruby/object:Gem::Version
94
- version: '0'
95
- name: tzinfo
108
+ version: '1.0'
109
+ name: logstash-mixin-ca_trusted_fingerprint_support
96
110
  prerelease: false
97
111
  type: :runtime
98
112
  version_requirements: !ruby/object:Gem::Requirement
99
113
  requirements:
100
- - - ">="
114
+ - - "~>"
101
115
  - !ruby/object:Gem::Version
102
- version: '0'
116
+ version: '1.0'
103
117
  - !ruby/object:Gem::Dependency
104
118
  requirement: !ruby/object:Gem::Requirement
105
119
  requirements:
106
120
  - - ">="
107
121
  - !ruby/object:Gem::Version
108
122
  version: '0'
109
- name: tzinfo-data
123
+ name: tzinfo
110
124
  prerelease: false
111
125
  type: :runtime
112
126
  version_requirements: !ruby/object:Gem::Requirement
@@ -120,7 +134,7 @@ dependencies:
120
134
  - - ">="
121
135
  - !ruby/object:Gem::Version
122
136
  version: '0'
123
- name: rufus-scheduler
137
+ name: tzinfo-data
124
138
  prerelease: false
125
139
  type: :runtime
126
140
  version_requirements: !ruby/object:Gem::Requirement
@@ -212,6 +226,20 @@ dependencies:
212
226
  - - ">="
213
227
  - !ruby/object:Gem::Version
214
228
  version: '0'
229
+ - !ruby/object:Gem::Dependency
230
+ requirement: !ruby/object:Gem::Requirement
231
+ requirements:
232
+ - - "~>"
233
+ - !ruby/object:Gem::Version
234
+ version: 3.0.9
235
+ name: rufus-scheduler
236
+ prerelease: false
237
+ type: :development
238
+ version_requirements: !ruby/object:Gem::Requirement
239
+ requirements:
240
+ - - "~>"
241
+ - !ruby/object:Gem::Version
242
+ version: 3.0.9
215
243
  description: This gem is a Logstash plugin required to be installed on top of the
216
244
  Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This
217
245
  gem is not a stand-alone program
@@ -227,12 +255,14 @@ files:
227
255
  - NOTICE.TXT
228
256
  - README.md
229
257
  - docs/index.asciidoc
258
+ - lib/logstash/helpers/loggable_try.rb
230
259
  - lib/logstash/inputs/elasticsearch.rb
231
260
  - lib/logstash/inputs/elasticsearch/patches/_elasticsearch_transport_connections_selector.rb
232
261
  - lib/logstash/inputs/elasticsearch/patches/_elasticsearch_transport_http_manticore.rb
233
262
  - logstash-input-elasticsearch.gemspec
234
263
  - spec/es_helper.rb
235
264
  - spec/fixtures/test_certs/ca.crt
265
+ - spec/fixtures/test_certs/ca.der.sha256
236
266
  - spec/fixtures/test_certs/ca.key
237
267
  - spec/fixtures/test_certs/es.crt
238
268
  - spec/fixtures/test_certs/es.key
@@ -266,6 +296,7 @@ summary: Reads query results from an Elasticsearch cluster
266
296
  test_files:
267
297
  - spec/es_helper.rb
268
298
  - spec/fixtures/test_certs/ca.crt
299
+ - spec/fixtures/test_certs/ca.der.sha256
269
300
  - spec/fixtures/test_certs/ca.key
270
301
  - spec/fixtures/test_certs/es.crt
271
302
  - spec/fixtures/test_certs/es.key