logstash-output-opensearch 1.1.0-java → 1.2.0-java

Sign up to get free protection for your applications and to get access to all the features.
Files changed (30) hide show
  1. checksums.yaml +4 -4
  2. checksums.yaml.gz.sig +0 -0
  3. data/CONTRIBUTING.md +26 -0
  4. data/README.md +2 -1
  5. data/lib/logstash/outputs/opensearch/http_client/manticore_adapter.rb +100 -3
  6. data/lib/logstash/outputs/opensearch/http_client.rb +11 -21
  7. data/lib/logstash/outputs/opensearch/http_client_builder.rb +5 -1
  8. data/lib/logstash/outputs/opensearch.rb +1 -1
  9. data/lib/logstash/plugin_mixins/opensearch/api_configs.rb +26 -0
  10. data/logstash-output-opensearch.gemspec +18 -3
  11. data/spec/integration/outputs/compressed_indexing_spec.rb +3 -3
  12. data/spec/integration/outputs/create_spec.rb +7 -7
  13. data/spec/integration/outputs/delete_spec.rb +8 -8
  14. data/spec/integration/outputs/index_spec.rb +45 -9
  15. data/spec/integration/outputs/index_version_spec.rb +11 -11
  16. data/spec/integration/outputs/ingest_pipeline_spec.rb +0 -2
  17. data/spec/integration/outputs/metrics_spec.rb +0 -2
  18. data/spec/integration/outputs/no_opensearch_on_startup_spec.rb +0 -1
  19. data/spec/integration/outputs/painless_update_spec.rb +10 -10
  20. data/spec/integration/outputs/retry_spec.rb +0 -1
  21. data/spec/integration/outputs/sniffer_spec.rb +2 -2
  22. data/spec/integration/outputs/templates_spec.rb +0 -2
  23. data/spec/integration/outputs/update_spec.rb +14 -14
  24. data/spec/opensearch_spec_helper.rb +2 -2
  25. data/spec/unit/outputs/opensearch/http_client/manticore_adapter_spec.rb +73 -3
  26. data/spec/unit/outputs/opensearch/http_client_spec.rb +5 -4
  27. data/spec/unit/outputs/opensearch_spec.rb +2 -2
  28. data.tar.gz.sig +0 -0
  29. metadata +36 -2
  30. metadata.gz.sig +0 -0
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: de9f143d61a40ce521ac3500f2af934aa8a8d0da3cea50ac74b48f410ec54107
4
- data.tar.gz: 46195137ee6c66e9fcc2b9a24f1ff474bf1444c4e5507560373dcf695014be64
3
+ metadata.gz: 2d7793b7f8cb45bd6d1d6a0893357497b1080fa46733e7806bd7bd6d0ec63ef3
4
+ data.tar.gz: d577bb202ad1e25d6ef34358dfad0d26cb22a0a579af4ba45343702c90768450
5
5
  SHA512:
6
- metadata.gz: e96a1a3e78e07b6b74134483eb5ad6353c356a1f322a8b82222489241ecf3bcd59550885759217231973ac450fc3b284e38719c0e23bf80cfb54b8ee2dafdfad
7
- data.tar.gz: 1ed119eefce3e7666f58e3cc57850c3809ec9dbbc7d6603f67808bd8d07a598c498580f5195c2aa361b4ee09cd6d49fb05170aa98d3faaf249a9c633876193e0
6
+ metadata.gz: a14b81ed9f97bd3c518227b5fc2e53cfbabab713dedd174aa128ac93674bb00e467e41eff9d11c60da3edd1e563f0da7e31fcdfc0d05f97f591adb93524fb82e
7
+ data.tar.gz: fb586673825bcc572f1e6b47fefa43540d6515e47542e2415838b56e7a9091586b16ebc07a2fb000955aab70c0459b7496564b88483064807c199711be42f5bf
checksums.yaml.gz.sig CHANGED
Binary file
data/CONTRIBUTING.md CHANGED
@@ -6,6 +6,7 @@
6
6
  - [Documentation Changes](#documentation-changes)
7
7
  - [Contributing Code](#contributing-code)
8
8
  - [Developer Certificate of Origin](#developer-certificate-of-origin)
9
+ - [License Headers](#license-headers)
9
10
  - [Review Process](#review-process)
10
11
 
11
12
  ## Contributing to logstash-output-opensearch
@@ -84,6 +85,31 @@ Signed-off-by: Jane Smith <jane.smith@email.com>
84
85
  ```
85
86
  You may type this line on your own when writing your commit messages. However, if your user.name and user.email are set in your git configs, you can use `-s` or `– – signoff` to add the `Signed-off-by` line to the end of the commit message.
86
87
 
88
+ ## License Headers
89
+
90
+ New files in your code contributions should contain the following license header. If you are modifying existing files with license headers, or including new files that already have license headers, do not remove or modify them without guidance.
91
+
92
+ ### Java
93
+
94
+ ```
95
+ /*
96
+ * Copyright OpenSearch Contributors
97
+ * SPDX-License-Identifier: Apache-2.0
98
+ */
99
+ ```
100
+
101
+ ### Python
102
+ ```
103
+ # Copyright OpenSearch Contributors
104
+ # SPDX-License-Identifier: Apache-2.0
105
+ ```
106
+
107
+ ### Shell
108
+ ```
109
+ # Copyright OpenSearch Contributors
110
+ # SPDX-License-Identifier: Apache-2.0
111
+ ```
112
+
87
113
  ## Review Process
88
114
 
89
115
  We deeply appreciate everyone who takes the time to make a contribution. We will review all contributions as quickly as possible. As a reminder, [opening an issue](https://github.com/opensearch-project/logstash-output-opensearch/issues/new/choose) discussing your change before you make it is the best way to smooth the PR process. This will prevent a rejection because someone else is already working on the problem, or because the solution is incompatible with the architectural direction.
data/README.md CHANGED
@@ -15,7 +15,8 @@
15
15
  ## Project Resources
16
16
 
17
17
  * [Project Website](https://opensearch.org/)
18
- * [Documentation](https://opensearch.org/)
18
+ * [Documentation](https://opensearch.org/docs/clients/logstash/index/)
19
+ * [Developer Guide](DEVELOPER_GUIDE.md)
19
20
  * Need help? Try [Forums](https://discuss.opendistrocommunity.dev/)
20
21
  * [Project Principles](https://opensearch.org/#principles)
21
22
  * [Contributing to OpenSearch](CONTRIBUTING.md)
@@ -7,12 +7,31 @@
7
7
  # Modifications Copyright OpenSearch Contributors. See
8
8
  # GitHub history for details.
9
9
 
10
- require 'manticore'
10
+ require 'aws-sdk-core'
11
11
  require 'cgi'
12
+ require 'manticore'
13
+ require 'uri'
12
14
 
13
15
  module LogStash; module Outputs; class OpenSearch; class HttpClient;
16
+ AWS_DEFAULT_PORT = 443
17
+ AWS_DEFAULT_PROFILE = 'default'
18
+ AWS_DEFAULT_PROFILE_CREDENTIAL_RETRY = 0
19
+ AWS_DEFAULT_PROFILE_CREDENTIAL_TIMEOUT = 1
20
+ AWS_DEFAULT_REGION = 'us-east-1'
21
+ AWS_IAM_AUTH_TYPE = "aws_iam"
22
+ AWS_SERVICE = 'es'
23
+ BASIC_AUTH_TYPE = 'basic'
14
24
  DEFAULT_HEADERS = { "content-type" => "application/json" }
15
-
25
+
26
+ AWSIAMCredential = Struct.new(
27
+ :access_key_id,
28
+ :secret_access_key,
29
+ :session_token,
30
+ :profile,
31
+ :instance_profile_credentials_retries,
32
+ :instance_profile_credentials_timeout,
33
+ :region)
34
+
16
35
  class ManticoreAdapter
17
36
  attr_reader :manticore, :logger
18
37
 
@@ -27,13 +46,65 @@ module LogStash; module Outputs; class OpenSearch; class HttpClient;
27
46
  options[:cookies] = false
28
47
 
29
48
  @client_params = {:headers => DEFAULT_HEADERS.merge(options[:headers] || {})}
30
-
49
+ @type = get_auth_type(options) || nil
50
+
51
+ if @type == AWS_IAM_AUTH_TYPE
52
+ aws_iam_auth_initialization(options)
53
+ elsif @type == BASIC_AUTH_TYPE
54
+ basic_auth_initialization(options)
55
+ end
56
+
31
57
  if options[:proxy]
32
58
  options[:proxy] = manticore_proxy_hash(options[:proxy])
33
59
  end
34
60
 
35
61
  @manticore = ::Manticore::Client.new(options)
36
62
  end
63
+
64
+ def get_auth_type(options)
65
+ if options[:auth_type] != nil
66
+ options[:auth_type]["type"]
67
+ end
68
+ end
69
+
70
+ def aws_iam_auth_initialization(options)
71
+ aws_access_key_id = options[:auth_type]["aws_access_key_id"] || nil
72
+ aws_secret_access_key = options[:auth_type]["aws_secret_access_key"] || nil
73
+ session_token = options[:auth_type]["session_token"] || nil
74
+ profile = options[:auth_type]["profile"] || AWS_DEFAULT_PROFILE
75
+ instance_cred_retries = options[:auth_type]["instance_profile_credentials_retries"] || AWS_DEFAULT_PROFILE_CREDENTIAL_RETRY
76
+ instance_cred_timeout = options[:auth_type]["instance_profile_credentials_timeout"] || AWS_DEFAULT_PROFILE_CREDENTIAL_TIMEOUT
77
+ region = options[:auth_type]["region"] || AWS_DEFAULT_REGION
78
+ set_aws_region(region)
79
+
80
+ credential_config = AWSIAMCredential.new(aws_access_key_id, aws_secret_access_key, session_token, profile, instance_cred_retries, instance_cred_timeout, region)
81
+ @credentials = Aws::CredentialProviderChain.new(credential_config).resolve
82
+ end
83
+
84
+ def basic_auth_initialization(options)
85
+ set_user_password(options)
86
+ end
87
+
88
+ def set_aws_region(region)
89
+ @region = region
90
+ end
91
+
92
+ def get_aws_region()
93
+ @region
94
+ end
95
+
96
+ def set_user_password(options)
97
+ @user = options[:auth_type]["user"]
98
+ @password = options[:auth_type]["password"]
99
+ end
100
+
101
+ def get_user()
102
+ @user
103
+ end
104
+
105
+ def get_password()
106
+ @password
107
+ end
37
108
 
38
109
  # Transform the proxy option to a hash. Manticore's support for non-hash
39
110
  # proxy options is broken. This was fixed in https://github.com/cheald/manticore/commit/34a00cee57a56148629ed0a47c329181e7319af5
@@ -61,6 +132,8 @@ module LogStash; module Outputs; class OpenSearch; class HttpClient;
61
132
  params = (params || {}).merge(@client_params) { |key, oldval, newval|
62
133
  (oldval.is_a?(Hash) && newval.is_a?(Hash)) ? oldval.merge(newval) : newval
63
134
  }
135
+
136
+ params[:headers] = params[:headers].clone
64
137
  params[:body] = body if body
65
138
 
66
139
  if url.user
@@ -71,9 +144,16 @@ module LogStash; module Outputs; class OpenSearch; class HttpClient;
71
144
  :password => CGI.unescape(url.password),
72
145
  :eager => true
73
146
  }
147
+ elsif @type == BASIC_AUTH_TYPE
148
+ add_basic_auth_to_params(params)
74
149
  end
75
150
 
76
151
  request_uri = format_url(url, path)
152
+
153
+ if @type == AWS_IAM_AUTH_TYPE
154
+ sign_aws_request(request_uri, path, method, params)
155
+ end
156
+
77
157
  request_uri_as_string = remove_double_escaping(request_uri.to_s)
78
158
  resp = @manticore.send(method.downcase, request_uri_as_string, params)
79
159
 
@@ -92,6 +172,23 @@ module LogStash; module Outputs; class OpenSearch; class HttpClient;
92
172
  resp
93
173
  end
94
174
 
175
+ def sign_aws_request(request_uri, path, method, params)
176
+ url = URI::HTTPS.build({:host=>URI(request_uri.to_s).host, :port=>AWS_DEFAULT_PORT.to_s, :path=>path})
177
+ key = Seahorse::Client::Http::Request.new(options={:endpoint=>url, :http_method => method.to_s.upcase,
178
+ :headers => params[:headers],:body => params[:body]})
179
+ aws_signer = Aws::Signers::V4.new(@credentials, AWS_SERVICE, get_aws_region )
180
+ signed_key = aws_signer.sign(key)
181
+ params[:headers] = params[:headers].merge(signed_key.headers)
182
+ end
183
+
184
+ def add_basic_auth_to_params(params)
185
+ params[:auth] = {
186
+ :user => get_user(),
187
+ :password => get_password(),
188
+ :eager => true
189
+ }
190
+ end
191
+
95
192
  # Returned urls from this method should be checked for double escaping.
96
193
  def format_url(url, path_and_query=nil)
97
194
  request_uri = url.clone
@@ -15,23 +15,9 @@ require 'zlib'
15
15
  require 'stringio'
16
16
 
17
17
  module LogStash; module Outputs; class OpenSearch;
18
- # This is a constant instead of a config option because
19
- # there really isn't a good reason to configure it.
20
- #
21
- # The criteria used are:
22
- # 1. We need a number that's less than 100MiB because OpenSearch
23
- # won't accept bulks larger than that.
24
- # 2. It must be large enough to amortize the connection constant
25
- # across multiple requests.
26
- # 3. It must be small enough that even if multiple threads hit this size
27
- # we won't use a lot of heap.
28
- #
29
- # We wound up agreeing that a number greater than 10 MiB and less than 100MiB
30
- # made sense. We picked one on the lowish side to not use too much heap.
31
- TARGET_BULK_BYTES = 20 * 1024 * 1024 # 20MiB
32
-
33
18
  class HttpClient
34
- attr_reader :client, :options, :logger, :pool, :action_count, :recv_count
19
+ attr_reader :client, :options, :logger, :pool, :action_count, :recv_count, :target_bulk_bytes
20
+
35
21
  # This is here in case we use DEFAULT_OPTIONS in the future
36
22
  # DEFAULT_OPTIONS = {
37
23
  # :setting => value
@@ -41,13 +27,14 @@ module LogStash; module Outputs; class OpenSearch;
41
27
  # The `options` is a hash where the following symbol keys have meaning:
42
28
  #
43
29
  # * `:hosts` - array of String. Set a list of hosts to use for communication.
44
- # * `:port` - number. set the port to use to communicate with OpenSearch
45
30
  # * `:user` - String. The user to use for authentication.
46
31
  # * `:password` - String. The password to use for authentication.
47
32
  # * `:timeout` - Float. A duration value, in seconds, after which a socket
48
33
  # operation or request will be aborted if not yet successfull
34
+ # * `:auth_type` - hash of String. It contains the type of authentication
35
+ # and it's respective credentials
49
36
  # * `:client_settings` - a hash; see below for keys.
50
- #
37
+
51
38
  # The `client_settings` key is a has that can contain other settings:
52
39
  #
53
40
  # * `:ssl` - Boolean. Enable or disable SSL/TLS.
@@ -72,6 +59,8 @@ module LogStash; module Outputs; class OpenSearch;
72
59
  # mutex to prevent requests and sniffing to access the
73
60
  # connection pool at the same time
74
61
  @bulk_path = @options[:bulk_path]
62
+
63
+ @target_bulk_bytes = @options[:target_bulk_bytes]
75
64
  end
76
65
 
77
66
  def build_url_template
@@ -104,7 +93,6 @@ module LogStash; module Outputs; class OpenSearch;
104
93
  def bulk(actions)
105
94
  @action_count ||= 0
106
95
  @action_count += actions.size
107
-
108
96
  return if actions.empty?
109
97
 
110
98
  bulk_actions = actions.collect do |action, args, source|
@@ -131,7 +119,7 @@ module LogStash; module Outputs; class OpenSearch;
131
119
  action.map {|line| LogStash::Json.dump(line)}.join("\n") :
132
120
  LogStash::Json.dump(action)
133
121
  as_json << "\n"
134
- if (stream_writer.pos + as_json.bytesize) > TARGET_BULK_BYTES && stream_writer.pos > 0
122
+ if (stream_writer.pos + as_json.bytesize) > @target_bulk_bytes && stream_writer.pos > 0
135
123
  stream_writer.flush # ensure writer has sync'd buffers before reporting sizes
136
124
  logger.debug("Sending partial bulk request for batch with one or more actions remaining.",
137
125
  :action_count => batch_actions.size,
@@ -324,6 +312,8 @@ module LogStash; module Outputs; class OpenSearch;
324
312
 
325
313
  adapter_options[:headers] = client_settings[:headers] if client_settings[:headers]
326
314
 
315
+ adapter_options[:auth_type] = options[:auth_type]
316
+
327
317
  adapter_class = ::LogStash::Outputs::OpenSearch::HttpClient::ManticoreAdapter
328
318
  adapter = adapter_class.new(@logger, adapter_options)
329
319
  end
@@ -400,7 +390,7 @@ module LogStash; module Outputs; class OpenSearch;
400
390
 
401
391
  def template_endpoint
402
392
  # TODO: Check Version < 7.8 and use index template for >= 7.8 & OpenSearch
403
- # https://docs-beta.opensearch.org/opensearch/index-templates/
393
+ # https://opensearch.org/docs/opensearch/index-templates/
404
394
  '_template'
405
395
  end
406
396
 
@@ -35,6 +35,7 @@ module LogStash; module Outputs; class OpenSearch;
35
35
  end
36
36
 
37
37
  common_options[:timeout] = params["timeout"] if params["timeout"]
38
+ common_options[:target_bulk_bytes] = params["target_bulk_bytes"]
38
39
 
39
40
  if params["path"]
40
41
  client_settings[:path] = dedup_slashes("/#{params["path"]}/")
@@ -106,7 +107,10 @@ module LogStash; module Outputs; class OpenSearch;
106
107
  }
107
108
  common_options.merge! update_options if params["action"] == 'update'
108
109
 
109
- create_http_client(common_options.merge(:hosts => hosts, :logger => logger))
110
+ create_http_client(common_options.merge(:hosts => hosts,
111
+ :logger => logger,
112
+ :auth_type => params["auth_type"]
113
+ ))
110
114
  end
111
115
 
112
116
  def self.create_http_client(options)
@@ -95,7 +95,7 @@ class LogStash::Outputs::OpenSearch < LogStash::Outputs::Base
95
95
  # - A sprintf style string to change the action based on the content of the event. The value `%{[foo]}`
96
96
  # would use the foo field for the action
97
97
  #
98
- # For more details on actions, check out the https://docs-beta.opensearch.org/opensearch/rest-api/bulk/[OpenSearch bulk API documentation]
98
+ # For more details on actions, check out the https://opensearch.org/docs/opensearch/rest-api/document-apis/bulk/[OpenSearch bulk API documentation]
99
99
  config :action, :validate => :string, :default => "index"
100
100
 
101
101
  # The index to write events to. This can be dynamic using the `%{foo}` syntax.
@@ -20,6 +20,18 @@ module LogStash; module PluginMixins; module OpenSearch
20
20
  # Password to authenticate to a secure OpenSearch cluster
21
21
  :password => { :validate => :password },
22
22
 
23
+ # if auth_type is "aws_iam" then
24
+ # Credential resolution logic works as follows :
25
+ #
26
+ # - User passed aws_access_key_id and aws_secret_access_key in opensearch configuration
27
+ # - Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
28
+ # (RECOMMENDED since they are recognized by all the AWS SDKs and CLI except for .NET),
29
+ # or AWS_ACCESS_KEY and AWS_SECRET_KEY (only recognized by Java SDK)
30
+ # - Credential profiles file at the default location (~/.aws/credentials) shared by all AWS SDKs and the AWS CLI
31
+ # - Instance profile credentials delivered through the Amazon EC2 metadata service
32
+ # - type in auth_type specifies the type of authentication
33
+ :auth_type => { },
34
+
23
35
  # The document ID for the index. Useful for overwriting existing entries in
24
36
  # OpenSearch with the same ID.
25
37
  :document_id => { :validate => :string },
@@ -34,6 +46,20 @@ module LogStash; module PluginMixins; module OpenSearch
34
46
  # this defaults to a concatenation of the path parameter and "_bulk"
35
47
  :bulk_path => { :validate => :string },
36
48
 
49
+ # Maximum number of bytes in bulk requests
50
+ # The criteria for deciding the default value of target_bulk_bytes is:
51
+ # 1. We need a number that's less than 10MiB because OpenSearch is commonly
52
+ # configured (particular in AWS Opensearch Service) to not accept
53
+ # bulks larger than that.
54
+ # 2. It must be large enough to amortize the connection constant
55
+ # across multiple requests.
56
+ # 3. It must be small enough that even if multiple threads hit this size
57
+ # we won't use a lot of heap.
58
+ :target_bulk_bytes => {
59
+ :validate => :number,
60
+ :default => 9 * 1024 * 1024 # 9MiB
61
+ },
62
+
37
63
  # Pass a set of key value pairs as the URL query string. This query string is added
38
64
  # to every host listed in the 'hosts' configuration. If the 'hosts' list contains
39
65
  # urls that already have query strings, the one specified here will be appended.
@@ -1,6 +1,17 @@
1
+ # SPDX-License-Identifier: Apache-2.0
2
+ #
3
+ # The OpenSearch Contributors require contributions made to
4
+ # this file be licensed under the Apache-2.0 license or a
5
+ # compatible open source license.
6
+ #
7
+ # Modifications Copyright OpenSearch Contributors. See
8
+ # GitHub history for details.
9
+
10
+ signing_key_path = "gem-private_key.pem"
11
+
1
12
  Gem::Specification.new do |s|
2
13
  s.name = 'logstash-output-opensearch'
3
- s.version = '1.1.0'
14
+ s.version = '1.2.0'
4
15
 
5
16
  s.licenses = ['Apache-2.0']
6
17
  s.summary = "Stores logs in OpenSearch"
@@ -18,8 +29,10 @@ Gem::Specification.new do |s|
18
29
  # Tests
19
30
  s.test_files = s.files.grep(%r{^(test|spec|features)/})
20
31
 
21
- s.cert_chain = ['certs/opensearch-rubygems.pem']
22
- s.signing_key = File.expand_path("private_key.pem") if $0 =~ /gem\z/
32
+ if $PROGRAM_NAME.end_with?("gem") && ARGV == ["build", __FILE__] && File.exist?(signing_key_path)
33
+ s.signing_key = signing_key_path
34
+ s.cert_chain = ['certs/opensearch-rubygems.pem']
35
+ end
23
36
 
24
37
  # Special flag to let us know this is actually a logstash plugin
25
38
  s.metadata = {
@@ -32,9 +45,11 @@ Gem::Specification.new do |s|
32
45
  s.add_runtime_dependency 'stud', ['>= 0.0.17', '~> 0.0']
33
46
  s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
34
47
  s.add_runtime_dependency 'logstash-mixin-ecs_compatibility_support', '~>1.0'
48
+ s.add_runtime_dependency 'aws-sdk', '>= 2.11.632', '~> 2'
35
49
 
36
50
  s.add_development_dependency 'logstash-codec-plain'
37
51
  s.add_development_dependency 'logstash-devutils'
38
52
  s.add_development_dependency 'flores'
39
53
  s.add_development_dependency 'cabin', ['~> 0.6']
54
+ s.add_development_dependency 'opensearch-ruby'
40
55
  end
@@ -33,8 +33,8 @@ describe "indexing with http_compression turned on", :integration => true do
33
33
  }
34
34
  subject { LogStash::Outputs::OpenSearch.new(config) }
35
35
 
36
- let(:es_url) { "http://#{get_host_port}" }
37
- let(:index_url) {"#{es_url}/#{index}"}
36
+ let(:opensearch_url) { "http://#{get_host_port}" }
37
+ let(:index_url) {"#{opensearch_url}/#{index}"}
38
38
  let(:http_client_options) { {} }
39
39
  let(:http_client) do
40
40
  Manticore::Client.new(http_client_options)
@@ -49,7 +49,7 @@ describe "indexing with http_compression turned on", :integration => true do
49
49
  it "ships events" do
50
50
  subject.multi_receive(events)
51
51
 
52
- http_client.post("#{es_url}/_refresh").call
52
+ http_client.post("#{opensearch_url}/_refresh").call
53
53
 
54
54
  response = http_client.get("#{index_url}/_count?q=*")
55
55
  result = LogStash::Json.load(response.body)
@@ -12,7 +12,7 @@ require_relative "../../../spec/opensearch_spec_helper"
12
12
  describe "client create actions", :integration => true do
13
13
  require "logstash/outputs/opensearch"
14
14
 
15
- def get_es_output(action, id, version=nil, version_type=nil)
15
+ def get_output(action, id, version=nil, version_type=nil)
16
16
  settings = {
17
17
  "manage_template" => true,
18
18
  "index" => "logstash-create",
@@ -37,7 +37,7 @@ describe "client create actions", :integration => true do
37
37
 
38
38
  context "when action => create" do
39
39
  it "should create new documents with or without id" do
40
- subject = get_es_output("create", "id123")
40
+ subject = get_output("create", "id123")
41
41
  subject.register
42
42
  subject.multi_receive([LogStash::Event.new("message" => "sample message here")])
43
43
  @client.indices.refresh
@@ -49,27 +49,27 @@ describe "client create actions", :integration => true do
49
49
  end
50
50
 
51
51
  it "should allow default (internal) version" do
52
- subject = get_es_output("create", "id123", 43)
52
+ subject = get_output("create", "id123", 43)
53
53
  subject.register
54
54
  end
55
55
 
56
56
  it "should allow internal version" do
57
- subject = get_es_output("create", "id123", 43, "internal")
57
+ subject = get_output("create", "id123", 43, "internal")
58
58
  subject.register
59
59
  end
60
60
 
61
61
  it "should not allow external version" do
62
- subject = get_es_output("create", "id123", 43, "external")
62
+ subject = get_output("create", "id123", 43, "external")
63
63
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
64
64
  end
65
65
 
66
66
  it "should not allow external_gt version" do
67
- subject = get_es_output("create", "id123", 43, "external_gt")
67
+ subject = get_output("create", "id123", 43, "external_gt")
68
68
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
69
69
  end
70
70
 
71
71
  it "should not allow external_gte version" do
72
- subject = get_es_output("create", "id123", 43, "external_gte")
72
+ subject = get_output("create", "id123", 43, "external_gte")
73
73
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
74
74
  end
75
75
  end
@@ -14,15 +14,15 @@ require "logstash/outputs/opensearch"
14
14
  describe "Versioned delete", :integration => true do
15
15
  require "logstash/outputs/opensearch"
16
16
 
17
- let(:es) { get_client }
17
+ let(:client) { get_client }
18
18
 
19
19
  before :each do
20
20
  # Delete all templates first.
21
21
  # Clean ES of data before we start.
22
- es.indices.delete_template(:name => "*")
22
+ client.indices.delete_template(:name => "*")
23
23
  # This can fail if there are no indexes, ignore failure.
24
- es.indices.delete(:index => "*") rescue nil
25
- es.indices.refresh
24
+ client.indices.delete(:index => "*") rescue nil
25
+ client.indices.refresh
26
26
  end
27
27
 
28
28
  context "when delete only" do
@@ -48,12 +48,12 @@ describe "Versioned delete", :integration => true do
48
48
  it "should ignore non-monotonic external version updates" do
49
49
  id = "ev2"
50
50
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_action" => "index", "message" => "foo", "my_version" => 99)])
51
- r = es.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true)
51
+ r = client.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true)
52
52
  expect(r['_version']).to eq(99)
53
53
  expect(r['_source']['message']).to eq('foo')
54
54
 
55
55
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_action" => "delete", "message" => "foo", "my_version" => 98)])
56
- r2 = es.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true)
56
+ r2 = client.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true)
57
57
  expect(r2['_version']).to eq(99)
58
58
  expect(r2['_source']['message']).to eq('foo')
59
59
  end
@@ -61,12 +61,12 @@ describe "Versioned delete", :integration => true do
61
61
  it "should commit monotonic external version updates" do
62
62
  id = "ev3"
63
63
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_action" => "index", "message" => "foo", "my_version" => 99)])
64
- r = es.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true)
64
+ r = client.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true)
65
65
  expect(r['_version']).to eq(99)
66
66
  expect(r['_source']['message']).to eq('foo')
67
67
 
68
68
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_action" => "delete", "message" => "foo", "my_version" => 100)])
69
- expect { es.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true) }.to raise_error(Elasticsearch::Transport::Transport::Errors::NotFound)
69
+ expect { client.get(:index => 'logstash-delete', :type => doc_type, :id => id, :refresh => true) }.to raise_error(OpenSearch::Transport::Transport::Errors::NotFound)
70
70
  end
71
71
  end
72
72
  end
@@ -10,8 +10,7 @@
10
10
  require_relative "../../../spec/opensearch_spec_helper"
11
11
  require "logstash/outputs/opensearch"
12
12
 
13
- describe "TARGET_BULK_BYTES", :integration => true do
14
- let(:target_bulk_bytes) { LogStash::Outputs::OpenSearch::TARGET_BULK_BYTES }
13
+ describe "target_bulk_bytes", :integration => true do
15
14
  let(:event_count) { 1000 }
16
15
  let(:events) { event_count.times.map { event }.to_a }
17
16
  let(:config) {
@@ -32,11 +31,11 @@ describe "TARGET_BULK_BYTES", :integration => true do
32
31
  end
33
32
 
34
33
  describe "batches that are too large for one" do
35
- let(:event) { LogStash::Event.new("message" => "a " * (((target_bulk_bytes/2) / event_count)+1)) }
34
+ let(:event) { LogStash::Event.new("message" => "a " * (((subject.client.target_bulk_bytes/2) / event_count)+1)) }
36
35
 
37
36
  it "should send in two batches" do
38
37
  expect(subject.client).to have_received(:bulk_send).twice do |payload|
39
- expect(payload.size).to be <= target_bulk_bytes
38
+ expect(payload.size).to be <= subject.client.target_bulk_bytes
40
39
  end
41
40
  end
42
41
 
@@ -47,7 +46,7 @@ describe "TARGET_BULK_BYTES", :integration => true do
47
46
 
48
47
  it "should send in one batch" do
49
48
  expect(subject.client).to have_received(:bulk_send).once do |payload|
50
- expect(payload.size).to be <= target_bulk_bytes
49
+ expect(payload.size).to be <= subject.client.target_bulk_bytes
51
50
  end
52
51
  end
53
52
  end
@@ -63,8 +62,8 @@ describe "indexing" do
63
62
  let(:events) { event_count.times.map { event }.to_a }
64
63
  subject { LogStash::Outputs::OpenSearch.new(config) }
65
64
 
66
- let(:es_url) { "http://#{get_host_port}" }
67
- let(:index_url) {"#{es_url}/#{index}"}
65
+ let(:opensearch_url) { "http://#{get_host_port}" }
66
+ let(:index_url) {"#{opensearch_url}/#{index}"}
68
67
  let(:http_client_options) { {} }
69
68
  let(:http_client) do
70
69
  Manticore::Client.new(http_client_options)
@@ -79,7 +78,7 @@ describe "indexing" do
79
78
  it "ships events" do
80
79
  subject.multi_receive(events)
81
80
 
82
- http_client.post("#{es_url}/_refresh").call
81
+ http_client.post("#{opensearch_url}/_refresh").call
83
82
 
84
83
  response = http_client.get("#{index_url}/_count?q=*")
85
84
  result = LogStash::Json.load(response.body)
@@ -136,7 +135,7 @@ describe "indexing" do
136
135
  describe "a secured indexer", :secure_integration => true do
137
136
  let(:user) { "admin" }
138
137
  let(:password) { "admin" }
139
- let(:es_url) {"https://integration:9200"}
138
+ let(:opensearch_url) {"https://integration:9200"}
140
139
  let(:config) do
141
140
  {
142
141
  "hosts" => ["integration:9200"],
@@ -161,4 +160,41 @@ describe "indexing" do
161
160
  end
162
161
  it_behaves_like("an indexer", true)
163
162
  end
163
+
164
+ describe "a basic auth secured indexer", :secure_integration => true do
165
+ let(:options) { {
166
+ :auth_type => {
167
+ "type"=>"basic",
168
+ "user" => "admin",
169
+ "password" => "admin"}
170
+ } }
171
+ let(:user) {options[:auth_type]["user"]}
172
+ let(:password) {options[:auth_type]["password"]}
173
+ let(:opensearch_url) {"https://integration:9200"}
174
+ let(:config) do
175
+ {
176
+ "hosts" => ["integration:9200"],
177
+ "auth_type" => {
178
+ "type"=>"basic",
179
+ "user" => user,
180
+ "password" => password},
181
+ "ssl" => true,
182
+ "ssl_certificate_verification" => false,
183
+ "index" => index
184
+ }
185
+ end
186
+ let(:http_client_options) do
187
+ {
188
+ :auth => {
189
+ :user => user,
190
+ :password => password
191
+ },
192
+ :ssl => {
193
+ :enabled => true,
194
+ :verify => false
195
+ }
196
+ }
197
+ end
198
+ it_behaves_like("an indexer", true)
199
+ end
164
200
  end
@@ -13,15 +13,15 @@ require "logstash/outputs/opensearch"
13
13
  describe "Versioned indexing", :integration => true do
14
14
  require "logstash/outputs/opensearch"
15
15
 
16
- let(:es) { get_client }
16
+ let(:client) { get_client }
17
17
 
18
18
  before :each do
19
19
  # Delete all templates first.
20
20
  # Clean OpenSearch of data before we start.
21
- es.indices.delete_template(:name => "*")
21
+ client.indices.delete_template(:name => "*")
22
22
  # This can fail if there are no indexes, ignore failure.
23
- es.indices.delete(:index => "*") rescue nil
24
- es.indices.refresh
23
+ client.indices.delete(:index => "*") rescue nil
24
+ client.indices.refresh
25
25
  end
26
26
 
27
27
  context "when index only" do
@@ -46,11 +46,11 @@ describe "Versioned indexing", :integration => true do
46
46
 
47
47
  it "should default to OpenSearch version" do
48
48
  subject.multi_receive([LogStash::Event.new("my_id" => "123", "message" => "foo")])
49
- r = es.get(:index => 'logstash-index', :type => doc_type, :id => "123", :refresh => true)
49
+ r = client.get(:index => 'logstash-index', :type => doc_type, :id => "123", :refresh => true)
50
50
  expect(r["_version"]).to eq(1)
51
51
  expect(r["_source"]["message"]).to eq('foo')
52
52
  subject.multi_receive([LogStash::Event.new("my_id" => "123", "message" => "foobar")])
53
- r2 = es.get(:index => 'logstash-index', :type => doc_type, :id => "123", :refresh => true)
53
+ r2 = client.get(:index => 'logstash-index', :type => doc_type, :id => "123", :refresh => true)
54
54
  expect(r2["_version"]).to eq(2)
55
55
  expect(r2["_source"]["message"]).to eq('foobar')
56
56
  end
@@ -74,7 +74,7 @@ describe "Versioned indexing", :integration => true do
74
74
  it "should respect the external version" do
75
75
  id = "ev1"
76
76
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_version" => "99", "message" => "foo")])
77
- r = es.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
77
+ r = client.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
78
78
  expect(r["_version"]).to eq(99)
79
79
  expect(r["_source"]["message"]).to eq('foo')
80
80
  end
@@ -82,12 +82,12 @@ describe "Versioned indexing", :integration => true do
82
82
  it "should ignore non-monotonic external version updates" do
83
83
  id = "ev2"
84
84
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_version" => "99", "message" => "foo")])
85
- r = es.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
85
+ r = client.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
86
86
  expect(r["_version"]).to eq(99)
87
87
  expect(r["_source"]["message"]).to eq('foo')
88
88
 
89
89
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_version" => "98", "message" => "foo")])
90
- r2 = es.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
90
+ r2 = client.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
91
91
  expect(r2["_version"]).to eq(99)
92
92
  expect(r2["_source"]["message"]).to eq('foo')
93
93
  end
@@ -95,12 +95,12 @@ describe "Versioned indexing", :integration => true do
95
95
  it "should commit monotonic external version updates" do
96
96
  id = "ev3"
97
97
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_version" => "99", "message" => "foo")])
98
- r = es.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
98
+ r = client.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
99
99
  expect(r["_version"]).to eq(99)
100
100
  expect(r["_source"]["message"]).to eq('foo')
101
101
 
102
102
  subject.multi_receive([LogStash::Event.new("my_id" => id, "my_version" => "100", "message" => "foo")])
103
- r2 = es.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
103
+ r2 = client.get(:index => 'logstash-index', :type => doc_type, :id => id, :refresh => true)
104
104
  expect(r2["_version"]).to eq(100)
105
105
  expect(r2["_source"]["message"]).to eq('foo')
106
106
  end
@@ -37,8 +37,6 @@ describe "Ingest pipeline execution behavior", :integration => true do
37
37
 
38
38
  before :each do
39
39
  # Delete all templates first.
40
- require "elasticsearch"
41
-
42
40
  # Clean OpenSearch of data before we start.
43
41
  @client = get_client
44
42
  @client.indices.delete_template(:name => "*")
@@ -24,8 +24,6 @@ describe "metrics", :integration => true do
24
24
  let(:document_level_metrics) { subject.instance_variable_get(:@document_level_metrics) }
25
25
 
26
26
  before :each do
27
- require "elasticsearch"
28
-
29
27
  # Clean OpenSearch of data before we start.
30
28
  @client = get_client
31
29
  clean(@client)
@@ -27,7 +27,6 @@ describe "opensearch is down on startup", :integration => true do
27
27
 
28
28
  before :each do
29
29
  # Delete all templates first.
30
- require "elasticsearch"
31
30
  allow(Stud).to receive(:stoppable_sleep)
32
31
 
33
32
  # Clean OpenSearch of data before we start.
@@ -12,7 +12,7 @@ require_relative "../../../spec/opensearch_spec_helper"
12
12
  describe "Update actions using painless scripts", :integration => true, :update_tests => 'painless' do
13
13
  require "logstash/outputs/opensearch"
14
14
 
15
- def get_es_output( options={} )
15
+ def get_output( options={} )
16
16
  settings = {
17
17
  "manage_template" => true,
18
18
  "index" => "logstash-update",
@@ -42,7 +42,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
42
42
  context "scripted updates" do
43
43
 
44
44
  it "should increment a counter with event/doc 'count' variable with inline script" do
45
- subject = get_es_output({
45
+ subject = get_output({
46
46
  'document_id' => "123",
47
47
  'script' => 'ctx._source.counter += params.event.counter',
48
48
  'script_type' => 'inline'
@@ -54,7 +54,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
54
54
  end
55
55
 
56
56
  it "should increment a counter with event/doc 'count' variable with event/doc as upsert and inline script" do
57
- subject = get_es_output({
57
+ subject = get_output({
58
58
  'document_id' => "123",
59
59
  'doc_as_upsert' => true,
60
60
  'script' => 'if( ctx._source.containsKey("counter") ){ ctx._source.counter += params.event.counter; } else { ctx._source.counter = params.event.counter; }',
@@ -67,7 +67,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
67
67
  end
68
68
 
69
69
  it "should, with new doc, set a counter with event/doc 'count' variable with event/doc as upsert and inline script" do
70
- subject = get_es_output({
70
+ subject = get_output({
71
71
  'document_id' => "456",
72
72
  'doc_as_upsert' => true,
73
73
  'script' => 'if( ctx._source.containsKey("counter") ){ ctx._source.counter += params.event.counter; } else { ctx._source.counter = params.event.counter; }',
@@ -90,7 +90,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
90
90
 
91
91
  plugin_parameters.merge!('script_lang' => '')
92
92
 
93
- subject = get_es_output(plugin_parameters)
93
+ subject = get_output(plugin_parameters)
94
94
  subject.register
95
95
  subject.multi_receive([LogStash::Event.new("count" => 4 )])
96
96
  r = @client.get(:index => 'logstash-update', :type => doc_type, :id => "123", :refresh => true)
@@ -101,7 +101,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
101
101
 
102
102
  context "when update with upsert" do
103
103
  it "should create new documents with provided upsert" do
104
- subject = get_es_output({ 'document_id' => "456", 'upsert' => '{"message": "upsert message"}' })
104
+ subject = get_output({ 'document_id' => "456", 'upsert' => '{"message": "upsert message"}' })
105
105
  subject.register
106
106
  subject.multi_receive([LogStash::Event.new("message" => "sample message here")])
107
107
  r = @client.get(:index => 'logstash-update', :type => doc_type, :id => "456", :refresh => true)
@@ -109,7 +109,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
109
109
  end
110
110
 
111
111
  it "should create new documents with event/doc as upsert" do
112
- subject = get_es_output({ 'document_id' => "456", 'doc_as_upsert' => true })
112
+ subject = get_output({ 'document_id' => "456", 'doc_as_upsert' => true })
113
113
  subject.register
114
114
  subject.multi_receive([LogStash::Event.new("message" => "sample message here")])
115
115
  r = @client.get(:index => 'logstash-update', :type => doc_type, :id => "456", :refresh => true)
@@ -117,7 +117,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
117
117
  end
118
118
 
119
119
  it "should fail on documents with event/doc as upsert at external version" do
120
- subject = get_es_output({ 'document_id' => "456", 'doc_as_upsert' => true, 'version' => 999, "version_type" => "external" })
120
+ subject = get_output({ 'document_id' => "456", 'doc_as_upsert' => true, 'version' => 999, "version_type" => "external" })
121
121
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
122
122
  end
123
123
  end
@@ -126,7 +126,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
126
126
 
127
127
  context 'with an inline script' do
128
128
  it "should create new documents with upsert content" do
129
- subject = get_es_output({ 'document_id' => "456", 'script' => 'ctx._source.counter = params.event.counter', 'upsert' => '{"message": "upsert message"}', 'script_type' => 'inline' })
129
+ subject = get_output({ 'document_id' => "456", 'script' => 'ctx._source.counter = params.event.counter', 'upsert' => '{"message": "upsert message"}', 'script_type' => 'inline' })
130
130
  subject.register
131
131
 
132
132
  subject.multi_receive([LogStash::Event.new("message" => "sample message here")])
@@ -135,7 +135,7 @@ describe "Update actions using painless scripts", :integration => true, :update_
135
135
  end
136
136
 
137
137
  it "should create new documents with event/doc as script params" do
138
- subject = get_es_output({ 'document_id' => "456", 'script' => 'ctx._source.counter = params.event.counter', 'scripted_upsert' => true, 'script_type' => 'inline' })
138
+ subject = get_output({ 'document_id' => "456", 'script' => 'ctx._source.counter = params.event.counter', 'scripted_upsert' => true, 'script_type' => 'inline' })
139
139
  subject.register
140
140
  subject.multi_receive([LogStash::Event.new("counter" => 1)])
141
141
  @client.indices.refresh
@@ -53,7 +53,6 @@ describe "failures in bulk class expected behavior", :integration => true do
53
53
 
54
54
  before :each do
55
55
  # Delete all templates first.
56
- require "elasticsearch"
57
56
  allow(Stud).to receive(:stoppable_sleep)
58
57
 
59
58
  # Clean OpenSearch of data before we start.
@@ -15,9 +15,9 @@ require "socket"
15
15
  describe "pool sniffer", :integration => true do
16
16
  let(:logger) { Cabin::Channel.get }
17
17
  let(:adapter) { LogStash::Outputs::OpenSearch::HttpClient::ManticoreAdapter.new(logger) }
18
- let(:es_host) { get_host_port.split(":").first }
18
+ let(:opensearch_host) { get_host_port.split(":").first }
19
19
  let(:es_port) { get_host_port.split(":").last }
20
- let(:es_ip) { IPSocket.getaddress(es_host) }
20
+ let(:opensearch_ip) { IPSocket.getaddress(opensearch_host) }
21
21
  let(:initial_urls) { [::LogStash::Util::SafeURI.new("http://#{get_host_port}")] }
22
22
  let(:options) do
23
23
  {
@@ -21,8 +21,6 @@ describe "index template expected behavior", :integration => true do
21
21
  end
22
22
 
23
23
  before :each do
24
- # Delete all templates first.
25
- require "elasticsearch"
26
24
 
27
25
  # Clean OpenSearch of data before we start.
28
26
  @client = get_client
@@ -12,7 +12,7 @@ require_relative "../../../spec/opensearch_spec_helper"
12
12
  describe "Update actions without scripts", :integration => true do
13
13
  require "logstash/outputs/opensearch"
14
14
 
15
- def get_es_output( options={} )
15
+ def get_output( options={} )
16
16
  settings = {
17
17
  "manage_template" => true,
18
18
  "index" => "logstash-update",
@@ -40,20 +40,20 @@ describe "Update actions without scripts", :integration => true do
40
40
  end
41
41
 
42
42
  it "should fail without a document_id" do
43
- subject = get_es_output
43
+ subject = get_output
44
44
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
45
45
  end
46
46
 
47
47
  context "when update only" do
48
48
  it "should not create new document" do
49
- subject = get_es_output({ 'document_id' => "456" } )
49
+ subject = get_output({ 'document_id' => "456" } )
50
50
  subject.register
51
51
  subject.multi_receive([LogStash::Event.new("message" => "sample message here")])
52
- expect {@client.get(:index => 'logstash-update', :type => doc_type, :id => "456", :refresh => true)}.to raise_error(Elasticsearch::Transport::Transport::Errors::NotFound)
52
+ expect {@client.get(:index => 'logstash-update', :type => doc_type, :id => "456", :refresh => true)}.to raise_error(OpenSearch::Transport::Transport::Errors::NotFound)
53
53
  end
54
54
 
55
55
  it "should update existing document" do
56
- subject = get_es_output({ 'document_id' => "123" })
56
+ subject = get_output({ 'document_id' => "123" })
57
57
  subject.register
58
58
  subject.multi_receive([LogStash::Event.new("message" => "updated message here")])
59
59
  r = @client.get(:index => 'logstash-update', :type => doc_type, :id => "123", :refresh => true)
@@ -63,7 +63,7 @@ describe "Update actions without scripts", :integration => true do
63
63
  # The es ruby client treats the data field differently. Make sure this doesn't
64
64
  # raise an exception
65
65
  it "should update an existing document that has a 'data' field" do
66
- subject = get_es_output({ 'document_id' => "123" })
66
+ subject = get_output({ 'document_id' => "123" })
67
67
  subject.register
68
68
  subject.multi_receive([LogStash::Event.new("data" => "updated message here", "message" => "foo")])
69
69
  r = @client.get(:index => 'logstash-update', :type => doc_type, :id => "123", :refresh => true)
@@ -72,27 +72,27 @@ describe "Update actions without scripts", :integration => true do
72
72
  end
73
73
 
74
74
  it "should allow default (internal) version" do
75
- subject = get_es_output({ 'document_id' => "123", "version" => "99" })
75
+ subject = get_output({ 'document_id' => "123", "version" => "99" })
76
76
  subject.register
77
77
  end
78
78
 
79
79
  it "should allow internal version" do
80
- subject = get_es_output({ 'document_id' => "123", "version" => "99", "version_type" => "internal" })
80
+ subject = get_output({ 'document_id' => "123", "version" => "99", "version_type" => "internal" })
81
81
  subject.register
82
82
  end
83
83
 
84
84
  it "should not allow external version" do
85
- subject = get_es_output({ 'document_id' => "123", "version" => "99", "version_type" => "external" })
85
+ subject = get_output({ 'document_id' => "123", "version" => "99", "version_type" => "external" })
86
86
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
87
87
  end
88
88
 
89
89
  it "should not allow external_gt version" do
90
- subject = get_es_output({ 'document_id' => "123", "version" => "99", "version_type" => "external_gt" })
90
+ subject = get_output({ 'document_id' => "123", "version" => "99", "version_type" => "external_gt" })
91
91
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
92
92
  end
93
93
 
94
94
  it "should not allow external_gte version" do
95
- subject = get_es_output({ 'document_id' => "123", "version" => "99", "version_type" => "external_gte" })
95
+ subject = get_output({ 'document_id' => "123", "version" => "99", "version_type" => "external_gte" })
96
96
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
97
97
  end
98
98
 
@@ -100,7 +100,7 @@ describe "Update actions without scripts", :integration => true do
100
100
 
101
101
  context "when update with upsert" do
102
102
  it "should create new documents with provided upsert" do
103
- subject = get_es_output({ 'document_id' => "456", 'upsert' => '{"message": "upsert message"}' })
103
+ subject = get_output({ 'document_id' => "456", 'upsert' => '{"message": "upsert message"}' })
104
104
  subject.register
105
105
  subject.multi_receive([LogStash::Event.new("message" => "sample message here")])
106
106
  r = @client.get(:index => 'logstash-update', :type => doc_type, :id => "456", :refresh => true)
@@ -108,7 +108,7 @@ describe "Update actions without scripts", :integration => true do
108
108
  end
109
109
 
110
110
  it "should create new documents with event/doc as upsert" do
111
- subject = get_es_output({ 'document_id' => "456", 'doc_as_upsert' => true })
111
+ subject = get_output({ 'document_id' => "456", 'doc_as_upsert' => true })
112
112
  subject.register
113
113
  subject.multi_receive([LogStash::Event.new("message" => "sample message here")])
114
114
  r = @client.get(:index => 'logstash-update', :type => doc_type, :id => "456", :refresh => true)
@@ -116,7 +116,7 @@ describe "Update actions without scripts", :integration => true do
116
116
  end
117
117
 
118
118
  it "should fail on documents with event/doc as upsert at external version" do
119
- subject = get_es_output({ 'document_id' => "456", 'doc_as_upsert' => true, 'version' => 999, "version_type" => "external" })
119
+ subject = get_output({ 'document_id' => "456", 'doc_as_upsert' => true, 'version' => 999, "version_type" => "external" })
120
120
  expect { subject.register }.to raise_error(LogStash::ConfigurationError)
121
121
  end
122
122
  end
@@ -9,7 +9,7 @@
9
9
 
10
10
  require_relative './spec_helper'
11
11
 
12
- require 'elasticsearch'
12
+ require 'opensearch'
13
13
 
14
14
  require 'json'
15
15
  require 'cabin'
@@ -24,7 +24,7 @@ module OpenSearchHelper
24
24
  end
25
25
 
26
26
  def get_client
27
- Elasticsearch::Client.new(:hosts => [get_host_port])
27
+ OpenSearch::Client.new(:hosts => [get_host_port])
28
28
  end
29
29
 
30
30
  def doc_type
@@ -25,13 +25,13 @@ describe LogStash::Outputs::OpenSearch::HttpClient::ManticoreAdapter do
25
25
  it "should implement host unreachable exceptions" do
26
26
  expect(subject.host_unreachable_exceptions).to be_a(Array)
27
27
  end
28
-
28
+
29
29
  describe "auth" do
30
30
  let(:user) { "myuser" }
31
31
  let(:password) { "mypassword" }
32
32
  let(:noauth_uri) { clone = uri.clone; clone.user=nil; clone.password=nil; clone }
33
33
  let(:uri) { ::LogStash::Util::SafeURI.new("http://#{user}:#{password}@localhost:9200") }
34
-
34
+
35
35
  it "should convert the auth to params" do
36
36
  resp = double("response")
37
37
  allow(resp).to receive(:call)
@@ -39,7 +39,7 @@ describe LogStash::Outputs::OpenSearch::HttpClient::ManticoreAdapter do
39
39
 
40
40
  expected_uri = noauth_uri.clone
41
41
  expected_uri.path = "/"
42
-
42
+
43
43
  expect(subject.manticore).to receive(:get).
44
44
  with(expected_uri.to_s, {
45
45
  :headers => {"content-type" => "application/json"},
@@ -54,6 +54,76 @@ describe LogStash::Outputs::OpenSearch::HttpClient::ManticoreAdapter do
54
54
  end
55
55
  end
56
56
 
57
+ describe "aws_iam" do
58
+ let(:options) { {
59
+ :auth_type => {
60
+ "type"=>"aws_iam",
61
+ "aws_access_key_id"=>"AAAAAAAAAAAAAAAAAAAA",
62
+ "aws_secret_access_key"=>"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"}
63
+ } }
64
+ subject { described_class.new(logger, options) }
65
+ let(:uri) { ::LogStash::Util::SafeURI.new("http://localhost:9200") }
66
+ let(:sign_aws_request) { }
67
+
68
+ it "should validate AWS IAM credentials initialization" do
69
+ expect(subject.aws_iam_auth_initialization(options)).not_to be_nil
70
+ end
71
+
72
+ it "should validate signing aws request" do
73
+ resp = double("response")
74
+ allow(resp).to receive(:call)
75
+ allow(resp).to receive(:code).and_return(200)
76
+ allow(subject).to receive(:sign_aws_request).with(any_args).and_return(sign_aws_request)
77
+
78
+ expected_uri = uri.clone
79
+ expected_uri.path = "/"
80
+
81
+ expect(subject.manticore).to receive(:get).
82
+ with(expected_uri.to_s, {
83
+ :headers => {"content-type"=> "application/json"}
84
+ }
85
+ ).and_return resp
86
+
87
+ expect(subject).to receive(:sign_aws_request)
88
+ subject.perform_request(uri, :get, "/")
89
+ end
90
+ end
91
+
92
+ describe "basic_auth" do
93
+ let(:options) { {
94
+ :auth_type => {
95
+ "type"=>"basic",
96
+ "user" => "myuser",
97
+ "password" => "mypassword"}
98
+ } }
99
+ subject { described_class.new(logger, options) }
100
+ let(:user) {options[:auth_type]["user"]}
101
+ let(:password) {options[:auth_type]["password"]}
102
+ let(:noauth_uri) { clone = uri.clone; clone.user=nil; clone.password=nil; clone }
103
+ let(:uri) { ::LogStash::Util::SafeURI.new("http://localhost:9200") }
104
+
105
+ it "should validate master credentials with type as 'basic_auth'" do
106
+ resp = double("response")
107
+ allow(resp).to receive(:call)
108
+ allow(resp).to receive(:code).and_return(200)
109
+
110
+ expected_uri = noauth_uri.clone
111
+ expected_uri.path = "/"
112
+
113
+ expect(subject.manticore).to receive(:get).
114
+ with(expected_uri.to_s, {
115
+ :headers => {"content-type" => "application/json"},
116
+ :auth => {
117
+ :user => user,
118
+ :password => password,
119
+ :eager => true
120
+ }
121
+ }).and_return resp
122
+
123
+ subject.perform_request(uri, :get, "/")
124
+ end
125
+ end
126
+
57
127
  describe "bad response codes" do
58
128
  let(:uri) { ::LogStash::Util::SafeURI.new("http://localhost:9200") }
59
129
 
@@ -17,6 +17,7 @@ describe LogStash::Outputs::OpenSearch::HttpClient do
17
17
  opts = {
18
18
  :hosts => [::LogStash::Util::SafeURI.new("127.0.0.1")],
19
19
  :logger => Cabin::Channel.get,
20
+ :target_bulk_bytes => 9_000_000,
20
21
  :metric => ::LogStash::Instrument::NullMetric.new(:dummy).namespace(:alsodummy)
21
22
  }
22
23
 
@@ -226,8 +227,8 @@ describe LogStash::Outputs::OpenSearch::HttpClient do
226
227
  end
227
228
  end
228
229
 
229
- context "if a message is over TARGET_BULK_BYTES" do
230
- let(:target_bulk_bytes) { LogStash::Outputs::OpenSearch::TARGET_BULK_BYTES }
230
+ context "if a message is over target_bulk_bytes" do
231
+ let(:target_bulk_bytes) { subject.target_bulk_bytes }
231
232
  let(:message) { "a" * (target_bulk_bytes + 1) }
232
233
 
233
234
  it "should be handled properly" do
@@ -256,8 +257,8 @@ describe LogStash::Outputs::OpenSearch::HttpClient do
256
257
  s = subject.send(:bulk, actions)
257
258
  end
258
259
 
259
- context "if one exceeds TARGET_BULK_BYTES" do
260
- let(:target_bulk_bytes) { LogStash::Outputs::OpenSearch::TARGET_BULK_BYTES }
260
+ context "if one exceeds target_bulk_bytes" do
261
+ let(:target_bulk_bytes) { subject.target_bulk_bytes }
261
262
  let(:message1) { "a" * (target_bulk_bytes + 1) }
262
263
  it "executes two bulk_send operations" do
263
264
  allow(subject).to receive(:join_bulk_responses)
@@ -325,7 +325,7 @@ describe LogStash::Outputs::OpenSearch do
325
325
  end
326
326
 
327
327
  context '413 errors' do
328
- let(:payload_size) { LogStash::Outputs::OpenSearch::TARGET_BULK_BYTES + 1024 }
328
+ let(:payload_size) { subject.client.target_bulk_bytes + 1024 }
329
329
  let(:event) { ::LogStash::Event.new("message" => ("a" * payload_size ) ) }
330
330
 
331
331
  let(:logger_stub) { double("logger").as_null_object }
@@ -357,7 +357,7 @@ describe LogStash::Outputs::OpenSearch do
357
357
 
358
358
  expect(logger_stub).to have_received(:warn)
359
359
  .with(a_string_matching(/413 Payload Too Large/),
360
- hash_including(:action_count => 1, :content_length => a_value > 20_000_000))
360
+ hash_including(:action_count => 1, :content_length => a_value > subject.client.target_bulk_bytes))
361
361
  end
362
362
  end
363
363
 
data.tar.gz.sig CHANGED
Binary file
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-opensearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.1.0
4
+ version: 1.2.0
5
5
  platform: java
6
6
  authors:
7
7
  - Elastic
@@ -31,7 +31,7 @@ cert_chain:
31
31
  yXikuH6LEVykA8pgOcB9gKsB2/zMd2ZlSj2monM8Qw9EfB14ZSDTYS8VYuwWCeF0
32
32
  eFmXXk0ufQFKl1Yll7quHkmQ0PzKkvXTpONBT6qPkXE=
33
33
  -----END CERTIFICATE-----
34
- date: 2021-09-21 00:00:00.000000000 Z
34
+ date: 2021-12-16 00:00:00.000000000 Z
35
35
  dependencies:
36
36
  - !ruby/object:Gem::Dependency
37
37
  requirement: !ruby/object:Gem::Requirement
@@ -107,6 +107,26 @@ dependencies:
107
107
  - - "~>"
108
108
  - !ruby/object:Gem::Version
109
109
  version: '1.0'
110
+ - !ruby/object:Gem::Dependency
111
+ requirement: !ruby/object:Gem::Requirement
112
+ requirements:
113
+ - - "~>"
114
+ - !ruby/object:Gem::Version
115
+ version: '2'
116
+ - - ">="
117
+ - !ruby/object:Gem::Version
118
+ version: 2.11.632
119
+ name: aws-sdk
120
+ prerelease: false
121
+ type: :runtime
122
+ version_requirements: !ruby/object:Gem::Requirement
123
+ requirements:
124
+ - - "~>"
125
+ - !ruby/object:Gem::Version
126
+ version: '2'
127
+ - - ">="
128
+ - !ruby/object:Gem::Version
129
+ version: 2.11.632
110
130
  - !ruby/object:Gem::Dependency
111
131
  requirement: !ruby/object:Gem::Requirement
112
132
  requirements:
@@ -163,6 +183,20 @@ dependencies:
163
183
  - - "~>"
164
184
  - !ruby/object:Gem::Version
165
185
  version: '0.6'
186
+ - !ruby/object:Gem::Dependency
187
+ requirement: !ruby/object:Gem::Requirement
188
+ requirements:
189
+ - - ">="
190
+ - !ruby/object:Gem::Version
191
+ version: '0'
192
+ name: opensearch-ruby
193
+ prerelease: false
194
+ type: :development
195
+ version_requirements: !ruby/object:Gem::Requirement
196
+ requirements:
197
+ - - ">="
198
+ - !ruby/object:Gem::Version
199
+ version: '0'
166
200
  description: This gem is a Logstash plugin required to be installed on top of the
167
201
  Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gem. This gem
168
202
  is not a stand-alone program
metadata.gz.sig CHANGED
Binary file