microsoft-sentinel-log-analytics-logstash-output-plugin 1.1.0 → 1.1.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 38074bb5b8fb9211f87c27ccbd46051d6fff20ae2c24c0c398150c1e1087fa13
4
- data.tar.gz: 9b14d8ba18d9eba8f0464e07e2351aa7eee9ea3356d410eeef56fdace03bfe42
3
+ metadata.gz: bf8c57d14129f064f4d2256d5578d7aacd8ab97e3b263748bc703da757cfeb58
4
+ data.tar.gz: bcbf850de17a394e10702c613354b1638d963df818fbbb8e680ac54642571297
5
5
  SHA512:
6
- metadata.gz: 19fd167e1dd6ab8874ca2fa4d11971f01e070238ef08afffe4e851eaf7cc7c27b9dc559257deb118eb1c4d82295e3f1a5a3530cdc77cbef5f98ff0926fe4b44b
7
- data.tar.gz: b41a633f1f6a0af1f07bdd8fb12f784febbb219c165de1c0e92557644033b30b5b73d66f9d0e4b361825d1f8a7320609d38b25181496e3dfdd12ec718169ed90
6
+ metadata.gz: edf7141d94c4da2a3518197e74cf976c60b664c1f8118ba817aef9bf26d4c7b04b0812dc21be0549ebbc8246fa2f3bb6f828281e7db11ed661556720538e792b
7
+ data.tar.gz: b256cb4d7d78684c3626dcfe6723094f61ea522b008f38c802278fbf0cb19dea18d3061842108f832f12af25757936f4f450726b2fdb9abf8aa81c58b1d11cce
data/CHANGELOG.md CHANGED
@@ -1,13 +1,13 @@
1
- ## 1.0.0
2
- * Initial release for output plugin for logstash to Microsoft Sentinel. This is done with the Log Analytics DCR based API.
3
-
4
- ## 1.0.2
5
- * Upgrade the rest-client dependency minimum version to 2.1.0
6
- * Allow setting different proxy values for api connections.
7
-
8
- ## 1.0.6
9
- * Increase timeout for read/open connections to 120 seconds.
10
- * Add error handling for when connection timeout occurs.
11
-
1
+ ## 1.1.3
2
+ - Replaces the `rest-client` library used for connecting to Azure with the `excon` library.
3
+
4
+ ## 1.1.1
5
+ - Adds support for Azure US Government cloud and Microsoft Azure operated by 21Vianet in China.
6
+
12
7
  ## 1.1.0
13
- * Rename the plugin to microsoft-sentinel-log-analytics-logstash-output-plugin
8
+ - Allows setting different proxy values for API connections.
9
+ - Upgrades version for logs ingestion API to 2023-01-01.
10
+ - Renames the plugin to microsoft-sentinel-log-analytics-logstash-output-plugin.
11
+
12
+ ## 1.0.0
13
+ - The initial release for the Logstash output plugin for Microsoft Sentinel. This plugin uses Data Collection Rules (DCRs) with Azure Monitor's Logs Ingestion API.
data/README.md CHANGED
@@ -3,13 +3,12 @@
3
3
  Microsoft Sentinel provides a new output plugin for Logstash. Use this output plugin to send any log via Logstash to the Microsoft Sentinel/Log Analytics workspace. This is done with the Log Analytics DCR-based API.
4
4
  You may send logs to custom or standard tables.
5
5
 
6
- Plugin version: v1.0.2
7
- Released on: 2023-04-27
6
+ Plugin version: v1.1.3
7
+ Released on: 2024-10-10
8
8
 
9
9
  This plugin is currently in development and is free to use. We welcome contributions from the open source community on this project, and we request and appreciate feedback from users.
10
10
 
11
-
12
- ## Steps to implement the output plugin
11
+ ## Installation Instructions
13
12
  1) Install the plugin
14
13
  2) Create a sample file
15
14
  3) Create the required DCR-related resources
@@ -19,12 +18,16 @@ This plugin is currently in development and is free to use. We welcome contribut
19
18
 
20
19
  ## 1. Install the plugin
21
20
 
22
- Microsoft Sentinel provides Logstash output plugin to Log analytics workspace using DCR based logs API.
23
- Install the microsoft-sentinel-log-analytics-logstash-output-plugin, use [Logstash Offline Plugin Management instruction](<https://www.elastic.co/guide/en/logstash/current/offline-plugins.html>).
21
+ Microsoft Sentinel provides Logstash output plugin to Log analytics workspace using DCR based logs API.
22
+
23
+ The plugin is published on [RubyGems](https://rubygems.org/gems/microsoft-sentinel-log-analytics-logstash-output-plugin). To install to an existing logstash installation, run `logstash-plugin install microsoft-sentinel-log-analytics-logstash-output-plugin`.
24
+
25
+ If you do not have a direct internet connection, you can install the plugin to another logstash installation, and then export and import a plugin bundle to the offline host. For more information, see [Logstash Offline Plugin Management instruction](<https://www.elastic.co/guide/en/logstash/current/offline-plugins.html>).
24
26
 
25
27
  Microsoft Sentinel's Logstash output plugin supports the following versions
26
- - Logstash 7 Between 7.0 and 7.17.6
27
- - Logstash 8 Between 8.0 and 8.4.2
28
+ - 7.0 - 7.17.13
29
+ - 8.0 - 8.9
30
+ - 8.11 - 8.15
28
31
 
29
32
  Please note that when using Logstash 8, it is recommended to disable ECS in the pipeline. For more information refer to [Logstash documentation.](<https://www.elastic.co/guide/en/logstash/8.4/ecs-ls.html>)
30
33
 
@@ -41,8 +44,8 @@ output {
41
44
  }
42
45
  ```
43
46
  Note: make sure that the path exists before creating the sample file.
44
- 2) Start Logstash. The plugin will write up to 10 records to a sample file named "sampleFile<epoch seconds>.json" in the configured path
45
- (for example: "c:\temp\sampleFile1648453501.json")
47
+ 2) Start Logstash. The plugin will collect up to 10 records to a sample.
48
+ 3) The file named "sampleFile<epoch seconds>.json" in the configured path will be created once there are 10 events to sample or when the Logstash process exited gracefully. (for example: "c:\temp\sampleFile1648453501.json").
46
49
 
47
50
 
48
51
  ### Configurations:
@@ -124,6 +127,7 @@ output {
124
127
  - **proxy** - String, Empty by default. Specify which proxy URL to use for API calls for all of the communications with Azure.
125
128
  - **proxy_aad** - String, Empty by default. Specify which proxy URL to use for API calls for the Azure Active Directory service. Overrides the proxy setting.
126
129
  - **proxy_endpoint** - String, Empty by default. Specify which proxy URL to use when sending log data to the endpoint. Overrides the proxy setting.
130
+ - **azure_cloud** - String, Empty by default. Used to specify the name of the Azure cloud that is being used, AzureCloud is set as default. Available values are: AzureCloud, AzureChinaCloud and AzureUSGovernment.
127
131
 
128
132
  #### Note: When setting an empty string as a value for a proxy setting, it will unset any system wide proxy setting.
129
133
 
@@ -232,3 +236,23 @@ Which will produce this content in the sample file:
232
236
  }
233
237
  ]
234
238
  ```
239
+
240
+
241
+ ## Known issues
242
+
243
+ When using Logstash installed on a Docker image of Lite Ubuntu, the following warning may appear:
244
+
245
+ ```
246
+ java.lang.RuntimeException: getprotobyname_r failed
247
+ ```
248
+
249
+ To resolve it, use the following commands to install the *netbase* package within your Dockerfile:
250
+ ```bash
251
+ USER root
252
+ RUN apt install netbase -y
253
+ ```
254
+ For more information, see [JNR regression in Logstash 7.17.0 (Docker)](https://github.com/elastic/logstash/issues/13703).
255
+
256
+ If your environment's event rate is low considering the number of allocated Logstash workers, we recommend increasing the value of *plugin_flush_interval* to 60 or more. This change will allow each worker to batch more events before uploading to the Data Collection Endpoint (DCE). You can monitor the ingestion payload using [DCR metrics](https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-monitor#dcr-metrics).
257
+ For more information on *plugin_flush_interval*, see the [Optional Configuration table](https://learn.microsoft.com/azure/sentinel/connect-logstash-data-connection-rules#optional-configuration) mentioned earlier.
258
+
@@ -68,6 +68,11 @@ class LogStash::Outputs::MicrosoftSentinelOutput < LogStash::Outputs::Base
68
68
  # Path where to place the sample file created
69
69
  config :sample_file_path, :validate => :string
70
70
 
71
+ # Used to specify the name of the Azure cloud that is being used. By default, the value is set to "AzureCloud", which
72
+ # is the public Azure cloud. However, you can specify a different Azure cloud if you are
73
+ # using a different environment, such as Azure Government or Azure China.
74
+ config :azure_cloud, :validate => :string
75
+
71
76
  public
72
77
  def register
73
78
  @logstash_configuration= build_logstash_configuration()
@@ -103,6 +108,7 @@ class LogStash::Outputs::MicrosoftSentinelOutput < LogStash::Outputs::Base
103
108
  logstash_configuration.proxy_aad = @proxy_aad || @proxy || ENV['http_proxy']
104
109
  logstash_configuration.proxy_endpoint = @proxy_endpoint || @proxy || ENV['http_proxy']
105
110
  logstash_configuration.retransmission_time = @retransmission_time
111
+ logstash_configuration.azure_cloud = @azure_cloud || "AzureCloud"
106
112
 
107
113
  return logstash_configuration
108
114
  end # def build_logstash_configuration
@@ -1,16 +1,16 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
3
- require 'rest-client'
4
3
  require 'json'
5
4
  require 'openssl'
6
5
  require 'base64'
7
6
  require 'time'
7
+ require 'excon'
8
8
 
9
9
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
10
10
  class LogAnalyticsAadTokenProvider
11
11
  def initialize (logstashLoganalyticsConfiguration)
12
- scope = CGI.escape("https://monitor.azure.com//.default")
13
- @aad_uri = "https://login.microsoftonline.com"
12
+ scope = CGI.escape("#{logstashLoganalyticsConfiguration.get_monitor_endpoint}//.default")
13
+ @aad_uri = logstashLoganalyticsConfiguration.get_aad_endpoint
14
14
  @token_request_body = sprintf("client_id=%s&scope=%s&client_secret=%s&grant_type=client_credentials", logstashLoganalyticsConfiguration.client_app_Id, scope, logstashLoganalyticsConfiguration.client_app_secret)
15
15
  @token_request_uri = sprintf("%s/%s/oauth2/v2.0/token",@aad_uri, logstashLoganalyticsConfiguration.tenant_id)
16
16
  @token_state = {
@@ -64,14 +64,13 @@ class LogAnalyticsAadTokenProvider
64
64
  while true
65
65
  begin
66
66
  # Post REST request
67
- response = RestClient::Request.execute(method: :post, url: @token_request_uri, payload: @token_request_body, headers: headers,
68
- proxy: @logstashLoganalyticsConfiguration.proxy_aad)
69
-
70
- if (response.code == 200 || response.code == 201)
67
+ response = Excon.post(@token_request_uri, :body => @token_request_body, :headers => headers, :proxy => @logstashLoganalyticsConfiguration.proxy_aad, expects: [200, 201])
68
+
69
+ if (response.status == 200 || response.status == 201)
71
70
  return JSON.parse(response.body)
72
71
  end
73
- rescue RestClient::ExceptionWithResponse => ewr
74
- @logger.error("Exception while authenticating with AAD API ['#{ewr.response}']")
72
+ rescue Excon::Error::HTTPStatus => ex
73
+ @logger.error("Error while authenticating with AAD [#{ex.class}: '#{ex.response.status}', Response: '#{ex.response.body}']")
75
74
  rescue Exception => ex
76
75
  @logger.trace("Exception while authenticating with AAD API ['#{ex}']")
77
76
  end
@@ -1,11 +1,11 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/sentinel_la/version"
3
- require 'rest-client'
4
3
  require 'json'
5
4
  require 'openssl'
6
5
  require 'base64'
7
6
  require 'time'
8
7
  require 'rbconfig'
8
+ require 'excon'
9
9
 
10
10
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
11
11
  class LogAnalyticsClient
@@ -14,7 +14,7 @@ require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
14
14
  require "logstash/sentinel_la/logAnalyticsAadTokenProvider"
15
15
 
16
16
 
17
- def initialize (logstashLoganalyticsConfiguration)
17
+ def initialize(logstashLoganalyticsConfiguration)
18
18
  @logstashLoganalyticsConfiguration = logstashLoganalyticsConfiguration
19
19
  @logger = @logstashLoganalyticsConfiguration.logger
20
20
 
@@ -22,28 +22,78 @@ require "logstash/sentinel_la/logAnalyticsAadTokenProvider"
22
22
  @uri = sprintf("%s/dataCollectionRules/%s/streams/%s?api-version=%s",@logstashLoganalyticsConfiguration.data_collection_endpoint, @logstashLoganalyticsConfiguration.dcr_immutable_id, logstashLoganalyticsConfiguration.dcr_stream_name, la_api_version)
23
23
  @aadTokenProvider=LogAnalyticsAadTokenProvider::new(logstashLoganalyticsConfiguration)
24
24
  @userAgent = getUserAgent()
25
+
26
+ # Auto close connection after 60 seconds of inactivity
27
+ @connectionAutoClose = {
28
+ :last_use => Time.now,
29
+ :lock => Mutex.new,
30
+ :max_idel_time => 60,
31
+ :is_closed => true
32
+ }
33
+
34
+ @timer = Thread.new do
35
+ loop do
36
+ sleep @connectionAutoClose[:max_idel_time] / 2
37
+ if is_connection_stale?
38
+ @connectionAutoClose[:lock].synchronize do
39
+ if is_connection_stale?
40
+ reset_connection
41
+ end
42
+ end
43
+ end
44
+ end
45
+ end
46
+
47
+
25
48
  end # def initialize
26
49
 
27
50
  # Post the given json to Azure Loganalytics
28
51
  def post_data(body)
29
52
  raise ConfigError, 'no json_records' if body.empty?
53
+ response = nil
54
+
55
+ @connectionAutoClose[:lock].synchronize do
56
+ #close connection if its stale
57
+ if is_connection_stale?
58
+ reset_connection
59
+ end
60
+ if @connectionAutoClose[:is_closed]
61
+ open_connection
62
+ end
63
+
64
+ headers = get_header()
65
+ # Post REST request
66
+ response = @connection.request(method: :post, body: body, headers: headers)
67
+ @connectionAutoClose[:is_closed] = false
68
+ @connectionAutoClose[:last_use] = Time.now
69
+ end
70
+ return response
30
71
 
31
- # Create REST request header
32
- headers = get_header()
33
-
34
- # Post REST request
35
-
36
- return RestClient::Request.execute(method: :post, url: @uri, payload: body, headers: headers,
37
- proxy: @logstashLoganalyticsConfiguration.proxy_endpoint, timeout: 120)
38
72
  end # def post_data
39
73
 
40
74
  # Static function to return if the response is OK or else
41
75
  def self.is_successfully_posted(response)
42
- return (response.code >= 200 && response.code < 300 ) ? true : false
76
+ return (response.status >= 200 && response.status < 300 ) ? true : false
43
77
  end # def self.is_successfully_posted
44
78
 
45
79
  private
46
80
 
81
+ def open_connection
82
+ @connection = Excon.new(@uri, :persistent => true, :proxy => @logstashLoganalyticsConfiguration.proxy_endpoint,
83
+ expects: [200, 201, 202, 204, 206, 207, 208, 226, 300, 301, 302, 303, 304, 305, 306, 307, 308],
84
+ read_timeout: 240, write_timeout: 240, connect_timeout: 240)
85
+ @logger.trace("Connection to Azure LogAnalytics was opened.");
86
+ end
87
+
88
+ def reset_connection
89
+ @connection.reset
90
+ @connectionAutoClose[:is_closed] = true
91
+ @logger.trace("Connection to Azure LogAnalytics was closed due to inactivity.");
92
+ end
93
+
94
+ def is_connection_stale?
95
+ return Time.now - @connectionAutoClose[:last_use] > @connectionAutoClose[:max_idel_time] && !@connectionAutoClose[:is_closed]
96
+ end
47
97
  # Create a header for the given length
48
98
  def get_header()
49
99
  # Getting an authorization token bearer (if the token is expired, the method will post a request to get a new authorization token)
@@ -2,7 +2,7 @@
2
2
 
3
3
  require "logstash/sentinel_la/logAnalyticsClient"
4
4
  require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
5
-
5
+ require "excon"
6
6
  # LogStashAutoResizeBuffer class setting a resizable buffer which is flushed periodically
7
7
  # The buffer resize itself according to Azure Loganalytics and configuration limitations
8
8
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
@@ -59,34 +59,32 @@ class LogStashEventsBatcher
59
59
  return
60
60
  else
61
61
  @logger.trace("Rest client response ['#{response}']")
62
- @logger.error("#{api_name} request failed. Error code: #{response.code} #{try_get_info_from_error_response(response)}")
62
+ @logger.error("#{api_name} request failed. Error code: #{response.pree} #{try_get_info_from_error_response(response)}")
63
63
  end
64
- rescue RestClient::Exceptions::Timeout => eto
65
- @logger.trace("Timeout exception ['#{eto.display}'] when posting data to #{api_name}. Rest client response ['#{eto.response.display}']. [amount_of_documents=#{amount_of_documents}]")
66
- @logger.error("Timeout exception while posting data to #{api_name}. [Exception: '#{eto}'] [amount of documents=#{amount_of_documents}]'")
67
- force_retry = true
68
-
69
- rescue RestClient::ExceptionWithResponse => ewr
70
- response = ewr.response
71
- @logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{ewr.response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
72
- @logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")
64
+ rescue Excon::Error::HTTPStatus => ewr
65
+ response = ewr.response
66
+ @logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
67
+ @logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr.class}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")
73
68
 
74
- if ewr.http_code.to_f == 400
75
- @logger.info("Not trying to resend since exception http code is #{ewr.http_code}")
76
- return
77
- elsif ewr.http_code.to_f == 408
78
- force_retry = true
79
- elsif ewr.http_code.to_f == 429
80
- # thrutteling detected, backoff before resending
81
- parsed_retry_after = response.headers.include?(:retry_after) ? response.headers[:retry_after].to_i : 0
82
- seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30
69
+ if ewr.class == Excon::Error::BadRequest
70
+ @logger.info("Not trying to resend since exception http code is 400")
71
+ return
72
+ elsif ewr.class == Excon::Error::RequestTimeout
73
+ force_retry = true
74
+ elsif ewr.class == Excon::Error::TooManyRequests
75
+ # thrutteling detected, backoff before resending
76
+ parsed_retry_after = response.data[:headers].include?('Retry-After') ? response.data[:headers]['Retry-After'].to_i : 0
77
+ seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30
83
78
 
84
- #force another retry even if the next iteration of the loop will be after the retransmission_timeout
79
+ #force another retry even if the next iteration of the loop will be after the retransmission_timeout
80
+ force_retry = true
81
+ end
82
+ rescue Excon::Error::Socket => ex
83
+ @logger.trace("Exception: '#{ex.class.name}]#{ex} in posting data to #{api_name}. [amount_of_documents=#{amount_of_documents}]'")
85
84
  force_retry = true
86
- end
87
- rescue Exception => ex
88
- @logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
89
- @logger.error("Exception in posting data to #{api_name}. [Exception: '#{ex}, amount of documents=#{amount_of_documents}]'")
85
+ rescue Exception => ex
86
+ @logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
87
+ @logger.error("Exception in posting data to #{api_name}. [Exception: '[#{ex.class.name}]#{ex}, amount of documents=#{amount_of_documents}]'")
90
88
  end
91
89
  is_retry = true
92
90
  @logger.info("Retrying transmission to #{api_name} in #{seconds_to_sleep} seconds.")
@@ -110,8 +108,8 @@ class LogStashEventsBatcher
110
108
  def get_request_id_from_response(response)
111
109
  output =""
112
110
  begin
113
- if !response.nil? && response.headers.include?(:x_ms_request_id)
114
- output += response.headers[:x_ms_request_id]
111
+ if !response.nil? && response.data[:headers].include?("x-ms-request-id")
112
+ output += response.data[:headers]["x-ms-request-id"]
115
113
  end
116
114
  rescue Exception => ex
117
115
  @logger.debug("Error while getting reqeust id from success response headers: #{ex.display}")
@@ -124,12 +122,13 @@ class LogStashEventsBatcher
124
122
  begin
125
123
  output = ""
126
124
  if !response.nil?
127
- if response.headers.include?(:x_ms_error_code)
128
- output += " [ms-error-code header: #{response.headers[:x_ms_error_code]}]"
125
+ if response.data[:headers].include?("x-ms-error-code")
126
+ output += " [ms-error-code header: #{response.data[:headers]["x-ms-error-code"]}]"
129
127
  end
130
- if response.headers.include?(:x_ms_request_id)
131
- output += " [x-ms-request-id header: #{response.headers[:x_ms_request_id]}]"
128
+ if response.data[:headers].include?("x-ms-request-id")
129
+ output += " [x-ms-request-id header: #{response.data[:headers]["x-ms-request-id"]}]"
132
130
  end
131
+ output += " [response body: #{response.data[:body]}]"
133
132
  end
134
133
  return output
135
134
  rescue Exception => ex
@@ -23,6 +23,12 @@ class LogstashLoganalyticsOutputConfiguration
23
23
 
24
24
  # Taking 4K safety buffer
25
25
  @MAX_SIZE_BYTES = @loganalytics_api_data_limit - 10000
26
+
27
+ @azure_clouds = {
28
+ "AzureCloud" => {"aad" => "https://login.microsoftonline.com", "monitor" => "https://monitor.azure.com"},
29
+ "AzureChinaCloud" => {"aad" => "https://login.chinacloudapi.cn", "monitor" => "https://monitor.azure.cn"},
30
+ "AzureUSGovernment" => {"aad" => "https://login.microsoftonline.us", "monitor" => "https://monitor.azure.us"}
31
+ }.freeze
26
32
  end
27
33
 
28
34
  def validate_configuration()
@@ -68,6 +74,9 @@ class LogstashLoganalyticsOutputConfiguration
68
74
  if @key_names.length > 500
69
75
  raise ArgumentError, 'There are over 500 key names listed to be included in the events sent to Azure Loganalytics, which exceeds the limit of columns that can be define in each table in log analytics.'
70
76
  end
77
+ if !@azure_clouds.key?(@azure_cloud)
78
+ raise ArgumentError, "The specified Azure cloud #{@azure_cloud} is not supported. Supported clouds are: #{@azure_clouds.keys.join(", ")}."
79
+ end
71
80
  end
72
81
  @logger.info("Azure Loganalytics configuration was found valid.")
73
82
  # If all validation pass then configuration is valid
@@ -159,10 +168,6 @@ class LogstashLoganalyticsOutputConfiguration
159
168
  @MIN_MESSAGE_AMOUNT
160
169
  end
161
170
 
162
- def max_items=(new_max_items)
163
- @max_items = new_max_items
164
- end
165
-
166
171
  def key_names=(new_key_names)
167
172
  @key_names = new_key_names
168
173
  end
@@ -218,5 +223,21 @@ class LogstashLoganalyticsOutputConfiguration
218
223
  def sample_file_path=(new_sample_file_path)
219
224
  @sample_file_path = new_sample_file_path
220
225
  end
226
+
227
+ def azure_cloud
228
+ @azure_cloud
229
+ end
230
+
231
+ def azure_cloud=(new_azure_cloud)
232
+ @azure_cloud = new_azure_cloud
233
+ end
234
+
235
+ def get_aad_endpoint
236
+ @azure_clouds[@azure_cloud]["aad"]
237
+ end
238
+
239
+ def get_monitor_endpoint
240
+ @azure_clouds[@azure_cloud]["monitor"]
241
+ end
221
242
  end
222
243
  end ;end ;end
@@ -1,6 +1,6 @@
1
1
  module LogStash; module Outputs;
2
2
  class MicrosoftSentinelOutputInternal
3
- VERSION_INFO = [1, 1, 0].freeze
3
+ VERSION_INFO = [1, 1, 3].freeze
4
4
  VERSION = VERSION_INFO.map(&:to_s).join('.').freeze
5
5
 
6
6
  def self.version
@@ -20,8 +20,8 @@ Gem::Specification.new do |s|
20
20
  s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
21
21
 
22
22
  # Gem dependencies
23
- s.add_runtime_dependency "rest-client", ">= 2.1.0"
24
23
  s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
25
24
  s.add_runtime_dependency "logstash-codec-plain"
25
+ s.add_runtime_dependency "excon", ">= 0.88.0"
26
26
  s.add_development_dependency "logstash-devutils"
27
27
  end
metadata CHANGED
@@ -1,29 +1,15 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: microsoft-sentinel-log-analytics-logstash-output-plugin
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.1.0
4
+ version: 1.1.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Microsoft Sentinel
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-07-23 00:00:00.000000000 Z
11
+ date: 2024-10-10 00:00:00.000000000 Z
12
12
  dependencies:
13
- - !ruby/object:Gem::Dependency
14
- name: rest-client
15
- requirement: !ruby/object:Gem::Requirement
16
- requirements:
17
- - - ">="
18
- - !ruby/object:Gem::Version
19
- version: 2.1.0
20
- type: :runtime
21
- prerelease: false
22
- version_requirements: !ruby/object:Gem::Requirement
23
- requirements:
24
- - - ">="
25
- - !ruby/object:Gem::Version
26
- version: 2.1.0
27
13
  - !ruby/object:Gem::Dependency
28
14
  name: logstash-core-plugin-api
29
15
  requirement: !ruby/object:Gem::Requirement
@@ -58,6 +44,20 @@ dependencies:
58
44
  - - ">="
59
45
  - !ruby/object:Gem::Version
60
46
  version: '0'
47
+ - !ruby/object:Gem::Dependency
48
+ name: excon
49
+ requirement: !ruby/object:Gem::Requirement
50
+ requirements:
51
+ - - ">="
52
+ - !ruby/object:Gem::Version
53
+ version: 0.88.0
54
+ type: :runtime
55
+ prerelease: false
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - ">="
59
+ - !ruby/object:Gem::Version
60
+ version: 0.88.0
61
61
  - !ruby/object:Gem::Dependency
62
62
  name: logstash-devutils
63
63
  requirement: !ruby/object:Gem::Requirement
@@ -118,7 +118,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
118
118
  - !ruby/object:Gem::Version
119
119
  version: '0'
120
120
  requirements: []
121
- rubygems_version: 3.1.6
121
+ rubygems_version: 3.3.26
122
122
  signing_key:
123
123
  specification_version: 4
124
124
  summary: Microsoft Sentinel provides a new output plugin for Logstash. Use this output