microsoft-sentinel-log-analytics-logstash-output-plugin 1.1.1 → 1.1.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: bc9cee055d5aa8f90acde6ed20eacccf0d9e9a36fa9a5ebd1c916d85a410e628
4
- data.tar.gz: '081cd8f53b716eeed931c417ce6fc7a13e3d2d1fe9acf0050858e5b87918d4ff'
3
+ metadata.gz: bf8c57d14129f064f4d2256d5578d7aacd8ab97e3b263748bc703da757cfeb58
4
+ data.tar.gz: bcbf850de17a394e10702c613354b1638d963df818fbbb8e680ac54642571297
5
5
  SHA512:
6
- metadata.gz: 77753b09f6c4631fb2e2eb1e0ba5dd4eb4c33840901226f1caa08043b39caa77497cdfd42adb7ac6a857b22e3bd98512fff8e3e1ba4d6b7916eb16e822f752ce
7
- data.tar.gz: 9ae33cb6bb9c96c19011b4a21f9ac943f75df18b99761023cb4ba00c84584a6ad0c05d8b0d5c0819993b434e2f6d14da1e62f92ba2f44a11aa6eeed22cbe0b9c
6
+ metadata.gz: edf7141d94c4da2a3518197e74cf976c60b664c1f8118ba817aef9bf26d4c7b04b0812dc21be0549ebbc8246fa2f3bb6f828281e7db11ed661556720538e792b
7
+ data.tar.gz: b256cb4d7d78684c3626dcfe6723094f61ea522b008f38c802278fbf0cb19dea18d3061842108f832f12af25757936f4f450726b2fdb9abf8aa81c58b1d11cce
data/CHANGELOG.md CHANGED
@@ -1,14 +1,13 @@
1
- ## 1.0.0
2
- * Initial release for output plugin for logstash to Microsoft Sentinel. This is done with the Log Analytics DCR based API.
3
-
4
- ## 1.1.0
5
- * Increase timeout for read/open connections to 120 seconds.
6
- * Add error handling for when connection timeout occurs.
7
- * Upgrade the rest-client dependency minimum version to 2.1.0.
8
- * Allow setting different proxy values for api connections.
9
- * Upgrade version for ingestion api to 2023-01-01.
10
- * Rename the plugin to microsoft-sentinel-log-analytics-logstash-output-plugin.
11
-
1
+ ## 1.1.3
2
+ - Replaces the `rest-client` library used for connecting to Azure with the `excon` library.
3
+
12
4
  ## 1.1.1
13
- * Support China and US Government Azure sovereign clouds.
14
- * Increase timeout for read/open connections to 240 seconds.
5
+ - Adds support for Azure US Government cloud and Microsoft Azure operated by 21Vianet in China.
6
+
7
+ ## 1.1.0
8
+ - Allows setting different proxy values for API connections.
9
+ - Upgrades version for logs ingestion API to 2023-01-01.
10
+ - Renames the plugin to microsoft-sentinel-log-analytics-logstash-output-plugin.
11
+
12
+ ## 1.0.0
13
+ - The initial release for the Logstash output plugin for Microsoft Sentinel. This plugin uses Data Collection Rules (DCRs) with Azure Monitor's Logs Ingestion API.
data/README.md CHANGED
@@ -3,13 +3,12 @@
3
3
  Microsoft Sentinel provides a new output plugin for Logstash. Use this output plugin to send any log via Logstash to the Microsoft Sentinel/Log Analytics workspace. This is done with the Log Analytics DCR-based API.
4
4
  You may send logs to custom or standard tables.
5
5
 
6
- Plugin version: v1.1.0
7
- Released on: 2023-07-23
6
+ Plugin version: v1.1.3
7
+ Released on: 2024-10-10
8
8
 
9
9
  This plugin is currently in development and is free to use. We welcome contributions from the open source community on this project, and we request and appreciate feedback from users.
10
10
 
11
-
12
- ## Steps to implement the output plugin
11
+ ## Installation Instructions
13
12
  1) Install the plugin
14
13
  2) Create a sample file
15
14
  3) Create the required DCR-related resources
@@ -19,13 +18,16 @@ This plugin is currently in development and is free to use. We welcome contribut
19
18
 
20
19
  ## 1. Install the plugin
21
20
 
22
- Microsoft Sentinel provides Logstash output plugin to Log analytics workspace using DCR based logs API.
23
- Install the microsoft-sentinel-log-analytics-logstash-output-plugin, use [Logstash Offline Plugin Management instruction](<https://www.elastic.co/guide/en/logstash/current/offline-plugins.html>).
21
+ Microsoft Sentinel provides Logstash output plugin to Log analytics workspace using DCR based logs API.
22
+
23
+ The plugin is published on [RubyGems](https://rubygems.org/gems/microsoft-sentinel-log-analytics-logstash-output-plugin). To install to an existing logstash installation, run `logstash-plugin install microsoft-sentinel-log-analytics-logstash-output-plugin`.
24
+
25
+ If you do not have a direct internet connection, you can install the plugin to another logstash installation, and then export and import a plugin bundle to the offline host. For more information, see [Logstash Offline Plugin Management instruction](<https://www.elastic.co/guide/en/logstash/current/offline-plugins.html>).
24
26
 
25
27
  Microsoft Sentinel's Logstash output plugin supports the following versions
26
28
  - 7.0 - 7.17.13
27
29
  - 8.0 - 8.9
28
- - 8.11
30
+ - 8.11 - 8.15
29
31
 
30
32
  Please note that when using Logstash 8, it is recommended to disable ECS in the pipeline. For more information refer to [Logstash documentation.](<https://www.elastic.co/guide/en/logstash/8.4/ecs-ls.html>)
31
33
 
@@ -234,3 +236,23 @@ Which will produce this content in the sample file:
234
236
  }
235
237
  ]
236
238
  ```
239
+
240
+
241
+ ## Known issues
242
+
243
+ When using Logstash installed on a Docker image of Lite Ubuntu, the following warning may appear:
244
+
245
+ ```
246
+ java.lang.RuntimeException: getprotobyname_r failed
247
+ ```
248
+
249
+ To resolve it, use the following commands to install the *netbase* package within your Dockerfile:
250
+ ```bash
251
+ USER root
252
+ RUN apt install netbase -y
253
+ ```
254
+ For more information, see [JNR regression in Logstash 7.17.0 (Docker)](https://github.com/elastic/logstash/issues/13703).
255
+
256
+ If your environment's event rate is low considering the number of allocated Logstash workers, we recommend increasing the value of *plugin_flush_interval* to 60 or more. This change will allow each worker to batch more events before uploading to the Data Collection Endpoint (DCE). You can monitor the ingestion payload using [DCR metrics](https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-monitor#dcr-metrics).
257
+ For more information on *plugin_flush_interval*, see the [Optional Configuration table](https://learn.microsoft.com/azure/sentinel/connect-logstash-data-connection-rules#optional-configuration) mentioned earlier.
258
+
@@ -1,10 +1,10 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
3
- require 'rest-client'
4
3
  require 'json'
5
4
  require 'openssl'
6
5
  require 'base64'
7
6
  require 'time'
7
+ require 'excon'
8
8
 
9
9
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
10
10
  class LogAnalyticsAadTokenProvider
@@ -64,14 +64,13 @@ class LogAnalyticsAadTokenProvider
64
64
  while true
65
65
  begin
66
66
  # Post REST request
67
- response = RestClient::Request.execute(method: :post, url: @token_request_uri, payload: @token_request_body, headers: headers,
68
- proxy: @logstashLoganalyticsConfiguration.proxy_aad)
69
-
70
- if (response.code == 200 || response.code == 201)
67
+ response = Excon.post(@token_request_uri, :body => @token_request_body, :headers => headers, :proxy => @logstashLoganalyticsConfiguration.proxy_aad, expects: [200, 201])
68
+
69
+ if (response.status == 200 || response.status == 201)
71
70
  return JSON.parse(response.body)
72
71
  end
73
- rescue RestClient::ExceptionWithResponse => ewr
74
- @logger.error("Exception while authenticating with AAD API ['#{ewr.response}']")
72
+ rescue Excon::Error::HTTPStatus => ex
73
+ @logger.error("Error while authenticating with AAD [#{ex.class}: '#{ex.response.status}', Response: '#{ex.response.body}']")
75
74
  rescue Exception => ex
76
75
  @logger.trace("Exception while authenticating with AAD API ['#{ex}']")
77
76
  end
@@ -1,11 +1,11 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/sentinel_la/version"
3
- require 'rest-client'
4
3
  require 'json'
5
4
  require 'openssl'
6
5
  require 'base64'
7
6
  require 'time'
8
7
  require 'rbconfig'
8
+ require 'excon'
9
9
 
10
10
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
11
11
  class LogAnalyticsClient
@@ -22,28 +22,78 @@ require "logstash/sentinel_la/logAnalyticsAadTokenProvider"
22
22
  @uri = sprintf("%s/dataCollectionRules/%s/streams/%s?api-version=%s",@logstashLoganalyticsConfiguration.data_collection_endpoint, @logstashLoganalyticsConfiguration.dcr_immutable_id, logstashLoganalyticsConfiguration.dcr_stream_name, la_api_version)
23
23
  @aadTokenProvider=LogAnalyticsAadTokenProvider::new(logstashLoganalyticsConfiguration)
24
24
  @userAgent = getUserAgent()
25
+
26
+ # Auto close connection after 60 seconds of inactivity
27
+ @connectionAutoClose = {
28
+ :last_use => Time.now,
29
+ :lock => Mutex.new,
30
+ :max_idel_time => 60,
31
+ :is_closed => true
32
+ }
33
+
34
+ @timer = Thread.new do
35
+ loop do
36
+ sleep @connectionAutoClose[:max_idel_time] / 2
37
+ if is_connection_stale?
38
+ @connectionAutoClose[:lock].synchronize do
39
+ if is_connection_stale?
40
+ reset_connection
41
+ end
42
+ end
43
+ end
44
+ end
45
+ end
46
+
47
+
25
48
  end # def initialize
26
49
 
27
50
  # Post the given json to Azure Loganalytics
28
51
  def post_data(body)
29
52
  raise ConfigError, 'no json_records' if body.empty?
53
+ response = nil
54
+
55
+ @connectionAutoClose[:lock].synchronize do
56
+ #close connection if its stale
57
+ if is_connection_stale?
58
+ reset_connection
59
+ end
60
+ if @connectionAutoClose[:is_closed]
61
+ open_connection
62
+ end
63
+
64
+ headers = get_header()
65
+ # Post REST request
66
+ response = @connection.request(method: :post, body: body, headers: headers)
67
+ @connectionAutoClose[:is_closed] = false
68
+ @connectionAutoClose[:last_use] = Time.now
69
+ end
70
+ return response
30
71
 
31
- # Create REST request header
32
- headers = get_header()
33
-
34
- # Post REST request
35
-
36
- return RestClient::Request.execute(method: :post, url: @uri, payload: body, headers: headers,
37
- proxy: @logstashLoganalyticsConfiguration.proxy_endpoint, timeout: 240)
38
72
  end # def post_data
39
73
 
40
74
  # Static function to return if the response is OK or else
41
75
  def self.is_successfully_posted(response)
42
- return (response.code >= 200 && response.code < 300 ) ? true : false
76
+ return (response.status >= 200 && response.status < 300 ) ? true : false
43
77
  end # def self.is_successfully_posted
44
78
 
45
79
  private
46
80
 
81
+ def open_connection
82
+ @connection = Excon.new(@uri, :persistent => true, :proxy => @logstashLoganalyticsConfiguration.proxy_endpoint,
83
+ expects: [200, 201, 202, 204, 206, 207, 208, 226, 300, 301, 302, 303, 304, 305, 306, 307, 308],
84
+ read_timeout: 240, write_timeout: 240, connect_timeout: 240)
85
+ @logger.trace("Connection to Azure LogAnalytics was opened.");
86
+ end
87
+
88
+ def reset_connection
89
+ @connection.reset
90
+ @connectionAutoClose[:is_closed] = true
91
+ @logger.trace("Connection to Azure LogAnalytics was closed due to inactivity.");
92
+ end
93
+
94
+ def is_connection_stale?
95
+ return Time.now - @connectionAutoClose[:last_use] > @connectionAutoClose[:max_idel_time] && !@connectionAutoClose[:is_closed]
96
+ end
47
97
  # Create a header for the given length
48
98
  def get_header()
49
99
  # Getting an authorization token bearer (if the token is expired, the method will post a request to get a new authorization token)
@@ -2,7 +2,7 @@
2
2
 
3
3
  require "logstash/sentinel_la/logAnalyticsClient"
4
4
  require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
5
-
5
+ require "excon"
6
6
  # LogStashAutoResizeBuffer class setting a resizable buffer which is flushed periodically
7
7
  # The buffer resize itself according to Azure Loganalytics and configuration limitations
8
8
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
@@ -59,34 +59,32 @@ class LogStashEventsBatcher
59
59
  return
60
60
  else
61
61
  @logger.trace("Rest client response ['#{response}']")
62
- @logger.error("#{api_name} request failed. Error code: #{response.code} #{try_get_info_from_error_response(response)}")
62
+ @logger.error("#{api_name} request failed. Error code: #{response.pree} #{try_get_info_from_error_response(response)}")
63
63
  end
64
- rescue RestClient::Exceptions::Timeout => eto
65
- @logger.trace("Timeout exception ['#{eto.display}'] when posting data to #{api_name}. Rest client response ['#{eto.response.display}']. [amount_of_documents=#{amount_of_documents}]")
66
- @logger.error("Timeout exception while posting data to #{api_name}. [Exception: '#{eto}'] [amount of documents=#{amount_of_documents}]'")
67
- force_retry = true
68
-
69
- rescue RestClient::ExceptionWithResponse => ewr
70
- response = ewr.response
71
- @logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{ewr.response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
72
- @logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")
64
+ rescue Excon::Error::HTTPStatus => ewr
65
+ response = ewr.response
66
+ @logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
67
+ @logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr.class}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")
73
68
 
74
- if ewr.http_code.to_f == 400
75
- @logger.info("Not trying to resend since exception http code is #{ewr.http_code}")
76
- return
77
- elsif ewr.http_code.to_f == 408
78
- force_retry = true
79
- elsif ewr.http_code.to_f == 429
80
- # thrutteling detected, backoff before resending
81
- parsed_retry_after = response.headers.include?(:retry_after) ? response.headers[:retry_after].to_i : 0
82
- seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30
69
+ if ewr.class == Excon::Error::BadRequest
70
+ @logger.info("Not trying to resend since exception http code is 400")
71
+ return
72
+ elsif ewr.class == Excon::Error::RequestTimeout
73
+ force_retry = true
74
+ elsif ewr.class == Excon::Error::TooManyRequests
75
+ # thrutteling detected, backoff before resending
76
+ parsed_retry_after = response.data[:headers].include?('Retry-After') ? response.data[:headers]['Retry-After'].to_i : 0
77
+ seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30
83
78
 
84
- #force another retry even if the next iteration of the loop will be after the retransmission_timeout
79
+ #force another retry even if the next iteration of the loop will be after the retransmission_timeout
80
+ force_retry = true
81
+ end
82
+ rescue Excon::Error::Socket => ex
83
+ @logger.trace("Exception: '#{ex.class.name}]#{ex} in posting data to #{api_name}. [amount_of_documents=#{amount_of_documents}]'")
85
84
  force_retry = true
86
- end
87
- rescue Exception => ex
88
- @logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
89
- @logger.error("Exception in posting data to #{api_name}. [Exception: '#{ex}, amount of documents=#{amount_of_documents}]'")
85
+ rescue Exception => ex
86
+ @logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
87
+ @logger.error("Exception in posting data to #{api_name}. [Exception: '[#{ex.class.name}]#{ex}, amount of documents=#{amount_of_documents}]'")
90
88
  end
91
89
  is_retry = true
92
90
  @logger.info("Retrying transmission to #{api_name} in #{seconds_to_sleep} seconds.")
@@ -110,8 +108,8 @@ class LogStashEventsBatcher
110
108
  def get_request_id_from_response(response)
111
109
  output =""
112
110
  begin
113
- if !response.nil? && response.headers.include?(:x_ms_request_id)
114
- output += response.headers[:x_ms_request_id]
111
+ if !response.nil? && response.data[:headers].include?("x-ms-request-id")
112
+ output += response.data[:headers]["x-ms-request-id"]
115
113
  end
116
114
  rescue Exception => ex
117
115
  @logger.debug("Error while getting reqeust id from success response headers: #{ex.display}")
@@ -124,12 +122,13 @@ class LogStashEventsBatcher
124
122
  begin
125
123
  output = ""
126
124
  if !response.nil?
127
- if response.headers.include?(:x_ms_error_code)
128
- output += " [ms-error-code header: #{response.headers[:x_ms_error_code]}]"
125
+ if response.data[:headers].include?("x-ms-error-code")
126
+ output += " [ms-error-code header: #{response.data[:headers]["x-ms-error-code"]}]"
129
127
  end
130
- if response.headers.include?(:x_ms_request_id)
131
- output += " [x-ms-request-id header: #{response.headers[:x_ms_request_id]}]"
128
+ if response.data[:headers].include?("x-ms-request-id")
129
+ output += " [x-ms-request-id header: #{response.data[:headers]["x-ms-request-id"]}]"
132
130
  end
131
+ output += " [response body: #{response.data[:body]}]"
133
132
  end
134
133
  return output
135
134
  rescue Exception => ex
@@ -1,6 +1,6 @@
1
1
  module LogStash; module Outputs;
2
2
  class MicrosoftSentinelOutputInternal
3
- VERSION_INFO = [1, 1, 1].freeze
3
+ VERSION_INFO = [1, 1, 3].freeze
4
4
  VERSION = VERSION_INFO.map(&:to_s).join('.').freeze
5
5
 
6
6
  def self.version
@@ -20,8 +20,8 @@ Gem::Specification.new do |s|
20
20
  s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
21
21
 
22
22
  # Gem dependencies
23
- s.add_runtime_dependency "rest-client", ">= 2.1.0"
24
23
  s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
25
24
  s.add_runtime_dependency "logstash-codec-plain"
25
+ s.add_runtime_dependency "excon", ">= 0.88.0"
26
26
  s.add_development_dependency "logstash-devutils"
27
27
  end
metadata CHANGED
@@ -1,29 +1,15 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: microsoft-sentinel-log-analytics-logstash-output-plugin
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.1.1
4
+ version: 1.1.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Microsoft Sentinel
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-01-17 00:00:00.000000000 Z
11
+ date: 2024-10-10 00:00:00.000000000 Z
12
12
  dependencies:
13
- - !ruby/object:Gem::Dependency
14
- name: rest-client
15
- requirement: !ruby/object:Gem::Requirement
16
- requirements:
17
- - - ">="
18
- - !ruby/object:Gem::Version
19
- version: 2.1.0
20
- type: :runtime
21
- prerelease: false
22
- version_requirements: !ruby/object:Gem::Requirement
23
- requirements:
24
- - - ">="
25
- - !ruby/object:Gem::Version
26
- version: 2.1.0
27
13
  - !ruby/object:Gem::Dependency
28
14
  name: logstash-core-plugin-api
29
15
  requirement: !ruby/object:Gem::Requirement
@@ -58,6 +44,20 @@ dependencies:
58
44
  - - ">="
59
45
  - !ruby/object:Gem::Version
60
46
  version: '0'
47
+ - !ruby/object:Gem::Dependency
48
+ name: excon
49
+ requirement: !ruby/object:Gem::Requirement
50
+ requirements:
51
+ - - ">="
52
+ - !ruby/object:Gem::Version
53
+ version: 0.88.0
54
+ type: :runtime
55
+ prerelease: false
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - ">="
59
+ - !ruby/object:Gem::Version
60
+ version: 0.88.0
61
61
  - !ruby/object:Gem::Dependency
62
62
  name: logstash-devutils
63
63
  requirement: !ruby/object:Gem::Requirement