microsoft-logstash-output-azure-loganalytics 0.1.0

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: ce0904868fbfab9bfbc3386a101957a436f9d8c1fde87450ffd2a3a0d9c78574
4
+ data.tar.gz: 1a69ab08f7bd6a3ea6ee639dfddd5c69333ef53a4dcfd6a08fccd65eb4bd26e5
5
+ SHA512:
6
+ metadata.gz: 262dd9c81132c1ad32bd5ccab843722ee42cc3d8ebc5cf417d99540c13c49f42f5eee2f987d29e31d03a471feda3a5dee4318f302ebace577333144f1744dd6b
7
+ data.tar.gz: 20306ed12e1dfde8f1a63caeca7c7b585ed2b50ab8de437fb7d1c37c66fedb2fce6c8d3080e73f2e3c49b276f27f23d3df17c68dffd0a82ebfe4473ad2ea56be
@@ -0,0 +1,3 @@
1
+ ## 1.0.0
2
+
3
+ * Initial release for output plugin for logstash to Azure Sentinel(Log Analytics)
data/Gemfile ADDED
@@ -0,0 +1,2 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
@@ -0,0 +1,69 @@
1
+ # Azure Log Analytics output plugin for Logstash
2
+
3
+ Azure Sentinel provides a new output plugin for Logstash. Using this output plugin, you will be able to send any log you want using Logstash to the Azure Sentinel/Log Analytics workspace
4
+ Today you will be able to send messages to custom logs table that you will define in the output plugin.
5
+ Getting started with Logstash
6
+
7
+ Azure Sentinel output plugin uses the rest API integration to Log Analytics, in order to ingest the logs into custom logs tables [What are custom logs tables]
8
+
9
+ Plugin version: v1.0.0
10
+ Released on: 2020-04-30
11
+
12
+ ## Installation
13
+
14
+ Azure Sentinel provides Logstash output plugin to Log analytics workspace.
15
+ Install the microsoft-logstash-output-azure-loganalytics, use Logstash Working with plugins document.
16
+ For offline setup follow Logstash Offline Plugin Management instruction.
17
+
18
+ ## Configuration
19
+
20
+ in your Logstash configuration file, add the Azure Sentinel output plugin to the configuration with following values:
21
+ - workspace_id – your workspace ID guid
22
+ - workspace_key – your workspace primary key guid
23
+ - custom_log_table_name – table name, in which the logs will be ingested, limited to one table, the log table will be presented in the logs blade under the custom logs label, with a _CL suffix.
24
+ - endpoint – Optional field by default set as log analytics endpoint.
25
+ - time_generated_field – Optional field, this property is used to override the default TimeGenerated field in Log Analytics. Populate this property with the name of the sent data time field.
26
+ - key_names – list of Log analytics output schema fields.
27
+ - plugin_flash_interval – Optional filed, define the maximal time difference (in seconds) between sending two messages to Log Analytics.
28
+ - Max_items – Optional field, 2000 by default. this parameter will control the maximum batch size. This value will be changed if the user didn’t specify “amount_resizing = false” in the configuration.
29
+
30
+ Note: View the GitHub to learn more about the sent message’s configuration, performance settings and mechanism
31
+
32
+ ## Tests
33
+
34
+ Here is an example configuration who parse Syslog incoming data into a custom table named "logstashCustomTableName".
35
+
36
+ ### Example Configuration
37
+ <u>Configuration</u>
38
+ ```
39
+ input {
40
+ tcp {
41
+ port => 514
42
+ type => syslog
43
+ }
44
+ }
45
+
46
+ filter {
47
+ grok {
48
+ match => { "message" => "<%{NUMBER:PRI}>1 (?<TIME_TAG>[0-9]{4}-[0-9]{1,2}-[0-9]{1,2}T[0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2})[^ ]* (?<HOSTNAME>[^ ]*) %{GREEDYDATA:MSG}" }
49
+ }
50
+ }
51
+
52
+ output {
53
+ microsoft-logstash-output-azure-loganalytics {
54
+ workspace_id => "<WS_ID>"
55
+ workspace_key => "${WS_KEY}"
56
+ custom_log_table_name => "logstashCustomTableName"
57
+ key_names => ['PRI','TIME_TAG','HOSTNAME','MSG']
58
+ plugin_flush_interval => 5
59
+ }
60
+ }
61
+ ```
62
+
63
+ Now you are able to run logstash with the example configuration and send mock data using the 'logger' command.
64
+
65
+ For example:
66
+ ```
67
+ logger -p local4.warn -t CEF: "0|Microsoft|Device|cef-test|example|data|1|here is some more data for the example" -P 514 -d -n 127.0.0.1
68
+
69
+ ```
data/VERSION ADDED
@@ -0,0 +1 @@
1
+ 0.1.0
@@ -0,0 +1,72 @@
1
+ # encoding: utf-8
2
+ require "logstash/logAnalyticsClient/logstashLoganalyticsConfiguration"
3
+ require 'rest-client'
4
+ require 'json'
5
+ require 'openssl'
6
+ require 'base64'
7
+ require 'time'
8
+
9
+ class LogAnalyticsClient
10
+ API_VERSION = '2016-04-01'.freeze
11
+
12
+ def initialize (logstashLoganalyticsConfiguration)
13
+ @logstashLoganalyticsConfiguration = logstashLoganalyticsConfiguration
14
+ set_proxy(@logstashLoganalyticsConfiguration.proxy)
15
+ @uri = sprintf("https://%s.%s/api/logs?api-version=%s", @logstashLoganalyticsConfiguration.workspace_id, @logstashLoganalyticsConfiguration.endpoint, API_VERSION)
16
+ end # def initialize
17
+
18
+
19
+ # Post the given json to Azure Loganalytics
20
+ def post_data(body)
21
+ raise ConfigError, 'no json_records' if body.empty?
22
+ # Create REST request header
23
+ header = get_header(body.bytesize)
24
+ # Post REST request
25
+ response = RestClient.post(@uri, body, header)
26
+
27
+ return response
28
+ end # def post_data
29
+
30
+ private
31
+
32
+ # Create a header for the given length
33
+ def get_header(body_bytesize_length)
34
+ # We would like each request to be sent with the current time
35
+ date = rfc1123date()
36
+
37
+ return {
38
+ 'Content-Type' => 'application/json',
39
+ 'Authorization' => signature(date, body_bytesize_length),
40
+ 'Log-Type' => @logstashLoganalyticsConfiguration.custom_log_table_name,
41
+ 'x-ms-date' => date,
42
+ 'time-generated-field' => @logstashLoganalyticsConfiguration.time_generated_field,
43
+ 'x-ms-AzureResourceId' => @logstashLoganalyticsConfiguration.azure_resource_id
44
+ }
45
+ end # def get_header
46
+
47
+ # Setting proxy for the REST client.
48
+ # This option is not used in the output plugin and will be used
49
+ #
50
+ def set_proxy(proxy='')
51
+ RestClient.proxy = proxy.empty? ? ENV['http_proxy'] : proxy
52
+ end # def set_proxy
53
+
54
+ # Return the current data
55
+ def rfc1123date()
56
+ current_time = Time.now
57
+
58
+ return current_time.httpdate()
59
+ end # def rfc1123date
60
+
61
+ def signature(date, body_bytesize_length)
62
+ sigs = sprintf("POST\n%d\napplication/json\nx-ms-date:%s\n/api/logs", body_bytesize_length, date)
63
+ utf8_sigs = sigs.encode('utf-8')
64
+ decoded_shared_key = Base64.decode64(@logstashLoganalyticsConfiguration.workspace_key)
65
+ hmac_sha256_sigs = OpenSSL::HMAC.digest(OpenSSL::Digest.new('sha256'), decoded_shared_key, utf8_sigs)
66
+ encoded_hash = Base64.encode64(hmac_sha256_sigs)
67
+ authorization = sprintf("SharedKey %s:%s", @logstashLoganalyticsConfiguration.workspace_id, encoded_hash)
68
+
69
+ return authorization
70
+ end # def signature
71
+
72
+ end # end of class
@@ -0,0 +1,142 @@
1
+ # encoding: utf-8
2
+ require "stud/buffer"
3
+ require "logstash/logAnalyticsClient/logAnalyticsClient"
4
+ require "stud/buffer"
5
+ require "logstash/logAnalyticsClient/logstashLoganalyticsConfiguration"
6
+
7
+ # LogStashAutoResizeBuffer class setting a resizable buffer which is flushed periodically
8
+ # The buffer resize itself according to Azure Loganalytics and configuration limitations
9
+ class LogStashAutoResizeBuffer
10
+ include Stud::Buffer
11
+
12
+ def initialize(logstashLoganalyticsConfiguration)
13
+ @logstashLoganalyticsConfiguration = logstashLoganalyticsConfiguration
14
+ @logger = @logstashLoganalyticsConfiguration.logger
15
+ @client=LogAnalyticsClient::new(logstashLoganalyticsConfiguration)
16
+ buffer_initialize(
17
+ :max_items => logstashLoganalyticsConfiguration.max_items,
18
+ :max_interval => logstashLoganalyticsConfiguration.plugin_flush_interval,
19
+ :logger => @logstashLoganalyticsConfiguration.logger
20
+ )
21
+ end # initialize
22
+
23
+ # Public methods
24
+ public
25
+
26
+ # Adding an event document into the buffer
27
+ def add_event_document(event_document)
28
+ buffer_receive(event_document)
29
+ end # def add_event_document
30
+
31
+ # Flushing all buffer content to Azure Loganalytics.
32
+ # Called from Stud::Buffer#buffer_flush when there are events to flush
33
+ def flush (documents, close=false)
34
+ # Skip in case there are no candidate documents to deliver
35
+ if documents.length < 1
36
+ @logger.warn("No documents in batch for log type #{@logstashLoganalyticsConfiguration.custom_log_table_name}. Skipping")
37
+ return
38
+ end
39
+
40
+ # We send Json in the REST request
41
+ documents_json = documents.to_json
42
+ # Setting resizing to true will cause changing the max size
43
+ if @logstashLoganalyticsConfiguration.amount_resizing == true
44
+ # Resizing the amount of messages according to size of message received and amount of messages
45
+ change_message_limit_size(documents.length, documents_json.bytesize)
46
+ end
47
+ send_message_to_loganalytics(documents_json, documents.length)
48
+ end # def flush
49
+
50
+ # Private methods
51
+ private
52
+
53
+ # Send documents_json to Azure Loganalytics
54
+ def send_message_to_loganalytics(documents_json, amount_of_documents)
55
+ begin
56
+ @logger.debug("Posting log batch (log count: #{amount_of_documents}) as log type #{@logstashLoganalyticsConfiguration.custom_log_table_name} to DataCollector API.")
57
+ response = @client.post_data(documents_json)
58
+ if is_successfully_posted(response)
59
+ @logger.info("Successfully posted #{amount_of_documents} logs into custom log analytics table[#{@logstashLoganalyticsConfiguration.custom_log_table_name}].")
60
+ else
61
+ @logger.error("DataCollector API request failure: error code: #{response.code}, data=>" + (documents.to_json).to_s)
62
+ resend_message(documents_json, amount_of_documents, @logstashLoganalyticsConfiguration.retransmission_time)
63
+ end
64
+ rescue Exception => ex
65
+ @logger.error("Exception in posting data to Azure Loganalytics.\n[Exception: '#{ex}]'")
66
+ @logger.trace("Exception in posting data to Azure Loganalytics.[amount_of_documents=#{amount_of_documents} documents=#{documents_json}]")
67
+ resend_message(documents_json, amount_of_documents, @logstashLoganalyticsConfiguration.retransmission_time)
68
+ end
69
+ end # end send_message_to_loganalytics
70
+
71
+ # If sending the message toAzure Loganalytics fails we would like to retry to send it again.
72
+ # We would like to do it until we reached to the duration
73
+ def resend_message(documents_json, amount_of_documents, remaining_duration)
74
+ if remaining_duration > 0
75
+ @logger.info("Resending #{amount_of_documents} documents as log type #{@logstashLoganalyticsConfiguration.custom_log_table_name} to DataCollector API in #{@logstashLoganalyticsConfiguration.RETRANSMITION_DELAY} seconds.")
76
+ sleep @logstashLoganalyticsConfiguration.RETRANSMISSION_DELAY
77
+ begin
78
+ response = @client.post_data(documents_json)
79
+ if is_successfully_posted(response)
80
+ @logger.info("Successfully sent #{amount_of_documents} logs into custom log analytics table[#{@logstashLoganalyticsConfiguration.custom_log_table_name}] after resending.")
81
+ else
82
+ @logger.debug("Resending #{amount_of_documents} documents failed, will try to resend for #{(remaining_duration - @logstashLoganalyticsConfiguration.RETRANSMITION_DELAY)}")
83
+ resend_message(documents_json, amount_of_documents, (remaining_duration - @logstashLoganalyticsConfiguration.RETRANSMITION_DELAY))
84
+ end
85
+ rescue Exception => ex
86
+ @logger.debug("Resending #{amount_of_documents} documents failed, will try to resend for #{(remaining_duration - @logstashLoganalyticsConfiguration.RETRANSMITION_DELAY)}")
87
+ resend_message(documents_json, amount_of_documents, (remaining_duration - @logstashLoganalyticsConfiguration.RETRANSMITION_DELAY))
88
+ end
89
+ else
90
+ @logger.error("Could not resend #{amount_of_documents} documents, message is dropped.")
91
+ @logger.trace("Documents (#{amount_of_documents}) dropped. [documents_json=#{documents_json}]")
92
+ end
93
+ end # def resend_message
94
+
95
+ # We would like to change the amount of messages in the buffer (change_max_size)
96
+ # We change the amount according to the Azure Loganalytics limitation and the amount of messages inserted to the buffer
97
+ # in one sending window.
98
+ # Meaning that if we reached the max amount we would like to increase it.
99
+ # Else we would like to decrease it(to reduce latency for messages)
100
+ def change_message_limit_size(amount_of_documents, documents_byte_size)
101
+ new_buffer_size = @logstashLoganalyticsConfiguration.max_items
102
+ average_document_size = documents_byte_size / amount_of_documents
103
+ # If window is full we need to increase it
104
+ # "amount_of_documents" can be greater since buffer is not synchronized meaning
105
+ # that flush can occur after limit was reached.
106
+ if amount_of_documents >= @logstashLoganalyticsConfiguration.max_items
107
+ # if doubling the size wouldn't exceed the API limit
108
+ if ((2 * @logstashLoganalyticsConfiguration.max_items) * average_document_size) < @logstashLoganalyticsConfiguration.MAX_SIZE_BYTES
109
+ new_buffer_size = 2 * @logstashLoganalyticsConfiguration.max_items
110
+ else
111
+ new_buffer_size = (@logstashLoganalyticsConfiguration.MAX_SIZE_BYTES / average_document_size) -1000
112
+ end
113
+
114
+ # We would like to decrease the window but not more then the MIN_MESSAGE_AMOUNT
115
+ # We are trying to decrease it slowly to be able to send as much messages as we can in one window
116
+ elsif amount_of_documents < @logstashLoganalyticsConfiguration.max_items and @logstashLoganalyticsConfiguration.max_items != [(@logstashLoganalyticsConfiguration.max_items - @logstashLoganalyticsConfiguration.decrease_factor) ,@logstashLoganalyticsConfiguration.MIN_MESSAGE_AMOUNT].max
117
+ new_buffer_size = [(@logstashLoganalyticsConfiguration.max_items - @logstashLoganalyticsConfiguration.decrease_factor) ,@logstashLoganalyticsConfiguration.MIN_MESSAGE_AMOUNT].max
118
+ end
119
+
120
+ change_buffer_size(new_buffer_size)
121
+ end # def change_message_limit_size
122
+
123
+ # Receiving new_size as the new max buffer size.
124
+ # Changing both the buffer, the configuration and logging as necessary
125
+ def change_buffer_size(new_size)
126
+ # Change buffer size only if it's needed(new size)
127
+ if @buffer_config[:max_items] != new_size
128
+ old_buffer_size = @buffer_config[:max_items]
129
+ @buffer_config[:max_items] = new_size
130
+ @logstashLoganalyticsConfiguration.max_items = new_size
131
+ @logger.info("Changing buffer size.[configuration='#{old_buffer_size}' , new_size='#{new_size}']")
132
+ else
133
+ @logger.info("Buffer size wasn't changed.[configuration='#{old_buffer_size}' , new_size='#{new_size}']")
134
+ end
135
+ end # def change_buffer_size
136
+
137
+ # Function to return if the response is OK or else
138
+ def is_successfully_posted(response)
139
+ return (response.code == 200) ? true : false
140
+ end # def is_successfully_posted
141
+
142
+ end # LogStashAutoResizeBuffer
@@ -0,0 +1,158 @@
1
+ # encoding: utf-8
2
+ class LogstashLoganalyticsOutputConfiguration
3
+ def initialize(workspace_id, workspace_key, custom_log_table_name, logger)
4
+ @workspace_id = workspace_id
5
+ @workspace_key = workspace_key
6
+ @custom_log_table_name = custom_log_table_name
7
+ @logger = logger
8
+
9
+ # Delay between each resending of a message
10
+ @RETRANSMISSION_DELAY = 2
11
+ @MIN_MESSAGE_AMOUNT = 100
12
+ # Maximum of 30 MB per post to Log Analytics Data Collector API.
13
+ # This is a size limit for a single post.
14
+ # If the data from a single post that exceeds 30 MB, you should split it.
15
+ @loganalytics_api_data_limit = 30 * 1000 * 1000
16
+
17
+ # Taking 4K safety buffer
18
+ @MAX_SIZE_BYTES = @loganalytics_api_data_limit - 10000
19
+ end
20
+
21
+ def validate_configuration()
22
+ if @retransmission_time < 0
23
+ raise ArgumentError, "Setting retransmission_time which sets the time spent for resending each failed messages must be positive integer. [retransmission_time=#{@retransmission_time}]."
24
+
25
+ elsif @max_items < @MIN_MESSAGE_AMOUNT
26
+ raise ArgumentError, "Setting max_items to value must be greater then #{@MIN_MESSAGE_AMOUNT}."
27
+
28
+ elsif @workspace_id.empty? or @workspace_key.empty? or @custom_log_table_name.empty?
29
+ raise ArgumentError, "Malformed configuration , the following arguments can not be null or empty.[workspace_id=#{@workspace_id} , workspace_key=#{@workspace_key} , custom_log_table_name=#{@custom_log_table_name}]"
30
+
31
+ elsif not @custom_log_table_name.match(/^[[:alpha:]]+$/)
32
+ raise ArgumentError, 'custom_log_table_name must be only alpha characters.'
33
+
34
+ elsif custom_log_table_name.empty?
35
+ raise ArgumentError, 'custom_log_table_name should not be empty.'
36
+
37
+ elsif @key_names.length > 500
38
+ raise ArgumentError, 'Azure Loganalytics limits the amount of columns to 500 in each table.'
39
+ end
40
+
41
+ @logger.info("Azure Loganalytics configuration was found valid.")
42
+
43
+ # If all validation pass then configuration is valid
44
+ return true
45
+ end # def validate_configuration
46
+
47
+ def azure_resource_id
48
+ @azure_resource_id
49
+ end
50
+
51
+ def RETRANSMISSION_DELAY
52
+ @RETRANSMISSION_DELAY
53
+ end
54
+
55
+ def MAX_SIZE_BYTES
56
+ @MAX_SIZE_BYTES
57
+ end
58
+
59
+ def amount_resizing
60
+ @amount_resizing
61
+ end
62
+
63
+ def retransmission_time
64
+ @retransmission_time
65
+ end
66
+
67
+ def proxy
68
+ @proxy
69
+ end
70
+
71
+ def logger
72
+ @logger
73
+ end
74
+
75
+ def decrease_factor
76
+ @decrease_factor
77
+ end
78
+
79
+ def workspace_id
80
+ @workspace_id
81
+ end
82
+
83
+ def workspace_key
84
+ @workspace_key
85
+ end
86
+
87
+ def custom_log_table_name
88
+ @custom_log_table_name
89
+ end
90
+
91
+ def endpoint
92
+ @endpoint
93
+ end
94
+
95
+ def time_generated_field
96
+ @time_generated_field
97
+ end
98
+
99
+ def key_names
100
+ @key_names
101
+ end
102
+
103
+ def max_items
104
+ @max_items
105
+ end
106
+
107
+ def plugin_flush_interval
108
+ @plugin_flush_interval
109
+ end
110
+
111
+ def MIN_MESSAGE_AMOUNT
112
+ @MIN_MESSAGE_AMOUNT
113
+ end
114
+
115
+ def max_items=(new_max_items)
116
+ @max_items = new_max_items
117
+ end
118
+
119
+ def endpoint=(new_endpoint)
120
+ @endpoint = new_endpoint
121
+ end
122
+
123
+ def time_generated_field=(new_time_generated_field)
124
+ @time_generated_field = new_time_generated_field
125
+ end
126
+
127
+ def key_names=(new_key_names)
128
+ @key_names = new_key_names
129
+ end
130
+
131
+ def plugin_flush_interval=(new_plugin_flush_interval)
132
+ @plugin_flush_interval = new_plugin_flush_interval
133
+ end
134
+
135
+ def decrease_factor=(new_decrease_factor)
136
+ @decrease_factor = new_decrease_factor
137
+ end
138
+
139
+ def amount_resizing=(new_amount_resizing)
140
+ @amount_resizing = new_amount_resizing
141
+ end
142
+
143
+ def max_items=(new_max_items)
144
+ @max_items = new_max_items
145
+ end
146
+
147
+ def azure_resource_id=(new_azure_resource_id)
148
+ @azure_resource_id = new_azure_resource_id
149
+ end
150
+
151
+ def proxy=(new_proxy)
152
+ @proxy = new_proxy
153
+ end
154
+
155
+ def retransmission_time=(new_retransmission_time)
156
+ @retransmission_time = new_retransmission_time
157
+ end
158
+ end
@@ -0,0 +1,130 @@
1
+ # encoding: utf-8
2
+ require "logstash/outputs/base"
3
+ require "logstash/namespace"
4
+ require "stud/buffer"
5
+ require "logstash/logAnalyticsClient/logStashAutoResizeBuffer"
6
+ require "logstash/logAnalyticsClient/logstashLoganalyticsConfiguration"
7
+
8
+ class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
9
+
10
+ config_name "microsoft-logstash-output-azure-loganalytics"
11
+
12
+ # Stating that the output plugin will run in concurrent mode
13
+ concurrency :shared
14
+
15
+ # Your Operations Management Suite workspace ID
16
+ config :workspace_id, :validate => :string, :required => true
17
+
18
+ # The primary or the secondary key used for authentication, required by Azure Loganalytics REST API
19
+ config :workspace_key, :validate => :string, :required => true
20
+
21
+ # The name of the event type that is being submitted to Log Analytics.
22
+ # This must be only alpha characters.
23
+ # Table name under custom logs in which the data will be inserted
24
+ config :custom_log_table_name, :validate => :string, :required => true
25
+
26
+ # The service endpoint (Default: ods.opinsights.azure.com)
27
+ config :endpoint, :validate => :string, :default => 'ods.opinsights.azure.com'
28
+
29
+ # The name of the time generated field.
30
+ # Be careful that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ)
31
+ config :time_generated_field, :validate => :string, :default => ''
32
+
33
+ # Subset of keys to send to the Azure Loganalytics workspace
34
+ config :key_names, :validate => :array, :default => []
35
+
36
+ # # Max number of items to buffer before flushing. Default 50.
37
+ # config :flush_items, :validate => :number, :default => 50
38
+
39
+ # Max number of seconds to wait between flushes. Default 5
40
+ config :plugin_flush_interval, :validate => :number, :default => 5
41
+
42
+ # Factor for adding to the amount of messages sent
43
+ config :decrease_factor, :validate => :number, :default => 100
44
+
45
+ # This will trigger message amount resizing in a REST request to LA
46
+ config :amount_resizing, :validate => :boolean, :default => true
47
+
48
+ # Setting the default amount of messages sent
49
+ # it this is set with amount_resizing=false --> each message will have max_items
50
+ config :max_items, :validate => :number, :default => 2000
51
+
52
+ # Setting proxy to be used for the Azure Loganalytics REST client
53
+ config :proxy, :validate => :string, :default => ''
54
+
55
+ # This will set the amount of time given for retransmitting messages once sending is failed
56
+ config :retransmission_time, :validate => :number, :default => 10
57
+
58
+ # Optional to override the resource ID field on the workspace table.
59
+ # Resource ID provided must be a valid resource ID on azure
60
+ config :azure_resource_id, :validate => :string, :default => ''
61
+
62
+ public
63
+ def register
64
+ @logstash_configuration= build_logstash_configuration()
65
+ # Validate configuration correctness
66
+ @logstash_configuration.validate_configuration()
67
+ @logger.info("Logstash Azure Loganalytics output plugin configuration was found valid")
68
+
69
+ # Initialize the logstash resizable buffer
70
+ # This buffer will increase and decrease size according to the amount of messages inserted.
71
+ # If the buffer reached the max amount of messages the amount will be increased until the limit
72
+ @logstash_resizable_event_buffer=LogStashAutoResizeBuffer::new(@logstash_configuration)
73
+
74
+ end # def register
75
+
76
+ def multi_receive(events)
77
+ events.each do |event|
78
+ # creating document from event
79
+ document = create_event_document(event)
80
+ # Skip if document doesn't contain any items
81
+ next if (document.keys).length < 1
82
+
83
+ @logger.trace("Adding event document - " + event.to_s)
84
+ @logstash_resizable_event_buffer.add_event_document(document)
85
+
86
+ end
87
+ end # def multi_receive
88
+
89
+ #private
90
+ private
91
+
92
+ # In case that the user has defined key_names meaning that he would like to a subset of the data,
93
+ # we would like to insert only those keys.
94
+ # If no keys were defined we will send all the data
95
+
96
+ def create_event_document(event)
97
+ document = {}
98
+ event_hash = event.to_hash()
99
+ if @key_names.length > 0
100
+ # Get the intersection of key_names and keys of event_hash
101
+ keys_intersection = @key_names & event_hash.keys
102
+ keys_intersection.each do |key|
103
+ document[key] = event_hash[key]
104
+ end
105
+ else
106
+ document = event_hash
107
+ end
108
+
109
+ return document
110
+ end # def create_event_document
111
+
112
+ # Building the logstash object configuration from the output configuration provided by the user
113
+ # Return LogstashLoganalyticsOutputConfiguration populated with the configuration values
114
+ def build_logstash_configuration()
115
+ logstash_configuration= LogstashLoganalyticsOutputConfiguration::new(@workspace_id, @workspace_key, @custom_log_table_name, @logger)
116
+ logstash_configuration.endpoint = @endpoint
117
+ logstash_configuration.time_generated_field = @time_generated_field
118
+ logstash_configuration.key_names = @key_names
119
+ logstash_configuration.plugin_flush_interval = @plugin_flush_interval
120
+ logstash_configuration.decrease_factor = @decrease_factor
121
+ logstash_configuration.amount_resizing = @amount_resizing
122
+ logstash_configuration.max_items = @max_items
123
+ logstash_configuration.azure_resource_id = @azure_resource_id
124
+ logstash_configuration.proxy = @proxy
125
+ logstash_configuration.retransmission_time = @retransmission_time
126
+
127
+ return logstash_configuration
128
+ end # def build_logstash_configuration
129
+
130
+ end # class LogStash::Outputs::AzureLogAnalytics
@@ -0,0 +1,26 @@
1
+ Gem::Specification.new do |s|
2
+ s.name = 'microsoft-logstash-output-azure-loganalytics'
3
+ s.version = File.read("VERSION").strip
4
+ s.authors = ["Ron Marsiano"]
5
+ s.email = "romarsia@outlook.com"
6
+ s.summary = %q{Azure Sentinel provides a new output plugin for Logstash. Using this output plugin, you will be able to send any log you want using Logstash to the Azure Sentinel/Log Analytics workspace}
7
+ s.description = s.summary
8
+ s.homepage = "https://github.com/Azure/Azure-Sentinel"
9
+ s.licenses = ["Apache License (2.0)"]
10
+ s.require_paths = ["lib"]
11
+
12
+ # Files
13
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT', 'VERSION']
14
+ # Tests
15
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
16
+
17
+ # Special flag to let us know this is actually a logstash plugin
18
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
19
+
20
+ # Gem dependencies
21
+ s.add_runtime_dependency "rest-client", ">= 1.8.0"
22
+ s.add_runtime_dependency "azure-loganalytics-datacollector-api", ">= 0.1.5"
23
+ s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
+ s.add_runtime_dependency "logstash-codec-plain"
25
+ s.add_development_dependency "logstash-devutils"
26
+ end
@@ -0,0 +1,78 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
3
+ require "logstash/outputs/microsoft-logstash-output-azure-loganalytics"
4
+ require "logstash/codecs/plain"
5
+ require "logstash/event"
6
+
7
+ describe LogStash::Outputs::AzureLogAnalytics do
8
+
9
+ let(:workspace_id) { '<Workspace ID identifying your workspace>' }
10
+ let(:workspace_key) { '<Primary Key for the Azure log analytics workspace>' }
11
+ let(:custom_log_table_name) { 'ApacheAccessLog' }
12
+ let(:key_names) { ['logid','date','processing_time','remote','user','method','status','agent','eventtime'] }
13
+ let(:time_generated_field) { 'eventtime' }
14
+ let(:amount_resizing) {false}
15
+
16
+ # 1 second flush interval
17
+ let(:plugin_flush_interval) {1}
18
+
19
+ let(:azure_loganalytics_config) {
20
+ {
21
+ "workspace_id" => workspace_id,
22
+ "workspace_key" => workspace_key,
23
+ "custom_log_table_name" => custom_log_table_name,
24
+ "key_names" => key_names,
25
+ "time_generated_field" => time_generated_field,
26
+ "plugin_flush_interval" => plugin_flush_interval,
27
+ "amount_resizing" => amount_resizing
28
+ }
29
+ }
30
+
31
+ let(:azure_loganalytics) { LogStash::Outputs::AzureLogAnalytics.new(azure_loganalytics_config) }
32
+
33
+ before do
34
+ azure_loganalytics.register
35
+ end
36
+
37
+ describe "#flush" do
38
+ it "Should successfully send the event to Azure Log Analytics" do
39
+ events = []
40
+ log1 = {
41
+ :logid => "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
42
+ :date => "2017-04-22 09:44:32 JST",
43
+ :processing_time => "372",
44
+ :remote => "101.202.74.59",
45
+ :user => "-",
46
+ :method => "GET / HTTP/1.1",
47
+ :status => "304",
48
+ :size => "-",
49
+ :referer => "-",
50
+ :agent => "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:27.0) Gecko/20100101 Firefox/27.0",
51
+ :eventtime => "2017-04-22T01:44:32Z"
52
+ }
53
+
54
+ log2 = {
55
+ :logid => "7260iswx-8034-4cc3-uirtx-f068dd4cd659",
56
+ :date => "2017-04-22 09:45:14 JST",
57
+ :processing_time => "105",
58
+ :remote => "201.78.74.59",
59
+ :user => "-",
60
+ :method => "GET /manager/html HTTP/1.1",
61
+ :status =>"200",
62
+ :size => "-",
63
+ :referer => "-",
64
+ :agent => "Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0",
65
+ :eventtime => "2017-04-22T01:45:14Z"
66
+ }
67
+
68
+ event1 = LogStash::Event.new(log1)
69
+ event2 = LogStash::Event.new(log2)
70
+ events.push(event1)
71
+ events.push(event2)
72
+ expect {azure_loganalytics.multi_receive(events)}.to_not raise_error
73
+ # Waiting for the data to be sent
74
+ sleep(plugin_flush_interval + 2)
75
+ end
76
+ end
77
+
78
+ end
metadata ADDED
@@ -0,0 +1,136 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: microsoft-logstash-output-azure-loganalytics
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.0
5
+ platform: ruby
6
+ authors:
7
+ - Ron Marsiano
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2020-05-25 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - ">="
17
+ - !ruby/object:Gem::Version
18
+ version: 1.8.0
19
+ name: rest-client
20
+ prerelease: false
21
+ type: :runtime
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ">="
25
+ - !ruby/object:Gem::Version
26
+ version: 1.8.0
27
+ - !ruby/object:Gem::Dependency
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - ">="
31
+ - !ruby/object:Gem::Version
32
+ version: 0.1.5
33
+ name: azure-loganalytics-datacollector-api
34
+ prerelease: false
35
+ type: :runtime
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - ">="
39
+ - !ruby/object:Gem::Version
40
+ version: 0.1.5
41
+ - !ruby/object:Gem::Dependency
42
+ requirement: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - ">="
45
+ - !ruby/object:Gem::Version
46
+ version: '1.60'
47
+ - - "<="
48
+ - !ruby/object:Gem::Version
49
+ version: '2.99'
50
+ name: logstash-core-plugin-api
51
+ prerelease: false
52
+ type: :runtime
53
+ version_requirements: !ruby/object:Gem::Requirement
54
+ requirements:
55
+ - - ">="
56
+ - !ruby/object:Gem::Version
57
+ version: '1.60'
58
+ - - "<="
59
+ - !ruby/object:Gem::Version
60
+ version: '2.99'
61
+ - !ruby/object:Gem::Dependency
62
+ requirement: !ruby/object:Gem::Requirement
63
+ requirements:
64
+ - - ">="
65
+ - !ruby/object:Gem::Version
66
+ version: '0'
67
+ name: logstash-codec-plain
68
+ prerelease: false
69
+ type: :runtime
70
+ version_requirements: !ruby/object:Gem::Requirement
71
+ requirements:
72
+ - - ">="
73
+ - !ruby/object:Gem::Version
74
+ version: '0'
75
+ - !ruby/object:Gem::Dependency
76
+ requirement: !ruby/object:Gem::Requirement
77
+ requirements:
78
+ - - ">="
79
+ - !ruby/object:Gem::Version
80
+ version: '0'
81
+ name: logstash-devutils
82
+ prerelease: false
83
+ type: :development
84
+ version_requirements: !ruby/object:Gem::Requirement
85
+ requirements:
86
+ - - ">="
87
+ - !ruby/object:Gem::Version
88
+ version: '0'
89
+ description: Azure Sentinel provides a new output plugin for Logstash. Using this
90
+ output plugin, you will be able to send any log you want using Logstash to the Azure
91
+ Sentinel/Log Analytics workspace
92
+ email: romarsia@outlook.com
93
+ executables: []
94
+ extensions: []
95
+ extra_rdoc_files: []
96
+ files:
97
+ - CHANGELOG.md
98
+ - Gemfile
99
+ - README.md
100
+ - VERSION
101
+ - lib/logstash/logAnalyticsClient/logAnalyticsClient.rb
102
+ - lib/logstash/logAnalyticsClient/logStashAutoResizeBuffer.rb
103
+ - lib/logstash/logAnalyticsClient/logstashLoganalyticsConfiguration.rb
104
+ - lib/logstash/outputs/microsoft-logstash-output-azure-loganalytics.rb
105
+ - microsoft-logstash-output-azure-loganalytics.gemspec
106
+ - spec/outputs/azure_loganalytics_spec.rb
107
+ homepage: https://github.com/Azure/Azure-Sentinel
108
+ licenses:
109
+ - Apache License (2.0)
110
+ metadata:
111
+ logstash_plugin: 'true'
112
+ logstash_group: output
113
+ post_install_message:
114
+ rdoc_options: []
115
+ require_paths:
116
+ - lib
117
+ required_ruby_version: !ruby/object:Gem::Requirement
118
+ requirements:
119
+ - - ">="
120
+ - !ruby/object:Gem::Version
121
+ version: '0'
122
+ required_rubygems_version: !ruby/object:Gem::Requirement
123
+ requirements:
124
+ - - ">="
125
+ - !ruby/object:Gem::Version
126
+ version: '0'
127
+ requirements: []
128
+ rubyforge_project:
129
+ rubygems_version: 2.7.10
130
+ signing_key:
131
+ specification_version: 4
132
+ summary: Azure Sentinel provides a new output plugin for Logstash. Using this output
133
+ plugin, you will be able to send any log you want using Logstash to the Azure Sentinel/Log
134
+ Analytics workspace
135
+ test_files:
136
+ - spec/outputs/azure_loganalytics_spec.rb