microsoft-sentinel-log-analytics-logstash-output-plugin 1.1.1 → 1.1.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: bc9cee055d5aa8f90acde6ed20eacccf0d9e9a36fa9a5ebd1c916d85a410e628
4
- data.tar.gz: '081cd8f53b716eeed931c417ce6fc7a13e3d2d1fe9acf0050858e5b87918d4ff'
3
+ metadata.gz: a1b93ef7e00ec54fd6aea1712b10a16947a3bf4a226a7992897ae5e6fb2f7708
4
+ data.tar.gz: 712e6f247d574d5b48bb87693af44098401246c192ab886d0b3960c318861491
5
5
  SHA512:
6
- metadata.gz: 77753b09f6c4631fb2e2eb1e0ba5dd4eb4c33840901226f1caa08043b39caa77497cdfd42adb7ac6a857b22e3bd98512fff8e3e1ba4d6b7916eb16e822f752ce
7
- data.tar.gz: 9ae33cb6bb9c96c19011b4a21f9ac943f75df18b99761023cb4ba00c84584a6ad0c05d8b0d5c0819993b434e2f6d14da1e62f92ba2f44a11aa6eeed22cbe0b9c
6
+ metadata.gz: 66758c4c92a88a28a81b19161dc1e55d727675ad5f788d6abfce1a495603de7a601a0f4fd9ee35c8ad746a3f782b019cf7343c3c9c472c3f2ddf03a8a1b348f1
7
+ data.tar.gz: 234aee955f0a3ce3a1dbb5c84f620d87ef350053491a716e57feb15282fddf2cce69003add06ef73bf017fa682d9c5e165cfe1b7ad3bc158e1f6a9bf21ec55a8
data/CHANGELOG.md CHANGED
@@ -1,14 +1,16 @@
1
- ## 1.0.0
2
- * Initial release for output plugin for logstash to Microsoft Sentinel. This is done with the Log Analytics DCR based API.
3
-
4
- ## 1.1.0
5
- * Increase timeout for read/open connections to 120 seconds.
6
- * Add error handling for when connection timeout occurs.
7
- * Upgrade the rest-client dependency minimum version to 2.1.0.
8
- * Allow setting different proxy values for api connections.
9
- * Upgrade version for ingestion api to 2023-01-01.
10
- * Rename the plugin to microsoft-sentinel-log-analytics-logstash-output-plugin.
11
-
12
- ## 1.1.1
13
- * Support China and US Government Azure sovereign clouds.
14
- * Increase timeout for read/open connections to 240 seconds.
1
+ ## 1.1.4
2
+ - Limit `excon` library version to lower than 1.0.0 to make sure port is always used when using a proxy.
3
+
4
+ ## 1.1.3
5
+ - Replaces the `rest-client` library used for connecting to Azure with the `excon` library.
6
+
7
+ ## 1.1.1
8
+ - Adds support for Azure US Government cloud and Microsoft Azure operated by 21Vianet in China.
9
+
10
+ ## 1.1.0
11
+ - Allows setting different proxy values for API connections.
12
+ - Upgrades version for logs ingestion API to 2023-01-01.
13
+ - Renames the plugin to microsoft-sentinel-log-analytics-logstash-output-plugin.
14
+
15
+ ## 1.0.0
16
+ - The initial release for the Logstash output plugin for Microsoft Sentinel. This plugin uses Data Collection Rules (DCRs) with Azure Monitor's Logs Ingestion API.
data/Gemfile CHANGED
@@ -1,2 +1,2 @@
1
- source 'https://rubygems.org'
2
- gemspec
1
+ source 'https://rubygems.org'
2
+ gemspec
data/LICENSE CHANGED
@@ -1,21 +1,21 @@
1
- MIT License
2
-
3
- Copyright (c) Microsoft Corporation. All rights reserved.
4
-
5
- Permission is hereby granted, free of charge, to any person obtaining a copy
6
- of this software and associated documentation files (the "Software"), to deal
7
- in the Software without restriction, including without limitation the rights
8
- to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
- copies of the Software, and to permit persons to whom the Software is
10
- furnished to do so, subject to the following conditions:
11
-
12
- The above copyright notice and this permission notice shall be included in all
13
- copies or substantial portions of the Software.
14
-
15
- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
- IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
- FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
- AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
- LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
- SOFTWARE
1
+ MIT License
2
+
3
+ Copyright (c) Microsoft Corporation. All rights reserved.
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE
data/README.md CHANGED
@@ -1,236 +1,258 @@
1
- # Microsoft Sentinel output plugin for Logstash
2
-
3
- Microsoft Sentinel provides a new output plugin for Logstash. Use this output plugin to send any log via Logstash to the Microsoft Sentinel/Log Analytics workspace. This is done with the Log Analytics DCR-based API.
4
- You may send logs to custom or standard tables.
5
-
6
- Plugin version: v1.1.0
7
- Released on: 2023-07-23
8
-
9
- This plugin is currently in development and is free to use. We welcome contributions from the open source community on this project, and we request and appreciate feedback from users.
10
-
11
-
12
- ## Steps to implement the output plugin
13
- 1) Install the plugin
14
- 2) Create a sample file
15
- 3) Create the required DCR-related resources
16
- 4) Configure Logstash configuration file
17
- 5) Basic logs transmission
18
-
19
-
20
- ## 1. Install the plugin
21
-
22
- Microsoft Sentinel provides Logstash output plugin to Log analytics workspace using DCR based logs API.
23
- Install the microsoft-sentinel-log-analytics-logstash-output-plugin, use [Logstash Offline Plugin Management instruction](<https://www.elastic.co/guide/en/logstash/current/offline-plugins.html>).
24
-
25
- Microsoft Sentinel's Logstash output plugin supports the following versions
26
- - 7.0 - 7.17.13
27
- - 8.0 - 8.9
28
- - 8.11
29
-
30
- Please note that when using Logstash 8, it is recommended to disable ECS in the pipeline. For more information refer to [Logstash documentation.](<https://www.elastic.co/guide/en/logstash/8.4/ecs-ls.html>)
31
-
32
-
33
- ## 2. Create a sample file
34
- To create a sample file, follow the following steps:
35
- 1) Copy the output plugin configuration below to your Logstash configuration file:
36
- ```
37
- output {
38
- microsoft-sentinel-log-analytics-logstash-output-plugin {
39
- create_sample_file => true
40
- sample_file_path => "<enter the path to the file in which the sample data will be written>" #for example: "c:\\temp" (for windows) or "/var/log" for Linux.
41
- }
42
- }
43
- ```
44
- Note: make sure that the path exists before creating the sample file.
45
- 2) Start Logstash. The plugin will collect up to 10 records to a sample.
46
- 3) The file named "sampleFile<epoch seconds>.json" in the configured path will be created once there are 10 events to sample or when the Logstash process exited gracefully. (for example: "c:\temp\sampleFile1648453501.json").
47
-
48
-
49
- ### Configurations:
50
- The following parameters are optional and should be used to create a sample file.
51
- - **create_sample_file** - Boolean, False by default. When enabled, up to 10 events will be written to a sample json file.
52
- - **sample_file_path** - Number, Empty by default. Required when create_sample_file is enabled. Should include a valid path in which to place the sample file generated.
53
-
54
- ### Complete example
55
- 1. set the pipeline.conf with the following configuration:
56
- ```
57
- input {
58
- generator {
59
- lines => [ "This is a test log message"]
60
- count => 10
61
- }
62
- }
63
-
64
- output {
65
- microsoft-sentinel-log-analytics-logstash-output-plugin {
66
- create_sample_file => true
67
- sample_file_path => "<enter the path to the file in which the sample data will be written>" #for example: "c:\\temp" (for windows) or "/var/log" for Linux.
68
- }
69
- }
70
- ```
71
-
72
- 2. the following sample file will be generated:
73
- ```
74
- [
75
- {
76
- "host": "logstashMachine",
77
- "sequence": 0,
78
- "message": "This is a test log message",
79
- "ls_timestamp": "2022-10-29T13:19:28.116Z",
80
- "ls_version": "1"
81
- },
82
- ...
83
- ]
84
- ```
85
-
86
- ## 3. Create the required DCR-related resources
87
- To configure Microsoft Sentinel Logstash plugin you first need to create the DCR-related resources. To create these resources, follow one of the following tutorials:
88
- 1) To ingest the data to a custom table use [Tutorial - Send custom logs to Azure Monitor Logs (preview) - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs>) tutorial. Note that as part of creating the table and the DCR you will need to provide the sample file that you've created in the previous section.
89
- 2) To ingest the data to a standard table like Syslog or CommonSecurityLog use [Tutorial - Send custom logs to Azure Monitor Logs using resource manager templates - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs-api>).
90
-
91
-
92
- ## 4. Configure Logstash configuration file
93
-
94
- Use the tutorial from the previous section to retrieve the following attributes:
95
- - **client_app_Id** - String, The 'Application (client) ID' value created in step #3 of the "Configure Application" section of the tutorial you used in the previous step.
96
- - **client_app_secret** -String, The value of the client secret created in step #5 of the "Configure Application" section of the tutorial you used in the previous step.
97
- - **tenant_id** - String, Your subscription's tenant id. You can find in the following path: Home -> Azure Active Directory -> Overview Under 'Basic Information'.
98
- - **data_collection_endpoint** - String, - The value of the logsIngestion URI (see step #3 of the "Create data collection endpoint" section in Tutorial [Tutorial - Send custom logs to Azure Monitor Logs using resource manager templates - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs-api#create-data-collection-endpoint>).
99
- - **dcr_immutable_id** - String, The value of the DCR immutableId (see the "Collect information from DCR" section in [Tutorial - Send custom logs to Azure Monitor Logs (preview) - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs#collect-information-from-dcr>).
100
- - **dcr_stream_name** - String, The name of the data stream (Go to the json view of the DCR as explained in the "Collect information from DCR" section in [Tutorial - Send custom logs to Azure Monitor Logs (preview) - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs#collect-information-from-dcr>) and copy the value of the "dataFlows -> streams" property (see circled in red in the below example).
101
-
102
- After retrieving the required values replace the output section of the Logstash configuration file created in the previous steps with the example below. Then, replace the strings in the brackets below with the corresponding values. Make sure you change the "create_sample_file" attribute to false.
103
-
104
- Here is an example for the output plugin configuration section:
105
- ```
106
- output {
107
- microsoft-sentinel-log-analytics-logstash-output-plugin {
108
- client_app_Id => "<enter your client_app_id value here>"
109
- client_app_secret => "<enter your client_app_secret value here>"
110
- tenant_id => "<enter your tenant id here>"
111
- data_collection_endpoint => "<enter your DCE logsIngestion URI here>"
112
- dcr_immutable_id => "<enter your DCR immutableId here>"
113
- dcr_stream_name => "<enter your stream name here>"
114
- create_sample_file=> false
115
- sample_file_path => "c:\\temp"
116
- }
117
- }
118
-
119
- ```
120
- ### Optional configuration
121
- - **key_names** – Array of strings, if you wish to send a subset of the columns to Log Analytics.
122
- - **plugin_flush_interval** Number, 5 by default. Defines the maximal time difference (in seconds) between sending two messages to Log Analytics.
123
- - **retransmission_time** - Number, 10 by default. This will set the amount of time in seconds given for retransmitting messages once sending has failed.
124
- - **compress_data** - Boolean, false by default. When this field is true, the event data is compressed before using the API. Recommended for high throughput pipelines
125
- - **proxy** - String, Empty by default. Specify which proxy URL to use for API calls for all of the communications with Azure.
126
- - **proxy_aad** - String, Empty by default. Specify which proxy URL to use for API calls for the Azure Active Directory service. Overrides the proxy setting.
127
- - **proxy_endpoint** - String, Empty by default. Specify which proxy URL to use when sending log data to the endpoint. Overrides the proxy setting.
128
- - **azure_cloud** - String, Empty by default. Used to specify the name of the Azure cloud that is being used, AzureCloud is set as default. Available values are: AzureCloud, AzureChinaCloud and AzureUSGovernment.
129
-
130
- #### Note: When setting an empty string as a value for a proxy setting, it will unset any system wide proxy setting.
131
-
132
- Security notice: We recommend not to implicitly state client_app_Id, client_app_secret, tenant_id, data_collection_endpoint, and dcr_immutable_id in your Logstash configuration for security reasons.
133
- It is best to store this sensitive information in a Logstash KeyStore as described here- ['Secrets Keystore'](<https://www.elastic.co/guide/en/logstash/current/keystore.html>)
134
-
135
-
136
- ## 5. Basic logs transmission
137
-
138
- Here is an example configuration that parses Syslog incoming data into a custom stream named "Custom-MyTableRawData".
139
-
140
- ### Example Configuration
141
-
142
- - Using filebeat input pipe
143
-
144
- ```
145
- input {
146
- beats {
147
- port => "5044"
148
- }
149
- }
150
- filter {
151
- }
152
- output {
153
- microsoft-sentinel-log-analytics-logstash-output-plugin {
154
- client_app_Id => "619c1731-15ca-4403-9c61-xxxxxxxxxxxx"
155
- client_app_secret => "xxxxxxxxxxxxxxxx"
156
- tenant_id => "72f988bf-86f1-41af-91ab-xxxxxxxxxxxx"
157
- data_collection_endpoint => "https://my-customlogsv2-test-jz2a.eastus2-1.ingest.monitor.azure.com"
158
- dcr_immutable_id => "dcr-xxxxxxxxxxxxxxxxac23b8978251433a"
159
- dcr_stream_name => "Custom-MyTableRawData"
160
- proxy_aad => "http://proxy.example.com"
161
- }
162
- }
163
-
164
- ```
165
- - Or using the tcp input pipe
166
-
167
- ```
168
- input {
169
- tcp {
170
- port => "514"
171
- type => syslog #optional, will effect log type in table
172
- }
173
- }
174
- filter {
175
- }
176
- output {
177
- microsoft-sentinel-log-analytics-logstash-output-plugin {
178
- client_app_Id => "619c1731-15ca-4403-9c61-xxxxxxxxxxxx"
179
- client_app_secret => "xxxxxxxxxxxxxxxx"
180
- tenant_id => "72f988bf-86f1-41af-91ab-xxxxxxxxxxxx"
181
- data_collection_endpoint => "https://my-customlogsv2-test-jz2a.eastus2-1.ingest.monitor.azure.com"
182
- dcr_immutable_id => "dcr-xxxxxxxxxxxxxxxxac23b8978251433a"
183
- dcr_stream_name => "Custom-MyTableRawData"
184
- }
185
- }
186
- ```
187
-
188
- <u>Advanced Configuration</u>
189
- ```
190
- input {
191
- syslog {
192
- port => 514
193
- }
194
- }
195
-
196
- output {
197
- microsoft-sentinel-log-analytics-logstash-output-plugin {
198
- client_app_Id => "${CLIENT_APP_ID}"
199
- client_app_secret => "${CLIENT_APP_SECRET}"
200
- tenant_id => "${TENANT_ID}"
201
- data_collection_endpoint => "${DATA_COLLECTION_ENDPOINT}"
202
- dcr_immutable_id => "${DCR_IMMUTABLE_ID}"
203
- dcr_stream_name => "Custom-MyTableRawData"
204
- key_names => ['PRI','TIME_TAG','HOSTNAME','MSG']
205
- }
206
- }
207
-
208
- ```
209
-
210
- Now you are able to run logstash with the example configuration and send mock data using the 'logger' command.
211
-
212
- For example:
213
- ```
214
- logger -p local4.warn --rfc3164 --tcp -t CEF "0|Microsoft|Device|cef-test|example|data|1|here is some more data for the example" -P 514 -d -n 127.0.0.1
215
- ```
216
-
217
- Which will produce this content in the sample file:
218
-
219
- ```
220
- [
221
- {
222
- "logsource": "logstashMachine",
223
- "facility": 20,
224
- "severity_label": "Warning",
225
- "severity": 4,
226
- "timestamp": "Apr 7 08:26:04",
227
- "program": "CEF:",
228
- "host": "127.0.0.1",
229
- "facility_label": "local4",
230
- "priority": 164,
231
- "message": "0|Microsoft|Device|cef-test|example|data|1|here is some more data for the example",
232
- "ls_timestamp": "2022-04-07T08:26:04.000Z",
233
- "ls_version": "1"
234
- }
235
- ]
236
- ```
1
+ # Microsoft Sentinel output plugin for Logstash
2
+
3
+ Microsoft Sentinel provides a new output plugin for Logstash. Use this output plugin to send any log via Logstash to the Microsoft Sentinel/Log Analytics workspace. This is done with the Log Analytics DCR-based API.
4
+ You may send logs to custom or standard tables.
5
+
6
+ Plugin version: v1.1.3
7
+ Released on: 2024-10-10
8
+
9
+ This plugin is currently in development and is free to use. We welcome contributions from the open source community on this project, and we request and appreciate feedback from users.
10
+
11
+ ## Installation Instructions
12
+ 1) Install the plugin
13
+ 2) Create a sample file
14
+ 3) Create the required DCR-related resources
15
+ 4) Configure Logstash configuration file
16
+ 5) Basic logs transmission
17
+
18
+
19
+ ## 1. Install the plugin
20
+
21
+ Microsoft Sentinel provides Logstash output plugin to Log analytics workspace using DCR based logs API.
22
+
23
+ The plugin is published on [RubyGems](https://rubygems.org/gems/microsoft-sentinel-log-analytics-logstash-output-plugin). To install to an existing logstash installation, run `logstash-plugin install microsoft-sentinel-log-analytics-logstash-output-plugin`.
24
+
25
+ If you do not have a direct internet connection, you can install the plugin to another logstash installation, and then export and import a plugin bundle to the offline host. For more information, see [Logstash Offline Plugin Management instruction](<https://www.elastic.co/guide/en/logstash/current/offline-plugins.html>).
26
+
27
+ Microsoft Sentinel's Logstash output plugin supports the following versions
28
+ - 7.0 - 7.17.13
29
+ - 8.0 - 8.9
30
+ - 8.11 - 8.15
31
+
32
+ Please note that when using Logstash 8, it is recommended to disable ECS in the pipeline. For more information refer to [Logstash documentation.](<https://www.elastic.co/guide/en/logstash/8.4/ecs-ls.html>)
33
+
34
+
35
+ ## 2. Create a sample file
36
+ To create a sample file, follow the following steps:
37
+ 1) Copy the output plugin configuration below to your Logstash configuration file:
38
+ ```
39
+ output {
40
+ microsoft-sentinel-log-analytics-logstash-output-plugin {
41
+ create_sample_file => true
42
+ sample_file_path => "<enter the path to the file in which the sample data will be written>" #for example: "c:\\temp" (for windows) or "/var/log" for Linux.
43
+ }
44
+ }
45
+ ```
46
+ Note: make sure that the path exists before creating the sample file.
47
+ 2) Start Logstash. The plugin will collect up to 10 records to a sample.
48
+ 3) The file named "sampleFile<epoch seconds>.json" in the configured path will be created once there are 10 events to sample or when the Logstash process exited gracefully. (for example: "c:\temp\sampleFile1648453501.json").
49
+
50
+
51
+ ### Configurations:
52
+ The following parameters are optional and should be used to create a sample file.
53
+ - **create_sample_file** - Boolean, False by default. When enabled, up to 10 events will be written to a sample json file.
54
+ - **sample_file_path** - Number, Empty by default. Required when create_sample_file is enabled. Should include a valid path in which to place the sample file generated.
55
+
56
+ ### Complete example
57
+ 1. set the pipeline.conf with the following configuration:
58
+ ```
59
+ input {
60
+ generator {
61
+ lines => [ "This is a test log message"]
62
+ count => 10
63
+ }
64
+ }
65
+
66
+ output {
67
+ microsoft-sentinel-log-analytics-logstash-output-plugin {
68
+ create_sample_file => true
69
+ sample_file_path => "<enter the path to the file in which the sample data will be written>" #for example: "c:\\temp" (for windows) or "/var/log" for Linux.
70
+ }
71
+ }
72
+ ```
73
+
74
+ 2. the following sample file will be generated:
75
+ ```
76
+ [
77
+ {
78
+ "host": "logstashMachine",
79
+ "sequence": 0,
80
+ "message": "This is a test log message",
81
+ "ls_timestamp": "2022-10-29T13:19:28.116Z",
82
+ "ls_version": "1"
83
+ },
84
+ ...
85
+ ]
86
+ ```
87
+
88
+ ## 3. Create the required DCR-related resources
89
+ To configure Microsoft Sentinel Logstash plugin you first need to create the DCR-related resources. To create these resources, follow one of the following tutorials:
90
+ 1) To ingest the data to a custom table use [Tutorial - Send custom logs to Azure Monitor Logs (preview) - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs>) tutorial. Note that as part of creating the table and the DCR you will need to provide the sample file that you've created in the previous section.
91
+ 2) To ingest the data to a standard table like Syslog or CommonSecurityLog use [Tutorial - Send custom logs to Azure Monitor Logs using resource manager templates - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs-api>).
92
+
93
+
94
+ ## 4. Configure Logstash configuration file
95
+
96
+ Use the tutorial from the previous section to retrieve the following attributes:
97
+ - **client_app_Id** - String, The 'Application (client) ID' value created in step #3 of the "Configure Application" section of the tutorial you used in the previous step.
98
+ - **client_app_secret** -String, The value of the client secret created in step #5 of the "Configure Application" section of the tutorial you used in the previous step.
99
+ - **tenant_id** - String, Your subscription's tenant id. You can find in the following path: Home -> Azure Active Directory -> Overview Under 'Basic Information'.
100
+ - **data_collection_endpoint** - String, - The value of the logsIngestion URI (see step #3 of the "Create data collection endpoint" section in Tutorial [Tutorial - Send custom logs to Azure Monitor Logs using resource manager templates - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs-api#create-data-collection-endpoint>).
101
+ - **dcr_immutable_id** - String, The value of the DCR immutableId (see the "Collect information from DCR" section in [Tutorial - Send custom logs to Azure Monitor Logs (preview) - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs#collect-information-from-dcr>).
102
+ - **dcr_stream_name** - String, The name of the data stream (Go to the json view of the DCR as explained in the "Collect information from DCR" section in [Tutorial - Send custom logs to Azure Monitor Logs (preview) - Azure Monitor | Microsoft Docs](<https://docs.microsoft.com/azure/azure-monitor/logs/tutorial-custom-logs#collect-information-from-dcr>) and copy the value of the "dataFlows -> streams" property (see circled in red in the below example).
103
+
104
+ After retrieving the required values replace the output section of the Logstash configuration file created in the previous steps with the example below. Then, replace the strings in the brackets below with the corresponding values. Make sure you change the "create_sample_file" attribute to false.
105
+
106
+ Here is an example for the output plugin configuration section:
107
+ ```
108
+ output {
109
+ microsoft-sentinel-log-analytics-logstash-output-plugin {
110
+ client_app_Id => "<enter your client_app_id value here>"
111
+ client_app_secret => "<enter your client_app_secret value here>"
112
+ tenant_id => "<enter your tenant id here>"
113
+ data_collection_endpoint => "<enter your DCE logsIngestion URI here>"
114
+ dcr_immutable_id => "<enter your DCR immutableId here>"
115
+ dcr_stream_name => "<enter your stream name here>"
116
+ create_sample_file=> false
117
+ sample_file_path => "c:\\temp"
118
+ }
119
+ }
120
+
121
+ ```
122
+ ### Optional configuration
123
+ - **key_names** Array of strings, if you wish to send a subset of the columns to Log Analytics.
124
+ - **plugin_flush_interval** Number, 5 by default. Defines the maximal time difference (in seconds) between sending two messages to Log Analytics.
125
+ - **retransmission_time** - Number, 10 by default. This will set the amount of time in seconds given for retransmitting messages once sending has failed.
126
+ - **compress_data** - Boolean, false by default. When this field is true, the event data is compressed before using the API. Recommended for high throughput pipelines
127
+ - **proxy** - String, Empty by default. Specify which proxy URL to use for API calls for all of the communications with Azure.
128
+ - **proxy_aad** - String, Empty by default. Specify which proxy URL to use for API calls for the Azure Active Directory service. Overrides the proxy setting.
129
+ - **proxy_endpoint** - String, Empty by default. Specify which proxy URL to use when sending log data to the endpoint. Overrides the proxy setting.
130
+ - **azure_cloud** - String, Empty by default. Used to specify the name of the Azure cloud that is being used, AzureCloud is set as default. Available values are: AzureCloud, AzureChinaCloud and AzureUSGovernment.
131
+
132
+ #### Note: When setting an empty string as a value for a proxy setting, it will unset any system wide proxy setting.
133
+
134
+ Security notice: We recommend not to implicitly state client_app_Id, client_app_secret, tenant_id, data_collection_endpoint, and dcr_immutable_id in your Logstash configuration for security reasons.
135
+ It is best to store this sensitive information in a Logstash KeyStore as described here- ['Secrets Keystore'](<https://www.elastic.co/guide/en/logstash/current/keystore.html>)
136
+
137
+
138
+ ## 5. Basic logs transmission
139
+
140
+ Here is an example configuration that parses Syslog incoming data into a custom stream named "Custom-MyTableRawData".
141
+
142
+ ### Example Configuration
143
+
144
+ - Using filebeat input pipe
145
+
146
+ ```
147
+ input {
148
+ beats {
149
+ port => "5044"
150
+ }
151
+ }
152
+ filter {
153
+ }
154
+ output {
155
+ microsoft-sentinel-log-analytics-logstash-output-plugin {
156
+ client_app_Id => "619c1731-15ca-4403-9c61-xxxxxxxxxxxx"
157
+ client_app_secret => "xxxxxxxxxxxxxxxx"
158
+ tenant_id => "72f988bf-86f1-41af-91ab-xxxxxxxxxxxx"
159
+ data_collection_endpoint => "https://my-customlogsv2-test-jz2a.eastus2-1.ingest.monitor.azure.com"
160
+ dcr_immutable_id => "dcr-xxxxxxxxxxxxxxxxac23b8978251433a"
161
+ dcr_stream_name => "Custom-MyTableRawData"
162
+ proxy_aad => "http://proxy.example.com"
163
+ }
164
+ }
165
+
166
+ ```
167
+ - Or using the tcp input pipe
168
+
169
+ ```
170
+ input {
171
+ tcp {
172
+ port => "514"
173
+ type => syslog #optional, will effect log type in table
174
+ }
175
+ }
176
+ filter {
177
+ }
178
+ output {
179
+ microsoft-sentinel-log-analytics-logstash-output-plugin {
180
+ client_app_Id => "619c1731-15ca-4403-9c61-xxxxxxxxxxxx"
181
+ client_app_secret => "xxxxxxxxxxxxxxxx"
182
+ tenant_id => "72f988bf-86f1-41af-91ab-xxxxxxxxxxxx"
183
+ data_collection_endpoint => "https://my-customlogsv2-test-jz2a.eastus2-1.ingest.monitor.azure.com"
184
+ dcr_immutable_id => "dcr-xxxxxxxxxxxxxxxxac23b8978251433a"
185
+ dcr_stream_name => "Custom-MyTableRawData"
186
+ }
187
+ }
188
+ ```
189
+
190
+ <u>Advanced Configuration</u>
191
+ ```
192
+ input {
193
+ syslog {
194
+ port => 514
195
+ }
196
+ }
197
+
198
+ output {
199
+ microsoft-sentinel-log-analytics-logstash-output-plugin {
200
+ client_app_Id => "${CLIENT_APP_ID}"
201
+ client_app_secret => "${CLIENT_APP_SECRET}"
202
+ tenant_id => "${TENANT_ID}"
203
+ data_collection_endpoint => "${DATA_COLLECTION_ENDPOINT}"
204
+ dcr_immutable_id => "${DCR_IMMUTABLE_ID}"
205
+ dcr_stream_name => "Custom-MyTableRawData"
206
+ key_names => ['PRI','TIME_TAG','HOSTNAME','MSG']
207
+ }
208
+ }
209
+
210
+ ```
211
+
212
+ Now you are able to run logstash with the example configuration and send mock data using the 'logger' command.
213
+
214
+ For example:
215
+ ```
216
+ logger -p local4.warn --rfc3164 --tcp -t CEF "0|Microsoft|Device|cef-test|example|data|1|here is some more data for the example" -P 514 -d -n 127.0.0.1
217
+ ```
218
+
219
+ Which will produce this content in the sample file:
220
+
221
+ ```
222
+ [
223
+ {
224
+ "logsource": "logstashMachine",
225
+ "facility": 20,
226
+ "severity_label": "Warning",
227
+ "severity": 4,
228
+ "timestamp": "Apr 7 08:26:04",
229
+ "program": "CEF:",
230
+ "host": "127.0.0.1",
231
+ "facility_label": "local4",
232
+ "priority": 164,
233
+ "message": "0|Microsoft|Device|cef-test|example|data|1|here is some more data for the example",
234
+ "ls_timestamp": "2022-04-07T08:26:04.000Z",
235
+ "ls_version": "1"
236
+ }
237
+ ]
238
+ ```
239
+
240
+
241
+ ## Known issues
242
+
243
+ When using Logstash installed on a Docker image of Lite Ubuntu, the following warning may appear:
244
+
245
+ ```
246
+ java.lang.RuntimeException: getprotobyname_r failed
247
+ ```
248
+
249
+ To resolve it, use the following commands to install the *netbase* package within your Dockerfile:
250
+ ```bash
251
+ USER root
252
+ RUN apt install netbase -y
253
+ ```
254
+ For more information, see [JNR regression in Logstash 7.17.0 (Docker)](https://github.com/elastic/logstash/issues/13703).
255
+
256
+ If your environment's event rate is low considering the number of allocated Logstash workers, we recommend increasing the value of *plugin_flush_interval* to 60 or more. This change will allow each worker to batch more events before uploading to the Data Collection Endpoint (DCE). You can monitor the ingestion payload using [DCR metrics](https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-monitor#dcr-metrics).
257
+ For more information on *plugin_flush_interval*, see the [Optional Configuration table](https://learn.microsoft.com/azure/sentinel/connect-logstash-data-connection-rules#optional-configuration) mentioned earlier.
258
+
@@ -1,116 +1,116 @@
1
- # encoding: utf-8
2
- require "logstash/outputs/base"
3
- require "logstash/namespace"
4
- require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
5
- require "logstash/sentinel_la/sampleFileCreator"
6
- require "logstash/sentinel_la/logsSender"
7
-
8
-
9
- class LogStash::Outputs::MicrosoftSentinelOutput < LogStash::Outputs::Base
10
-
11
- config_name "microsoft-sentinel-log-analytics-logstash-output-plugin"
12
-
13
- # Stating that the output plugin will run in concurrent mode
14
- concurrency :shared
15
-
16
- # Your registered app ID
17
- config :client_app_Id, :validate => :string
18
-
19
- # The registered app's secret, required by Azure Loganalytics REST API
20
- config :client_app_secret, :validate => :string
21
-
22
- # Your Operations Management Suite Tenant ID
23
- config :tenant_id, :validate => :string
24
-
25
- # Your data collection rule endpoint
26
- config :data_collection_endpoint, :validate => :string
27
-
28
- # Your data collection rule ID
29
- config :dcr_immutable_id, :validate => :string
30
-
31
- # Your dcr data stream name
32
- config :dcr_stream_name, :validate => :string
33
-
34
- # Subset of keys to send to the Azure Loganalytics workspace
35
- config :key_names, :validate => :array, :default => []
36
-
37
- # Max number of seconds to wait between flushes. Default 5
38
- config :plugin_flush_interval, :validate => :number, :default => 5
39
-
40
- # Factor for adding to the amount of messages sent
41
- config :decrease_factor, :validate => :number, :default => 100
42
-
43
- # This will trigger message amount resizing in a REST request to LA
44
- config :amount_resizing, :validate => :boolean, :default => true
45
-
46
- # Setting the default amount of messages sent
47
- # it this is set with amount_resizing=false --> each message will have max_items
48
- config :max_items, :validate => :number, :default => 2000
49
-
50
- # Setting default proxy to be used for all communication with azure
51
- config :proxy, :validate => :string
52
-
53
- # Setting proxy_aad to be used for communicating with azure active directory service
54
- config :proxy_aad, :validate => :string
55
-
56
- # Setting proxy to be used for the LogAnalytics endpoint REST client
57
- config :proxy_endpoint, :validate => :string
58
-
59
- # This will set the amount of time given for retransmitting messages once sending is failed
60
- config :retransmission_time, :validate => :number, :default => 10
61
-
62
- # Compress the message body before sending to LA
63
- config :compress_data, :validate => :boolean, :default => false
64
-
65
- # Generate sample file from incoming events
66
- config :create_sample_file, :validate => :boolean, :default => false
67
-
68
- # Path where to place the sample file created
69
- config :sample_file_path, :validate => :string
70
-
71
- # Used to specify the name of the Azure cloud that is being used. By default, the value is set to "AzureCloud", which
72
- # is the public Azure cloud. However, you can specify a different Azure cloud if you are
73
- # using a different environment, such as Azure Government or Azure China.
74
- config :azure_cloud, :validate => :string
75
-
76
- public
77
- def register
78
- @logstash_configuration= build_logstash_configuration()
79
-
80
- # Validate configuration correctness
81
- @logstash_configuration.validate_configuration()
82
-
83
- @events_handler = @logstash_configuration.create_sample_file ?
84
- LogStash::Outputs::MicrosoftSentinelOutputInternal::SampleFileCreator::new(@logstash_configuration) :
85
- LogStash::Outputs::MicrosoftSentinelOutputInternal::LogsSender::new(@logstash_configuration)
86
- end # def register
87
-
88
- def multi_receive(events)
89
- @events_handler.handle_events(events)
90
- end # def multi_receive
91
-
92
- def close
93
- @events_handler.close
94
- end
95
-
96
- #private
97
- private
98
-
99
- # Building the logstash object configuration from the output configuration provided by the user
100
- # Return LogstashLoganalyticsOutputConfiguration populated with the configuration values
101
- def build_logstash_configuration()
102
- logstash_configuration= LogStash::Outputs::MicrosoftSentinelOutputInternal::LogstashLoganalyticsOutputConfiguration::new(@client_app_Id, @client_app_secret, @tenant_id, @data_collection_endpoint, @dcr_immutable_id, @dcr_stream_name, @compress_data, @create_sample_file, @sample_file_path, @logger)
103
- logstash_configuration.key_names = @key_names
104
- logstash_configuration.plugin_flush_interval = @plugin_flush_interval
105
- logstash_configuration.decrease_factor = @decrease_factor
106
- logstash_configuration.amount_resizing = @amount_resizing
107
- logstash_configuration.max_items = @max_items
108
- logstash_configuration.proxy_aad = @proxy_aad || @proxy || ENV['http_proxy']
109
- logstash_configuration.proxy_endpoint = @proxy_endpoint || @proxy || ENV['http_proxy']
110
- logstash_configuration.retransmission_time = @retransmission_time
111
- logstash_configuration.azure_cloud = @azure_cloud || "AzureCloud"
112
-
113
- return logstash_configuration
114
- end # def build_logstash_configuration
115
-
116
- end # class LogStash::Outputs::MicrosoftSentinelOutput
1
+ # encoding: utf-8
2
+ require "logstash/outputs/base"
3
+ require "logstash/namespace"
4
+ require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
5
+ require "logstash/sentinel_la/sampleFileCreator"
6
+ require "logstash/sentinel_la/logsSender"
7
+
8
+
9
+ class LogStash::Outputs::MicrosoftSentinelOutput < LogStash::Outputs::Base
10
+
11
+ config_name "microsoft-sentinel-log-analytics-logstash-output-plugin"
12
+
13
+ # Stating that the output plugin will run in concurrent mode
14
+ concurrency :shared
15
+
16
+ # Your registered app ID
17
+ config :client_app_Id, :validate => :string
18
+
19
+ # The registered app's secret, required by Azure Loganalytics REST API
20
+ config :client_app_secret, :validate => :string
21
+
22
+ # Your Operations Management Suite Tenant ID
23
+ config :tenant_id, :validate => :string
24
+
25
+ # Your data collection rule endpoint
26
+ config :data_collection_endpoint, :validate => :string
27
+
28
+ # Your data collection rule ID
29
+ config :dcr_immutable_id, :validate => :string
30
+
31
+ # Your dcr data stream name
32
+ config :dcr_stream_name, :validate => :string
33
+
34
+ # Subset of keys to send to the Azure Loganalytics workspace
35
+ config :key_names, :validate => :array, :default => []
36
+
37
+ # Max number of seconds to wait between flushes. Default 5
38
+ config :plugin_flush_interval, :validate => :number, :default => 5
39
+
40
+ # Factor for adding to the amount of messages sent
41
+ config :decrease_factor, :validate => :number, :default => 100
42
+
43
+ # This will trigger message amount resizing in a REST request to LA
44
+ config :amount_resizing, :validate => :boolean, :default => true
45
+
46
+ # Setting the default amount of messages sent
47
+ # it this is set with amount_resizing=false --> each message will have max_items
48
+ config :max_items, :validate => :number, :default => 2000
49
+
50
+ # Setting default proxy to be used for all communication with azure
51
+ config :proxy, :validate => :string
52
+
53
+ # Setting proxy_aad to be used for communicating with azure active directory service
54
+ config :proxy_aad, :validate => :string
55
+
56
+ # Setting proxy to be used for the LogAnalytics endpoint REST client
57
+ config :proxy_endpoint, :validate => :string
58
+
59
+ # This will set the amount of time given for retransmitting messages once sending is failed
60
+ config :retransmission_time, :validate => :number, :default => 10
61
+
62
+ # Compress the message body before sending to LA
63
+ config :compress_data, :validate => :boolean, :default => false
64
+
65
+ # Generate sample file from incoming events
66
+ config :create_sample_file, :validate => :boolean, :default => false
67
+
68
+ # Path where to place the sample file created
69
+ config :sample_file_path, :validate => :string
70
+
71
+ # Used to specify the name of the Azure cloud that is being used. By default, the value is set to "AzureCloud", which
72
+ # is the public Azure cloud. However, you can specify a different Azure cloud if you are
73
+ # using a different environment, such as Azure Government or Azure China.
74
+ config :azure_cloud, :validate => :string
75
+
76
+ public
77
+ def register
78
+ @logstash_configuration= build_logstash_configuration()
79
+
80
+ # Validate configuration correctness
81
+ @logstash_configuration.validate_configuration()
82
+
83
+ @events_handler = @logstash_configuration.create_sample_file ?
84
+ LogStash::Outputs::MicrosoftSentinelOutputInternal::SampleFileCreator::new(@logstash_configuration) :
85
+ LogStash::Outputs::MicrosoftSentinelOutputInternal::LogsSender::new(@logstash_configuration)
86
+ end # def register
87
+
88
+ def multi_receive(events)
89
+ @events_handler.handle_events(events)
90
+ end # def multi_receive
91
+
92
+ def close
93
+ @events_handler.close
94
+ end
95
+
96
+ #private
97
+ private
98
+
99
+ # Building the logstash object configuration from the output configuration provided by the user
100
+ # Return LogstashLoganalyticsOutputConfiguration populated with the configuration values
101
+ def build_logstash_configuration()
102
+ logstash_configuration= LogStash::Outputs::MicrosoftSentinelOutputInternal::LogstashLoganalyticsOutputConfiguration::new(@client_app_Id, @client_app_secret, @tenant_id, @data_collection_endpoint, @dcr_immutable_id, @dcr_stream_name, @compress_data, @create_sample_file, @sample_file_path, @logger)
103
+ logstash_configuration.key_names = @key_names
104
+ logstash_configuration.plugin_flush_interval = @plugin_flush_interval
105
+ logstash_configuration.decrease_factor = @decrease_factor
106
+ logstash_configuration.amount_resizing = @amount_resizing
107
+ logstash_configuration.max_items = @max_items
108
+ logstash_configuration.proxy_aad = @proxy_aad || @proxy || ENV['http_proxy']
109
+ logstash_configuration.proxy_endpoint = @proxy_endpoint || @proxy || ENV['http_proxy']
110
+ logstash_configuration.retransmission_time = @retransmission_time
111
+ logstash_configuration.azure_cloud = @azure_cloud || "AzureCloud"
112
+
113
+ return logstash_configuration
114
+ end # def build_logstash_configuration
115
+
116
+ end # class LogStash::Outputs::MicrosoftSentinelOutput
@@ -1,10 +1,10 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
3
- require 'rest-client'
4
3
  require 'json'
5
4
  require 'openssl'
6
5
  require 'base64'
7
6
  require 'time'
7
+ require 'excon'
8
8
 
9
9
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
10
10
  class LogAnalyticsAadTokenProvider
@@ -64,14 +64,13 @@ class LogAnalyticsAadTokenProvider
64
64
  while true
65
65
  begin
66
66
  # Post REST request
67
- response = RestClient::Request.execute(method: :post, url: @token_request_uri, payload: @token_request_body, headers: headers,
68
- proxy: @logstashLoganalyticsConfiguration.proxy_aad)
69
-
70
- if (response.code == 200 || response.code == 201)
67
+ response = Excon.post(@token_request_uri, :body => @token_request_body, :headers => headers, :proxy => @logstashLoganalyticsConfiguration.proxy_aad, expects: [200, 201])
68
+
69
+ if (response.status == 200 || response.status == 201)
71
70
  return JSON.parse(response.body)
72
71
  end
73
- rescue RestClient::ExceptionWithResponse => ewr
74
- @logger.error("Exception while authenticating with AAD API ['#{ewr.response}']")
72
+ rescue Excon::Error::HTTPStatus => ex
73
+ @logger.error("Error while authenticating with AAD [#{ex.class}: '#{ex.response.status}', Response: '#{ex.response.body}']")
75
74
  rescue Exception => ex
76
75
  @logger.trace("Exception while authenticating with AAD API ['#{ex}']")
77
76
  end
@@ -1,11 +1,11 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/sentinel_la/version"
3
- require 'rest-client'
4
3
  require 'json'
5
4
  require 'openssl'
6
5
  require 'base64'
7
6
  require 'time'
8
7
  require 'rbconfig'
8
+ require 'excon'
9
9
 
10
10
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
11
11
  class LogAnalyticsClient
@@ -22,28 +22,78 @@ require "logstash/sentinel_la/logAnalyticsAadTokenProvider"
22
22
  @uri = sprintf("%s/dataCollectionRules/%s/streams/%s?api-version=%s",@logstashLoganalyticsConfiguration.data_collection_endpoint, @logstashLoganalyticsConfiguration.dcr_immutable_id, logstashLoganalyticsConfiguration.dcr_stream_name, la_api_version)
23
23
  @aadTokenProvider=LogAnalyticsAadTokenProvider::new(logstashLoganalyticsConfiguration)
24
24
  @userAgent = getUserAgent()
25
+
26
+ # Auto close connection after 60 seconds of inactivity
27
+ @connectionAutoClose = {
28
+ :last_use => Time.now,
29
+ :lock => Mutex.new,
30
+ :max_idel_time => 60,
31
+ :is_closed => true
32
+ }
33
+
34
+ @timer = Thread.new do
35
+ loop do
36
+ sleep @connectionAutoClose[:max_idel_time] / 2
37
+ if is_connection_stale?
38
+ @connectionAutoClose[:lock].synchronize do
39
+ if is_connection_stale?
40
+ reset_connection
41
+ end
42
+ end
43
+ end
44
+ end
45
+ end
46
+
47
+
25
48
  end # def initialize
26
49
 
27
50
  # Post the given json to Azure Loganalytics
28
51
  def post_data(body)
29
52
  raise ConfigError, 'no json_records' if body.empty?
53
+ response = nil
54
+
55
+ @connectionAutoClose[:lock].synchronize do
56
+ #close connection if its stale
57
+ if is_connection_stale?
58
+ reset_connection
59
+ end
60
+ if @connectionAutoClose[:is_closed]
61
+ open_connection
62
+ end
63
+
64
+ headers = get_header()
65
+ # Post REST request
66
+ response = @connection.request(method: :post, body: body, headers: headers)
67
+ @connectionAutoClose[:is_closed] = false
68
+ @connectionAutoClose[:last_use] = Time.now
69
+ end
70
+ return response
30
71
 
31
- # Create REST request header
32
- headers = get_header()
33
-
34
- # Post REST request
35
-
36
- return RestClient::Request.execute(method: :post, url: @uri, payload: body, headers: headers,
37
- proxy: @logstashLoganalyticsConfiguration.proxy_endpoint, timeout: 240)
38
72
  end # def post_data
39
73
 
40
74
  # Static function to return if the response is OK or else
41
75
  def self.is_successfully_posted(response)
42
- return (response.code >= 200 && response.code < 300 ) ? true : false
76
+ return (response.status >= 200 && response.status < 300 ) ? true : false
43
77
  end # def self.is_successfully_posted
44
78
 
45
79
  private
46
80
 
81
+ def open_connection
82
+ @connection = Excon.new(@uri, :persistent => true, :proxy => @logstashLoganalyticsConfiguration.proxy_endpoint,
83
+ expects: [200, 201, 202, 204, 206, 207, 208, 226, 300, 301, 302, 303, 304, 305, 306, 307, 308],
84
+ read_timeout: 240, write_timeout: 240, connect_timeout: 240)
85
+ @logger.trace("Connection to Azure LogAnalytics was opened.");
86
+ end
87
+
88
+ def reset_connection
89
+ @connection.reset
90
+ @connectionAutoClose[:is_closed] = true
91
+ @logger.trace("Connection to Azure LogAnalytics was closed due to inactivity.");
92
+ end
93
+
94
+ def is_connection_stale?
95
+ return Time.now - @connectionAutoClose[:last_use] > @connectionAutoClose[:max_idel_time] && !@connectionAutoClose[:is_closed]
96
+ end
47
97
  # Create a header for the given length
48
98
  def get_header()
49
99
  # Getting an authorization token bearer (if the token is expired, the method will post a request to get a new authorization token)
@@ -2,7 +2,7 @@
2
2
 
3
3
  require "logstash/sentinel_la/logAnalyticsClient"
4
4
  require "logstash/sentinel_la/logstashLoganalyticsConfiguration"
5
-
5
+ require "excon"
6
6
  # LogStashAutoResizeBuffer class setting a resizable buffer which is flushed periodically
7
7
  # The buffer resize itself according to Azure Loganalytics and configuration limitations
8
8
  module LogStash; module Outputs; class MicrosoftSentinelOutputInternal
@@ -59,34 +59,35 @@ class LogStashEventsBatcher
59
59
  return
60
60
  else
61
61
  @logger.trace("Rest client response ['#{response}']")
62
- @logger.error("#{api_name} request failed. Error code: #{response.code} #{try_get_info_from_error_response(response)}")
62
+ @logger.error("#{api_name} request failed. Error code: #{response.pree} #{try_get_info_from_error_response(response)}")
63
63
  end
64
- rescue RestClient::Exceptions::Timeout => eto
65
- @logger.trace("Timeout exception ['#{eto.display}'] when posting data to #{api_name}. Rest client response ['#{eto.response.display}']. [amount_of_documents=#{amount_of_documents}]")
66
- @logger.error("Timeout exception while posting data to #{api_name}. [Exception: '#{eto}'] [amount of documents=#{amount_of_documents}]'")
67
- force_retry = true
64
+ rescue Excon::Error::HTTPStatus => ewr
65
+ response = ewr.response
66
+ @logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
67
+ @logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr.class}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")
68
68
 
69
- rescue RestClient::ExceptionWithResponse => ewr
70
- response = ewr.response
71
- @logger.trace("Exception in posting data to #{api_name}. Rest client response ['#{ewr.response}']. [amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
72
- @logger.error("Exception when posting data to #{api_name}. [Exception: '#{ewr}'] #{try_get_info_from_error_response(ewr.response)} [amount of documents=#{amount_of_documents}]'")
69
+ if ewr.class == Excon::Error::BadRequest
70
+ @logger.info("Not trying to resend since exception http code is 400")
71
+ return
72
+ elsif ewr.class == Excon::Error::RequestTimeout
73
+ force_retry = true
74
+ elsif ewr.class == Excon::Error::TooManyRequests
75
+ # throttling detected, backoff before resending
76
+ parsed_retry_after = response.data[:headers].include?('Retry-After') ? response.data[:headers]['Retry-After'].to_i : 0
77
+ seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30
73
78
 
74
- if ewr.http_code.to_f == 400
75
- @logger.info("Not trying to resend since exception http code is #{ewr.http_code}")
76
- return
77
- elsif ewr.http_code.to_f == 408
79
+ #force another retry even if the next iteration of the loop will be after the retransmission_timeout
80
+ force_retry = true
81
+ end
82
+ rescue Excon::Error::Socket => ex
83
+ @logger.trace("Exception: '#{ex.class.name}]#{ex} in posting data to #{api_name}. [amount_of_documents=#{amount_of_documents}]'")
78
84
  force_retry = true
79
- elsif ewr.http_code.to_f == 429
80
- # thrutteling detected, backoff before resending
81
- parsed_retry_after = response.headers.include?(:retry_after) ? response.headers[:retry_after].to_i : 0
82
- seconds_to_sleep = parsed_retry_after > 0 ? parsed_retry_after : 30
83
-
84
- #force another retry even if the next iteration of the loop will be after the retransmission_timeout
85
+ rescue Excon::Error::Timeout => ex
86
+ @logger.trace("Exception: '#{ex.class.name}]#{ex} in posting data to #{api_name}. [amount_of_documents=#{amount_of_documents}]'")
85
87
  force_retry = true
86
- end
87
- rescue Exception => ex
88
- @logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
89
- @logger.error("Exception in posting data to #{api_name}. [Exception: '#{ex}, amount of documents=#{amount_of_documents}]'")
88
+ rescue Exception => ex
89
+ @logger.trace("Exception in posting data to #{api_name}.[amount_of_documents=#{amount_of_documents} request payload=#{call_payload}]")
90
+ @logger.error("Exception in posting data to #{api_name}. [Exception: '[#{ex.class.name}]#{ex}, amount of documents=#{amount_of_documents}]'")
90
91
  end
91
92
  is_retry = true
92
93
  @logger.info("Retrying transmission to #{api_name} in #{seconds_to_sleep} seconds.")
@@ -110,8 +111,8 @@ class LogStashEventsBatcher
110
111
  def get_request_id_from_response(response)
111
112
  output =""
112
113
  begin
113
- if !response.nil? && response.headers.include?(:x_ms_request_id)
114
- output += response.headers[:x_ms_request_id]
114
+ if !response.nil? && response.data[:headers].include?("x-ms-request-id")
115
+ output += response.data[:headers]["x-ms-request-id"]
115
116
  end
116
117
  rescue Exception => ex
117
118
  @logger.debug("Error while getting reqeust id from success response headers: #{ex.display}")
@@ -124,12 +125,13 @@ class LogStashEventsBatcher
124
125
  begin
125
126
  output = ""
126
127
  if !response.nil?
127
- if response.headers.include?(:x_ms_error_code)
128
- output += " [ms-error-code header: #{response.headers[:x_ms_error_code]}]"
128
+ if response.data[:headers].include?("x-ms-error-code")
129
+ output += " [ms-error-code header: #{response.data[:headers]["x-ms-error-code"]}]"
129
130
  end
130
- if response.headers.include?(:x_ms_request_id)
131
- output += " [x-ms-request-id header: #{response.headers[:x_ms_request_id]}]"
131
+ if response.data[:headers].include?("x-ms-request-id")
132
+ output += " [x-ms-request-id header: #{response.data[:headers]["x-ms-request-id"]}]"
132
133
  end
134
+ output += " [response body: #{response.data[:body]}]"
133
135
  end
134
136
  return output
135
137
  rescue Exception => ex
@@ -1,10 +1,10 @@
1
1
  module LogStash; module Outputs;
2
2
  class MicrosoftSentinelOutputInternal
3
- VERSION_INFO = [1, 1, 1].freeze
3
+ VERSION_INFO = [1, 1, 4].freeze
4
4
  VERSION = VERSION_INFO.map(&:to_s).join('.').freeze
5
5
 
6
6
  def self.version
7
7
  VERSION
8
8
  end
9
9
  end
10
- end;end
10
+ end;end
@@ -1,27 +1,27 @@
1
- require File.expand_path('../lib/logstash/sentinel_la/version', __FILE__)
2
-
3
- Gem::Specification.new do |s|
4
- s.name = 'microsoft-sentinel-log-analytics-logstash-output-plugin'
5
- s.version = LogStash::Outputs::MicrosoftSentinelOutputInternal::VERSION
6
- s.authors = ["Microsoft Sentinel"]
7
- s.email = 'AzureSentinel@microsoft.com'
8
- s.summary = %q{Microsoft Sentinel provides a new output plugin for Logstash. Use this output plugin to send any log via Logstash to the Microsoft Sentinel/Log Analytics workspace. This is done with the Log Analytics DCR-based API.}
9
- s.description = s.summary
10
- s.homepage = "https://github.com/Azure/Azure-Sentinel"
11
- s.licenses = ["MIT"]
12
- s.require_paths = ["lib"]
13
-
14
- # Files
15
- s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
16
- # Tests
17
- s.test_files = s.files.grep(%r{^(test|spec|features)/})
18
-
19
- # Special flag to let us know this is actually a logstash plugin
20
- s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
21
-
22
- # Gem dependencies
23
- s.add_runtime_dependency "rest-client", ">= 2.1.0"
24
- s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
25
- s.add_runtime_dependency "logstash-codec-plain"
26
- s.add_development_dependency "logstash-devutils"
27
- end
1
+ require File.expand_path('../lib/logstash/sentinel_la/version', __FILE__)
2
+
3
+ Gem::Specification.new do |s|
4
+ s.name = 'microsoft-sentinel-log-analytics-logstash-output-plugin'
5
+ s.version = LogStash::Outputs::MicrosoftSentinelOutputInternal::VERSION
6
+ s.authors = ["Microsoft Sentinel"]
7
+ s.email = 'AzureSentinel@microsoft.com'
8
+ s.summary = %q{Microsoft Sentinel provides a new output plugin for Logstash. Use this output plugin to send any log via Logstash to the Microsoft Sentinel/Log Analytics workspace. This is done with the Log Analytics DCR-based API.}
9
+ s.description = s.summary
10
+ s.homepage = "https://github.com/Azure/Azure-Sentinel"
11
+ s.licenses = ["MIT"]
12
+ s.require_paths = ["lib"]
13
+
14
+ # Files
15
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
16
+ # Tests
17
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
18
+
19
+ # Special flag to let us know this is actually a logstash plugin
20
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
21
+
22
+ # Gem dependencies
23
+ s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
+ s.add_runtime_dependency "logstash-codec-plain"
25
+ s.add_runtime_dependency "excon", ">= 0.88.0", "< 1.0.0"
26
+ s.add_development_dependency "logstash-devutils"
27
+ end
metadata CHANGED
@@ -1,70 +1,76 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: microsoft-sentinel-log-analytics-logstash-output-plugin
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.1.1
4
+ version: 1.1.4
5
5
  platform: ruby
6
6
  authors:
7
7
  - Microsoft Sentinel
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-01-17 00:00:00.000000000 Z
11
+ date: 2024-12-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
- name: rest-client
15
14
  requirement: !ruby/object:Gem::Requirement
16
15
  requirements:
17
16
  - - ">="
18
17
  - !ruby/object:Gem::Version
19
- version: 2.1.0
18
+ version: '1.60'
19
+ - - "<="
20
+ - !ruby/object:Gem::Version
21
+ version: '2.99'
22
+ name: logstash-core-plugin-api
20
23
  type: :runtime
21
24
  prerelease: false
22
25
  version_requirements: !ruby/object:Gem::Requirement
23
26
  requirements:
24
27
  - - ">="
25
28
  - !ruby/object:Gem::Version
26
- version: 2.1.0
29
+ version: '1.60'
30
+ - - "<="
31
+ - !ruby/object:Gem::Version
32
+ version: '2.99'
27
33
  - !ruby/object:Gem::Dependency
28
- name: logstash-core-plugin-api
29
34
  requirement: !ruby/object:Gem::Requirement
30
35
  requirements:
31
36
  - - ">="
32
37
  - !ruby/object:Gem::Version
33
- version: '1.60'
34
- - - "<="
35
- - !ruby/object:Gem::Version
36
- version: '2.99'
38
+ version: '0'
39
+ name: logstash-codec-plain
37
40
  type: :runtime
38
41
  prerelease: false
39
42
  version_requirements: !ruby/object:Gem::Requirement
40
43
  requirements:
41
44
  - - ">="
42
45
  - !ruby/object:Gem::Version
43
- version: '1.60'
44
- - - "<="
45
- - !ruby/object:Gem::Version
46
- version: '2.99'
46
+ version: '0'
47
47
  - !ruby/object:Gem::Dependency
48
- name: logstash-codec-plain
49
48
  requirement: !ruby/object:Gem::Requirement
50
49
  requirements:
51
50
  - - ">="
52
51
  - !ruby/object:Gem::Version
53
- version: '0'
52
+ version: 0.88.0
53
+ - - "<"
54
+ - !ruby/object:Gem::Version
55
+ version: 1.0.0
56
+ name: excon
54
57
  type: :runtime
55
58
  prerelease: false
56
59
  version_requirements: !ruby/object:Gem::Requirement
57
60
  requirements:
58
61
  - - ">="
59
62
  - !ruby/object:Gem::Version
60
- version: '0'
63
+ version: 0.88.0
64
+ - - "<"
65
+ - !ruby/object:Gem::Version
66
+ version: 1.0.0
61
67
  - !ruby/object:Gem::Dependency
62
- name: logstash-devutils
63
68
  requirement: !ruby/object:Gem::Requirement
64
69
  requirements:
65
70
  - - ">="
66
71
  - !ruby/object:Gem::Version
67
72
  version: '0'
73
+ name: logstash-devutils
68
74
  type: :development
69
75
  prerelease: false
70
76
  version_requirements: !ruby/object:Gem::Requirement