logstash-output-azure_loganalytics 0.2.2 → 0.4.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 629eda80466a6cfbdcce7776297948a4e0de40c6
4
- data.tar.gz: 042e2dfeb268197f6375f57f6d0a619bf530fa95
3
+ metadata.gz: f410b4bc1027dc2c2823fdea3b30d0d8228780b0
4
+ data.tar.gz: 7f903327f440d33f41c9c4fa68357554599f9bb7
5
5
  SHA512:
6
- metadata.gz: 0bfcfd713c74c97d32b077c3e039cd6a6bdc4ce85148a28dd45f852a65ab50606f5233fbff4e8ccc3698748a0e6957b822df6c88d48a6818f377fa84711b244f
7
- data.tar.gz: af429bb4c0a4e646b3b37886e1762cda43861ff33e437a8a59da02fef89d01d4aaeb5b1716a9bc4273a6e39d016d5035738a6cdc3bedcd5f8c4641e0d51a13c9
6
+ metadata.gz: cf3fa242b5337af37a589f687fc13d16374eefadf0de7d8114d925eba06837d41b2625b4bb0ae21671390bfae1fb0ce6a7df5fc87586a46cb9b0d85736bdf685
7
+ data.tar.gz: 622d7474dd3f46629674610dafd5170475004462286732c39372d7ca5c31b14a25eed54152ff6dbeacd497f630abbd4afd12cdf5f29448e01eeeb52212b4345b
@@ -1,5 +1,21 @@
1
+ ## 0.4.0
2
+ * Change base [azure-loganalytics-datacollector-api](https://github.com/yokawasa/azure-log-analytics-data-collector) to ">= 0.4.0"
3
+
4
+ ## 0.3.2
5
+ * Improvement: removed unnecessary key check
6
+
7
+ ## 0.3.1
8
+ * Performance optimization for large key_names list scenario - [Issue#10](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/10)
9
+
10
+ ## 0.3.0
11
+ * Support `key_types` param - [Issue#8](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/8)
12
+ * Support custom log analytics API endpoint (for supporting Azure sovereign cloud) - [Issue#9](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/9)
13
+
14
+ ## 0.2.3
15
+ * Added additional debug logging for successful requests - [PR#7](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/7) by [@daniel-chambers](https://github.com/daniel-chambers)
16
+
1
17
  ## 0.2.2
2
- * Fix logging failure [PR#6](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/6) (Thanks to [@daniel-chambers](https://github.com/daniel-chambers))
18
+ * Fix logging failure - [PR#6](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/6) by [@daniel-chambers](https://github.com/daniel-chambers)
3
19
 
4
20
  ## 0.2.1
5
21
 
@@ -7,7 +23,7 @@
7
23
 
8
24
  ## 0.2.0
9
25
 
10
- * Support for time-generated-field in output configuration [Issue#4](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/4) (Thanks to [@KiZach](https://github.com/KiZach))
26
+ * Support for time-generated-field in output configuration - [Issue#4](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/4) (Thanks to [@KiZach](https://github.com/KiZach))
11
27
 
12
28
  ## 0.1.1
13
29
 
data/README.md CHANGED
@@ -19,7 +19,8 @@ output {
19
19
  customer_id => "<OMS WORKSPACE ID>"
20
20
  shared_key => "<CLIENT AUTH KEY>"
21
21
  log_type => "<LOG TYPE NAME>"
22
- key_names => ['key1','key2','key3'..] ## list of Key names (array)
22
+ key_names => ['key1','key2','key3'..] ## list of Key names
23
+ key_types => {'key1'=> 'string' 'key2'=>'double' 'key3'=>'boolean' .. }
23
24
  flush_items => <FLUSH_ITEMS_NUM>
24
25
  flush_interval_time => <FLUSH INTERVAL TIME(sec)>
25
26
  }
@@ -30,10 +31,18 @@ output {
30
31
  * **shared\_key (required)** - The primary or the secondary Connected Sources client authentication key.
31
32
  * **log\_type (required)** - The name of the event type that is being submitted to Log Analytics. This must be only alpha characters.
32
33
  * **time\_generated\_field (optional)** - Default:''(empty string) The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ). See also [this](https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api#create-a-request) for more details
33
- * **key\_names (optional)** - Default:[] (empty array). list of Key names in in-coming record to deliver.
34
+ * **key\_names (optional)** - Default:[] (empty array). The list of key names in in-coming record that you want to submit to Log Analytics.
35
+ * **key\_types (optional)** - Default:{} (empty hash). The list of data types for each column as which you want to store in Log Analytics (`string`, `boolean`, or `double`)
36
+ * The key names in `key_types` param must be included in `key_names` param. The column data whose key isn't included in `key_names` is treated as `string` data type.
37
+ * Multiple key value entries are separated by `spaces` rather than commas (See also [this](https://www.elastic.co/guide/en/logstash/current/configuration-file-structure.html#hash))
38
+ * If you want to store a column as datetime or guid data format, set `string` for the column ( the value of the column should be `YYYY-MM-DDThh:mm:ssZ format` if it's `datetime`, and `GUID format` if it's `guid`).
39
+ * In case that `key_types` param are not specified, all columns that you want to submit ( you choose with `key_names` param ) are stored as `string` data type in Log Analytics.
34
40
  * **flush_items (optional)** - Default 50. Max number of items to buffer before flushing (1 - 1000).
35
41
  * **flush_interval_time (optional)** - Default 5. Max number of seconds to wait between flushes.
36
42
 
43
+ > [NOTE] There is a special param for changing the Log Analytics API endpoint (mainly for supporting Azure sovereign cloud)
44
+ > * **endpoint (optional)** - Default: ods.opinsights.azure.com
45
+
37
46
  ## Tests
38
47
 
39
48
  Here is an example configuration where Logstash's event source and destination are configured as Apache2 access log and Azure Log Analytics respectively.
@@ -115,6 +124,37 @@ Here is an expected output for sample input (Apache2 access log):
115
124
  }
116
125
  ```
117
126
 
127
+ ## Debugging
128
+ If you need to debug and watch what this plugin is sending to Log Analytics, you can change the logstash log level for this plugin to `DEBUG` to get additional logs in the logstash logs.
129
+
130
+ One way of changing the log level is to use the logstash API:
131
+
132
+ ```
133
+ > curl -XPUT 'localhost:9600/_node/logging?pretty' -H "Content-Type: application/json" -d '{ "logger.logstash.outputs.azureloganalytcs" : "DEBUG" }'
134
+ {
135
+ "host" : "yoichitest01",
136
+ "version" : "6.5.4",
137
+ "http_address" : "127.0.0.1:9600",
138
+ "id" : "d8038a9e-02c6-411a-9f6b-597f910edc54",
139
+ "name" : "yoichitest01",
140
+ "acknowledged" : true
141
+ }
142
+ ```
143
+
144
+ You should then be able to see logs like this in your logstash logs:
145
+
146
+ ```
147
+ [2019-03-29T01:18:52,652][DEBUG][logstash.outputs.azureloganalytics] Posting log batch (log count: 50) as log type HealthCheckLogs to DataCollector API. First log: {"message":{"Application":"HealthCheck.API","Environments":{},"Name":"SystemMetrics","LogLevel":"Information","Properties":{"CPU":3,"Memory":83}},"beat":{"version":"6.5.4","hostname":"yoichitest01","name":"yoichitest01"},"timestamp":"2019-03-29T01:18:51.901Z"}
148
+
149
+ [2019-03-29T01:18:52,819][DEBUG][logstash.outputs.azureloganalytics] Successfully posted logs as log type HealthCheckLogs with result code 200 to DataCollector API
150
+ ```
151
+
152
+ Once you're done, you can use the logstash API to undo your log level changes:
153
+
154
+ ```
155
+ > curl -XPUT 'localhost:9600/_node/logging/reset?pretty'
156
+ ```
157
+
118
158
  ## Contributing
119
159
 
120
160
  Bug reports and pull requests are welcome on GitHub at https://github.com/yokawasa/logstash-output-azure_loganalytics.
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.2.2
1
+ 0.4.0
@@ -1,4 +1,5 @@
1
1
  # encoding: utf-8
2
+
2
3
  require "logstash/outputs/base"
3
4
  require "logstash/namespace"
4
5
  require "stud/buffer"
@@ -14,15 +15,31 @@ class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
14
15
  # The primary or the secondary Connected Sources client authentication key
15
16
  config :shared_key, :validate => :string, :required => true
16
17
 
17
- # The name of the event type that is being submitted to Log Analytics. This must be only alpha characters.
18
+ # The name of the event type that is being submitted to Log Analytics.
19
+ # This must be only alpha characters.
18
20
  config :log_type, :validate => :string, :required => true
19
21
 
20
- # The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ)
22
+ # The service endpoint (Default: ods.opinsights.azure.com)
23
+ config :endpoint, :validate => :string, :default => 'ods.opinsights.azure.com'
24
+
25
+ # The name of the time generated field.
26
+ # Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ)
21
27
  config :time_generated_field, :validate => :string, :default => ''
22
28
 
23
- # list of Key names in in-coming record to deliver.
29
+ # The list of key names in in-coming record that you want to submit to Log Analytics
24
30
  config :key_names, :validate => :array, :default => []
25
-
31
+
32
+ # The list of data types for each column as which you want to store in Log Analytics (`string`, `boolean`, or `double`)
33
+ # - The key names in `key_types` param must be included in `key_names` param. The column data whose key isn't included in `key_names` is treated as `string` data type.
34
+ # - Multiple key value entries are separated by `spaces` rather than commas
35
+ # See also https://www.elastic.co/guide/en/logstash/current/configuration-file-structure.html#hash
36
+ # - If you want to store a column as datetime or guid data format, set `string` for the column ( the value of the column should be `YYYY-MM-DDThh:mm:ssZ format` if it's `datetime`, and `GUID format` if it's `guid`).
37
+ # - In case that `key_types` param are not specified, all columns that you want to submit ( you choose with `key_names` param ) are stored as `string` data type in Log Analytics.
38
+ # Example:
39
+ # key_names => ['key1','key2','key3','key4',...]
40
+ # key_types => {'key1'=>'string' 'key2'=>'string' 'key3'=>'boolean' 'key4'=>'double' ...}
41
+ config :key_types, :validate => :hash, :default => {}
42
+
26
43
  # Max number of items to buffer before flushing. Default 50.
27
44
  config :flush_items, :validate => :number, :default => 50
28
45
 
@@ -38,8 +55,15 @@ class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
38
55
  raise ArgumentError, 'log_type must be only alpha characters'
39
56
  end
40
57
 
58
+ @key_types.each { |k, v|
59
+ t = v.downcase
60
+ if ( !t.eql?('string') && !t.eql?('double') && !t.eql?('boolean') )
61
+ raise ArgumentError, "Key type(#{v}) for key(#{k}) must be either string, boolean, or double"
62
+ end
63
+ }
64
+
41
65
  ## Start
42
- @client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key)
66
+ @client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key,@endpoint)
43
67
 
44
68
  buffer_initialize(
45
69
  :max_items => @flush_items,
@@ -64,8 +88,12 @@ class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
64
88
  document = {}
65
89
  event_hash = event.to_hash()
66
90
  if @key_names.length > 0
67
- @key_names.each do |key|
68
- if event_hash.include?(key)
91
+ # Get the intersection of key_names and keys of event_hash
92
+ keys_intersection = @key_names & event_hash.keys
93
+ keys_intersection.each do |key|
94
+ if @key_types.include?(key)
95
+ document[key] = convert_value(@key_types[key], event_hash[key])
96
+ else
69
97
  document[key] = event_hash[key]
70
98
  end
71
99
  end
@@ -80,18 +108,35 @@ class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
80
108
 
81
109
  # Skip in case there are no candidate documents to deliver
82
110
  if documents.length < 1
111
+ @logger.debug("No documents in batch for log type #{@log_type}. Skipping")
83
112
  return
84
113
  end
85
114
 
86
115
  begin
116
+ @logger.debug("Posting log batch (log count: #{documents.length}) as log type #{@log_type} to DataCollector API. First log: " + (documents[0].to_json).to_s)
87
117
  res = @client.post_data(@log_type, documents, @time_generated_field)
88
- if not Azure::Loganalytics::Datacollectorapi::Client.is_success(res)
118
+ if Azure::Loganalytics::Datacollectorapi::Client.is_success(res)
119
+ @logger.debug("Successfully posted logs as log type #{@log_type} with result code #{res.code} to DataCollector API")
120
+ else
89
121
  @logger.error("DataCollector API request failure: error code: #{res.code}, data=>" + (documents.to_json).to_s)
90
122
  end
91
123
  rescue Exception => ex
92
124
  @logger.error("Exception occured in posting to DataCollector API: '#{ex}', data=>" + (documents.to_json).to_s)
93
125
  end
94
-
95
126
  end # def flush
96
127
 
128
+ private
129
+ def convert_value(type, val)
130
+ t = type.downcase
131
+ case t
132
+ when "boolean"
133
+ v = val.downcase
134
+ return (v.to_s == 'true' ) ? true : false
135
+ when "double"
136
+ return Integer(val) rescue Float(val) rescue val
137
+ else
138
+ return val
139
+ end
140
+ end
141
+
97
142
  end # class LogStash::Outputs::AzureLogAnalytics
@@ -19,7 +19,7 @@ Gem::Specification.new do |s|
19
19
 
20
20
  # Gem dependencies
21
21
  s.add_runtime_dependency "rest-client", ">= 1.8.0"
22
- s.add_runtime_dependency "azure-loganalytics-datacollector-api", ">= 0.1.2"
22
+ s.add_runtime_dependency "azure-loganalytics-datacollector-api", ">= 0.4.0"
23
23
  s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
24
  s.add_runtime_dependency "logstash-codec-plain"
25
25
  s.add_development_dependency "logstash-devutils"
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-azure_loganalytics
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.2
4
+ version: 0.4.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Yoichi Kawasaki
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2019-03-20 00:00:00.000000000 Z
11
+ date: 2020-07-17 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -29,7 +29,7 @@ dependencies:
29
29
  requirements:
30
30
  - - ">="
31
31
  - !ruby/object:Gem::Version
32
- version: 0.1.2
32
+ version: 0.4.0
33
33
  name: azure-loganalytics-datacollector-api
34
34
  prerelease: false
35
35
  type: :runtime
@@ -37,7 +37,7 @@ dependencies:
37
37
  requirements:
38
38
  - - ">="
39
39
  - !ruby/object:Gem::Version
40
- version: 0.1.2
40
+ version: 0.4.0
41
41
  - !ruby/object:Gem::Dependency
42
42
  requirement: !ruby/object:Gem::Requirement
43
43
  requirements:
@@ -123,7 +123,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
123
123
  version: '0'
124
124
  requirements: []
125
125
  rubyforge_project:
126
- rubygems_version: 2.4.8
126
+ rubygems_version: 2.6.8
127
127
  signing_key:
128
128
  specification_version: 4
129
129
  summary: logstash output plugin to store events into Azure Log Analytics