logstash-output-azure_loganalytics 0.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: b09df7b108a440679410974226e7e7599463d9c2
4
+ data.tar.gz: 30d5a64ab00f80ff9446eaccb8094e5766ff3606
5
+ SHA512:
6
+ metadata.gz: 1039c5799973e17dbd72ab8f4918d501967455bc398469da1e0ba7aee7478b4259c3894bd4ae857d5004042861360c506d77ab9263e5b8a245c015d7e1123d7c
7
+ data.tar.gz: 130337fedcefcf9f07ee34961cfbf0786b53d1129715811b0a8b52a38293913d9998733e1666dfbfc21de51a0d3ca6213644c2b450dc0f3e5e21bee0ed91ec15
@@ -0,0 +1,45 @@
1
+ ## 0.5.0
2
+
3
+ * Change base [azure-loganalytics-datacollector-api](https://github.com/yokawasa/azure-log-analytics-data-collector) to ">= 0.5.0"
4
+ * Support sprintf syntax like `%{my_log_type}` for `log_type` config param - [Issue #13](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/13)
5
+
6
+ ## 0.4.0
7
+
8
+ * Change base [azure-loganalytics-datacollector-api](https://github.com/yokawasa/azure-log-analytics-data-collector) to ">= 0.4.0"
9
+
10
+ ## 0.3.2
11
+
12
+ * Improvement: removed unnecessary key check
13
+
14
+ ## 0.3.1
15
+
16
+ * Performance optimization for large key_names list scenario - [Issue#10](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/10)
17
+
18
+ ## 0.3.0
19
+
20
+ * Support `key_types` param - [Issue#8](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/8)
21
+ * Support custom log analytics API endpoint (for supporting Azure sovereign cloud) - [Issue#9](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/9)
22
+
23
+ ## 0.2.3
24
+
25
+ * Added additional debug logging for successful requests - [PR#7](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/7) by [@daniel-chambers](https://github.com/daniel-chambers)
26
+
27
+ ## 0.2.2
28
+
29
+ * Fix logging failure - [PR#6](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/6) by [@daniel-chambers](https://github.com/daniel-chambers)
30
+
31
+ ## 0.2.1
32
+
33
+ * Updated gem dependencies to allow compatibility with Logstash 5 + 6 (Thanks to [@arthurtoper](https://github.com/arthurtoper))
34
+
35
+ ## 0.2.0
36
+
37
+ * Support for time-generated-field in output configuration - [Issue#4](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/4) (Thanks to [@KiZach](https://github.com/KiZach))
38
+
39
+ ## 0.1.1
40
+
41
+ * Fixed up [Issue#2](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/2) (Thanks to [@gmousset](https://github.com/gmousset))
42
+
43
+ ## 0.1.0
44
+
45
+ * Initial Release
@@ -0,0 +1,13 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Yoichi Kawasaki (@yokawasa)
6
+ * Gwendal Mousset (@gmousset)
7
+ * Arthur Toper (@arthurtoper)
8
+ * Daniel Chambers (@daniel-chambers)
9
+
10
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
11
+ Logstash, and you aren't on the list above and want to be, please let us know
12
+ and we'll make sure you're here. Contributions from folks like you are what make
13
+ open source awesome.
data/Gemfile ADDED
@@ -0,0 +1,2 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
data/LICENSE ADDED
@@ -0,0 +1,13 @@
1
+ Copyright (c) 2012–2015 Elasticsearch <http://www.elastic.co>
2
+
3
+ Licensed under the Apache License, Version 2.0 (the "License");
4
+ you may not use this file except in compliance with the License.
5
+ You may obtain a copy of the License at
6
+
7
+ http://www.apache.org/licenses/LICENSE-2.0
8
+
9
+ Unless required by applicable law or agreed to in writing, software
10
+ distributed under the License is distributed on an "AS IS" BASIS,
11
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ See the License for the specific language governing permissions and
13
+ limitations under the License.
@@ -0,0 +1,160 @@
1
+ # Azure Log Analytics output plugin for Logstash
2
+ logstash-output-azure_loganalytics is a logstash plugin to output to Azure Log Analytics. [Logstash](https://www.elastic.co/products/logstash) is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite [destinations](https://www.elastic.co/products/logstash). [Log Analytics](https://azure.microsoft.com/en-us/services/log-analytics/) is a service in Operations Management Suite (OMS) that helps you collect and analyze data generated by resources in your cloud and on-premises environments. It gives you real-time insights using integrated search and custom dashboards to readily analyze millions of records across all of your workloads and servers regardless of their physical location. The plugin stores in-coming events to Azure Log Analytics by leveraging [Log Analytics HTTP Data Collector API](https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api)
3
+
4
+ ## Installation
5
+
6
+ You can install this plugin using the Logstash "plugin" or "logstash-plugin" (for newer versions of Logstash) command:
7
+ ```
8
+ bin/plugin install logstash-output-azure_loganalytics
9
+ # or
10
+ bin/logstash-plugin install logstash-output-azure_loganalytics (Newer versions of Logstash)
11
+ ```
12
+ Please see [Logstash reference](https://www.elastic.co/guide/en/logstash/current/offline-plugins.html) for more information.
13
+
14
+ ## Configuration
15
+
16
+ ```
17
+ output {
18
+ azure_loganalytics {
19
+ customer_id => "<OMS WORKSPACE ID>"
20
+ shared_key => "<CLIENT AUTH KEY>"
21
+ log_type => "<LOG TYPE NAME>"
22
+ key_names => ['key1','key2','key3'..] ## list of Key names
23
+ key_types => {'key1'=> 'string' 'key2'=>'double' 'key3'=>'boolean' .. }
24
+ flush_items => <FLUSH_ITEMS_NUM>
25
+ flush_interval_time => <FLUSH INTERVAL TIME(sec)>
26
+ }
27
+ }
28
+ ```
29
+
30
+ * **customer\_id (required)** - Your Operations Management Suite workspace ID
31
+ * **shared\_key (required)** - The primary or the secondary Connected Sources client authentication key.
32
+ * **log\_type (required)** - The name of the event type that is being submitted to Log Analytics. It must only contain alpha numeric and _, and not exceed 100 chars. sprintf syntax like `%{my_log_type}` is supported.
33
+ * **time\_generated\_field (optional)** - Default:''(empty string) The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ). See also [this](https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api#create-a-request) for more details
34
+ * **key\_names (optional)** - Default:[] (empty array). The list of key names in in-coming record that you want to submit to Log Analytics.
35
+ * **key\_types (optional)** - Default:{} (empty hash). The list of data types for each column as which you want to store in Log Analytics (`string`, `boolean`, or `double`)
36
+ * The key names in `key_types` param must be included in `key_names` param. The column data whose key isn't included in `key_names` is treated as `string` data type.
37
+ * Multiple key value entries are separated by `spaces` rather than commas (See also [this](https://www.elastic.co/guide/en/logstash/current/configuration-file-structure.html#hash))
38
+ * If you want to store a column as datetime or guid data format, set `string` for the column ( the value of the column should be `YYYY-MM-DDThh:mm:ssZ format` if it's `datetime`, and `GUID format` if it's `guid`).
39
+ * In case that `key_types` param are not specified, all columns that you want to submit ( you choose with `key_names` param ) are stored as `string` data type in Log Analytics.
40
+ * **flush_items (optional)** - Default 50. Max number of items to buffer before flushing (1 - 1000).
41
+ * **flush_interval_time (optional)** - Default 5. Max number of seconds to wait between flushes.
42
+
43
+ > [NOTE] There is a special param for changing the Log Analytics API endpoint (mainly for supporting Azure sovereign cloud)
44
+ > * **endpoint (optional)** - Default: ods.opinsights.azure.com
45
+
46
+ ## Tests
47
+
48
+ Here is an example configuration where Logstash's event source and destination are configured as Apache2 access log and Azure Log Analytics respectively.
49
+
50
+ ### Example Configuration
51
+ ```
52
+ input {
53
+ file {
54
+ path => "/var/log/apache2/access.log"
55
+ start_position => "beginning"
56
+ }
57
+ }
58
+
59
+ filter {
60
+ if [path] =~ "access" {
61
+ mutate { replace => { "type" => "apache_access" } }
62
+ grok {
63
+ match => { "message" => "%{COMBINEDAPACHELOG}" }
64
+ }
65
+ }
66
+ date {
67
+ match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
68
+ }
69
+ }
70
+
71
+ output {
72
+ azure_loganalytics {
73
+ customer_id => "818f7bbc-8034-4cc3-b97d-f068dd4cd659"
74
+ shared_key => "ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksXxcBmQQHw==(dummy)"
75
+ log_type => "ApacheAccessLog"
76
+ key_names => ['logid','date','processing_time','remote','user','method','status','agent']
77
+ flush_items => 10
78
+ flush_interval_time => 5
79
+ }
80
+ # for debug
81
+ stdout { codec => rubydebug }
82
+ }
83
+ ```
84
+
85
+ You can find example configuration files in logstash-output-azure_loganalytics/examples.
86
+
87
+ ### Run the plugin with the example configuration
88
+
89
+ Now you run logstash with the the example configuration like this:
90
+ ```
91
+ # Test your logstash configuration before actually running the logstash
92
+ bin/logstash -f logstash-apache2-to-loganalytics.conf --configtest
93
+ # run
94
+ bin/logstash -f logstash-apache2-to-loganalytics.conf
95
+ ```
96
+
97
+ Here is an expected output for sample input (Apache2 access log):
98
+
99
+ <u>Apache2 access log</u>
100
+ ```
101
+ 106.143.121.169 - - [29/Dec/2016:01:38:16 +0000] "GET /test.html HTTP/1.1" 304 179 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36"
102
+ ```
103
+
104
+ <u>Output (rubydebug)</u>
105
+ ```
106
+ {
107
+ "message" => "106.143.121.169 - - [29/Dec/2016:01:38:16 +0000] \"GET /test.html HTTP/1.1\" 304 179 \"-\" \"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\"",
108
+ "@version" => "1",
109
+ "@timestamp" => "2016-12-29T01:38:16.000Z",
110
+ "path" => "/var/log/apache2/access.log",
111
+ "host" => "yoichitest01",
112
+ "type" => "apache_access",
113
+ "clientip" => "106.143.121.169",
114
+ "ident" => "-",
115
+ "auth" => "-",
116
+ "timestamp" => "29/Dec/2016:01:38:16 +0000",
117
+ "verb" => "GET",
118
+ "request" => "/test.html",
119
+ "httpversion" => "1.1",
120
+ "response" => "304",
121
+ "bytes" => "179",
122
+ "referrer" => "\"-\"",
123
+ "agent" => "\"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\""
124
+ }
125
+ ```
126
+
127
+ ## Debugging
128
+ If you need to debug and watch what this plugin is sending to Log Analytics, you can change the logstash log level for this plugin to `DEBUG` to get additional logs in the logstash logs.
129
+
130
+ One way of changing the log level is to use the logstash API:
131
+
132
+ ```
133
+ > curl -XPUT 'localhost:9600/_node/logging?pretty' -H "Content-Type: application/json" -d '{ "logger.logstash.outputs.azureloganalytcs" : "DEBUG" }'
134
+ {
135
+ "host" : "yoichitest01",
136
+ "version" : "6.5.4",
137
+ "http_address" : "127.0.0.1:9600",
138
+ "id" : "d8038a9e-02c6-411a-9f6b-597f910edc54",
139
+ "name" : "yoichitest01",
140
+ "acknowledged" : true
141
+ }
142
+ ```
143
+
144
+ You should then be able to see logs like this in your logstash logs:
145
+
146
+ ```
147
+ [2019-03-29T01:18:52,652][DEBUG][logstash.outputs.azureloganalytics] Posting log batch (log count: 50) as log type HealthCheckLogs to DataCollector API. First log: {"message":{"Application":"HealthCheck.API","Environments":{},"Name":"SystemMetrics","LogLevel":"Information","Properties":{"CPU":3,"Memory":83}},"beat":{"version":"6.5.4","hostname":"yoichitest01","name":"yoichitest01"},"timestamp":"2019-03-29T01:18:51.901Z"}
148
+
149
+ [2019-03-29T01:18:52,819][DEBUG][logstash.outputs.azureloganalytics] Successfully posted logs as log type HealthCheckLogs with result code 200 to DataCollector API
150
+ ```
151
+
152
+ Once you're done, you can use the logstash API to undo your log level changes:
153
+
154
+ ```
155
+ > curl -XPUT 'localhost:9600/_node/logging/reset?pretty'
156
+ ```
157
+
158
+ ## Contributing
159
+
160
+ Bug reports and pull requests are welcome on GitHub at https://github.com/yokawasa/logstash-output-azure_loganalytics.
data/VERSION ADDED
@@ -0,0 +1 @@
1
+ 0.5.0
@@ -0,0 +1,143 @@
1
+ # encoding: utf-8
2
+
3
+ require "logstash/outputs/base"
4
+ require "logstash/namespace"
5
+ require "stud/buffer"
6
+
7
+ class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
8
+ include Stud::Buffer
9
+
10
+ config_name "azure_loganalytics"
11
+
12
+ # Your Operations Management Suite workspace ID
13
+ config :customer_id, :validate => :string, :required => true
14
+
15
+ # The primary or the secondary Connected Sources client authentication key
16
+ config :shared_key, :validate => :string, :required => true
17
+
18
+ # The name of the event type that is being submitted to Log Analytics.
19
+ # This must only contain alpha numeric and _, and not exceed 100 chars.
20
+ # sprintf syntax like %{my_log_type} is supported.
21
+ config :log_type, :validate => :string, :required => true
22
+
23
+ # The service endpoint (Default: ods.opinsights.azure.com)
24
+ config :endpoint, :validate => :string, :default => 'ods.opinsights.azure.com'
25
+
26
+ # The name of the time generated field.
27
+ # Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ)
28
+ config :time_generated_field, :validate => :string, :default => ''
29
+
30
+ # The list of key names in in-coming record that you want to submit to Log Analytics
31
+ config :key_names, :validate => :array, :default => []
32
+
33
+ # The list of data types for each column as which you want to store in Log Analytics (`string`, `boolean`, or `double`)
34
+ # - The key names in `key_types` param must be included in `key_names` param. The column data whose key isn't included in `key_names` is treated as `string` data type.
35
+ # - Multiple key value entries are separated by `spaces` rather than commas
36
+ # See also https://www.elastic.co/guide/en/logstash/current/configuration-file-structure.html#hash
37
+ # - If you want to store a column as datetime or guid data format, set `string` for the column ( the value of the column should be `YYYY-MM-DDThh:mm:ssZ format` if it's `datetime`, and `GUID format` if it's `guid`).
38
+ # - In case that `key_types` param are not specified, all columns that you want to submit ( you choose with `key_names` param ) are stored as `string` data type in Log Analytics.
39
+ # Example:
40
+ # key_names => ['key1','key2','key3','key4',...]
41
+ # key_types => {'key1'=>'string' 'key2'=>'string' 'key3'=>'boolean' 'key4'=>'double' ...}
42
+ config :key_types, :validate => :hash, :default => {}
43
+
44
+ # Max number of items to buffer before flushing. Default 50.
45
+ config :flush_items, :validate => :number, :default => 50
46
+
47
+ # Max number of seconds to wait between flushes. Default 5
48
+ config :flush_interval_time, :validate => :number, :default => 5
49
+
50
+ public
51
+ def register
52
+ require 'azure/loganalytics/datacollectorapi/client'
53
+
54
+ #if not @log_type.match(/^[[:alpha:]]+$/)
55
+ # raise ArgumentError, 'log_type must be only alpha characters'
56
+ #end
57
+
58
+ @key_types.each { |k, v|
59
+ t = v.downcase
60
+ if ( !t.eql?('string') && !t.eql?('double') && !t.eql?('boolean') )
61
+ raise ArgumentError, "Key type(#{v}) for key(#{k}) must be either string, boolean, or double"
62
+ end
63
+ }
64
+
65
+ ## Start
66
+ @client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key,@endpoint)
67
+
68
+ buffer_initialize(
69
+ :max_items => @flush_items,
70
+ :max_interval => @flush_interval_time,
71
+ :logger => @logger
72
+ )
73
+
74
+ end # def register
75
+
76
+ public
77
+ def receive(event)
78
+ @log_type = event.sprintf(@log_type)
79
+ # Simply save an event for later delivery
80
+ buffer_receive(event)
81
+ end # def receive
82
+
83
+ # called from Stud::Buffer#buffer_flush when there are events to flush
84
+ public
85
+ def flush (events, close=false)
86
+
87
+ documents = [] #this is the array of hashes to add Azure Log Analytics
88
+ events.each do |event|
89
+ document = {}
90
+ event_hash = event.to_hash()
91
+ if @key_names.length > 0
92
+ # Get the intersection of key_names and keys of event_hash
93
+ keys_intersection = @key_names & event_hash.keys
94
+ keys_intersection.each do |key|
95
+ if @key_types.include?(key)
96
+ document[key] = convert_value(@key_types[key], event_hash[key])
97
+ else
98
+ document[key] = event_hash[key]
99
+ end
100
+ end
101
+ else
102
+ document = event_hash
103
+ end
104
+ # Skip if document doesn't contain any items
105
+ next if (document.keys).length < 1
106
+
107
+ documents.push(document)
108
+ end
109
+
110
+ # Skip in case there are no candidate documents to deliver
111
+ if documents.length < 1
112
+ @logger.debug("No documents in batch for log type #{@log_type}. Skipping")
113
+ return
114
+ end
115
+
116
+ begin
117
+ @logger.debug("Posting log batch (log count: #{documents.length}) as log type #{@log_type} to DataCollector API. First log: " + (documents[0].to_json).to_s)
118
+ res = @client.post_data(@log_type, documents, @time_generated_field)
119
+ if Azure::Loganalytics::Datacollectorapi::Client.is_success(res)
120
+ @logger.debug("Successfully posted logs as log type #{@log_type} with result code #{res.code} to DataCollector API")
121
+ else
122
+ @logger.error("DataCollector API request failure: error code: #{res.code}, data=>" + (documents.to_json).to_s)
123
+ end
124
+ rescue Exception => ex
125
+ @logger.error("Exception occured in posting to DataCollector API: '#{ex}', data=>" + (documents.to_json).to_s)
126
+ end
127
+ end # def flush
128
+
129
+ private
130
+ def convert_value(type, val)
131
+ t = type.downcase
132
+ case t
133
+ when "boolean"
134
+ v = val.downcase
135
+ return (v.to_s == 'true' ) ? true : false
136
+ when "double"
137
+ return Integer(val) rescue Float(val) rescue val
138
+ else
139
+ return val
140
+ end
141
+ end
142
+
143
+ end # class LogStash::Outputs::AzureLogAnalytics
@@ -0,0 +1,26 @@
1
+ Gem::Specification.new do |s|
2
+ s.name = 'logstash-output-azure_loganalytics'
3
+ s.version = File.read("VERSION").strip
4
+ s.authors = ["Yoichi Kawasaki"]
5
+ s.email = "yoichi.kawasaki@outlook.com"
6
+ s.summary = %q{logstash output plugin to store events into Azure Log Analytics}
7
+ s.description = s.summary
8
+ s.homepage = "http://github.com/yokawasa/logstash-output-azure_loganalytics"
9
+ s.licenses = ["Apache License (2.0)"]
10
+ s.require_paths = ["lib"]
11
+
12
+ # Files
13
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT', 'VERSION']
14
+ # Tests
15
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
16
+
17
+ # Special flag to let us know this is actually a logstash plugin
18
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
19
+
20
+ # Gem dependencies
21
+ s.add_runtime_dependency "rest-client", ">= 1.8.0"
22
+ s.add_runtime_dependency "azure-loganalytics-datacollector-api", ">= 0.4.0"
23
+ s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
+ s.add_runtime_dependency "logstash-codec-plain"
25
+ s.add_development_dependency "logstash-devutils"
26
+ end
@@ -0,0 +1,72 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
3
+ require "logstash/outputs/azure_loganalytics"
4
+ require "logstash/codecs/plain"
5
+ require "logstash/event"
6
+
7
+ describe LogStash::Outputs::AzureLogAnalytics do
8
+
9
+ let(:customer_id) { '<Customer ID aka WorkspaceID String>' }
10
+ let(:shared_key) { '<Primary Key String>' }
11
+ let(:log_type) { 'ApacheAccessLog' }
12
+ let(:key_names) { ['logid','date','processing_time','remote','user','method','status','agent','eventtime'] }
13
+ let(:time_generated_field) { 'eventtime' }
14
+
15
+ let(:azure_loganalytics_config) {
16
+ {
17
+ "customer_id" => customer_id,
18
+ "shared_key" => shared_key,
19
+ "log_type" => log_type,
20
+ "key_names" => key_names,
21
+ "time_generated_field" => time_generated_field
22
+ }
23
+ }
24
+
25
+ let(:azure_loganalytics_output) { LogStash::Outputs::AzureLogAnalytics.new(azure_loganalytics_config) }
26
+
27
+ before do
28
+ azure_loganalytics_output.register
29
+ end
30
+
31
+ describe "#flush" do
32
+ it "Should successfully send the event to Azure Log Analytics" do
33
+ events = []
34
+ log1 = {
35
+ :logid => "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
36
+ :date => "2017-04-22 09:44:32 JST",
37
+ :processing_time => "372",
38
+ :remote => "101.202.74.59",
39
+ :user => "-",
40
+ :method => "GET / HTTP/1.1",
41
+ :status => "304",
42
+ :size => "-",
43
+ :referer => "-",
44
+ :agent => "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:27.0) Gecko/20100101 Firefox/27.0",
45
+ :eventtime => "2017-04-22T01:44:32Z"
46
+ }
47
+
48
+ log2 = {
49
+ :logid => "7260iswx-8034-4cc3-uirtx-f068dd4cd659",
50
+ :date => "2017-04-22 09:45:14 JST",
51
+ :processing_time => "105",
52
+ :remote => "201.78.74.59",
53
+ :user => "-",
54
+ :method => "GET /manager/html HTTP/1.1",
55
+ :status =>"200",
56
+ :size => "-",
57
+ :referer => "-",
58
+ :agent => "Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0",
59
+ :eventtime => "2017-04-22T01:45:14Z"
60
+ }
61
+
62
+ event1 = LogStash::Event.new(log1)
63
+ event2 = LogStash::Event.new(log2)
64
+ azure_loganalytics_output.receive(event1)
65
+ azure_loganalytics_output.receive(event2)
66
+ events.push(event1)
67
+ events.push(event2)
68
+ expect {azure_loganalytics_output.flush(events)}.to_not raise_error
69
+ end
70
+ end
71
+
72
+ end
metadata ADDED
@@ -0,0 +1,131 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: logstash-output-azure_loganalytics
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.5.0
5
+ platform: ruby
6
+ authors:
7
+ - Yoichi Kawasaki
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2020-07-21 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - ">="
17
+ - !ruby/object:Gem::Version
18
+ version: 1.8.0
19
+ name: rest-client
20
+ prerelease: false
21
+ type: :runtime
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ">="
25
+ - !ruby/object:Gem::Version
26
+ version: 1.8.0
27
+ - !ruby/object:Gem::Dependency
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - ">="
31
+ - !ruby/object:Gem::Version
32
+ version: 0.4.0
33
+ name: azure-loganalytics-datacollector-api
34
+ prerelease: false
35
+ type: :runtime
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - ">="
39
+ - !ruby/object:Gem::Version
40
+ version: 0.4.0
41
+ - !ruby/object:Gem::Dependency
42
+ requirement: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - ">="
45
+ - !ruby/object:Gem::Version
46
+ version: '1.60'
47
+ - - "<="
48
+ - !ruby/object:Gem::Version
49
+ version: '2.99'
50
+ name: logstash-core-plugin-api
51
+ prerelease: false
52
+ type: :runtime
53
+ version_requirements: !ruby/object:Gem::Requirement
54
+ requirements:
55
+ - - ">="
56
+ - !ruby/object:Gem::Version
57
+ version: '1.60'
58
+ - - "<="
59
+ - !ruby/object:Gem::Version
60
+ version: '2.99'
61
+ - !ruby/object:Gem::Dependency
62
+ requirement: !ruby/object:Gem::Requirement
63
+ requirements:
64
+ - - ">="
65
+ - !ruby/object:Gem::Version
66
+ version: '0'
67
+ name: logstash-codec-plain
68
+ prerelease: false
69
+ type: :runtime
70
+ version_requirements: !ruby/object:Gem::Requirement
71
+ requirements:
72
+ - - ">="
73
+ - !ruby/object:Gem::Version
74
+ version: '0'
75
+ - !ruby/object:Gem::Dependency
76
+ requirement: !ruby/object:Gem::Requirement
77
+ requirements:
78
+ - - ">="
79
+ - !ruby/object:Gem::Version
80
+ version: '0'
81
+ name: logstash-devutils
82
+ prerelease: false
83
+ type: :development
84
+ version_requirements: !ruby/object:Gem::Requirement
85
+ requirements:
86
+ - - ">="
87
+ - !ruby/object:Gem::Version
88
+ version: '0'
89
+ description: logstash output plugin to store events into Azure Log Analytics
90
+ email: yoichi.kawasaki@outlook.com
91
+ executables: []
92
+ extensions: []
93
+ extra_rdoc_files: []
94
+ files:
95
+ - CHANGELOG.md
96
+ - CONTRIBUTORS
97
+ - Gemfile
98
+ - LICENSE
99
+ - README.md
100
+ - VERSION
101
+ - lib/logstash/outputs/azure_loganalytics.rb
102
+ - logstash-output-azure_loganalytics.gemspec
103
+ - spec/outputs/azure_loganalytics_spec.rb
104
+ homepage: http://github.com/yokawasa/logstash-output-azure_loganalytics
105
+ licenses:
106
+ - Apache License (2.0)
107
+ metadata:
108
+ logstash_plugin: 'true'
109
+ logstash_group: output
110
+ post_install_message:
111
+ rdoc_options: []
112
+ require_paths:
113
+ - lib
114
+ required_ruby_version: !ruby/object:Gem::Requirement
115
+ requirements:
116
+ - - ">="
117
+ - !ruby/object:Gem::Version
118
+ version: '0'
119
+ required_rubygems_version: !ruby/object:Gem::Requirement
120
+ requirements:
121
+ - - ">="
122
+ - !ruby/object:Gem::Version
123
+ version: '0'
124
+ requirements: []
125
+ rubyforge_project:
126
+ rubygems_version: 2.6.8
127
+ signing_key:
128
+ specification_version: 4
129
+ summary: logstash output plugin to store events into Azure Log Analytics
130
+ test_files:
131
+ - spec/outputs/azure_loganalytics_spec.rb