logstash-output-azure_loganalytics 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +7 -0
- data/CHANGELOG.md +3 -0
- data/CONTRIBUTORS +10 -0
- data/Gemfile +2 -0
- data/LICENSE +13 -0
- data/README.md +119 -0
- data/VERSION +1 -0
- data/lib/logstash/outputs/azure_loganalytics.rb +94 -0
- data/logstash-output-azure_loganalytics.gemspec +26 -0
- data/spec/outputs/azure_loganalytics_spec.rb +68 -0
- metadata +131 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 960c239563a176eb66e1b6c48b6dac8c5edd2df5
|
4
|
+
data.tar.gz: a89509bfd7dcf3f3211882de0009f37778b0d997
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 7fe84eace2a157cb4c00730f666d0679bd923ccccea515967f0b440dc413d59a5ace185fef46865b34741ef619d498b17003349d5414318b1e3e07445da4757d
|
7
|
+
data.tar.gz: b53d6da49d70b80877c12d87d6af8adfe98a20ba92d3c1e692e87e898a9699381594b8c5c0c76f06a90b7c9fd023f26d31ed8398a05a846d9346bf98eaeb0415
|
data/CHANGELOG.md
ADDED
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,10 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* Yoichi Kawasaki (yokawasa)
|
6
|
+
|
7
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
8
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
9
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
10
|
+
open source awesome.
|
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,13 @@
|
|
1
|
+
Copyright (c) 2012–2015 Elasticsearch <http://www.elastic.co>
|
2
|
+
|
3
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
you may not use this file except in compliance with the License.
|
5
|
+
You may obtain a copy of the License at
|
6
|
+
|
7
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
|
9
|
+
Unless required by applicable law or agreed to in writing, software
|
10
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
See the License for the specific language governing permissions and
|
13
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,119 @@
|
|
1
|
+
# Azure Log Analytics output plugin for Logstash
|
2
|
+
logstash-output-azure_loganalytics is a logstash plugin to output to Azure Log Analytics. [Logstash](https://www.elastic.co/products/logstash) is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite [destinations](https://www.elastic.co/products/logstash). [Log Analytics](https://azure.microsoft.com/en-us/services/log-analytics/) is a service in Operations Management Suite (OMS) that helps you collect and analyze data generated by resources in your cloud and on-premises environments. It gives you real-time insights using integrated search and custom dashboards to readily analyze millions of records across all of your workloads and servers regardless of their physical location. The plugin stores in-coming events to Azure Log Analytics by leveraging [Log Analytics HTTP Data Collector API](https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api)
|
3
|
+
|
4
|
+
## Installation
|
5
|
+
|
6
|
+
You can install this plugin using the Logstash "plugin" or "logstash-plugin" (for newer versions of Logstash) command:
|
7
|
+
```
|
8
|
+
bin/plugin install logstash-output-azure_loganalytics
|
9
|
+
# or
|
10
|
+
bin/logstash-plugin install logstash-output-azure_loganalytics (Newer versions of Logstash)
|
11
|
+
```
|
12
|
+
Please see [Logstash reference](https://www.elastic.co/guide/en/logstash/current/offline-plugins.html) for more information.
|
13
|
+
|
14
|
+
## Configuration
|
15
|
+
|
16
|
+
```
|
17
|
+
output {
|
18
|
+
azure_loganalytics {
|
19
|
+
customer_id => "<OMS WORKSPACE ID>"
|
20
|
+
shared_key => "<CLIENT AUTH KEY>"
|
21
|
+
log_type => "<LOG TYPE NAME>"
|
22
|
+
key_names => ['key1','key2','key3'..] ## list of Key names (array)
|
23
|
+
flush_items => <FLUSH_ITEMS_NUM>
|
24
|
+
flush_interval_time => <FLUSH INTERVAL TIME(sec)>
|
25
|
+
}
|
26
|
+
}
|
27
|
+
```
|
28
|
+
|
29
|
+
* **customer\_id (required)** - Your Operations Management Suite workspace ID
|
30
|
+
* **shared\_key (required)** - The primary or the secondary Connected Sources client authentication key.
|
31
|
+
* **log\_type (required)** - The name of the event type that is being submitted to Log Analytics. This must be only alpha characters.
|
32
|
+
* **key\_names (optional)** - Default:[] (empty array). list of Key names in in-coming record to deliver.
|
33
|
+
* **flush_items (optional)** - Default 50. Max number of items to buffer before flushing (1 - 1000).
|
34
|
+
* **flush_interval_time (optional)** - Default 5. Max number of seconds to wait between flushes.
|
35
|
+
|
36
|
+
## Tests
|
37
|
+
|
38
|
+
Here is an example configuration where Logstash's event source and destination are configured as Apache2 access log and Azure Log Analytics respectively.
|
39
|
+
|
40
|
+
### Example Configuration
|
41
|
+
```
|
42
|
+
input {
|
43
|
+
file {
|
44
|
+
path => "/var/log/apache2/access.log"
|
45
|
+
start_position => "beginning"
|
46
|
+
}
|
47
|
+
}
|
48
|
+
|
49
|
+
filter {
|
50
|
+
if [path] =~ "access" {
|
51
|
+
mutate { replace => { "type" => "apache_access" } }
|
52
|
+
grok {
|
53
|
+
match => { "message" => "%{COMBINEDAPACHELOG}" }
|
54
|
+
}
|
55
|
+
}
|
56
|
+
date {
|
57
|
+
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
|
58
|
+
}
|
59
|
+
}
|
60
|
+
|
61
|
+
output {
|
62
|
+
azure_loganalytics {
|
63
|
+
customer_id => "818f7bbc-8034-4cc3-b97d-f068dd4cd659"
|
64
|
+
shared_key => "ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksXxcBmQQHw==(dummy)"
|
65
|
+
log_type => "ApacheAccessLog"
|
66
|
+
key_names => ['logid','date','processing_time','remote','user','method','status','agent']
|
67
|
+
flush_items => 10
|
68
|
+
flush_interval_time => 5
|
69
|
+
}
|
70
|
+
# for debug
|
71
|
+
stdout { codec => rubydebug }
|
72
|
+
}
|
73
|
+
```
|
74
|
+
|
75
|
+
You can find example configuration files in logstash-output-azure_loganalytics/examples.
|
76
|
+
|
77
|
+
### Run the plugin with the example configuration
|
78
|
+
|
79
|
+
Now you run logstash with the the example configuration like this:
|
80
|
+
```
|
81
|
+
# Test your logstash configuration before actually running the logstash
|
82
|
+
bin/logstash -f logstash-apache2-to-loganalytics.conf --configtest
|
83
|
+
# run
|
84
|
+
bin/logstash -f logstash-apache2-to-loganalytics.conf
|
85
|
+
```
|
86
|
+
|
87
|
+
Here is an expected output for sample input (Apache2 access log):
|
88
|
+
|
89
|
+
<u>Apache2 access log</u>
|
90
|
+
```
|
91
|
+
106.143.121.169 - - [29/Dec/2016:01:38:16 +0000] "GET /test.html HTTP/1.1" 304 179 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36"
|
92
|
+
```
|
93
|
+
|
94
|
+
<u>Output (rubydebug)</u>
|
95
|
+
```
|
96
|
+
{
|
97
|
+
"message" => "106.143.121.169 - - [29/Dec/2016:01:38:16 +0000] \"GET /test.html HTTP/1.1\" 304 179 \"-\" \"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\"",
|
98
|
+
"@version" => "1",
|
99
|
+
"@timestamp" => "2016-12-29T01:38:16.000Z",
|
100
|
+
"path" => "/var/log/apache2/access.log",
|
101
|
+
"host" => "yoichitest01",
|
102
|
+
"type" => "apache_access",
|
103
|
+
"clientip" => "106.143.121.169",
|
104
|
+
"ident" => "-",
|
105
|
+
"auth" => "-",
|
106
|
+
"timestamp" => "29/Dec/2016:01:38:16 +0000",
|
107
|
+
"verb" => "GET",
|
108
|
+
"request" => "/test.html",
|
109
|
+
"httpversion" => "1.1",
|
110
|
+
"response" => "304",
|
111
|
+
"bytes" => "179",
|
112
|
+
"referrer" => "\"-\"",
|
113
|
+
"agent" => "\"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\""
|
114
|
+
}
|
115
|
+
```
|
116
|
+
|
117
|
+
## Contributing
|
118
|
+
|
119
|
+
Bug reports and pull requests are welcome on GitHub at https://github.com/yokawasa/logstash-output-azure_loganalytics.
|
data/VERSION
ADDED
@@ -0,0 +1 @@
|
|
1
|
+
0.1.0
|
@@ -0,0 +1,94 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/outputs/base"
|
3
|
+
require "logstash/namespace"
|
4
|
+
require "stud/buffer"
|
5
|
+
|
6
|
+
class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
|
7
|
+
include Stud::Buffer
|
8
|
+
|
9
|
+
config_name "azure_loganalytics"
|
10
|
+
|
11
|
+
# Your Operations Management Suite workspace ID
|
12
|
+
config :customer_id, :validate => :string, :required => true
|
13
|
+
|
14
|
+
# The primary or the secondary Connected Sources client authentication key
|
15
|
+
config :shared_key, :validate => :string, :required => true
|
16
|
+
|
17
|
+
# The name of the event type that is being submitted to Log Analytics. This must be only alpha characters.
|
18
|
+
config :log_type, :validate => :string, :required => true
|
19
|
+
|
20
|
+
# list of Key names in in-coming record to deliver.
|
21
|
+
config :key_names, :validate => :array, :default => []
|
22
|
+
|
23
|
+
# Max number of items to buffer before flushing. Default 50.
|
24
|
+
config :flush_items, :validate => :number, :default => 50
|
25
|
+
|
26
|
+
# Max number of seconds to wait between flushes. Default 5
|
27
|
+
config :flush_interval_time, :validate => :number, :default => 5
|
28
|
+
|
29
|
+
public
|
30
|
+
def register
|
31
|
+
require 'azure/loganalytics/datacollectorapi/client'
|
32
|
+
|
33
|
+
## Configure
|
34
|
+
if not @log_type.match(/^[[:alpha:]]+$/)
|
35
|
+
raise ArgumentError, 'log_type must be only alpha characters'
|
36
|
+
end
|
37
|
+
|
38
|
+
## Start
|
39
|
+
@client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key)
|
40
|
+
|
41
|
+
buffer_initialize(
|
42
|
+
:max_items => @flush_items,
|
43
|
+
:max_interval => @flush_interval_time,
|
44
|
+
:logger => @logger
|
45
|
+
)
|
46
|
+
|
47
|
+
end # def register
|
48
|
+
|
49
|
+
public
|
50
|
+
def receive(event)
|
51
|
+
# Simply save an event for later delivery
|
52
|
+
buffer_receive(event)
|
53
|
+
end # def receive
|
54
|
+
|
55
|
+
# called from Stud::Buffer#buffer_flush when there are events to flush
|
56
|
+
public
|
57
|
+
def flush (events, close=false)
|
58
|
+
|
59
|
+
documents = [] #this is the array of hashes to add Azure Log Analytics
|
60
|
+
events.each do |event|
|
61
|
+
document = {}
|
62
|
+
event_hash = event.to_hash()
|
63
|
+
if @key_names.length > 0
|
64
|
+
@key_names.each do |key|
|
65
|
+
if event_hash.include?(key)
|
66
|
+
document[key] = event_hash[key]
|
67
|
+
end
|
68
|
+
end
|
69
|
+
else
|
70
|
+
document = event_hash
|
71
|
+
end
|
72
|
+
# Skip if document doesn't contain any items
|
73
|
+
next if (document.keys).length < 1
|
74
|
+
|
75
|
+
documents.push(document)
|
76
|
+
end
|
77
|
+
|
78
|
+
# Skip in case there are no candidate documents to deliver
|
79
|
+
if documents.length < 1
|
80
|
+
return
|
81
|
+
end
|
82
|
+
|
83
|
+
begin
|
84
|
+
res = @client.post_data(@log_type, documents)
|
85
|
+
if not Azure::Loganalytics::Datacollectorapi::Client.is_success(res)
|
86
|
+
$logger.error("DataCollector API request failure: error code: #{res.code}, data=>" + (documents.to_json).to_s)
|
87
|
+
end
|
88
|
+
rescue Exception => ex
|
89
|
+
$logger.error("Exception occured in posting to DataCollector API: '#{ex}', data=>" + (documents.to_json).to_s)
|
90
|
+
end
|
91
|
+
|
92
|
+
end # def flush
|
93
|
+
|
94
|
+
end # class LogStash::Outputs::AzureLogAnalytics
|
@@ -0,0 +1,26 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-output-azure_loganalytics'
|
3
|
+
s.version = File.read("VERSION").strip
|
4
|
+
s.authors = ["Yoichi Kawasaki"]
|
5
|
+
s.email = "yoichi.kawasaki@outlook.com"
|
6
|
+
s.summary = %q{logstash output plugin to store events into Azure Log Analytics}
|
7
|
+
s.description = s.summary
|
8
|
+
s.homepage = "http://github.com/yokawasa/logstash-output-azure_loganalytics"
|
9
|
+
s.licenses = ["Apache License (2.0)"]
|
10
|
+
s.require_paths = ["lib"]
|
11
|
+
|
12
|
+
# Files
|
13
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT', 'VERSION']
|
14
|
+
# Tests
|
15
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
16
|
+
|
17
|
+
# Special flag to let us know this is actually a logstash plugin
|
18
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
|
19
|
+
|
20
|
+
# Gem dependencies
|
21
|
+
s.add_runtime_dependency "rest-client"
|
22
|
+
s.add_runtime_dependency "azure-loganalytics-datacollector-api", ">= 0.1.2"
|
23
|
+
s.add_runtime_dependency "logstash-core", ">= 2.0.0", "< 3.0.0"
|
24
|
+
s.add_runtime_dependency "logstash-codec-plain"
|
25
|
+
s.add_development_dependency "logstash-devutils"
|
26
|
+
end
|
@@ -0,0 +1,68 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/devutils/rspec/spec_helper"
|
3
|
+
require "logstash/outputs/azure_loganalytics"
|
4
|
+
require "logstash/codecs/plain"
|
5
|
+
require "logstash/event"
|
6
|
+
|
7
|
+
describe LogStash::Outputs::AzureLogAnalytics do
|
8
|
+
|
9
|
+
let(:customer_id) { '<Customer ID aka WorkspaceID String>' }
|
10
|
+
let(:shared_key) { '<Primary Key String>' }
|
11
|
+
let(:log_type) { 'ApacheAccessLog' }
|
12
|
+
let(:key_names) { ['logid','date','processing_time','remote','user','method','status','agent'] }
|
13
|
+
|
14
|
+
let(:azure_loganalytics_config) {
|
15
|
+
{
|
16
|
+
"customer_id" => customer_id,
|
17
|
+
"shared_key" => shared_key,
|
18
|
+
"log_type" => log_type,
|
19
|
+
"key_names" => key_names
|
20
|
+
}
|
21
|
+
}
|
22
|
+
|
23
|
+
let(:azure_loganalytics_output) { LogStash::Outputs::AzureLogAnalytics.new(azure_loganalytics_config) }
|
24
|
+
|
25
|
+
before do
|
26
|
+
azure_loganalytics_output.register
|
27
|
+
end
|
28
|
+
|
29
|
+
describe "#flush" do
|
30
|
+
it "Should successfully send the event to Azure Log Analytics" do
|
31
|
+
events = []
|
32
|
+
log1 = {
|
33
|
+
:logid => "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
|
34
|
+
:date => "2016-12-10 09:44:32 JST",
|
35
|
+
:processing_time => "372",
|
36
|
+
:remote => "101.202.74.59",
|
37
|
+
:user => "-",
|
38
|
+
:method => "GET / HTTP/1.1",
|
39
|
+
:status => "304",
|
40
|
+
:size => "-",
|
41
|
+
:referer => "-",
|
42
|
+
:agent => "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:27.0) Gecko/20100101 Firefox/27.0"
|
43
|
+
}
|
44
|
+
|
45
|
+
log2 = {
|
46
|
+
:logid => "7260iswx-8034-4cc3-uirtx-f068dd4cd659",
|
47
|
+
:date => "2016-12-10 09:45:14 JST",
|
48
|
+
:processing_time => "105",
|
49
|
+
:remote => "201.78.74.59",
|
50
|
+
:user => "-",
|
51
|
+
:method => "GET /manager/html HTTP/1.1",
|
52
|
+
:status =>"200",
|
53
|
+
:size => "-",
|
54
|
+
:referer => "-",
|
55
|
+
:agent => "Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0"
|
56
|
+
}
|
57
|
+
|
58
|
+
event1 = LogStash::Event.new(log1)
|
59
|
+
event2 = LogStash::Event.new(log2)
|
60
|
+
azure_loganalytics_output.receive(event1)
|
61
|
+
azure_loganalytics_output.receive(event2)
|
62
|
+
events.push(event1)
|
63
|
+
events.push(event2)
|
64
|
+
expect {azure_loganalytics_output.flush(events)}.to_not raise_error
|
65
|
+
end
|
66
|
+
end
|
67
|
+
|
68
|
+
end
|
metadata
ADDED
@@ -0,0 +1,131 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-output-azure_loganalytics
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Yoichi Kawasaki
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2016-12-29 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
15
|
+
requirements:
|
16
|
+
- - ">="
|
17
|
+
- !ruby/object:Gem::Version
|
18
|
+
version: '0'
|
19
|
+
name: rest-client
|
20
|
+
prerelease: false
|
21
|
+
type: :runtime
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - ">="
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: '0'
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
requirement: !ruby/object:Gem::Requirement
|
29
|
+
requirements:
|
30
|
+
- - ">="
|
31
|
+
- !ruby/object:Gem::Version
|
32
|
+
version: 0.1.2
|
33
|
+
name: azure-loganalytics-datacollector-api
|
34
|
+
prerelease: false
|
35
|
+
type: :runtime
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - ">="
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: 0.1.2
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
requirement: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - ">="
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: 2.0.0
|
47
|
+
- - "<"
|
48
|
+
- !ruby/object:Gem::Version
|
49
|
+
version: 3.0.0
|
50
|
+
name: logstash-core
|
51
|
+
prerelease: false
|
52
|
+
type: :runtime
|
53
|
+
version_requirements: !ruby/object:Gem::Requirement
|
54
|
+
requirements:
|
55
|
+
- - ">="
|
56
|
+
- !ruby/object:Gem::Version
|
57
|
+
version: 2.0.0
|
58
|
+
- - "<"
|
59
|
+
- !ruby/object:Gem::Version
|
60
|
+
version: 3.0.0
|
61
|
+
- !ruby/object:Gem::Dependency
|
62
|
+
requirement: !ruby/object:Gem::Requirement
|
63
|
+
requirements:
|
64
|
+
- - ">="
|
65
|
+
- !ruby/object:Gem::Version
|
66
|
+
version: '0'
|
67
|
+
name: logstash-codec-plain
|
68
|
+
prerelease: false
|
69
|
+
type: :runtime
|
70
|
+
version_requirements: !ruby/object:Gem::Requirement
|
71
|
+
requirements:
|
72
|
+
- - ">="
|
73
|
+
- !ruby/object:Gem::Version
|
74
|
+
version: '0'
|
75
|
+
- !ruby/object:Gem::Dependency
|
76
|
+
requirement: !ruby/object:Gem::Requirement
|
77
|
+
requirements:
|
78
|
+
- - ">="
|
79
|
+
- !ruby/object:Gem::Version
|
80
|
+
version: '0'
|
81
|
+
name: logstash-devutils
|
82
|
+
prerelease: false
|
83
|
+
type: :development
|
84
|
+
version_requirements: !ruby/object:Gem::Requirement
|
85
|
+
requirements:
|
86
|
+
- - ">="
|
87
|
+
- !ruby/object:Gem::Version
|
88
|
+
version: '0'
|
89
|
+
description: logstash output plugin to store events into Azure Log Analytics
|
90
|
+
email: yoichi.kawasaki@outlook.com
|
91
|
+
executables: []
|
92
|
+
extensions: []
|
93
|
+
extra_rdoc_files: []
|
94
|
+
files:
|
95
|
+
- CHANGELOG.md
|
96
|
+
- CONTRIBUTORS
|
97
|
+
- Gemfile
|
98
|
+
- LICENSE
|
99
|
+
- README.md
|
100
|
+
- VERSION
|
101
|
+
- lib/logstash/outputs/azure_loganalytics.rb
|
102
|
+
- logstash-output-azure_loganalytics.gemspec
|
103
|
+
- spec/outputs/azure_loganalytics_spec.rb
|
104
|
+
homepage: http://github.com/yokawasa/logstash-output-azure_loganalytics
|
105
|
+
licenses:
|
106
|
+
- Apache License (2.0)
|
107
|
+
metadata:
|
108
|
+
logstash_plugin: 'true'
|
109
|
+
logstash_group: output
|
110
|
+
post_install_message:
|
111
|
+
rdoc_options: []
|
112
|
+
require_paths:
|
113
|
+
- lib
|
114
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
115
|
+
requirements:
|
116
|
+
- - ">="
|
117
|
+
- !ruby/object:Gem::Version
|
118
|
+
version: '0'
|
119
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
120
|
+
requirements:
|
121
|
+
- - ">="
|
122
|
+
- !ruby/object:Gem::Version
|
123
|
+
version: '0'
|
124
|
+
requirements: []
|
125
|
+
rubyforge_project:
|
126
|
+
rubygems_version: 2.4.8
|
127
|
+
signing_key:
|
128
|
+
specification_version: 4
|
129
|
+
summary: logstash output plugin to store events into Azure Log Analytics
|
130
|
+
test_files:
|
131
|
+
- spec/outputs/azure_loganalytics_spec.rb
|