logstash-input-azurenlogtable 0.1.1
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +2 -0
- data/CONTRIBUTORS +10 -0
- data/DEVELOPER.md +2 -0
- data/Gemfile +4 -0
- data/LICENSE +11 -0
- data/README.md +86 -0
- data/lib/logstash/inputs/azurenlogtable.rb +202 -0
- data/logstash-input-azurenlogtable.gemspec +27 -0
- data/spec/inputs/azurenlogtable_spec.rb +11 -0
- metadata +152 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 280844fb8f1423b2ea804f4ed6824f757abd57357bef91a51342c148737b0135
|
4
|
+
data.tar.gz: 379ef19a4cfb62ac364741c83878cba1e5f71ef81ed4de9283339e0ae98be6a7
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 209e8beb94aa450072f1cc81d09971a11f769c4c9821bacdb1d8b3d49941c199a33d68263662d56afb90703483545143143ec6a0ce6155265c7cafce25ce1276
|
7
|
+
data.tar.gz: a4149fc9182341eec8f904dc74b25c0e82f189b19fbb1d43d2205b52431935fc5fe891804c522b8e77bd60dcefaadc84fefa2893a5291f692e89703996619d52
|
data/CHANGELOG.md
ADDED
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,10 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* he.jianpeng - he.jianpeng@oe.21vianet.com
|
6
|
+
|
7
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
8
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
9
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
10
|
+
open source awesome.
|
data/DEVELOPER.md
ADDED
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,11 @@
|
|
1
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
2
|
+
you may not use this file except in compliance with the License.
|
3
|
+
You may obtain a copy of the License at
|
4
|
+
|
5
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
6
|
+
|
7
|
+
Unless required by applicable law or agreed to in writing, software
|
8
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
9
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
10
|
+
See the License for the specific language governing permissions and
|
11
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,86 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
4
|
+
|
5
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
6
|
+
|
7
|
+
## Documentation
|
8
|
+
|
9
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
|
10
|
+
|
11
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
12
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
|
13
|
+
|
14
|
+
## Need Help?
|
15
|
+
|
16
|
+
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
17
|
+
|
18
|
+
## Developing
|
19
|
+
|
20
|
+
### 1. Plugin Developement and Testing
|
21
|
+
|
22
|
+
#### Code
|
23
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
24
|
+
|
25
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
|
26
|
+
|
27
|
+
- Install dependencies
|
28
|
+
```sh
|
29
|
+
bundle install
|
30
|
+
```
|
31
|
+
|
32
|
+
#### Test
|
33
|
+
|
34
|
+
- Update your dependencies
|
35
|
+
|
36
|
+
```sh
|
37
|
+
bundle install
|
38
|
+
```
|
39
|
+
|
40
|
+
- Run tests
|
41
|
+
|
42
|
+
```sh
|
43
|
+
bundle exec rspec
|
44
|
+
```
|
45
|
+
|
46
|
+
### 2. Running your unpublished Plugin in Logstash
|
47
|
+
|
48
|
+
#### 2.1 Run in a local Logstash clone
|
49
|
+
|
50
|
+
- Edit Logstash `Gemfile` and add the local plugin path, for example:
|
51
|
+
```ruby
|
52
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
53
|
+
```
|
54
|
+
- Install plugin
|
55
|
+
```sh
|
56
|
+
bin/logstash-plugin install --no-verify
|
57
|
+
```
|
58
|
+
- Run Logstash with your plugin
|
59
|
+
```sh
|
60
|
+
bin/logstash -e 'filter {awesome {}}'
|
61
|
+
```
|
62
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
63
|
+
|
64
|
+
#### 2.2 Run in an installed Logstash
|
65
|
+
|
66
|
+
You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
|
67
|
+
|
68
|
+
- Build your plugin gem
|
69
|
+
```sh
|
70
|
+
gem build logstash-filter-awesome.gemspec
|
71
|
+
```
|
72
|
+
- Install the plugin from the Logstash home
|
73
|
+
```sh
|
74
|
+
bin/logstash-plugin install /your/local/plugin/logstash-filter-awesome.gem
|
75
|
+
```
|
76
|
+
- Start Logstash and proceed to test the plugin
|
77
|
+
|
78
|
+
## Contributing
|
79
|
+
|
80
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
81
|
+
|
82
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
83
|
+
|
84
|
+
It is more important to the community that you are able to contribute.
|
85
|
+
|
86
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -0,0 +1,202 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/inputs/base"
|
3
|
+
require "logstash/namespace"
|
4
|
+
require "time"
|
5
|
+
require "azure/storage"
|
6
|
+
|
7
|
+
# Generate a repeating message.
|
8
|
+
#
|
9
|
+
# This plugin is intented only as an example.
|
10
|
+
|
11
|
+
class LogStash::Inputs::Azurenlogtable < LogStash::Inputs::Base
|
12
|
+
config_name "azurenlogtable"
|
13
|
+
|
14
|
+
config :account_name, :validate => :string
|
15
|
+
config :access_key, :validate => :string, :required => false # not required as the sas_token can alternatively be provided
|
16
|
+
config :sas_token, :validate => :string, :required => false # not required as the access_key can alternatively be provided
|
17
|
+
config :table_name, :validate => :string
|
18
|
+
|
19
|
+
config :collection_start_time_utc, :validate => :string, :default => nil #the actual value is set in the ctor (now - data_latency_minutes - 1)
|
20
|
+
config :idle_delay_seconds, :validate => :number, :default => 15
|
21
|
+
config :endpoint, :validate => :string, :default => "core.windows.net"
|
22
|
+
|
23
|
+
# Default 1 minute delay to ensure all data is published to the table before querying.
|
24
|
+
# See issue #23 for more: https://github.com/Azure/azure-diagnostics-tools/issues/23
|
25
|
+
config :data_latency_minutes, :validate => :number, :default => 1
|
26
|
+
|
27
|
+
# Number of past queries to be run, so we don't miss late arriving data
|
28
|
+
config :past_queries_count, :validate => :number, :default => 5
|
29
|
+
|
30
|
+
TICKS_SINCE_EPOCH = Time.utc(0001, 01, 01).to_i * 10000000
|
31
|
+
|
32
|
+
def initialize(*args)
|
33
|
+
super(*args)
|
34
|
+
if @collection_start_time_utc.nil?
|
35
|
+
@collection_start_time_utc = (Time.now - ( 60 * @data_latency_minutes) - 60).iso8601
|
36
|
+
@logger.debug("collection_start_time_utc = #{@collection_start_time_utc}")
|
37
|
+
end
|
38
|
+
end # initialize
|
39
|
+
|
40
|
+
public
|
41
|
+
def register
|
42
|
+
user_agent = "logstash-input-azurenlogtable-0.1.0"
|
43
|
+
|
44
|
+
if @sas_token.nil?
|
45
|
+
@client = Azure::Storage::Client.create(
|
46
|
+
:storage_account_name => @account_name,
|
47
|
+
:storage_access_key => @access_key,
|
48
|
+
:storage_table_host => "https://#{@account_name}.table.#{@endpoint}",
|
49
|
+
:user_agent_prefix => user_agent)
|
50
|
+
else
|
51
|
+
@client = Azure::Storage::Client.create(
|
52
|
+
:storage_account_name => @account_name,
|
53
|
+
:storage_sas_token => @sas_token,
|
54
|
+
:storage_table_host => "https://#{@account_name}.table.#{@endpoint}",
|
55
|
+
:user_agent_prefix => user_agent)
|
56
|
+
end
|
57
|
+
|
58
|
+
@azure_table_service = @client.table_client
|
59
|
+
@last_timestamp = partitionkey_from_datetime(@collection_start_time_utc)
|
60
|
+
@idle_delay = @idle_delay_seconds
|
61
|
+
end # def register
|
62
|
+
|
63
|
+
public
|
64
|
+
def run(output_queue)
|
65
|
+
while !stop?
|
66
|
+
@logger.debug("Starting process method @" + Time.now.to_s);
|
67
|
+
process(output_queue)
|
68
|
+
@logger.debug("Starting delay of: " + @idle_delay.to_s + " seconds @" + Time.now.to_s);
|
69
|
+
sleep @idle_delay
|
70
|
+
end # while
|
71
|
+
end # run
|
72
|
+
|
73
|
+
def process(output_queue)
|
74
|
+
@until_timestamp = partitionkey_from_datetime(Time.now.iso8601)
|
75
|
+
last_good_timestamp = nil
|
76
|
+
|
77
|
+
log_count = 0
|
78
|
+
|
79
|
+
query = build_latent_query
|
80
|
+
query.reset
|
81
|
+
query.run( ->(entity) {
|
82
|
+
last_good_timestamp = on_new_data(entity, output_queue, last_good_timestamp)
|
83
|
+
log_count += 1
|
84
|
+
})
|
85
|
+
|
86
|
+
@logger.debug("log total count => #{log_count}")
|
87
|
+
if (!last_good_timestamp.nil?)
|
88
|
+
@last_timestamp = last_good_timestamp
|
89
|
+
end
|
90
|
+
|
91
|
+
rescue => e
|
92
|
+
@logger.error("Oh My, An error occurred. Error:#{e}: Trace: #{e.backtrace}", :exception => e)
|
93
|
+
raise
|
94
|
+
end # process
|
95
|
+
|
96
|
+
def build_latent_query
|
97
|
+
@logger.debug("from #{@last_timestamp} to #{@until_timestamp}")
|
98
|
+
if @last_timestamp > @until_timestamp
|
99
|
+
@logger.debug("last_timestamp is in the future. Will not run any query!")
|
100
|
+
return nil
|
101
|
+
end
|
102
|
+
query_filter = "(PartitionKey gt '#{@last_timestamp}' and PartitionKey lt '#{@until_timestamp}')"
|
103
|
+
query_filter = query_filter.gsub('"','')
|
104
|
+
return AzureQuery.new(@logger, @azure_table_service, @table_name, query_filter, @last_timestamp.to_s + "-" + @until_timestamp.to_s, @entity_count_to_process)
|
105
|
+
end
|
106
|
+
|
107
|
+
def on_new_data(entity, output_queue, last_good_timestamp)
|
108
|
+
#@logger.debug("new event")
|
109
|
+
|
110
|
+
event = LogStash::Event.new(entity.properties)
|
111
|
+
event.set("type", @table_name)
|
112
|
+
@logger.debug("new event:" + event.to_hash.to_s)
|
113
|
+
|
114
|
+
decorate(event)
|
115
|
+
last_good_timestamp = event.get('PartitionKey')
|
116
|
+
output_queue << event
|
117
|
+
return last_good_timestamp
|
118
|
+
end
|
119
|
+
|
120
|
+
# Windows Azure Diagnostic's algorithm for determining the partition key based on time is as follows:
|
121
|
+
# 1. Take time in UTC without seconds.
|
122
|
+
# 2. Convert it into .net ticks
|
123
|
+
# 3. add a '0' prefix.
|
124
|
+
def partitionkey_from_datetime(time_string)
|
125
|
+
if time_string.nil?
|
126
|
+
@logger.warn("partitionkey_from_datetime with invalid time_string. ")
|
127
|
+
collection_time = (Time.now - ( 60 * @data_latency_minutes) - 60)
|
128
|
+
else
|
129
|
+
begin
|
130
|
+
collection_time = Time.parse(time_string)
|
131
|
+
rescue => e
|
132
|
+
@logger.error("partitionkey_from_datetime fail with time_string =>" + time_string, :exception => e)
|
133
|
+
collection_time = (Time.now - ( 60 * @data_latency_minutes) - 60)
|
134
|
+
end
|
135
|
+
end
|
136
|
+
if collection_time
|
137
|
+
#@logger.debug("collection time parsed successfully #{collection_time}")
|
138
|
+
else
|
139
|
+
raise(ArgumentError, "Could not parse the time_string => #{time_string}")
|
140
|
+
end # if else block
|
141
|
+
|
142
|
+
collection_time -= collection_time.sec
|
143
|
+
ticks = to_ticks(collection_time)
|
144
|
+
"0#{ticks}"
|
145
|
+
end # partitionkey_from_datetime
|
146
|
+
|
147
|
+
# Convert time to ticks
|
148
|
+
def to_ticks(time_to_convert)
|
149
|
+
#@logger.debug("Converting time to ticks")
|
150
|
+
time_to_convert.to_i * 10000000 - TICKS_SINCE_EPOCH
|
151
|
+
end # to_ticks
|
152
|
+
|
153
|
+
def stop
|
154
|
+
# nothing to do in this case so it is not necessary to define stop
|
155
|
+
# examples of common "stop" tasks:
|
156
|
+
# * close sockets (unblocking blocking reads/accepts)
|
157
|
+
# * cleanup temporary files
|
158
|
+
# * terminate spawned threads
|
159
|
+
end
|
160
|
+
end # class LogStash::Inputs::Azurenlogtable
|
161
|
+
|
162
|
+
class AzureQuery
|
163
|
+
def initialize(logger, azure_table_service, table_name, query_str, query_id, entity_count_to_process)
|
164
|
+
@logger = logger
|
165
|
+
@query_str = query_str
|
166
|
+
@query_id = query_id
|
167
|
+
@entity_count_to_process = entity_count_to_process
|
168
|
+
@azure_table_service = azure_table_service
|
169
|
+
@table_name = table_name
|
170
|
+
@continuation_token = nil
|
171
|
+
end
|
172
|
+
|
173
|
+
def reset
|
174
|
+
@continuation_token = nil
|
175
|
+
end
|
176
|
+
|
177
|
+
def id
|
178
|
+
return @query_id
|
179
|
+
end
|
180
|
+
|
181
|
+
def run(on_result_cbk)
|
182
|
+
results_found = false
|
183
|
+
@logger.debug("[#{@query_id}]Query filter: " + @query_str)
|
184
|
+
begin
|
185
|
+
@logger.debug("[#{@query_id}]Running query. continuation_token: #{@continuation_token}")
|
186
|
+
query = { :top => @entity_count_to_process, :filter => @query_str, :continuation_token => @continuation_token }
|
187
|
+
result = @azure_table_service.query_entities(@table_name, query)
|
188
|
+
|
189
|
+
if result and result.length > 0
|
190
|
+
results_found = true
|
191
|
+
@logger.debug("[#{@query_id}] #{result.length} results found.")
|
192
|
+
result.each do |entity|
|
193
|
+
on_result_cbk.call(entity)
|
194
|
+
end
|
195
|
+
end
|
196
|
+
|
197
|
+
@continuation_token = result.continuation_token
|
198
|
+
end until !@continuation_token
|
199
|
+
|
200
|
+
return results_found
|
201
|
+
end
|
202
|
+
end # class AzureQuery
|
@@ -0,0 +1,27 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-input-azurenlogtable'
|
3
|
+
s.version = '0.1.1'
|
4
|
+
s.licenses = ['Apache-2.0']
|
5
|
+
s.summary = 'This plugin collects NLog data from Azure Storage Tables.'
|
6
|
+
s.description = 'This gem is a Logstash plugin. It reads and parses NLog data from Azure Storage Tables.'
|
7
|
+
s.homepage = 'https://github.com/zirain/logstash-input-azurenlogtable'
|
8
|
+
s.authors = ['zirain']
|
9
|
+
s.email = 'zirain2009@gmail.com'
|
10
|
+
s.require_paths = ['lib']
|
11
|
+
|
12
|
+
# Files
|
13
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
14
|
+
# Tests
|
15
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
16
|
+
|
17
|
+
# Special flag to let us know this is actually a logstash plugin
|
18
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
|
19
|
+
|
20
|
+
# Gem dependencies
|
21
|
+
s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
|
22
|
+
s.add_runtime_dependency "logstash-codec-plain"
|
23
|
+
s.add_runtime_dependency 'azure-storage', '~> 0.13.0.preview'
|
24
|
+
s.add_runtime_dependency 'stud', '~> 0.0', '>= 0.0.22'
|
25
|
+
s.add_development_dependency 'logstash-devutils', '>= 1.1.0'
|
26
|
+
s.add_development_dependency 'logging'
|
27
|
+
end
|
@@ -0,0 +1,11 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/devutils/rspec/spec_helper"
|
3
|
+
require "logstash/inputs/azurenlogtable"
|
4
|
+
|
5
|
+
describe LogStash::Inputs::Azurenlogtable do
|
6
|
+
|
7
|
+
it_behaves_like "an interruptible input plugin" do
|
8
|
+
let(:config) { { "interval" => 100 } }
|
9
|
+
end
|
10
|
+
|
11
|
+
end
|
metadata
ADDED
@@ -0,0 +1,152 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-input-azurenlogtable
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.1
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- zirain
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2019-09-27 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: logstash-core-plugin-api
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - ">="
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: '1.60'
|
20
|
+
- - "<="
|
21
|
+
- !ruby/object:Gem::Version
|
22
|
+
version: '2.99'
|
23
|
+
type: :runtime
|
24
|
+
prerelease: false
|
25
|
+
version_requirements: !ruby/object:Gem::Requirement
|
26
|
+
requirements:
|
27
|
+
- - ">="
|
28
|
+
- !ruby/object:Gem::Version
|
29
|
+
version: '1.60'
|
30
|
+
- - "<="
|
31
|
+
- !ruby/object:Gem::Version
|
32
|
+
version: '2.99'
|
33
|
+
- !ruby/object:Gem::Dependency
|
34
|
+
name: logstash-codec-plain
|
35
|
+
requirement: !ruby/object:Gem::Requirement
|
36
|
+
requirements:
|
37
|
+
- - ">="
|
38
|
+
- !ruby/object:Gem::Version
|
39
|
+
version: '0'
|
40
|
+
type: :runtime
|
41
|
+
prerelease: false
|
42
|
+
version_requirements: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - ">="
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: '0'
|
47
|
+
- !ruby/object:Gem::Dependency
|
48
|
+
name: azure-storage
|
49
|
+
requirement: !ruby/object:Gem::Requirement
|
50
|
+
requirements:
|
51
|
+
- - "~>"
|
52
|
+
- !ruby/object:Gem::Version
|
53
|
+
version: 0.13.0.preview
|
54
|
+
type: :runtime
|
55
|
+
prerelease: false
|
56
|
+
version_requirements: !ruby/object:Gem::Requirement
|
57
|
+
requirements:
|
58
|
+
- - "~>"
|
59
|
+
- !ruby/object:Gem::Version
|
60
|
+
version: 0.13.0.preview
|
61
|
+
- !ruby/object:Gem::Dependency
|
62
|
+
name: stud
|
63
|
+
requirement: !ruby/object:Gem::Requirement
|
64
|
+
requirements:
|
65
|
+
- - "~>"
|
66
|
+
- !ruby/object:Gem::Version
|
67
|
+
version: '0.0'
|
68
|
+
- - ">="
|
69
|
+
- !ruby/object:Gem::Version
|
70
|
+
version: 0.0.22
|
71
|
+
type: :runtime
|
72
|
+
prerelease: false
|
73
|
+
version_requirements: !ruby/object:Gem::Requirement
|
74
|
+
requirements:
|
75
|
+
- - "~>"
|
76
|
+
- !ruby/object:Gem::Version
|
77
|
+
version: '0.0'
|
78
|
+
- - ">="
|
79
|
+
- !ruby/object:Gem::Version
|
80
|
+
version: 0.0.22
|
81
|
+
- !ruby/object:Gem::Dependency
|
82
|
+
name: logstash-devutils
|
83
|
+
requirement: !ruby/object:Gem::Requirement
|
84
|
+
requirements:
|
85
|
+
- - ">="
|
86
|
+
- !ruby/object:Gem::Version
|
87
|
+
version: 1.1.0
|
88
|
+
type: :development
|
89
|
+
prerelease: false
|
90
|
+
version_requirements: !ruby/object:Gem::Requirement
|
91
|
+
requirements:
|
92
|
+
- - ">="
|
93
|
+
- !ruby/object:Gem::Version
|
94
|
+
version: 1.1.0
|
95
|
+
- !ruby/object:Gem::Dependency
|
96
|
+
name: logging
|
97
|
+
requirement: !ruby/object:Gem::Requirement
|
98
|
+
requirements:
|
99
|
+
- - ">="
|
100
|
+
- !ruby/object:Gem::Version
|
101
|
+
version: '0'
|
102
|
+
type: :development
|
103
|
+
prerelease: false
|
104
|
+
version_requirements: !ruby/object:Gem::Requirement
|
105
|
+
requirements:
|
106
|
+
- - ">="
|
107
|
+
- !ruby/object:Gem::Version
|
108
|
+
version: '0'
|
109
|
+
description: This gem is a Logstash plugin. It reads and parses NLog data from Azure
|
110
|
+
Storage Tables.
|
111
|
+
email: zirain2009@gmail.com
|
112
|
+
executables: []
|
113
|
+
extensions: []
|
114
|
+
extra_rdoc_files: []
|
115
|
+
files:
|
116
|
+
- CHANGELOG.md
|
117
|
+
- CONTRIBUTORS
|
118
|
+
- DEVELOPER.md
|
119
|
+
- Gemfile
|
120
|
+
- LICENSE
|
121
|
+
- README.md
|
122
|
+
- lib/logstash/inputs/azurenlogtable.rb
|
123
|
+
- logstash-input-azurenlogtable.gemspec
|
124
|
+
- spec/inputs/azurenlogtable_spec.rb
|
125
|
+
homepage: https://github.com/zirain/logstash-input-azurenlogtable
|
126
|
+
licenses:
|
127
|
+
- Apache-2.0
|
128
|
+
metadata:
|
129
|
+
logstash_plugin: 'true'
|
130
|
+
logstash_group: input
|
131
|
+
post_install_message:
|
132
|
+
rdoc_options: []
|
133
|
+
require_paths:
|
134
|
+
- lib
|
135
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
136
|
+
requirements:
|
137
|
+
- - ">="
|
138
|
+
- !ruby/object:Gem::Version
|
139
|
+
version: '0'
|
140
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
141
|
+
requirements:
|
142
|
+
- - ">="
|
143
|
+
- !ruby/object:Gem::Version
|
144
|
+
version: '0'
|
145
|
+
requirements: []
|
146
|
+
rubyforge_project:
|
147
|
+
rubygems_version: 2.7.7
|
148
|
+
signing_key:
|
149
|
+
specification_version: 4
|
150
|
+
summary: This plugin collects NLog data from Azure Storage Tables.
|
151
|
+
test_files:
|
152
|
+
- spec/inputs/azurenlogtable_spec.rb
|