logstash-filter-jdbc_streaming 1.0.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +3 -0
- data/Gemfile +2 -0
- data/LICENSE +13 -0
- data/README.md +98 -0
- data/lib/logstash/filters/jdbc_streaming.rb +196 -0
- data/lib/logstash/plugin_mixins/jdbc_streaming.rb +60 -0
- data/logstash-filter-jdbc_streaming.gemspec +29 -0
- data/spec/filters/jdbc_streaming_spec.rb +277 -0
- data/spec/integration/jdbcstreaming_spec.rb +154 -0
- metadata +146 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 0a1ec4d5c58f6ce550226691dca0007a13c4b136
|
4
|
+
data.tar.gz: a3a51d17ea4a55cf57e74537a783c651cd737286
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 6ce0c39c40c7f0e93776bd9abdf185cc3780d96d3742e051488db497175fc3929a9bfe04943bde7e26b6f0d1a915363e3e00e28a3c77d6de6242ec2ddadc30e7
|
7
|
+
data.tar.gz: adf5c47acdac61a900d4f2288aa466bbcbdc28165ab7d6d28fc666eedb1e294024348127f789f69ec7f4c2b62d11c86491c627149d69d11f083672953f25b2b0
|
data/CHANGELOG.md
ADDED
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,13 @@
|
|
1
|
+
Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
|
2
|
+
|
3
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
you may not use this file except in compliance with the License.
|
5
|
+
You may obtain a copy of the License at
|
6
|
+
|
7
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
|
9
|
+
Unless required by applicable law or agreed to in writing, software
|
10
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
See the License for the specific language governing permissions and
|
13
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,98 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
[![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-filter-jdbc_streaming.svg)](https://travis-ci.org/logstash-plugins/logstash-filter-jdbc_streaming)
|
4
|
+
|
5
|
+
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
6
|
+
|
7
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
8
|
+
|
9
|
+
## Documentation
|
10
|
+
|
11
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
|
12
|
+
|
13
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
14
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
|
15
|
+
|
16
|
+
## Need Help?
|
17
|
+
|
18
|
+
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
19
|
+
|
20
|
+
## Developing
|
21
|
+
|
22
|
+
### 1. Plugin Developement and Testing
|
23
|
+
|
24
|
+
#### Code
|
25
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
26
|
+
|
27
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
|
28
|
+
|
29
|
+
- Install dependencies
|
30
|
+
```sh
|
31
|
+
bundle install
|
32
|
+
```
|
33
|
+
|
34
|
+
#### Test
|
35
|
+
|
36
|
+
- Update your dependencies
|
37
|
+
|
38
|
+
```sh
|
39
|
+
bundle install
|
40
|
+
```
|
41
|
+
|
42
|
+
- Run unit tests
|
43
|
+
|
44
|
+
```sh
|
45
|
+
bundle exec rspec
|
46
|
+
```
|
47
|
+
|
48
|
+
### 2. Running your unpublished Plugin in Logstash
|
49
|
+
|
50
|
+
#### 2.1 Run in a local Logstash clone
|
51
|
+
|
52
|
+
- Edit Logstash `Gemfile` and add the local plugin path, for example:
|
53
|
+
```ruby
|
54
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
55
|
+
```
|
56
|
+
- Install plugin
|
57
|
+
```sh
|
58
|
+
# Logstash 2.3 and higher
|
59
|
+
bin/logstash-plugin install --no-verify
|
60
|
+
|
61
|
+
# Prior to Logstash 2.3
|
62
|
+
bin/plugin install --no-verify
|
63
|
+
|
64
|
+
```
|
65
|
+
- Run Logstash with your plugin
|
66
|
+
```sh
|
67
|
+
bin/logstash -e 'filter {awesome {}}'
|
68
|
+
```
|
69
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
70
|
+
|
71
|
+
#### 2.2 Run in an installed Logstash
|
72
|
+
|
73
|
+
You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
|
74
|
+
|
75
|
+
- Build your plugin gem
|
76
|
+
```sh
|
77
|
+
gem build logstash-filter-awesome.gemspec
|
78
|
+
```
|
79
|
+
- Install the plugin from the Logstash home
|
80
|
+
```sh
|
81
|
+
# Logstash 2.3 and higher
|
82
|
+
bin/logstash-plugin install --no-verify
|
83
|
+
|
84
|
+
# Prior to Logstash 2.3
|
85
|
+
bin/plugin install --no-verify
|
86
|
+
|
87
|
+
```
|
88
|
+
- Start Logstash and proceed to test the plugin
|
89
|
+
|
90
|
+
## Contributing
|
91
|
+
|
92
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
93
|
+
|
94
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
95
|
+
|
96
|
+
It is more important to the community that you are able to contribute.
|
97
|
+
|
98
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -0,0 +1,196 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/filters/base"
|
3
|
+
require "logstash/namespace"
|
4
|
+
require "logstash/plugin_mixins/jdbc_streaming"
|
5
|
+
require "lru_redux"
|
6
|
+
|
7
|
+
# This filter executes a SQL query and store the result set in the field
|
8
|
+
# specified as `target`.
|
9
|
+
# It will cache the results locally in an LRU cache with expiry
|
10
|
+
#
|
11
|
+
# For example you can load a row based on an id from in the event
|
12
|
+
#
|
13
|
+
# [source,ruby]
|
14
|
+
# filter {
|
15
|
+
# jdbc_streaming {
|
16
|
+
# jdbc_driver_library => "/path/to/mysql-connector-java-5.1.34-bin.jar"
|
17
|
+
# jdbc_driver_class => "com.mysql.jdbc.Driver"
|
18
|
+
# jdbc_connection_string => ""jdbc:mysql://localhost:3306/mydatabase"
|
19
|
+
# jdbc_user => "me"
|
20
|
+
# jdbc_password => "secret"
|
21
|
+
# statement => "select * from WORLD.COUNTRY WHERE Code = :code"
|
22
|
+
# parameters => { "code" => "country_code"}
|
23
|
+
# target => "country_details"
|
24
|
+
# }
|
25
|
+
# }
|
26
|
+
#
|
27
|
+
module LogStash module Filters class JdbcStreaming < LogStash::Filters::Base
|
28
|
+
class CachePayload
|
29
|
+
attr_reader :payload
|
30
|
+
def initialize
|
31
|
+
@failure = false
|
32
|
+
@payload = []
|
33
|
+
end
|
34
|
+
|
35
|
+
def push(data)
|
36
|
+
@payload << data
|
37
|
+
end
|
38
|
+
|
39
|
+
def failed!
|
40
|
+
@failure = true
|
41
|
+
end
|
42
|
+
|
43
|
+
def failed?
|
44
|
+
@failure
|
45
|
+
end
|
46
|
+
|
47
|
+
def empty?
|
48
|
+
@payload.empty?
|
49
|
+
end
|
50
|
+
end
|
51
|
+
|
52
|
+
class RowCache
|
53
|
+
def initialize(size, ttl)
|
54
|
+
@cache = ::LruRedux::TTL::ThreadSafeCache.new(size, ttl)
|
55
|
+
end
|
56
|
+
|
57
|
+
def get(parameters)
|
58
|
+
@cache.getset(parameters) { yield }
|
59
|
+
end
|
60
|
+
end
|
61
|
+
|
62
|
+
class NoCache
|
63
|
+
def initialize(size, ttl) end
|
64
|
+
|
65
|
+
def get(statement)
|
66
|
+
yield
|
67
|
+
end
|
68
|
+
end
|
69
|
+
|
70
|
+
include LogStash::PluginMixins::JdbcStreaming
|
71
|
+
|
72
|
+
config_name "jdbc_streaming"
|
73
|
+
|
74
|
+
# Statement to execute.
|
75
|
+
# To use parameters, use named parameter syntax, for example "SELECT * FROM MYTABLE WHERE ID = :id"
|
76
|
+
config :statement, :validate => :string, :required => true
|
77
|
+
|
78
|
+
# Hash of query parameter, for example `{ "id" => "id_field" }`
|
79
|
+
config :parameters, :validate => :hash, :default => {}
|
80
|
+
|
81
|
+
# Define the target field to store the extracted result(s)
|
82
|
+
# Field is overwritten if exists
|
83
|
+
config :target, :validate => :string, :required => true
|
84
|
+
|
85
|
+
# Define a default object to use when lookup fails to return a matching row.
|
86
|
+
# ensure that the key names of this object match the columns from the statement
|
87
|
+
config :default_hash, :validate => :hash, :default => {}
|
88
|
+
|
89
|
+
# Append values to the `tags` field if sql error occured
|
90
|
+
config :tag_on_failure, :validate => :array, :default => ["_jdbcstreamingfailure"]
|
91
|
+
|
92
|
+
# Append values to the `tags` field if no record was found and default values were used
|
93
|
+
config :tag_on_default_use, :validate => :array, :default => ["_jdbcstreamingdefaultsused"]
|
94
|
+
|
95
|
+
# Enable or disable caching, boolean true or false, defaults to true
|
96
|
+
config :use_cache, :validate => :boolean, :default => true
|
97
|
+
|
98
|
+
# The minimum number of seconds any entry should remain in the cache, defaults to 5 seconds
|
99
|
+
# A numeric value, you can use decimals for example `{ "cache_expiration" => 0.25 }`
|
100
|
+
# If there are transient jdbc errors the cache will store empty results for a given
|
101
|
+
# parameter set and bypass the jbdc lookup, this merges the default_hash into the event, until
|
102
|
+
# the cache entry expires, then the jdbc lookup will be tried again for the same parameters
|
103
|
+
# Conversely, while the cache contains valid results any external problem that would cause
|
104
|
+
# jdbc errors, will not be noticed for the cache_expiration period.
|
105
|
+
config :cache_expiration, :validate => :number, :default => 5.0
|
106
|
+
|
107
|
+
# The maximum number of cache entries are stored, defaults to 500 entries
|
108
|
+
# The least recently used entry will be evicted
|
109
|
+
config :cache_size, :validate => :number, :default => 500
|
110
|
+
|
111
|
+
# ----------------------------------------
|
112
|
+
public
|
113
|
+
|
114
|
+
def register
|
115
|
+
convert_config_options
|
116
|
+
prepare_connected_jdbc_cache
|
117
|
+
end
|
118
|
+
|
119
|
+
def filter(event)
|
120
|
+
result = cache_lookup(event) # should return a JdbcCachePayload
|
121
|
+
|
122
|
+
if result.failed?
|
123
|
+
tag_failure(event)
|
124
|
+
end
|
125
|
+
|
126
|
+
if result.empty?
|
127
|
+
tag_default(event)
|
128
|
+
process_event(event, @default_array)
|
129
|
+
else
|
130
|
+
process_event(event, result.payload)
|
131
|
+
end
|
132
|
+
end
|
133
|
+
|
134
|
+
# ----------------------------------------
|
135
|
+
private
|
136
|
+
|
137
|
+
def cache_lookup(event)
|
138
|
+
params = prepare_parameters_from_event(event)
|
139
|
+
@cache.get(params) do
|
140
|
+
result = CachePayload.new
|
141
|
+
begin
|
142
|
+
query = @database[@statement, params] # returns a dataset
|
143
|
+
@logger.debug? && @logger.debug("Executing JDBC query", :statement => @statement, :parameters => params)
|
144
|
+
query.all do |row|
|
145
|
+
result.push row.inject({}){|hash,(k,v)| hash[k.to_s] = v; hash} #Stringify row keys
|
146
|
+
end
|
147
|
+
rescue ::Sequel::Error => e
|
148
|
+
# all sequel errors are a subclass of this, let all other standard or runtime errors bubble up
|
149
|
+
result.failed!
|
150
|
+
@logger.warn? && @logger.warn("Exception when executing JDBC query", :exception => e)
|
151
|
+
end
|
152
|
+
# if either of: no records or a Sequel exception occurs the payload is
|
153
|
+
# empty and the default can be substituted later.
|
154
|
+
result
|
155
|
+
end
|
156
|
+
end
|
157
|
+
|
158
|
+
def prepare_parameters_from_event(event)
|
159
|
+
@symbol_parameters.inject({}) do |hash,(k,v)|
|
160
|
+
value = event.get(event.sprintf(v))
|
161
|
+
hash[k] = value.is_a?(::LogStash::Timestamp) ? value.time : value
|
162
|
+
hash
|
163
|
+
end
|
164
|
+
end
|
165
|
+
|
166
|
+
def tag_failure(event)
|
167
|
+
@tag_on_failure.each do |tag|
|
168
|
+
event.tag(tag)
|
169
|
+
end
|
170
|
+
end
|
171
|
+
|
172
|
+
def tag_default(event)
|
173
|
+
@tag_on_default_use.each do |tag|
|
174
|
+
event.tag(tag)
|
175
|
+
end
|
176
|
+
end
|
177
|
+
|
178
|
+
def process_event(event, value)
|
179
|
+
# use deep clone here so other filter function don't taint the cached payload by reference
|
180
|
+
event.set(@target, ::LogStash::Util.deep_clone(value))
|
181
|
+
filter_matched(event)
|
182
|
+
end
|
183
|
+
|
184
|
+
def convert_config_options
|
185
|
+
# create these object once they will be cloned for every filter call anyway,
|
186
|
+
# lets not create a new object for each
|
187
|
+
@symbol_parameters = @parameters.inject({}) {|hash,(k,v)| hash[k.to_sym] = v ; hash }
|
188
|
+
@default_array = [@default_hash]
|
189
|
+
end
|
190
|
+
|
191
|
+
def prepare_connected_jdbc_cache
|
192
|
+
klass = @use_cache ? RowCache : NoCache
|
193
|
+
@cache = klass.new(@cache_size, @cache_expiration)
|
194
|
+
prepare_jdbc_connection
|
195
|
+
end
|
196
|
+
end end end # class LogStash::Filters::Jdbc
|
@@ -0,0 +1,60 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/config/mixin"
|
3
|
+
|
4
|
+
# Tentative of abstracting JDBC logic to a mixin
|
5
|
+
# for potential reuse in other plugins (input/output)
|
6
|
+
module LogStash module PluginMixins module JdbcStreaming
|
7
|
+
|
8
|
+
# This method is called when someone includes this module
|
9
|
+
def self.included(base)
|
10
|
+
# Add these methods to the 'base' given.
|
11
|
+
base.extend(self)
|
12
|
+
base.setup_jdbc_config
|
13
|
+
end
|
14
|
+
|
15
|
+
public
|
16
|
+
def setup_jdbc_config
|
17
|
+
# JDBC driver library path to third party driver library.
|
18
|
+
config :jdbc_driver_library, :validate => :path
|
19
|
+
|
20
|
+
# JDBC driver class to load, for example "oracle.jdbc.OracleDriver" or "org.apache.derby.jdbc.ClientDriver"
|
21
|
+
config :jdbc_driver_class, :validate => :string, :required => true
|
22
|
+
|
23
|
+
# JDBC connection string
|
24
|
+
config :jdbc_connection_string, :validate => :string, :required => true
|
25
|
+
|
26
|
+
# JDBC user
|
27
|
+
config :jdbc_user, :validate => :string
|
28
|
+
|
29
|
+
# JDBC password
|
30
|
+
config :jdbc_password, :validate => :password
|
31
|
+
|
32
|
+
# Connection pool configuration.
|
33
|
+
# Validate connection before use.
|
34
|
+
config :jdbc_validate_connection, :validate => :boolean, :default => false
|
35
|
+
|
36
|
+
# Connection pool configuration.
|
37
|
+
# How often to validate a connection (in seconds)
|
38
|
+
config :jdbc_validation_timeout, :validate => :number, :default => 3600
|
39
|
+
end
|
40
|
+
|
41
|
+
public
|
42
|
+
def prepare_jdbc_connection
|
43
|
+
require "sequel"
|
44
|
+
require "sequel/adapters/jdbc"
|
45
|
+
require "java"
|
46
|
+
require @jdbc_driver_library if @jdbc_driver_library
|
47
|
+
Sequel::JDBC.load_driver(@jdbc_driver_class)
|
48
|
+
@database = Sequel.connect(@jdbc_connection_string, :user=> @jdbc_user, :password=> @jdbc_password.nil? ? nil : @jdbc_password.value)
|
49
|
+
if @jdbc_validate_connection
|
50
|
+
@database.extension(:connection_validator)
|
51
|
+
@database.pool.connection_validation_timeout = @jdbc_validation_timeout
|
52
|
+
end
|
53
|
+
begin
|
54
|
+
@database.test_connection
|
55
|
+
rescue Sequel::DatabaseConnectionError => e
|
56
|
+
#TODO return false and let the plugin raise a LogStash::ConfigurationError
|
57
|
+
raise e
|
58
|
+
end
|
59
|
+
end # def prepare_jdbc_connection
|
60
|
+
end end end
|
@@ -0,0 +1,29 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-filter-jdbc_streaming'
|
3
|
+
s.version = '1.0.0'
|
4
|
+
s.licenses = ['Apache License (2.0)']
|
5
|
+
s.summary = "This filter executes a SQL query and store the result set in the event."
|
6
|
+
s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
|
7
|
+
s.authors = ["Elastic"]
|
8
|
+
s.email = 'info@elastic.co'
|
9
|
+
s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
|
10
|
+
s.require_paths = ["lib"]
|
11
|
+
|
12
|
+
# Files
|
13
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
14
|
+
|
15
|
+
# Tests
|
16
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
17
|
+
|
18
|
+
# Special flag to let us know this is actually a logstash plugin
|
19
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" }
|
20
|
+
|
21
|
+
# Gem dependencies
|
22
|
+
s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
|
23
|
+
s.add_runtime_dependency 'sequel'
|
24
|
+
s.add_runtime_dependency 'lru_redux' # lru cache with ttl
|
25
|
+
|
26
|
+
s.add_development_dependency 'logstash-devutils'
|
27
|
+
s.add_development_dependency 'jdbc-derby'
|
28
|
+
s.add_development_dependency 'jdbc-mysql'
|
29
|
+
end
|
@@ -0,0 +1,277 @@
|
|
1
|
+
require "logstash/devutils/rspec/spec_helper"
|
2
|
+
require "logstash/filters/jdbc_streaming"
|
3
|
+
require 'jdbc/derby'
|
4
|
+
require "sequel"
|
5
|
+
require "sequel/adapters/jdbc"
|
6
|
+
|
7
|
+
module LogStash module Filters
|
8
|
+
class TestJdbcStreaming < JdbcStreaming
|
9
|
+
attr_reader :database
|
10
|
+
end
|
11
|
+
|
12
|
+
describe JdbcStreaming do
|
13
|
+
#Use embedded Derby for tests
|
14
|
+
::Jdbc::Derby.load_driver
|
15
|
+
|
16
|
+
ENV["TZ"] = "Etc/UTC"
|
17
|
+
describe "plugin level execution" do
|
18
|
+
let(:mixin_settings) do
|
19
|
+
{ "jdbc_user" => ENV['USER'], "jdbc_driver_class" => "org.apache.derby.jdbc.EmbeddedDriver",
|
20
|
+
"jdbc_connection_string" => "jdbc:derby:memory:testdb;create=true"}
|
21
|
+
end
|
22
|
+
let(:plugin) { JdbcStreaming.new(mixin_settings.merge(settings)) }
|
23
|
+
let (:db) do
|
24
|
+
::Sequel.connect(mixin_settings['jdbc_connection_string'], :user=> nil, :password=> nil)
|
25
|
+
end
|
26
|
+
let(:event) { ::LogStash::Event.new("message" => "some text", "ip" => ipaddr) }
|
27
|
+
let(:cache_expiration) { 3.0 }
|
28
|
+
let(:use_cache) { true }
|
29
|
+
let(:cache_size) { 10 }
|
30
|
+
let(:statement) { "SELECT name, location FROM reference_table WHERE ip = :ip" }
|
31
|
+
let(:settings) do
|
32
|
+
{
|
33
|
+
"statement" => statement,
|
34
|
+
"parameters" => {"ip" => "ip"},
|
35
|
+
"target" => "server",
|
36
|
+
"use_cache" => use_cache,
|
37
|
+
"cache_expiration" => cache_expiration,
|
38
|
+
"cache_size" => cache_size,
|
39
|
+
"tag_on_failure" => ["lookup_failed"],
|
40
|
+
"tag_on_default_use" => ["default_used_instead"],
|
41
|
+
"default_hash" => {"name" => "unknown", "location" => "unknown"}
|
42
|
+
}
|
43
|
+
end
|
44
|
+
|
45
|
+
before :each do
|
46
|
+
db.create_table :reference_table do
|
47
|
+
String :ip
|
48
|
+
String :name
|
49
|
+
String :location
|
50
|
+
end
|
51
|
+
db[:reference_table].insert(:ip => "10.1.1.1", :name => "ldn-server-1", :location => "LDN-2-3-4")
|
52
|
+
db[:reference_table].insert(:ip => "10.2.1.1", :name => "nyc-server-1", :location => "NYC-5-2-8")
|
53
|
+
db[:reference_table].insert(:ip => "10.3.1.1", :name => "mv-server-1", :location => "MV-9-6-4")
|
54
|
+
plugin.register
|
55
|
+
end
|
56
|
+
|
57
|
+
after :each do
|
58
|
+
db.drop_table(:reference_table)
|
59
|
+
end
|
60
|
+
|
61
|
+
describe "found record - uses row" do
|
62
|
+
let(:ipaddr) { "10.1.1.1" }
|
63
|
+
|
64
|
+
it "fills in the target" do
|
65
|
+
plugin.filter(event)
|
66
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
67
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
68
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
69
|
+
end
|
70
|
+
end
|
71
|
+
|
72
|
+
describe "missing record - uses default" do
|
73
|
+
let(:ipaddr) { "192.168.1.1" }
|
74
|
+
|
75
|
+
it "fills in the target with the default" do
|
76
|
+
plugin.filter(event)
|
77
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
78
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
79
|
+
end
|
80
|
+
end
|
81
|
+
|
82
|
+
describe "database error - uses default" do
|
83
|
+
let(:ipaddr) { "10.1.1.1" }
|
84
|
+
let(:statement) { "SELECT name, location FROM reference_table WHERE ip = :address" }
|
85
|
+
it "fills in the target with the default" do
|
86
|
+
plugin.filter(event)
|
87
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
88
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["lookup_failed", "default_used_instead"])
|
89
|
+
end
|
90
|
+
end
|
91
|
+
|
92
|
+
context "when fetching from cache" do
|
93
|
+
let(:plugin) { TestJdbcStreaming.new(mixin_settings.merge(settings)) }
|
94
|
+
let(:events) do
|
95
|
+
5.times.map{|i| ::LogStash::Event.new("message" => "some other text #{i}", "ip" => ipaddr) }
|
96
|
+
end
|
97
|
+
let(:call_count) { 1 }
|
98
|
+
before(:each) do
|
99
|
+
expect(plugin.database).to receive(:[]).exactly(call_count).times.and_call_original
|
100
|
+
plugin.filter(event)
|
101
|
+
end
|
102
|
+
|
103
|
+
describe "found record - caches row" do
|
104
|
+
let(:ipaddr) { "10.1.1.1" }
|
105
|
+
it "calls the database once then uses the cache" do
|
106
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
107
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
108
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
109
|
+
events.each do |evt|
|
110
|
+
plugin.filter(evt)
|
111
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
112
|
+
end
|
113
|
+
end
|
114
|
+
end
|
115
|
+
|
116
|
+
describe "missing record - uses default" do
|
117
|
+
let(:ipaddr) { "10.10.1.1" }
|
118
|
+
it "calls the database once then uses the cache" do
|
119
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
120
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
121
|
+
events.each do |evt|
|
122
|
+
plugin.filter(evt)
|
123
|
+
expect(evt.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
124
|
+
end
|
125
|
+
end
|
126
|
+
end
|
127
|
+
|
128
|
+
context "extremely small cache expiration" do
|
129
|
+
describe "found record - cache always expires" do
|
130
|
+
let(:ipaddr) { "10.1.1.1" }
|
131
|
+
let(:call_count) { 6 }
|
132
|
+
let(:cache_expiration) { 0.0000001 }
|
133
|
+
it "calls the database each time because cache entry expired" do
|
134
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
135
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
136
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
137
|
+
events.each do |evt|
|
138
|
+
plugin.filter(evt)
|
139
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
140
|
+
end
|
141
|
+
end
|
142
|
+
end
|
143
|
+
end
|
144
|
+
|
145
|
+
context "when cache is disabled" do
|
146
|
+
let(:call_count) { 6 }
|
147
|
+
let(:use_cache) { false }
|
148
|
+
describe "database is always called" do
|
149
|
+
let(:ipaddr) { "10.1.1.1" }
|
150
|
+
it "calls the database each time" do
|
151
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
152
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
153
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
154
|
+
events.each do |evt|
|
155
|
+
plugin.filter(evt)
|
156
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
157
|
+
end
|
158
|
+
end
|
159
|
+
end
|
160
|
+
|
161
|
+
describe "database is always called but record is missing and default is used" do
|
162
|
+
let(:ipaddr) { "10.11.1.1" }
|
163
|
+
it "calls the database each time" do
|
164
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
165
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
166
|
+
events.each do |evt|
|
167
|
+
plugin.filter(evt)
|
168
|
+
expect(evt.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
169
|
+
end
|
170
|
+
end
|
171
|
+
end
|
172
|
+
end
|
173
|
+
end
|
174
|
+
end
|
175
|
+
|
176
|
+
describe "All default - Retrieve a value from database" do
|
177
|
+
let(:config) do <<-CONFIG
|
178
|
+
filter {
|
179
|
+
jdbc_streaming {
|
180
|
+
jdbc_driver_class => "org.apache.derby.jdbc.EmbeddedDriver"
|
181
|
+
jdbc_connection_string => "jdbc:derby:memory:testdb;create=true"
|
182
|
+
statement => "SELECT 'from_database' FROM SYSIBM.SYSDUMMY1"
|
183
|
+
target => "new_field"
|
184
|
+
}
|
185
|
+
}
|
186
|
+
CONFIG
|
187
|
+
end
|
188
|
+
|
189
|
+
sample("message" => "some text") do
|
190
|
+
expect(subject.get('new_field')).to eq([{"1" => 'from_database'}])
|
191
|
+
end
|
192
|
+
end
|
193
|
+
|
194
|
+
describe "Named column - Retrieve a value from database" do
|
195
|
+
let(:config) do <<-CONFIG
|
196
|
+
filter {
|
197
|
+
jdbc_streaming {
|
198
|
+
jdbc_driver_class => "org.apache.derby.jdbc.EmbeddedDriver"
|
199
|
+
jdbc_connection_string => "jdbc:derby:memory:testdb;create=true"
|
200
|
+
statement => "SELECT 'from_database' as col_1 FROM SYSIBM.SYSDUMMY1"
|
201
|
+
target => "new_field"
|
202
|
+
}
|
203
|
+
}
|
204
|
+
CONFIG
|
205
|
+
end
|
206
|
+
|
207
|
+
sample("message" => "some text") do
|
208
|
+
expect(subject.get('new_field')).to eq([{"col_1" => 'from_database'}])
|
209
|
+
end
|
210
|
+
end
|
211
|
+
|
212
|
+
describe "Using string parameters - Retrieve a value from database" do
|
213
|
+
let(:config) do <<-CONFIG
|
214
|
+
filter {
|
215
|
+
jdbc_streaming {
|
216
|
+
jdbc_driver_class => "org.apache.derby.jdbc.EmbeddedDriver"
|
217
|
+
jdbc_connection_string => "jdbc:derby:memory:testdb;create=true"
|
218
|
+
statement => "SELECT 'from_database' FROM SYSIBM.SYSDUMMY1 WHERE '1' = :param"
|
219
|
+
parameters => { "param" => "param_field"}
|
220
|
+
target => "new_field"
|
221
|
+
}
|
222
|
+
}
|
223
|
+
CONFIG
|
224
|
+
end
|
225
|
+
|
226
|
+
sample("message" => "some text", "param_field" => "1") do
|
227
|
+
expect(subject.get('new_field')).to eq([{"1" => 'from_database'}])
|
228
|
+
end
|
229
|
+
|
230
|
+
sample("message" => "some text", "param_field" => "2") do
|
231
|
+
expect(subject.get('new_field').nil?)
|
232
|
+
end
|
233
|
+
end
|
234
|
+
|
235
|
+
describe "Using integer parameters" do
|
236
|
+
let(:config) do <<-CONFIG
|
237
|
+
filter {
|
238
|
+
jdbc_streaming {
|
239
|
+
jdbc_driver_class => "org.apache.derby.jdbc.EmbeddedDriver"
|
240
|
+
jdbc_connection_string => "jdbc:derby:memory:testdb;create=true"
|
241
|
+
statement => "SELECT 'from_database' FROM SYSIBM.SYSDUMMY1 WHERE 1 = :param"
|
242
|
+
parameters => { "param" => "param_field"}
|
243
|
+
target => "new_field"
|
244
|
+
}
|
245
|
+
}
|
246
|
+
CONFIG
|
247
|
+
end
|
248
|
+
|
249
|
+
sample("message" => "some text", "param_field" => 1) do
|
250
|
+
expect(subject.get('new_field')).to eq([{"1" => 'from_database'}])
|
251
|
+
end
|
252
|
+
|
253
|
+
sample("message" => "some text", "param_field" => "1") do
|
254
|
+
expect(subject.get('new_field').nil?)
|
255
|
+
end
|
256
|
+
end
|
257
|
+
|
258
|
+
describe "Using timestamp parameter" do
|
259
|
+
let(:config) do <<-CONFIG
|
260
|
+
filter {
|
261
|
+
jdbc_streaming {
|
262
|
+
jdbc_driver_class => "org.apache.derby.jdbc.EmbeddedDriver"
|
263
|
+
jdbc_connection_string => "jdbc:derby:memory:testdb;create=true"
|
264
|
+
statement => "SELECT 'from_database' FROM SYSIBM.SYSDUMMY1 WHERE {fn TIMESTAMPDIFF( SQL_TSI_DAY, {t :param}, current_timestamp)} = 0"
|
265
|
+
parameters => { "param" => "@timestamp"}
|
266
|
+
target => "new_field"
|
267
|
+
}
|
268
|
+
}
|
269
|
+
CONFIG
|
270
|
+
end
|
271
|
+
|
272
|
+
sample("message" => "some text") do
|
273
|
+
expect(subject.get('new_field')).to eq([{"1" => 'from_database'}])
|
274
|
+
end
|
275
|
+
end
|
276
|
+
end
|
277
|
+
end end
|
@@ -0,0 +1,154 @@
|
|
1
|
+
require "logstash/devutils/rspec/spec_helper"
|
2
|
+
require "logstash/filters/jdbc_streaming"
|
3
|
+
require 'jdbc/mysql'
|
4
|
+
require "sequel"
|
5
|
+
require "sequel/adapters/jdbc"
|
6
|
+
|
7
|
+
module LogStash module Filters
|
8
|
+
class TestJdbcStreaming < JdbcStreaming
|
9
|
+
attr_reader :database
|
10
|
+
end
|
11
|
+
|
12
|
+
describe JdbcStreaming, :integration => true do
|
13
|
+
# Use MySql for integration tests
|
14
|
+
::Jdbc::MySQL.load_driver
|
15
|
+
|
16
|
+
ENV["TZ"] = "Etc/UTC"
|
17
|
+
let(:mixin_settings) do
|
18
|
+
{ "jdbc_user" => "root", "jdbc_driver_class" => "com.mysql.jdbc.Driver",
|
19
|
+
"jdbc_connection_string" => "jdbc:mysql://localhost/jdbc_streaming_db?user=root"}
|
20
|
+
end
|
21
|
+
let(:settings) { {} }
|
22
|
+
let(:plugin) { JdbcStreaming.new(mixin_settings.merge(settings)) }
|
23
|
+
let (:db) do
|
24
|
+
::Sequel.connect(mixin_settings['jdbc_connection_string'])
|
25
|
+
end
|
26
|
+
let(:event) { ::LogStash::Event.new("message" => "some text", "ip" => ipaddr) }
|
27
|
+
let(:cache_expiration) { 3.0 }
|
28
|
+
let(:use_cache) { true }
|
29
|
+
let(:cache_size) { 10 }
|
30
|
+
let(:statement) { "SELECT name, location FROM reference_table WHERE ip = :ip" }
|
31
|
+
let(:settings) do
|
32
|
+
{
|
33
|
+
"statement" => statement,
|
34
|
+
"parameters" => {"ip" => "ip"},
|
35
|
+
"target" => "server",
|
36
|
+
"use_cache" => use_cache,
|
37
|
+
"cache_expiration" => cache_expiration,
|
38
|
+
"cache_size" => cache_size,
|
39
|
+
"tag_on_failure" => ["lookup_failed"],
|
40
|
+
"tag_on_default_use" => ["default_used_instead"],
|
41
|
+
"default_hash" => {"name" => "unknown", "location" => "unknown"}
|
42
|
+
}
|
43
|
+
end
|
44
|
+
let(:ipaddr) { "10.#{idx}.1.1" }
|
45
|
+
|
46
|
+
before :each do
|
47
|
+
db.create_table :reference_table do
|
48
|
+
String :ip
|
49
|
+
String :name
|
50
|
+
String :location
|
51
|
+
end
|
52
|
+
1.upto(250) do |i|
|
53
|
+
db[:reference_table].insert(:ip => "10.#{i}.1.1", :name => "ldn-server-#{i}", :location => "LDN-#{i}-2-3")
|
54
|
+
end
|
55
|
+
plugin.register
|
56
|
+
end
|
57
|
+
|
58
|
+
after(:each) { db.drop_table(:reference_table) }
|
59
|
+
|
60
|
+
describe "found record - uses row" do
|
61
|
+
let(:idx) { 200 }
|
62
|
+
|
63
|
+
it "fills in the target" do
|
64
|
+
plugin.filter(event)
|
65
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
66
|
+
expect((event.get("tags") || []) & ["lookup_failed", "default_used_instead"]).to be_empty
|
67
|
+
end
|
68
|
+
end
|
69
|
+
|
70
|
+
context "when fetching from cache" do
|
71
|
+
let(:plugin) { TestJdbcStreaming.new(mixin_settings.merge(settings)) }
|
72
|
+
let(:events) do
|
73
|
+
5.times.map{|i| ::LogStash::Event.new("message" => "some other text #{i}", "ip" => ipaddr) }
|
74
|
+
end
|
75
|
+
let(:call_count) { 1 }
|
76
|
+
before(:each) do
|
77
|
+
expect(plugin.database).to receive(:[]).exactly(call_count).times.and_call_original
|
78
|
+
plugin.filter(event)
|
79
|
+
end
|
80
|
+
|
81
|
+
describe "found record - caches row" do
|
82
|
+
let(:idx) { "42" }
|
83
|
+
it "calls the database once then uses the cache" do
|
84
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
85
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
86
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
87
|
+
events.each do |evt|
|
88
|
+
plugin.filter(evt)
|
89
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
90
|
+
end
|
91
|
+
end
|
92
|
+
end
|
93
|
+
|
94
|
+
describe "missing record - uses default" do
|
95
|
+
let(:idx) { "252" }
|
96
|
+
it "calls the database once then uses the cache" do
|
97
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
98
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
99
|
+
events.each do |evt|
|
100
|
+
plugin.filter(evt)
|
101
|
+
expect(evt.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
102
|
+
end
|
103
|
+
end
|
104
|
+
end
|
105
|
+
|
106
|
+
context "extremely small cache expiration" do
|
107
|
+
describe "found record - cache always expires" do
|
108
|
+
let(:idx) { "10" }
|
109
|
+
let(:call_count) { 6 }
|
110
|
+
let(:cache_expiration) { 0.0000001 }
|
111
|
+
it "calls the database each time because cache entry expired" do
|
112
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
113
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
114
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
115
|
+
events.each do |evt|
|
116
|
+
plugin.filter(evt)
|
117
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
118
|
+
end
|
119
|
+
end
|
120
|
+
end
|
121
|
+
end
|
122
|
+
|
123
|
+
context "when cache is disabled" do
|
124
|
+
let(:call_count) { 6 }
|
125
|
+
let(:use_cache) { false }
|
126
|
+
describe "database is always called" do
|
127
|
+
let(:idx) { "1" }
|
128
|
+
it "calls the database each time" do
|
129
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
130
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
131
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
132
|
+
events.each do |evt|
|
133
|
+
plugin.filter(evt)
|
134
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
135
|
+
end
|
136
|
+
end
|
137
|
+
end
|
138
|
+
|
139
|
+
describe "database is always called but record is missing and default is used" do
|
140
|
+
let(:idx) { "251" }
|
141
|
+
it "calls the database each time" do
|
142
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
143
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
144
|
+
events.each do |evt|
|
145
|
+
plugin.filter(evt)
|
146
|
+
expect(evt.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
147
|
+
end
|
148
|
+
end
|
149
|
+
end
|
150
|
+
end
|
151
|
+
end
|
152
|
+
end
|
153
|
+
|
154
|
+
end end
|
metadata
ADDED
@@ -0,0 +1,146 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-filter-jdbc_streaming
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 1.0.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Elastic
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2017-03-27 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
15
|
+
requirements:
|
16
|
+
- - ">="
|
17
|
+
- !ruby/object:Gem::Version
|
18
|
+
version: '1.60'
|
19
|
+
- - "<="
|
20
|
+
- !ruby/object:Gem::Version
|
21
|
+
version: '2.99'
|
22
|
+
name: logstash-core-plugin-api
|
23
|
+
prerelease: false
|
24
|
+
type: :runtime
|
25
|
+
version_requirements: !ruby/object:Gem::Requirement
|
26
|
+
requirements:
|
27
|
+
- - ">="
|
28
|
+
- !ruby/object:Gem::Version
|
29
|
+
version: '1.60'
|
30
|
+
- - "<="
|
31
|
+
- !ruby/object:Gem::Version
|
32
|
+
version: '2.99'
|
33
|
+
- !ruby/object:Gem::Dependency
|
34
|
+
requirement: !ruby/object:Gem::Requirement
|
35
|
+
requirements:
|
36
|
+
- - ">="
|
37
|
+
- !ruby/object:Gem::Version
|
38
|
+
version: '0'
|
39
|
+
name: sequel
|
40
|
+
prerelease: false
|
41
|
+
type: :runtime
|
42
|
+
version_requirements: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - ">="
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: '0'
|
47
|
+
- !ruby/object:Gem::Dependency
|
48
|
+
requirement: !ruby/object:Gem::Requirement
|
49
|
+
requirements:
|
50
|
+
- - ">="
|
51
|
+
- !ruby/object:Gem::Version
|
52
|
+
version: '0'
|
53
|
+
name: lru_redux
|
54
|
+
prerelease: false
|
55
|
+
type: :runtime
|
56
|
+
version_requirements: !ruby/object:Gem::Requirement
|
57
|
+
requirements:
|
58
|
+
- - ">="
|
59
|
+
- !ruby/object:Gem::Version
|
60
|
+
version: '0'
|
61
|
+
- !ruby/object:Gem::Dependency
|
62
|
+
requirement: !ruby/object:Gem::Requirement
|
63
|
+
requirements:
|
64
|
+
- - ">="
|
65
|
+
- !ruby/object:Gem::Version
|
66
|
+
version: '0'
|
67
|
+
name: logstash-devutils
|
68
|
+
prerelease: false
|
69
|
+
type: :development
|
70
|
+
version_requirements: !ruby/object:Gem::Requirement
|
71
|
+
requirements:
|
72
|
+
- - ">="
|
73
|
+
- !ruby/object:Gem::Version
|
74
|
+
version: '0'
|
75
|
+
- !ruby/object:Gem::Dependency
|
76
|
+
requirement: !ruby/object:Gem::Requirement
|
77
|
+
requirements:
|
78
|
+
- - ">="
|
79
|
+
- !ruby/object:Gem::Version
|
80
|
+
version: '0'
|
81
|
+
name: jdbc-derby
|
82
|
+
prerelease: false
|
83
|
+
type: :development
|
84
|
+
version_requirements: !ruby/object:Gem::Requirement
|
85
|
+
requirements:
|
86
|
+
- - ">="
|
87
|
+
- !ruby/object:Gem::Version
|
88
|
+
version: '0'
|
89
|
+
- !ruby/object:Gem::Dependency
|
90
|
+
requirement: !ruby/object:Gem::Requirement
|
91
|
+
requirements:
|
92
|
+
- - ">="
|
93
|
+
- !ruby/object:Gem::Version
|
94
|
+
version: '0'
|
95
|
+
name: jdbc-mysql
|
96
|
+
prerelease: false
|
97
|
+
type: :development
|
98
|
+
version_requirements: !ruby/object:Gem::Requirement
|
99
|
+
requirements:
|
100
|
+
- - ">="
|
101
|
+
- !ruby/object:Gem::Version
|
102
|
+
version: '0'
|
103
|
+
description: This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program
|
104
|
+
email: info@elastic.co
|
105
|
+
executables: []
|
106
|
+
extensions: []
|
107
|
+
extra_rdoc_files: []
|
108
|
+
files:
|
109
|
+
- CHANGELOG.md
|
110
|
+
- Gemfile
|
111
|
+
- LICENSE
|
112
|
+
- README.md
|
113
|
+
- lib/logstash/filters/jdbc_streaming.rb
|
114
|
+
- lib/logstash/plugin_mixins/jdbc_streaming.rb
|
115
|
+
- logstash-filter-jdbc_streaming.gemspec
|
116
|
+
- spec/filters/jdbc_streaming_spec.rb
|
117
|
+
- spec/integration/jdbcstreaming_spec.rb
|
118
|
+
homepage: http://www.elastic.co/guide/en/logstash/current/index.html
|
119
|
+
licenses:
|
120
|
+
- Apache License (2.0)
|
121
|
+
metadata:
|
122
|
+
logstash_plugin: 'true'
|
123
|
+
logstash_group: filter
|
124
|
+
post_install_message:
|
125
|
+
rdoc_options: []
|
126
|
+
require_paths:
|
127
|
+
- lib
|
128
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
129
|
+
requirements:
|
130
|
+
- - ">="
|
131
|
+
- !ruby/object:Gem::Version
|
132
|
+
version: '0'
|
133
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
134
|
+
requirements:
|
135
|
+
- - ">="
|
136
|
+
- !ruby/object:Gem::Version
|
137
|
+
version: '0'
|
138
|
+
requirements: []
|
139
|
+
rubyforge_project:
|
140
|
+
rubygems_version: 2.4.8
|
141
|
+
signing_key:
|
142
|
+
specification_version: 4
|
143
|
+
summary: This filter executes a SQL query and store the result set in the event.
|
144
|
+
test_files:
|
145
|
+
- spec/filters/jdbc_streaming_spec.rb
|
146
|
+
- spec/integration/jdbcstreaming_spec.rb
|