logstash-filter-jdbc_streaming 1.0.7 → 1.0.9
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +10 -2
- data/docs/index.asciidoc +108 -20
- data/lib/logstash/filters/jdbc_streaming.rb +73 -79
- data/lib/logstash/plugin_mixins/jdbc_streaming/cache_payload.rb +28 -0
- data/lib/logstash/plugin_mixins/jdbc_streaming/parameter_handler.rb +64 -0
- data/lib/logstash/plugin_mixins/jdbc_streaming/statement_handler.rb +143 -0
- data/lib/logstash/plugin_mixins/jdbc_streaming/wrapped_driver.rb +46 -0
- data/lib/logstash/plugin_mixins/jdbc_streaming.rb +50 -5
- data/logstash-filter-jdbc_streaming.gemspec +1 -1
- data/spec/filters/jdbc_streaming_spec.rb +163 -91
- data/spec/integration/jdbcstreaming_spec.rb +37 -5
- metadata +6 -2
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 677e0e5ec0f5045d139d29bc159e197d7f771391e303a11f0de12b707e54d224
|
4
|
+
data.tar.gz: af5f3e19476f2789a65afd5e5e10faa48a632b667ba0c5ca2ffc030720a23562
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 82b751b0170789cf6b27939030976f3c315d53221e82dcc5862ab1cbca43cdf7df2086ee2f9649aa2da469dbbac5a49d058fe6095c47e4f8228a4a6c0ba50c7a
|
7
|
+
data.tar.gz: a096d52b59659d6aee46c764567a9131bb077da1e13aed1e492884f10ace4050448f88750e433d689d509e3642286865d756df3e0d6e8713370cd9f6c2d3d2e3
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,13 @@
|
|
1
|
+
## 1.0.9
|
2
|
+
- Added support for prepared statements [PR 32](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/pull/32)
|
3
|
+
- Added support for `sequel_opts` to pass options to the 3rd party Sequel library.
|
4
|
+
|
5
|
+
## 1.0.8
|
6
|
+
- Added support for driver loading in JDK 9+ [Issue 25](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/issues/25)
|
7
|
+
- Added support for multiple driver jars [Issue #21](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/issues/21)
|
8
|
+
|
1
9
|
## 1.0.7
|
2
|
-
- Fixed formatting in documentation [#17](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/pull/17) and [#28](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/pull/28)
|
10
|
+
- Fixed formatting in documentation [#17](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/pull/17) and [#28](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/pull/28)
|
3
11
|
|
4
12
|
## 1.0.6
|
5
13
|
- Fixes connection leak in pipeline reloads by properly disconnecting on plugin close
|
@@ -8,7 +16,7 @@
|
|
8
16
|
- [#11](https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/pull/11) Swap out mysql for postgresql for testing
|
9
17
|
|
10
18
|
## 1.0.4
|
11
|
-
- [JDBC input - #263](https://github.com/logstash-plugins/logstash-input-jdbc/issues/263) Load the driver with the system class loader. Fixes issue loading some JDBC drivers in Logstash 6.2+
|
19
|
+
- [JDBC input - #263](https://github.com/logstash-plugins/logstash-input-jdbc/issues/263) Load the driver with the system class loader. Fixes issue loading some JDBC drivers in Logstash 6.2+
|
12
20
|
|
13
21
|
## 1.0.3
|
14
22
|
- Update gemspec summary
|
data/docs/index.asciidoc
CHANGED
@@ -42,6 +42,47 @@ filter {
|
|
42
42
|
}
|
43
43
|
-----
|
44
44
|
|
45
|
+
[id="plugins-{type}s-{plugin}-prepared_statements"]
|
46
|
+
==== Prepared Statements
|
47
|
+
|
48
|
+
Using server side prepared statements can speed up execution times as the server optimises the query plan and execution.
|
49
|
+
|
50
|
+
NOTE: Not all JDBC accessible technologies will support prepared statements.
|
51
|
+
|
52
|
+
With the introduction of Prepared Statement support comes a different code execution path and some new settings. Most of the existing settings are still useful but there are several new settings for Prepared Statements to read up on.
|
53
|
+
|
54
|
+
Use the boolean setting `use_prepared_statements` to enable this execution mode.
|
55
|
+
|
56
|
+
Use the `prepared_statement_name` setting to specify a name for the Prepared Statement, this identifies the prepared statement locally and remotely and it should be unique in your config and on the database.
|
57
|
+
|
58
|
+
Use the `prepared_statement_bind_values` array setting to specify the bind values. Typically, these values are indirectly extracted from your event, i.e. the string in the array refers to a field name in your event. You can also use constant values like numbers or strings but ensure that any string constants (e.g. a locale constant of "en" or "de") is not also an event field name. It is a good idea to use the bracketed field reference syntax for fields and normal strings for constants, e.g. `prepared_statement_bind_values => ["[src_ip]", "tokyo"],`.
|
59
|
+
|
60
|
+
There are 3 possible parameter schemes. Interpolated, field references and constants. Use interpolation when you are prefixing, suffixing or concatenating field values to create a value that exists in your database, e.g. "%{username}@%{domain}" -> "alice@example.org", "%{distance}km" -> "42km". Use field references for exact field values e.g. "[srcip]" -> "192.168.1.2". Use constants when a database column holds values that slice or categorise a number of similar records e.g. language translations.
|
61
|
+
|
62
|
+
A boolean setting `prepared_statement_warn_on_constant_usage`, defaulting to true, controls whether you will see a WARN message logged that warns when constants could be missing the bracketed field reference syntax. If you have set your field references and constants correctly you should set `prepared_statement_warn_on_constant_usage` to false. This setting and code checks should be deprecated in a future major Logstash release.
|
63
|
+
|
64
|
+
The `statement` (or `statement_path`) setting still holds the SQL statement but to use bind variables you must use the `?` character as a placeholder in the exact order found in the `prepared_statement_bind_values` array.
|
65
|
+
Some technologies may require connection string properties to be set, see MySQL example below.
|
66
|
+
|
67
|
+
Example:
|
68
|
+
[source,ruby]
|
69
|
+
-----
|
70
|
+
filter {
|
71
|
+
jdbc_streaming {
|
72
|
+
jdbc_driver_library => "/path/to/mysql-connector-java-5.1.34-bin.jar"
|
73
|
+
jdbc_driver_class => "com.mysql.jdbc.Driver"
|
74
|
+
jdbc_connection_string => "jdbc:mysql://localhost:3306/mydatabase?cachePrepStmts=true&prepStmtCacheSize=250&prepStmtCacheSqlLimit=2048&useServerPrepStmts=true"
|
75
|
+
jdbc_user => "me"
|
76
|
+
jdbc_password => "secret"
|
77
|
+
statement => "select * from WORLD.COUNTRY WHERE Code = ?"
|
78
|
+
use_prepared_statements => true
|
79
|
+
prepared_statement_name => "lookup_country_info"
|
80
|
+
prepared_statement_bind_values => ["[country_code]"]
|
81
|
+
target => "country_details"
|
82
|
+
}
|
83
|
+
}
|
84
|
+
-----
|
85
|
+
|
45
86
|
[id="plugins-{type}s-{plugin}-options"]
|
46
87
|
==== Jdbc_streaming Filter Configuration Options
|
47
88
|
|
@@ -61,11 +102,16 @@ This plugin supports the following configuration options plus the <<plugins-{typ
|
|
61
102
|
| <<plugins-{type}s-{plugin}-jdbc_validate_connection>> |<<boolean,boolean>>|No
|
62
103
|
| <<plugins-{type}s-{plugin}-jdbc_validation_timeout>> |<<number,number>>|No
|
63
104
|
| <<plugins-{type}s-{plugin}-parameters>> |<<hash,hash>>|No
|
105
|
+
| <<plugins-{type}s-{plugin}-prepared_statement_bind_values>> |<<array,array>>|No
|
106
|
+
| <<plugins-{type}s-{plugin}-prepared_statement_name>> |<<string,string>>|No
|
107
|
+
| <<plugins-{type}s-{plugin}-prepared_statement_warn_on_constant_usage>> |<<boolean,boolean>>|No
|
108
|
+
| <<plugins-{type}s-{plugin}-sequel_opts>> |<<hash,hash>>|No
|
64
109
|
| <<plugins-{type}s-{plugin}-statement>> |<<string,string>>|Yes
|
65
110
|
| <<plugins-{type}s-{plugin}-tag_on_default_use>> |<<array,array>>|No
|
66
111
|
| <<plugins-{type}s-{plugin}-tag_on_failure>> |<<array,array>>|No
|
67
112
|
| <<plugins-{type}s-{plugin}-target>> |<<string,string>>|Yes
|
68
113
|
| <<plugins-{type}s-{plugin}-use_cache>> |<<boolean,boolean>>|No
|
114
|
+
| <<plugins-{type}s-{plugin}-use_prepared_statements>> |<<boolean,boolean>>|No
|
69
115
|
|=======================================================================
|
70
116
|
|
71
117
|
Also see <<plugins-{type}s-{plugin}-common-options>> for a list of options supported by all
|
@@ -74,7 +120,7 @@ filter plugins.
|
|
74
120
|
|
75
121
|
|
76
122
|
[id="plugins-{type}s-{plugin}-cache_expiration"]
|
77
|
-
===== `cache_expiration`
|
123
|
+
===== `cache_expiration`
|
78
124
|
|
79
125
|
* Value type is <<number,number>>
|
80
126
|
* Default value is `5.0`
|
@@ -90,7 +136,7 @@ any external problem that would cause jdbc errors will not be noticed for the
|
|
90
136
|
cache_expiration period.
|
91
137
|
|
92
138
|
[id="plugins-{type}s-{plugin}-cache_size"]
|
93
|
-
===== `cache_size`
|
139
|
+
===== `cache_size`
|
94
140
|
|
95
141
|
* Value type is <<number,number>>
|
96
142
|
* Default value is `500`
|
@@ -99,7 +145,7 @@ The maximum number of cache entries that will be stored. Defaults to 500 entries
|
|
99
145
|
The least recently used entry will be evicted.
|
100
146
|
|
101
147
|
[id="plugins-{type}s-{plugin}-default_hash"]
|
102
|
-
===== `default_hash`
|
148
|
+
===== `default_hash`
|
103
149
|
|
104
150
|
* Value type is <<hash,hash>>
|
105
151
|
* Default value is `{}`
|
@@ -108,7 +154,7 @@ Define a default object to use when lookup fails to return a matching row.
|
|
108
154
|
Ensure that the key names of this object match the columns from the statement.
|
109
155
|
|
110
156
|
[id="plugins-{type}s-{plugin}-jdbc_connection_string"]
|
111
|
-
===== `jdbc_connection_string`
|
157
|
+
===== `jdbc_connection_string`
|
112
158
|
|
113
159
|
* This is a required setting.
|
114
160
|
* Value type is <<string,string>>
|
@@ -117,7 +163,7 @@ Ensure that the key names of this object match the columns from the statement.
|
|
117
163
|
JDBC connection string
|
118
164
|
|
119
165
|
[id="plugins-{type}s-{plugin}-jdbc_driver_class"]
|
120
|
-
===== `jdbc_driver_class`
|
166
|
+
===== `jdbc_driver_class`
|
121
167
|
|
122
168
|
* This is a required setting.
|
123
169
|
* Value type is <<string,string>>
|
@@ -126,19 +172,15 @@ JDBC connection string
|
|
126
172
|
JDBC driver class to load, for example "oracle.jdbc.OracleDriver" or "org.apache.derby.jdbc.ClientDriver"
|
127
173
|
|
128
174
|
[id="plugins-{type}s-{plugin}-jdbc_driver_library"]
|
129
|
-
===== `jdbc_driver_library`
|
175
|
+
===== `jdbc_driver_library`
|
130
176
|
|
131
177
|
* Value type is <<path,path>>
|
132
178
|
* There is no default value for this setting.
|
133
179
|
|
134
|
-
Tentative of abstracting JDBC logic to a mixin
|
135
|
-
for potential reuse in other plugins (input/output).
|
136
|
-
This method is called when someone includes this module.
|
137
|
-
Add these methods to the 'base' given.
|
138
180
|
JDBC driver library path to third party driver library.
|
139
181
|
|
140
182
|
[id="plugins-{type}s-{plugin}-jdbc_password"]
|
141
|
-
===== `jdbc_password`
|
183
|
+
===== `jdbc_password`
|
142
184
|
|
143
185
|
* Value type is <<password,password>>
|
144
186
|
* There is no default value for this setting.
|
@@ -146,7 +188,7 @@ JDBC driver library path to third party driver library.
|
|
146
188
|
JDBC password
|
147
189
|
|
148
190
|
[id="plugins-{type}s-{plugin}-jdbc_user"]
|
149
|
-
===== `jdbc_user`
|
191
|
+
===== `jdbc_user`
|
150
192
|
|
151
193
|
* Value type is <<string,string>>
|
152
194
|
* There is no default value for this setting.
|
@@ -154,7 +196,7 @@ JDBC password
|
|
154
196
|
JDBC user
|
155
197
|
|
156
198
|
[id="plugins-{type}s-{plugin}-jdbc_validate_connection"]
|
157
|
-
===== `jdbc_validate_connection`
|
199
|
+
===== `jdbc_validate_connection`
|
158
200
|
|
159
201
|
* Value type is <<boolean,boolean>>
|
160
202
|
* Default value is `false`
|
@@ -163,7 +205,7 @@ Connection pool configuration.
|
|
163
205
|
Validate connection before use.
|
164
206
|
|
165
207
|
[id="plugins-{type}s-{plugin}-jdbc_validation_timeout"]
|
166
|
-
===== `jdbc_validation_timeout`
|
208
|
+
===== `jdbc_validation_timeout`
|
167
209
|
|
168
210
|
* Value type is <<number,number>>
|
169
211
|
* Default value is `3600`
|
@@ -172,15 +214,55 @@ Connection pool configuration.
|
|
172
214
|
How often to validate a connection (in seconds).
|
173
215
|
|
174
216
|
[id="plugins-{type}s-{plugin}-parameters"]
|
175
|
-
===== `parameters`
|
217
|
+
===== `parameters`
|
176
218
|
|
177
219
|
* Value type is <<hash,hash>>
|
178
220
|
* Default value is `{}`
|
179
221
|
|
180
222
|
Hash of query parameter, for example `{ "id" => "id_field" }`.
|
181
223
|
|
224
|
+
[id="plugins-{type}s-{plugin}-prepared_statement_bind_values"]
|
225
|
+
===== `prepared_statement_bind_values`
|
226
|
+
|
227
|
+
* Value type is <<array,array>>
|
228
|
+
* Default value is `[]`
|
229
|
+
|
230
|
+
Array of bind values for the prepared statement. Use field references and constants. See the section on <<plugins-{type}s-{plugin}-prepared_statements,prepared_statements>> for more info.
|
231
|
+
|
232
|
+
[id="plugins-{type}s-{plugin}-prepared_statement_name"]
|
233
|
+
===== `prepared_statement_name`
|
234
|
+
|
235
|
+
* Value type is <<string,string>>
|
236
|
+
* Default value is `""`
|
237
|
+
|
238
|
+
Name given to the prepared statement. It must be unique in your config and in the database.
|
239
|
+
You need to supply this if `use_prepared_statements` is true.
|
240
|
+
|
241
|
+
[id="plugins-{type}s-{plugin}-prepared_statement_warn_on_constant_usage"]
|
242
|
+
===== `prepared_statement_warn_on_constant_usage`
|
243
|
+
|
244
|
+
* Value type is <<boolean,boolean>>
|
245
|
+
* Default value is `true`
|
246
|
+
|
247
|
+
A flag that controls whether a warning is logged if, in `prepared_statement_bind_values`,
|
248
|
+
a String constant is detected that might be intended as a field reference.
|
249
|
+
|
250
|
+
[id="plugins-{type}s-{plugin}-sequel_opts"]
|
251
|
+
===== `sequel_opts`
|
252
|
+
|
253
|
+
* Value type is <<hash,hash>>
|
254
|
+
* Default value is `{}`
|
255
|
+
|
256
|
+
General/Vendor-specific Sequel configuration options
|
257
|
+
|
258
|
+
An example of an optional connection pool configuration
|
259
|
+
max_connections - The maximum number of connections the connection pool
|
260
|
+
|
261
|
+
examples of vendor-specific options can be found in this documentation page:
|
262
|
+
https://github.com/jeremyevans/sequel/blob/master/doc/opening_databases.rdoc
|
263
|
+
|
182
264
|
[id="plugins-{type}s-{plugin}-statement"]
|
183
|
-
===== `statement`
|
265
|
+
===== `statement`
|
184
266
|
|
185
267
|
* This is a required setting.
|
186
268
|
* Value type is <<string,string>>
|
@@ -190,7 +272,7 @@ Statement to execute.
|
|
190
272
|
To use parameters, use named parameter syntax, for example "SELECT * FROM MYTABLE WHERE ID = :id".
|
191
273
|
|
192
274
|
[id="plugins-{type}s-{plugin}-tag_on_default_use"]
|
193
|
-
===== `tag_on_default_use`
|
275
|
+
===== `tag_on_default_use`
|
194
276
|
|
195
277
|
* Value type is <<array,array>>
|
196
278
|
* Default value is `["_jdbcstreamingdefaultsused"]`
|
@@ -198,7 +280,7 @@ To use parameters, use named parameter syntax, for example "SELECT * FROM MYTABL
|
|
198
280
|
Append values to the `tags` field if no record was found and default values were used.
|
199
281
|
|
200
282
|
[id="plugins-{type}s-{plugin}-tag_on_failure"]
|
201
|
-
===== `tag_on_failure`
|
283
|
+
===== `tag_on_failure`
|
202
284
|
|
203
285
|
* Value type is <<array,array>>
|
204
286
|
* Default value is `["_jdbcstreamingfailure"]`
|
@@ -206,7 +288,7 @@ Append values to the `tags` field if no record was found and default values were
|
|
206
288
|
Append values to the `tags` field if sql error occurred.
|
207
289
|
|
208
290
|
[id="plugins-{type}s-{plugin}-target"]
|
209
|
-
===== `target`
|
291
|
+
===== `target`
|
210
292
|
|
211
293
|
* This is a required setting.
|
212
294
|
* Value type is <<string,string>>
|
@@ -216,14 +298,20 @@ Define the target field to store the extracted result(s).
|
|
216
298
|
Field is overwritten if exists.
|
217
299
|
|
218
300
|
[id="plugins-{type}s-{plugin}-use_cache"]
|
219
|
-
===== `use_cache`
|
301
|
+
===== `use_cache`
|
220
302
|
|
221
303
|
* Value type is <<boolean,boolean>>
|
222
304
|
* Default value is `true`
|
223
305
|
|
224
306
|
Enable or disable caching, boolean true or false. Defaults to true.
|
225
307
|
|
308
|
+
[id="plugins-{type}s-{plugin}-use_prepared_statements"]
|
309
|
+
===== `use_prepared_statements`
|
310
|
+
|
311
|
+
* Value type is <<boolean,boolean>>
|
312
|
+
* Default value is `false`
|
226
313
|
|
314
|
+
When set to `true`, enables prepare statement usage
|
227
315
|
|
228
316
|
[id="plugins-{type}s-{plugin}-common-options"]
|
229
317
|
include::{include_path}/{type}.asciidoc[]
|
@@ -2,6 +2,9 @@
|
|
2
2
|
require "logstash/filters/base"
|
3
3
|
require "logstash/namespace"
|
4
4
|
require "logstash/plugin_mixins/jdbc_streaming"
|
5
|
+
require "logstash/plugin_mixins/jdbc_streaming/cache_payload"
|
6
|
+
require "logstash/plugin_mixins/jdbc_streaming/statement_handler"
|
7
|
+
require "logstash/plugin_mixins/jdbc_streaming/parameter_handler"
|
5
8
|
require "lru_redux"
|
6
9
|
|
7
10
|
# This filter executes a SQL query and store the result set in the field
|
@@ -24,49 +27,25 @@ require "lru_redux"
|
|
24
27
|
# }
|
25
28
|
# }
|
26
29
|
#
|
30
|
+
# Prepared Statement Mode example
|
31
|
+
#
|
32
|
+
# [source,ruby]
|
33
|
+
# filter {
|
34
|
+
# jdbc_streaming {
|
35
|
+
# jdbc_driver_library => "/path/to/mysql-connector-java-5.1.34-bin.jar"
|
36
|
+
# jdbc_driver_class => "com.mysql.jdbc.Driver"
|
37
|
+
# jdbc_connection_string => ""jdbc:mysql://localhost:3306/mydatabase"
|
38
|
+
# jdbc_user => "me"
|
39
|
+
# jdbc_password => "secret"
|
40
|
+
# statement => "select * from WORLD.COUNTRY WHERE Code = ?"
|
41
|
+
# use_prepared_statements => true
|
42
|
+
# prepared_statement_name => "get_country_from_code"
|
43
|
+
# prepared_statement_bind_values => ["[country_code]"]
|
44
|
+
# target => "country_details"
|
45
|
+
# }
|
46
|
+
# }
|
47
|
+
#
|
27
48
|
module LogStash module Filters class JdbcStreaming < LogStash::Filters::Base
|
28
|
-
class CachePayload
|
29
|
-
attr_reader :payload
|
30
|
-
def initialize
|
31
|
-
@failure = false
|
32
|
-
@payload = []
|
33
|
-
end
|
34
|
-
|
35
|
-
def push(data)
|
36
|
-
@payload << data
|
37
|
-
end
|
38
|
-
|
39
|
-
def failed!
|
40
|
-
@failure = true
|
41
|
-
end
|
42
|
-
|
43
|
-
def failed?
|
44
|
-
@failure
|
45
|
-
end
|
46
|
-
|
47
|
-
def empty?
|
48
|
-
@payload.empty?
|
49
|
-
end
|
50
|
-
end
|
51
|
-
|
52
|
-
class RowCache
|
53
|
-
def initialize(size, ttl)
|
54
|
-
@cache = ::LruRedux::TTL::ThreadSafeCache.new(size, ttl)
|
55
|
-
end
|
56
|
-
|
57
|
-
def get(parameters)
|
58
|
-
@cache.getset(parameters) { yield }
|
59
|
-
end
|
60
|
-
end
|
61
|
-
|
62
|
-
class NoCache
|
63
|
-
def initialize(size, ttl) end
|
64
|
-
|
65
|
-
def get(statement)
|
66
|
-
yield
|
67
|
-
end
|
68
|
-
end
|
69
|
-
|
70
49
|
include LogStash::PluginMixins::JdbcStreaming
|
71
50
|
|
72
51
|
config_name "jdbc_streaming"
|
@@ -108,16 +87,41 @@ module LogStash module Filters class JdbcStreaming < LogStash::Filters::Base
|
|
108
87
|
# The least recently used entry will be evicted
|
109
88
|
config :cache_size, :validate => :number, :default => 500
|
110
89
|
|
90
|
+
config :use_prepared_statements, :validate => :boolean, :default => false
|
91
|
+
config :prepared_statement_name, :validate => :string, :default => ""
|
92
|
+
config :prepared_statement_bind_values, :validate => :array, :default => []
|
93
|
+
config :prepared_statement_warn_on_constant_usage, :validate => :boolean, :default => true # deprecate in a future major LS release
|
94
|
+
|
95
|
+
# Options hash to pass to Sequel
|
96
|
+
config :sequel_opts, :validate => :hash, :default => {}
|
97
|
+
|
98
|
+
attr_reader :prepared_statement_constant_warned # for test verification, remove when warning is deprecated and removed
|
99
|
+
|
111
100
|
# ----------------------------------------
|
112
101
|
public
|
113
102
|
|
114
103
|
def register
|
115
104
|
convert_config_options
|
116
|
-
|
105
|
+
if @use_prepared_statements
|
106
|
+
validation_errors = validate_prepared_statement_mode
|
107
|
+
unless validation_errors.empty?
|
108
|
+
raise(LogStash::ConfigurationError, "Prepared Statement Mode validation errors: " + validation_errors.join(", "))
|
109
|
+
end
|
110
|
+
else
|
111
|
+
# symbolise and wrap value in parameter handler
|
112
|
+
unless @parameters.values.all?{|v| v.is_a?(PluginMixins::JdbcStreaming::ParameterHandler)}
|
113
|
+
@parameters = parameters.inject({}) do |hash,(k,value)|
|
114
|
+
hash[k.to_sym] = PluginMixins::JdbcStreaming::ParameterHandler.build_parameter_handler(value)
|
115
|
+
hash
|
116
|
+
end
|
117
|
+
end
|
118
|
+
end
|
119
|
+
@statement_handler = LogStash::PluginMixins::JdbcStreaming::StatementHandler.build_statement_handler(self)
|
120
|
+
prepare_jdbc_connection
|
117
121
|
end
|
118
122
|
|
119
123
|
def filter(event)
|
120
|
-
result = cache_lookup(event) # should return a
|
124
|
+
result = @statement_handler.cache_lookup(@database, event) # should return a CachePayload instance
|
121
125
|
|
122
126
|
if result.failed?
|
123
127
|
tag_failure(event)
|
@@ -140,35 +144,6 @@ module LogStash module Filters class JdbcStreaming < LogStash::Filters::Base
|
|
140
144
|
# ----------------------------------------
|
141
145
|
private
|
142
146
|
|
143
|
-
def cache_lookup(event)
|
144
|
-
params = prepare_parameters_from_event(event)
|
145
|
-
@cache.get(params) do
|
146
|
-
result = CachePayload.new
|
147
|
-
begin
|
148
|
-
query = @database[@statement, params] # returns a dataset
|
149
|
-
@logger.debug? && @logger.debug("Executing JDBC query", :statement => @statement, :parameters => params)
|
150
|
-
query.all do |row|
|
151
|
-
result.push row.inject({}){|hash,(k,v)| hash[k.to_s] = v; hash} #Stringify row keys
|
152
|
-
end
|
153
|
-
rescue ::Sequel::Error => e
|
154
|
-
# all sequel errors are a subclass of this, let all other standard or runtime errors bubble up
|
155
|
-
result.failed!
|
156
|
-
@logger.warn? && @logger.warn("Exception when executing JDBC query", :exception => e)
|
157
|
-
end
|
158
|
-
# if either of: no records or a Sequel exception occurs the payload is
|
159
|
-
# empty and the default can be substituted later.
|
160
|
-
result
|
161
|
-
end
|
162
|
-
end
|
163
|
-
|
164
|
-
def prepare_parameters_from_event(event)
|
165
|
-
@symbol_parameters.inject({}) do |hash,(k,v)|
|
166
|
-
value = event.get(event.sprintf(v))
|
167
|
-
hash[k] = value.is_a?(::LogStash::Timestamp) ? value.time : value
|
168
|
-
hash
|
169
|
-
end
|
170
|
-
end
|
171
|
-
|
172
147
|
def tag_failure(event)
|
173
148
|
@tag_on_failure.each do |tag|
|
174
149
|
event.tag(tag)
|
@@ -190,13 +165,32 @@ module LogStash module Filters class JdbcStreaming < LogStash::Filters::Base
|
|
190
165
|
def convert_config_options
|
191
166
|
# create these object once they will be cloned for every filter call anyway,
|
192
167
|
# lets not create a new object for each
|
193
|
-
@symbol_parameters = @parameters.inject({}) {|hash,(k,v)| hash[k.to_sym] = v ; hash }
|
194
168
|
@default_array = [@default_hash]
|
195
169
|
end
|
196
170
|
|
197
|
-
def
|
198
|
-
|
199
|
-
|
200
|
-
|
171
|
+
def validate_prepared_statement_mode
|
172
|
+
@prepared_statement_constant_warned = false
|
173
|
+
error_messages = []
|
174
|
+
if @prepared_statement_name.empty?
|
175
|
+
error_messages << "must provide a name for the Prepared Statement, it must be unique for the db session"
|
176
|
+
end
|
177
|
+
if @statement.count("?") != @prepared_statement_bind_values.size
|
178
|
+
# mismatch in number of bind value elements to placeholder characters
|
179
|
+
error_messages << "there is a mismatch between the number of statement `?` placeholders and :prepared_statement_bind_values array setting elements"
|
180
|
+
end
|
181
|
+
unless @prepared_statement_bind_values.all?{|v| v.is_a?(PluginMixins::JdbcStreaming::ParameterHandler)}
|
182
|
+
@prepared_statement_bind_values = prepared_statement_bind_values.map do |value|
|
183
|
+
ParameterHandler.build_bind_value_handler(value)
|
184
|
+
end
|
185
|
+
end
|
186
|
+
if prepared_statement_warn_on_constant_usage
|
187
|
+
warnables = @prepared_statement_bind_values.select {|handler| handler.is_a?(PluginMixins::JdbcStreaming::ConstantParameter) && handler.given_value.is_a?(String)}
|
188
|
+
unless warnables.empty?
|
189
|
+
@prepared_statement_constant_warned = true
|
190
|
+
msg = "When using prepared statements, the following `prepared_statement_bind_values` will be treated as constants, if you intend them to be field references please use the square bracket field reference syntax e.g. '[field]'"
|
191
|
+
logger.warn(msg, :constants => warnables)
|
192
|
+
end
|
193
|
+
end
|
194
|
+
error_messages
|
201
195
|
end
|
202
|
-
end end end # class LogStash::Filters::
|
196
|
+
end end end # class LogStash::Filters::JdbcStreaming
|
@@ -0,0 +1,28 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
|
3
|
+
module LogStash module PluginMixins module JdbcStreaming
|
4
|
+
class CachePayload
|
5
|
+
attr_reader :payload
|
6
|
+
|
7
|
+
def initialize
|
8
|
+
@failure = false
|
9
|
+
@payload = []
|
10
|
+
end
|
11
|
+
|
12
|
+
def push(data)
|
13
|
+
@payload << data
|
14
|
+
end
|
15
|
+
|
16
|
+
def failed!
|
17
|
+
@failure = true
|
18
|
+
end
|
19
|
+
|
20
|
+
def failed?
|
21
|
+
@failure
|
22
|
+
end
|
23
|
+
|
24
|
+
def empty?
|
25
|
+
@payload.empty?
|
26
|
+
end
|
27
|
+
end
|
28
|
+
end end end
|
@@ -0,0 +1,64 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
|
3
|
+
module LogStash module PluginMixins module JdbcStreaming
|
4
|
+
class ParameterHandler
|
5
|
+
|
6
|
+
def self.build_parameter_handler(given_value)
|
7
|
+
# does it really make sense to deal with normal parameters differently?
|
8
|
+
handler = FieldParameter.new(given_value)
|
9
|
+
return handler unless given_value.is_a?(String)
|
10
|
+
|
11
|
+
first_percent_curly = given_value.index("%{")
|
12
|
+
if first_percent_curly && given_value.index("}", first_percent_curly)
|
13
|
+
return InterpolatedParameter.new(given_value)
|
14
|
+
end
|
15
|
+
|
16
|
+
handler
|
17
|
+
end
|
18
|
+
|
19
|
+
def self.build_bind_value_handler(given_value)
|
20
|
+
handler = ConstantParameter.new(given_value)
|
21
|
+
|
22
|
+
return handler unless given_value.is_a?(String) # allow non String constants
|
23
|
+
|
24
|
+
first_percent_curly = given_value.index("%{")
|
25
|
+
if first_percent_curly && given_value.index("}", first_percent_curly)
|
26
|
+
return InterpolatedParameter.new(given_value)
|
27
|
+
end
|
28
|
+
|
29
|
+
if given_value =~ /\A\s*\[[^\]]+\]\s*\z/
|
30
|
+
return FieldParameter.new(given_value)
|
31
|
+
end
|
32
|
+
|
33
|
+
handler
|
34
|
+
end
|
35
|
+
|
36
|
+
attr_reader :given_value
|
37
|
+
|
38
|
+
def initialize(given_value)
|
39
|
+
@given_value = given_value
|
40
|
+
end
|
41
|
+
|
42
|
+
def extract_from(event)
|
43
|
+
# override in subclass
|
44
|
+
end
|
45
|
+
end
|
46
|
+
|
47
|
+
class InterpolatedParameter < ParameterHandler
|
48
|
+
def extract_from(event)
|
49
|
+
event.sprintf(@given_value)
|
50
|
+
end
|
51
|
+
end
|
52
|
+
|
53
|
+
class FieldParameter < ParameterHandler
|
54
|
+
def extract_from(event)
|
55
|
+
event.get(@given_value)
|
56
|
+
end
|
57
|
+
end
|
58
|
+
|
59
|
+
class ConstantParameter < ParameterHandler
|
60
|
+
def extract_from(event)
|
61
|
+
@given_value
|
62
|
+
end
|
63
|
+
end
|
64
|
+
end end end
|
@@ -0,0 +1,143 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/util/loggable"
|
3
|
+
|
4
|
+
module LogStash module PluginMixins module JdbcStreaming
|
5
|
+
# so as to not clash with the class of the same name and function in the jdbc input
|
6
|
+
# this is in the `module JdbcStreaming` namespace
|
7
|
+
# this duplication can be removed in a universal plugin
|
8
|
+
|
9
|
+
class StatementHandler
|
10
|
+
def self.build_statement_handler(plugin)
|
11
|
+
klass = plugin.use_prepared_statements ? PreparedStatementHandler : NormalStatementHandler
|
12
|
+
klass.new(plugin)
|
13
|
+
end
|
14
|
+
|
15
|
+
attr_reader :statement, :parameters, :cache
|
16
|
+
|
17
|
+
def initialize(plugin)
|
18
|
+
@statement = plugin.statement
|
19
|
+
klass = plugin.use_cache ? RowCache : NoCache
|
20
|
+
@cache = klass.new(plugin.cache_size, plugin.cache_expiration)
|
21
|
+
post_init(plugin)
|
22
|
+
end
|
23
|
+
|
24
|
+
# Get from cache or performs remote lookup and saves to cache
|
25
|
+
# @param db [Sequel::Database]
|
26
|
+
# @param event [LogStash::Event]
|
27
|
+
# @returnparam [CachePayload]
|
28
|
+
def cache_lookup(db, event)
|
29
|
+
# override in subclass
|
30
|
+
end
|
31
|
+
|
32
|
+
private
|
33
|
+
|
34
|
+
def common_cache_lookup(db, event)
|
35
|
+
params = prepare_parameters_from_event(event)
|
36
|
+
@cache.get(params) do
|
37
|
+
result = CachePayload.new
|
38
|
+
begin
|
39
|
+
logger.debug? && logger.debug("Executing JDBC query", :statement => statement, :parameters => params)
|
40
|
+
execute_extract_records(db, params, result)
|
41
|
+
rescue ::Sequel::Error => e
|
42
|
+
# all sequel errors are a subclass of this, let all other standard or runtime errors bubble up
|
43
|
+
result.failed!
|
44
|
+
logger.warn? && logger.warn("Exception when executing JDBC query", :statement => statement, :parameters => params, :exception => e)
|
45
|
+
end
|
46
|
+
# if either of: no records or a Sequel exception occurs the payload is
|
47
|
+
# empty and the default can be substituted later.
|
48
|
+
result
|
49
|
+
end
|
50
|
+
end
|
51
|
+
|
52
|
+
def execute_extract_records(db, params, result)
|
53
|
+
# override in subclass
|
54
|
+
end
|
55
|
+
|
56
|
+
def post_init(plugin)
|
57
|
+
# override in subclass, if needed
|
58
|
+
end
|
59
|
+
|
60
|
+
def prepare_parameters_from_event(event)
|
61
|
+
@parameters.inject({}) do |hash, (k, parameter_handler)|
|
62
|
+
# defer to appropriate parameter handler
|
63
|
+
value = parameter_handler.extract_from(event)
|
64
|
+
hash[k] = value.is_a?(::LogStash::Timestamp) ? value.time : value
|
65
|
+
hash
|
66
|
+
end
|
67
|
+
end
|
68
|
+
end
|
69
|
+
|
70
|
+
class NormalStatementHandler < StatementHandler
|
71
|
+
include LogStash::Util::Loggable
|
72
|
+
|
73
|
+
# Get from cache or performs remote lookup and saves to cache
|
74
|
+
# @param db [Sequel::Database]
|
75
|
+
# @param event [LogStash::Event]
|
76
|
+
# @returnparam [CachePayload]
|
77
|
+
def cache_lookup(db, event)
|
78
|
+
common_cache_lookup(db, event)
|
79
|
+
end
|
80
|
+
|
81
|
+
private
|
82
|
+
|
83
|
+
def execute_extract_records(db, params, result)
|
84
|
+
dataset = db[statement, params] # returns a Sequel dataset
|
85
|
+
dataset.all do |row|
|
86
|
+
result.push row.inject({}){|hash,(k,v)| hash[k.to_s] = v; hash} # Stringify row keys
|
87
|
+
end
|
88
|
+
end
|
89
|
+
|
90
|
+
def post_init(plugin)
|
91
|
+
@parameters = plugin.parameters
|
92
|
+
end
|
93
|
+
end
|
94
|
+
|
95
|
+
class PreparedStatementHandler < StatementHandler
|
96
|
+
include LogStash::Util::Loggable
|
97
|
+
attr_reader :name, :bind_values_array, :statement_prepared, :prepared
|
98
|
+
|
99
|
+
# Get from cache or performs remote lookup and saves to cache
|
100
|
+
# @param db [Sequel::Database]
|
101
|
+
# @param event [LogStash::Event]
|
102
|
+
# @returnparam [CachePayload]
|
103
|
+
def cache_lookup(db, event)
|
104
|
+
build_prepared_statement(db)
|
105
|
+
common_cache_lookup(db, event)
|
106
|
+
end
|
107
|
+
|
108
|
+
private
|
109
|
+
|
110
|
+
def execute_extract_records(db, params, result)
|
111
|
+
records = db.call(name, params) # returns an array of hashes
|
112
|
+
records.each do |row|
|
113
|
+
result.push row.inject({}){|hash,(k,v)| hash[k.to_s] = v; hash} #Stringify row keys
|
114
|
+
end
|
115
|
+
end
|
116
|
+
|
117
|
+
def post_init(plugin)
|
118
|
+
@name = plugin.prepared_statement_name.to_sym
|
119
|
+
@bind_values_array = plugin.prepared_statement_bind_values
|
120
|
+
@statement_prepared = Concurrent::AtomicBoolean.new(false)
|
121
|
+
@parameters = create_bind_values_hash
|
122
|
+
end
|
123
|
+
|
124
|
+
def build_prepared_statement(db)
|
125
|
+
# create prepared statement on first use
|
126
|
+
if statement_prepared.false?
|
127
|
+
prepended = parameters.keys.map{|v| v.to_s.prepend("$").to_sym}
|
128
|
+
@prepared = db[statement, *prepended].prepare(:select, name)
|
129
|
+
statement_prepared.make_true
|
130
|
+
end
|
131
|
+
# make sure the Sequel database instance has the prepared statement
|
132
|
+
if db.prepared_statement(name).nil?
|
133
|
+
db.set_prepared_statement(name, prepared)
|
134
|
+
end
|
135
|
+
end
|
136
|
+
|
137
|
+
def create_bind_values_hash
|
138
|
+
hash = {}
|
139
|
+
bind_values_array.each_with_index {|v,i| hash[:"p#{i}"] = v}
|
140
|
+
hash
|
141
|
+
end
|
142
|
+
end
|
143
|
+
end end end
|
@@ -0,0 +1,46 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
|
3
|
+
module LogStash module PluginMixins module JdbcStreaming
|
4
|
+
class WrappedDriver
|
5
|
+
java_implements java.sql.Driver
|
6
|
+
|
7
|
+
def initialize(drv)
|
8
|
+
@driver = drv
|
9
|
+
end
|
10
|
+
|
11
|
+
java_signature 'boolean acceptsURL(String u) throws SQLException'
|
12
|
+
def accepts_url(u)
|
13
|
+
@driver.accepts_url(u)
|
14
|
+
end
|
15
|
+
|
16
|
+
java_signature 'Connection connect(String u, Properties p)'
|
17
|
+
def connect(url, props)
|
18
|
+
@driver.connect(url, props)
|
19
|
+
end
|
20
|
+
|
21
|
+
java_signature 'int getMajorVersion()'
|
22
|
+
def get_major_version()
|
23
|
+
@driver.get_major_version()
|
24
|
+
end
|
25
|
+
|
26
|
+
java_signature 'int getMinorVersion()'
|
27
|
+
def get_minor_version()
|
28
|
+
@driver.get_minor_version()
|
29
|
+
end
|
30
|
+
|
31
|
+
java_signature 'DriverPropertyInfo[] getPropertyInfo(String u, Properties p)'
|
32
|
+
def get_property_info(url, props)
|
33
|
+
@driver.get_property_info(url, props)
|
34
|
+
end
|
35
|
+
|
36
|
+
java_signature 'boolean jdbcCompliant()'
|
37
|
+
def jdbc_compliant()
|
38
|
+
@driver.jdbc_compliant
|
39
|
+
end
|
40
|
+
|
41
|
+
java_signature 'Logger getParentLogger() throws SQLFeatureNotSupportedException'
|
42
|
+
def get_parent_logger()
|
43
|
+
@driver.get_parent_logger
|
44
|
+
end
|
45
|
+
end
|
46
|
+
end end end
|
@@ -1,9 +1,27 @@
|
|
1
1
|
# encoding: utf-8
|
2
2
|
require "logstash/config/mixin"
|
3
|
+
require_relative "jdbc_streaming/wrapped_driver"
|
3
4
|
|
4
5
|
# Tentative of abstracting JDBC logic to a mixin
|
5
6
|
# for potential reuse in other plugins (input/output)
|
6
7
|
module LogStash module PluginMixins module JdbcStreaming
|
8
|
+
class RowCache
|
9
|
+
def initialize(size, ttl)
|
10
|
+
@cache = ::LruRedux::TTL::ThreadSafeCache.new(size, ttl)
|
11
|
+
end
|
12
|
+
|
13
|
+
def get(parameters)
|
14
|
+
@cache.getset(parameters) { yield }
|
15
|
+
end
|
16
|
+
end
|
17
|
+
|
18
|
+
class NoCache
|
19
|
+
def initialize(size, ttl) end
|
20
|
+
|
21
|
+
def get(statement)
|
22
|
+
yield
|
23
|
+
end
|
24
|
+
end
|
7
25
|
|
8
26
|
# This method is called when someone includes this module
|
9
27
|
def self.included(base)
|
@@ -38,19 +56,46 @@ module LogStash module PluginMixins module JdbcStreaming
|
|
38
56
|
config :jdbc_validation_timeout, :validate => :number, :default => 3600
|
39
57
|
end
|
40
58
|
|
59
|
+
private
|
60
|
+
|
61
|
+
def load_drivers
|
62
|
+
return if @jdbc_driver_library.nil? || @jdbc_driver_library.empty?
|
63
|
+
driver_jars = @jdbc_driver_library.split(",")
|
64
|
+
|
65
|
+
# Needed for JDK 11 as the DriverManager has a different ClassLoader than Logstash
|
66
|
+
urls = java.net.URL[driver_jars.length].new
|
67
|
+
driver_jars.each_with_index do |driver, idx|
|
68
|
+
urls[idx] = java.io.File.new(driver).toURI().toURL()
|
69
|
+
end
|
70
|
+
ucl = java.net.URLClassLoader.new_instance(urls)
|
71
|
+
begin
|
72
|
+
klass = java.lang.Class.forName(@jdbc_driver_class.to_java(:string), true, ucl);
|
73
|
+
rescue Java::JavaLang::ClassNotFoundException => e
|
74
|
+
raise LogStash::Error, "Unable to find driver class via URLClassLoader in given driver jars: #{@jdbc_driver_class}"
|
75
|
+
end
|
76
|
+
begin
|
77
|
+
driver = klass.getConstructor().newInstance();
|
78
|
+
java.sql.DriverManager.register_driver(WrappedDriver.new(driver.to_java(java.sql.Driver)).to_java(java.sql.Driver))
|
79
|
+
rescue Java::JavaSql::SQLException => e
|
80
|
+
raise LogStash::Error, "Unable to register driver with java.sql.DriverManager using WrappedDriver: #{@jdbc_driver_class}"
|
81
|
+
end
|
82
|
+
|
83
|
+
end
|
84
|
+
|
41
85
|
public
|
42
86
|
def prepare_jdbc_connection
|
43
87
|
require "sequel"
|
44
88
|
require "sequel/adapters/jdbc"
|
45
89
|
require "java"
|
46
90
|
|
47
|
-
|
48
|
-
|
49
|
-
|
50
|
-
|
91
|
+
load_drivers
|
92
|
+
|
93
|
+
@sequel_opts_symbols = @sequel_opts.inject({}) {|hash, (k,v)| hash[k.to_sym] = v; hash}
|
94
|
+
@sequel_opts_symbols[:user] = @jdbc_user unless @jdbc_user.nil? || @jdbc_user.empty?
|
95
|
+
@sequel_opts_symbols[:password] = @jdbc_password.value unless @jdbc_password.nil?
|
51
96
|
|
52
97
|
Sequel::JDBC.load_driver(@jdbc_driver_class)
|
53
|
-
@database = Sequel.connect(@jdbc_connection_string,
|
98
|
+
@database = Sequel.connect(@jdbc_connection_string, @sequel_opts_symbols)
|
54
99
|
if @jdbc_validate_connection
|
55
100
|
@database.extension(:connection_validator)
|
56
101
|
@database.pool.connection_validation_timeout = @jdbc_validation_timeout
|
@@ -1,6 +1,6 @@
|
|
1
1
|
Gem::Specification.new do |s|
|
2
2
|
s.name = 'logstash-filter-jdbc_streaming'
|
3
|
-
s.version = '1.0.
|
3
|
+
s.version = '1.0.9'
|
4
4
|
s.licenses = ['Apache License (2.0)']
|
5
5
|
s.summary = "Enrich events with your database data"
|
6
6
|
s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
|
@@ -27,127 +27,91 @@ module LogStash module Filters
|
|
27
27
|
let(:cache_expiration) { 3.0 }
|
28
28
|
let(:use_cache) { true }
|
29
29
|
let(:cache_size) { 10 }
|
30
|
-
let(:statement) { "SELECT name, location FROM reference_table WHERE ip = :ip" }
|
31
|
-
let(:settings) do
|
32
|
-
{
|
33
|
-
"statement" => statement,
|
34
|
-
"parameters" => {"ip" => "ip"},
|
35
|
-
"target" => "server",
|
36
|
-
"use_cache" => use_cache,
|
37
|
-
"cache_expiration" => cache_expiration,
|
38
|
-
"cache_size" => cache_size,
|
39
|
-
"tag_on_failure" => ["lookup_failed"],
|
40
|
-
"tag_on_default_use" => ["default_used_instead"],
|
41
|
-
"default_hash" => {"name" => "unknown", "location" => "unknown"}
|
42
|
-
}
|
43
|
-
end
|
44
30
|
|
45
31
|
before :each do
|
46
32
|
db.create_table :reference_table do
|
47
|
-
String
|
33
|
+
String :ip
|
48
34
|
String :name
|
49
35
|
String :location
|
36
|
+
Integer :gcode
|
50
37
|
end
|
51
|
-
db[:reference_table].insert(:ip => "10.1.1.1", :name => "ldn-server-1", :location => "LDN-2-3-4")
|
52
|
-
db[:reference_table].insert(:ip => "10.2.1.1", :name => "nyc-server-1", :location => "NYC-5-2-8")
|
53
|
-
db[:reference_table].insert(:ip => "10.3.1.1", :name => "mv-server-1", :location => "MV-9-6-4")
|
54
|
-
|
38
|
+
db[:reference_table].insert(:ip => "10.1.1.1", :name => "ldn-server-1", :location => "LDN-2-3-4", :gcode => 3)
|
39
|
+
db[:reference_table].insert(:ip => "10.2.1.1", :name => "nyc-server-1", :location => "NYC-5-2-8", :gcode => 1)
|
40
|
+
db[:reference_table].insert(:ip => "10.3.1.1", :name => "mv-server-1", :location => "MV-9-6-4", :gcode => 1)
|
41
|
+
db[:reference_table].insert(:ip => "10.4.1.1", :name => "sf-server-1", :location => "SF-9-5-4", :gcode => 1)
|
42
|
+
db[:reference_table].insert(:ip => "10.4.1.1", :name => "mtl-server-1", :location => "MTL-9-3-4", :gcode => 2)
|
55
43
|
end
|
56
44
|
|
57
45
|
after :each do
|
58
46
|
db.drop_table(:reference_table)
|
59
47
|
end
|
60
48
|
|
61
|
-
|
62
|
-
|
63
|
-
|
64
|
-
it "fills in the target" do
|
65
|
-
plugin.filter(event)
|
66
|
-
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
67
|
-
expect(event.get("tags") || []).not_to include("lookup_failed")
|
68
|
-
expect(event.get("tags") || []).not_to include("default_used_instead")
|
49
|
+
context "Normal Mode" do
|
50
|
+
before :each do
|
51
|
+
plugin.register
|
69
52
|
end
|
70
|
-
end
|
71
|
-
|
72
|
-
describe "missing record - uses default" do
|
73
|
-
let(:ipaddr) { "192.168.1.1" }
|
74
53
|
|
75
|
-
|
76
|
-
|
77
|
-
|
78
|
-
|
79
|
-
|
80
|
-
|
81
|
-
|
82
|
-
|
83
|
-
|
84
|
-
|
85
|
-
|
86
|
-
|
87
|
-
|
88
|
-
|
89
|
-
end
|
90
|
-
end
|
91
|
-
|
92
|
-
context "when fetching from cache" do
|
93
|
-
let(:plugin) { TestJdbcStreaming.new(mixin_settings.merge(settings)) }
|
94
|
-
let(:events) do
|
95
|
-
5.times.map{|i| ::LogStash::Event.new("message" => "some other text #{i}", "ip" => ipaddr) }
|
96
|
-
end
|
97
|
-
let(:call_count) { 1 }
|
98
|
-
before(:each) do
|
99
|
-
expect(plugin.database).to receive(:[]).exactly(call_count).times.and_call_original
|
100
|
-
plugin.filter(event)
|
54
|
+
let(:statement) { "SELECT name, location FROM reference_table WHERE ip = :ip" }
|
55
|
+
let(:settings) do
|
56
|
+
{
|
57
|
+
"statement" => statement,
|
58
|
+
"parameters" => {"ip" => "ip"},
|
59
|
+
"target" => "server",
|
60
|
+
"use_cache" => use_cache,
|
61
|
+
"cache_expiration" => cache_expiration,
|
62
|
+
"cache_size" => cache_size,
|
63
|
+
"tag_on_failure" => ["lookup_failed"],
|
64
|
+
"tag_on_default_use" => ["default_used_instead"],
|
65
|
+
"default_hash" => {"name" => "unknown", "location" => "unknown"},
|
66
|
+
"sequel_opts" => {"pool_timeout" => 600}
|
67
|
+
}
|
101
68
|
end
|
102
69
|
|
103
|
-
describe "found record -
|
70
|
+
describe "found record - uses row" do
|
104
71
|
let(:ipaddr) { "10.1.1.1" }
|
105
|
-
|
72
|
+
|
73
|
+
it "fills in the target" do
|
74
|
+
plugin.filter(event)
|
106
75
|
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
107
76
|
expect(event.get("tags") || []).not_to include("lookup_failed")
|
108
77
|
expect(event.get("tags") || []).not_to include("default_used_instead")
|
109
|
-
events.each do |evt|
|
110
|
-
plugin.filter(evt)
|
111
|
-
expect(evt.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
112
|
-
end
|
113
78
|
end
|
114
79
|
end
|
115
80
|
|
116
81
|
describe "missing record - uses default" do
|
117
|
-
let(:ipaddr) { "
|
118
|
-
|
82
|
+
let(:ipaddr) { "192.168.1.1" }
|
83
|
+
|
84
|
+
it "fills in the target with the default" do
|
85
|
+
plugin.filter(event)
|
119
86
|
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
120
87
|
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
121
|
-
events.each do |evt|
|
122
|
-
plugin.filter(evt)
|
123
|
-
expect(evt.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
124
|
-
end
|
125
88
|
end
|
126
89
|
end
|
127
90
|
|
128
|
-
|
129
|
-
|
130
|
-
|
131
|
-
|
132
|
-
|
133
|
-
|
134
|
-
|
135
|
-
expect(event.get("tags") || []).not_to include("lookup_failed")
|
136
|
-
expect(event.get("tags") || []).not_to include("default_used_instead")
|
137
|
-
events.each do |evt|
|
138
|
-
plugin.filter(evt)
|
139
|
-
expect(evt.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
140
|
-
end
|
141
|
-
end
|
91
|
+
describe "database error - uses default" do
|
92
|
+
let(:ipaddr) { "10.1.1.1" }
|
93
|
+
let(:statement) { "SELECT name, location FROM reference_table WHERE ip = :address" }
|
94
|
+
it "fills in the target with the default" do
|
95
|
+
plugin.filter(event)
|
96
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
97
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["lookup_failed", "default_used_instead"])
|
142
98
|
end
|
143
99
|
end
|
144
100
|
|
145
|
-
context "when
|
146
|
-
let(:
|
147
|
-
let(:
|
148
|
-
|
101
|
+
context "when fetching from cache" do
|
102
|
+
let(:plugin) { TestJdbcStreaming.new(mixin_settings.merge(settings)) }
|
103
|
+
let(:events) do
|
104
|
+
5.times.map{|i| ::LogStash::Event.new("message" => "some other text #{i}", "ip" => ipaddr) }
|
105
|
+
end
|
106
|
+
let(:call_count) { 1 }
|
107
|
+
before(:each) do
|
108
|
+
expect(plugin.database).to receive(:[]).exactly(call_count).times.and_call_original
|
109
|
+
plugin.filter(event)
|
110
|
+
end
|
111
|
+
|
112
|
+
describe "found record - caches row" do
|
149
113
|
let(:ipaddr) { "10.1.1.1" }
|
150
|
-
it "calls the database
|
114
|
+
it "calls the database once then uses the cache" do
|
151
115
|
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
152
116
|
expect(event.get("tags") || []).not_to include("lookup_failed")
|
153
117
|
expect(event.get("tags") || []).not_to include("default_used_instead")
|
@@ -158,9 +122,9 @@ module LogStash module Filters
|
|
158
122
|
end
|
159
123
|
end
|
160
124
|
|
161
|
-
describe "
|
162
|
-
let(:ipaddr) { "10.
|
163
|
-
it "calls the database
|
125
|
+
describe "missing record - uses default" do
|
126
|
+
let(:ipaddr) { "10.10.1.1" }
|
127
|
+
it "calls the database once then uses the cache" do
|
164
128
|
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
165
129
|
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
166
130
|
events.each do |evt|
|
@@ -169,6 +133,113 @@ module LogStash module Filters
|
|
169
133
|
end
|
170
134
|
end
|
171
135
|
end
|
136
|
+
|
137
|
+
context "extremely small cache expiration" do
|
138
|
+
describe "found record - cache always expires" do
|
139
|
+
let(:ipaddr) { "10.1.1.1" }
|
140
|
+
let(:call_count) { 6 }
|
141
|
+
let(:cache_expiration) { 0.0000001 }
|
142
|
+
it "calls the database each time because cache entry expired" do
|
143
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
144
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
145
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
146
|
+
events.each do |evt|
|
147
|
+
plugin.filter(evt)
|
148
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
149
|
+
end
|
150
|
+
end
|
151
|
+
end
|
152
|
+
end
|
153
|
+
|
154
|
+
context "when cache is disabled" do
|
155
|
+
let(:call_count) { 6 }
|
156
|
+
let(:use_cache) { false }
|
157
|
+
describe "database is always called" do
|
158
|
+
let(:ipaddr) { "10.1.1.1" }
|
159
|
+
it "calls the database each time" do
|
160
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
161
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
162
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
163
|
+
events.each do |evt|
|
164
|
+
plugin.filter(evt)
|
165
|
+
expect(evt.get("server")).to eq([{"name" => "ldn-server-1", "location" => "LDN-2-3-4"}])
|
166
|
+
end
|
167
|
+
end
|
168
|
+
end
|
169
|
+
|
170
|
+
describe "database is always called but record is missing and default is used" do
|
171
|
+
let(:ipaddr) { "10.11.1.1" }
|
172
|
+
it "calls the database each time" do
|
173
|
+
expect(event.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
174
|
+
expect(event.get("tags") & ["lookup_failed", "default_used_instead"]).to eq(["default_used_instead"])
|
175
|
+
events.each do |evt|
|
176
|
+
plugin.filter(evt)
|
177
|
+
expect(evt.get("server")).to eq([{"name" => "unknown", "location" => "unknown"}])
|
178
|
+
end
|
179
|
+
end
|
180
|
+
end
|
181
|
+
end
|
182
|
+
end
|
183
|
+
end
|
184
|
+
|
185
|
+
context "Prepared Statement Mode" do
|
186
|
+
let(:statement) { "SELECT name, location FROM reference_table WHERE (ip = ?) AND (gcode = ?)" }
|
187
|
+
let(:settings) do
|
188
|
+
{
|
189
|
+
"statement" => statement,
|
190
|
+
"use_prepared_statements" => true,
|
191
|
+
"prepared_statement_name" => "lookup_ip",
|
192
|
+
"prepared_statement_bind_values" => ["[ip]", 2],
|
193
|
+
"target" => "server",
|
194
|
+
"use_cache" => use_cache,
|
195
|
+
"cache_expiration" => cache_expiration,
|
196
|
+
"cache_size" => cache_size,
|
197
|
+
"tag_on_failure" => ["lookup_failed"],
|
198
|
+
"tag_on_default_use" => ["default_used_instead"],
|
199
|
+
"default_hash" => {"name" => "unknown", "location" => "unknown"},
|
200
|
+
"sequel_opts" => {"pool_timeout" => 600}
|
201
|
+
}
|
202
|
+
end
|
203
|
+
|
204
|
+
describe "using one variable and one constant, found record - uses row" do
|
205
|
+
let(:ipaddr) { "10.4.1.1" }
|
206
|
+
|
207
|
+
it "fills in the target" do
|
208
|
+
plugin.register
|
209
|
+
expect(plugin.prepared_statement_constant_warned).to be_falsey
|
210
|
+
plugin.filter(event)
|
211
|
+
expect(event.get("server")).to eq([{"name" => "mtl-server-1", "location" => "MTL-9-3-4"}])
|
212
|
+
expect(event.get("tags") || []).not_to include("lookup_failed")
|
213
|
+
expect(event.get("tags") || []).not_to include("default_used_instead")
|
214
|
+
end
|
215
|
+
end
|
216
|
+
|
217
|
+
describe "fails empty name validation" do
|
218
|
+
before :each do
|
219
|
+
settings["prepared_statement_name"] = ""
|
220
|
+
end
|
221
|
+
it "should fail to register" do
|
222
|
+
expect{ plugin.register }.to raise_error(LogStash::ConfigurationError)
|
223
|
+
end
|
224
|
+
end
|
225
|
+
|
226
|
+
describe "fails parameter mismatch validation" do
|
227
|
+
before :each do
|
228
|
+
settings["prepared_statement_bind_values"] = ["[ip]"]
|
229
|
+
end
|
230
|
+
it "should fail to register" do
|
231
|
+
expect{ plugin.register }.to raise_error(LogStash::ConfigurationError)
|
232
|
+
end
|
233
|
+
end
|
234
|
+
|
235
|
+
describe "warns on constant usage" do
|
236
|
+
before :each do
|
237
|
+
settings["prepared_statement_bind_values"] = ["ip", 2]
|
238
|
+
end
|
239
|
+
it "should set the warning logged flag" do
|
240
|
+
plugin.register
|
241
|
+
expect(plugin.prepared_statement_constant_warned).to be_truthy
|
242
|
+
end
|
172
243
|
end
|
173
244
|
end
|
174
245
|
end
|
@@ -273,5 +344,6 @@ module LogStash module Filters
|
|
273
344
|
expect(subject.get('new_field')).to eq([{"1" => 'from_database'}])
|
274
345
|
end
|
275
346
|
end
|
347
|
+
|
276
348
|
end
|
277
349
|
end end
|
@@ -14,13 +14,18 @@ module LogStash module Filters
|
|
14
14
|
::Jdbc::Postgres.load_driver
|
15
15
|
|
16
16
|
ENV["TZ"] = "Etc/UTC"
|
17
|
+
|
18
|
+
# For Travis and CI based on docker, we source from ENV
|
19
|
+
jdbc_connection_string = ENV.fetch("PG_CONNECTION_STRING",
|
20
|
+
"jdbc:postgresql://localhost:5432") + "/jdbc_streaming_db?user=postgres"
|
21
|
+
|
17
22
|
let(:mixin_settings) do
|
18
|
-
{ "
|
19
|
-
"jdbc_connection_string" =>
|
23
|
+
{ "jdbc_driver_class" => "org.postgresql.Driver",
|
24
|
+
"jdbc_connection_string" => jdbc_connection_string
|
25
|
+
}
|
20
26
|
end
|
21
|
-
let(:settings) { {} }
|
22
27
|
let(:plugin) { JdbcStreaming.new(mixin_settings.merge(settings)) }
|
23
|
-
let
|
28
|
+
let(:db) do
|
24
29
|
::Sequel.connect(mixin_settings['jdbc_connection_string'])
|
25
30
|
end
|
26
31
|
let(:event) { ::LogStash::Event.new("message" => "some text", "ip" => ipaddr) }
|
@@ -38,7 +43,8 @@ module LogStash module Filters
|
|
38
43
|
"cache_size" => cache_size,
|
39
44
|
"tag_on_failure" => ["lookup_failed"],
|
40
45
|
"tag_on_default_use" => ["default_used_instead"],
|
41
|
-
"default_hash" => {"name" => "unknown", "location" => "unknown"}
|
46
|
+
"default_hash" => {"name" => "unknown", "location" => "unknown"},
|
47
|
+
"sequel_opts" => {"pool_timeout" => 600}
|
42
48
|
}
|
43
49
|
end
|
44
50
|
let(:ipaddr) { "10.#{idx}.1.1" }
|
@@ -67,6 +73,32 @@ module LogStash module Filters
|
|
67
73
|
end
|
68
74
|
end
|
69
75
|
|
76
|
+
describe "In Prepared Statement mode, found record - uses row" do
|
77
|
+
let(:idx) { 200 }
|
78
|
+
let(:statement) { "SELECT name, location FROM reference_table WHERE ip = ?" }
|
79
|
+
let(:settings) do
|
80
|
+
{
|
81
|
+
"statement" => statement,
|
82
|
+
"use_prepared_statements" => true,
|
83
|
+
"prepared_statement_name" => "lookup_ip",
|
84
|
+
"prepared_statement_bind_values" => ["[ip]"],
|
85
|
+
"target" => "server",
|
86
|
+
"use_cache" => use_cache,
|
87
|
+
"cache_expiration" => cache_expiration,
|
88
|
+
"cache_size" => cache_size,
|
89
|
+
"tag_on_failure" => ["lookup_failed"],
|
90
|
+
"tag_on_default_use" => ["default_used_instead"],
|
91
|
+
"default_hash" => {"name" => "unknown", "location" => "unknown"},
|
92
|
+
"sequel_opts" => {"pool_timeout" => 600}
|
93
|
+
}
|
94
|
+
end
|
95
|
+
it "fills in the target" do
|
96
|
+
plugin.filter(event)
|
97
|
+
expect(event.get("server")).to eq([{"name" => "ldn-server-#{idx}", "location" => "LDN-#{idx}-2-3"}])
|
98
|
+
expect((event.get("tags") || []) & ["lookup_failed", "default_used_instead"]).to be_empty
|
99
|
+
end
|
100
|
+
end
|
101
|
+
|
70
102
|
context "when fetching from cache" do
|
71
103
|
let(:plugin) { TestJdbcStreaming.new(mixin_settings.merge(settings)) }
|
72
104
|
let(:events) do
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-filter-jdbc_streaming
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 1.0.
|
4
|
+
version: 1.0.9
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Elastic
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2019-
|
11
|
+
date: 2019-10-24 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
requirement: !ruby/object:Gem::Requirement
|
@@ -115,6 +115,10 @@ files:
|
|
115
115
|
- docs/index.asciidoc
|
116
116
|
- lib/logstash/filters/jdbc_streaming.rb
|
117
117
|
- lib/logstash/plugin_mixins/jdbc_streaming.rb
|
118
|
+
- lib/logstash/plugin_mixins/jdbc_streaming/cache_payload.rb
|
119
|
+
- lib/logstash/plugin_mixins/jdbc_streaming/parameter_handler.rb
|
120
|
+
- lib/logstash/plugin_mixins/jdbc_streaming/statement_handler.rb
|
121
|
+
- lib/logstash/plugin_mixins/jdbc_streaming/wrapped_driver.rb
|
118
122
|
- logstash-filter-jdbc_streaming.gemspec
|
119
123
|
- spec/filters/jdbc_streaming_spec.rb
|
120
124
|
- spec/integration/jdbcstreaming_spec.rb
|