logstash-integration-jdbc 5.3.0 → 5.4.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: d9937a8c1c5ab26acb4cd887f4bdb9dfff51e9348c7187ef2d44ff6055d137ee
4
- data.tar.gz: 0042c3e29b5b08abe8baa4f84e4c240960bbc615778df9c77415f8db7985c831
3
+ metadata.gz: 71bdfca5d1e5e86b7d86b5c49caf237cfc882e31dab78d63e169241e1c96814d
4
+ data.tar.gz: 5c07e44042c4a6e6e29b8ecc7fb7631147100a27777c6fdd2a22015ac7fd3b4e
5
5
  SHA512:
6
- metadata.gz: dbc38871c106b045a500645cf6c4517b30ce15e52bfc68520845e22276efdb4a21df3e4a7709ae682cd6b55ff76e063a615b9fe397795272535aa7c5298a9a92
7
- data.tar.gz: 70b8e1dc84ba60d478affb22c93400cb153d4d0adc4904667970220ac03cb9d22b94e28f1b1a5504a30b8592c543d3234ee441d0ed57a31aefef70dfb6c2b9a3
6
+ metadata.gz: 53e45e47383b15f559e58969e96c0217c43ceb5ba3528db5d6ce3bad10628da6e4c14442aeb5a8795e17b667ac558e5b1377134bb0ecc5e60a375e6ebe106b2c
7
+ data.tar.gz: d33a2cdea75e56fbe338e175276679e2910b4023c956bc8df7d0795f58140b39da1b9167d668c7665e27b2577c086409b3e390b2a3f8d86c4f9a3e3cdef343e3
data/CHANGELOG.md CHANGED
@@ -1,3 +1,11 @@
1
+ ## 5.4.1
2
+ - Bugfix leak which happened in creating a new Database pool for every query. The pool is now crated on registration and closed on plugin's `stop` [#119](https://github.com/logstash-plugins/logstash-integration-jdbc/pull/119)
3
+
4
+ ## 5.4.0
5
+ - Ambiguous Timestamp Support [#92](https://github.com/logstash-plugins/logstash-integration-jdbc/pull/92)
6
+ - FIX: when encountering an ambiguous timestamp, the JDBC Input no longer crashes
7
+ - Added support for disambiguating timestamps in daylight saving time (DST) overlap periods
8
+
1
9
  ## 5.3.0
2
10
  - Refactor: start using scheduler mixin [#110](https://github.com/logstash-plugins/logstash-integration-jdbc/pull/110)
3
11
 
@@ -21,7 +21,7 @@ include::{include_path}/plugin_header-integration.asciidoc[]
21
21
 
22
22
  ==== Description
23
23
 
24
- This filter executes a SQL query and store the result set in the field
24
+ This filter executes a SQL query and stores the result set in the field
25
25
  specified as `target`.
26
26
  It will cache the results locally in an LRU cache with expiry.
27
27
 
@@ -286,10 +286,14 @@ JDBC connection string
286
286
  ===== `jdbc_default_timezone`
287
287
 
288
288
  * Value type is <<string,string>>
289
+ ** Value should be a canonical timezone or offset, such as `Europe/Paris` or `Etc/GMT+3`
290
+ ** Value _may_ include square-bracketed extensions, such as `America/Denver[dst_enabled_on_overlap:true]`
289
291
  * There is no default value for this setting.
290
292
 
291
- Timezone conversion.
292
- Logstash (and Elasticsearch) expects that timestamps are expressed in UTC terms.
293
+ [id="plugins-{type}s-{plugin}-jdbc_timezone_conv"]
294
+ ====== Timezone conversion
295
+
296
+ Logstash and Elasticsearch expect timestamps to be expressed in UTC terms.
293
297
  If your database has recorded timestamps that are relative to another timezone,
294
298
  the database timezone if you will, then set this setting to be the timezone that
295
299
  the database is using. However, as SQL does not allow for timezone data in
@@ -299,7 +303,15 @@ in relative UTC time in ISO8601 format.
299
303
 
300
304
  Using this setting will manually assign a specified timezone offset, instead
301
305
  of using the timezone setting of the local machine. You must use a canonical
302
- timezone, *America/Denver*, for example.
306
+ timezone, `America/Denver`, for example.
307
+
308
+ [id="plugins-{type}s-{plugin}-jdbc_ambiguous_timestamps"]
309
+ ===== Ambiguous timestamps
310
+
311
+ While it is common to store local times in SQL's timestamp column type, many timezones change their offset during the course of a calendar year and therefore cannot be used with SQL's timestamp type to represent an ordered, continuous timeline.
312
+ For example in the `America/Chicago` zone when daylight saving time (DST) ends in the autumn, the clock rolls from `01:59:59` back to `01:00:00`, making any timestamp in the 2-hour period between `01:00:00CDT` and `02:00:00CST` on that day ambiguous.
313
+
314
+ When encountering an ambiguous timestamp caused by a DST transition, the query will fail unless the timezone specified here includes a square-bracketed instruction for how to handle overlapping periods (such as: `America/Chicago[dst_enabled_on_overlap:true]` or `Australia/Melbourne[dst_enabled_on_overlap:false]`).
303
315
 
304
316
  [id="plugins-{type}s-{plugin}-plugin_timezone"]
305
317
  ===== `plugin_timezone`
@@ -261,8 +261,6 @@ module LogStash module Inputs class Jdbc < LogStash::Inputs::Base
261
261
  end
262
262
  end
263
263
 
264
- prepare_jdbc_connection
265
-
266
264
  if @use_column_value
267
265
  # Raise an error if @use_column_value is true, but no @tracking_column is set
268
266
  if @tracking_column.nil?
@@ -305,6 +303,20 @@ module LogStash module Inputs class Jdbc < LogStash::Inputs::Base
305
303
  converters[encoding] = converter
306
304
  end
307
305
  end
306
+
307
+ load_driver
308
+ begin
309
+ open_jdbc_connection
310
+ rescue Sequel::DatabaseConnectionError,
311
+ Sequel::DatabaseError,
312
+ Sequel::InvalidValue,
313
+ Java::JavaSql::SQLException => e
314
+ details = { exception: e.class, message: e.message }
315
+ details[:cause] = e.cause.inspect if e.cause
316
+ details[:backtrace] = e.backtrace if @logger.debug?
317
+ @logger.warn("Exception when executing JDBC query", details)
318
+ raise(LogStash::ConfigurationError, "Can't create a connection pool to the database")
319
+ end
308
320
  end # def register
309
321
 
310
322
  # test injection points
@@ -317,7 +329,6 @@ module LogStash module Inputs class Jdbc < LogStash::Inputs::Base
317
329
  end
318
330
 
319
331
  def run(queue)
320
- load_driver
321
332
  if @schedule
322
333
  # scheduler input thread name example: "[my-oracle]|input|jdbc|scheduler"
323
334
  scheduler.cron(@schedule) { execute_query(queue) }
@@ -4,6 +4,7 @@ require "logstash/config/mixin"
4
4
  require "time"
5
5
  require "date"
6
6
  require_relative "value_tracking"
7
+ require_relative "timezone_proxy"
7
8
  require_relative "statement_handler"
8
9
 
9
10
  java_import java.util.concurrent.locks.ReentrantLock
@@ -82,7 +83,8 @@ module LogStash module PluginMixins module Jdbc
82
83
  # Using this setting will manually assign a specified timezone offset, instead
83
84
  # of using the timezone setting of the local machine. You must use a canonical
84
85
  # timezone, *America/Denver*, for example.
85
- config :jdbc_default_timezone, :validate => :string
86
+ config :jdbc_default_timezone, :validate => :jdbc_timezone_spec
87
+ extend TimezoneProxy::JDBCTimezoneSpecValidator
86
88
 
87
89
  # General/Vendor-specific Sequel configuration options.
88
90
  #
@@ -157,7 +159,7 @@ module LogStash module PluginMixins module Jdbc
157
159
  @database.extension(:pagination)
158
160
  if @jdbc_default_timezone
159
161
  @database.extension(:named_timezones)
160
- @database.timezone = @jdbc_default_timezone
162
+ @database.timezone = TimezoneProxy.load(@jdbc_default_timezone)
161
163
  end
162
164
  if @jdbc_validate_connection
163
165
  @database.extension(:connection_validator)
@@ -186,22 +188,14 @@ module LogStash module PluginMixins module Jdbc
186
188
  end
187
189
  end
188
190
 
189
- public
190
- def prepare_jdbc_connection
191
- @connection_lock = ReentrantLock.new
192
- end
193
-
194
191
  public
195
192
  def close_jdbc_connection
196
193
  begin
197
194
  # pipeline restarts can also close the jdbc connection, block until the current executing statement is finished to avoid leaking connections
198
195
  # connections in use won't really get closed
199
- @connection_lock.lock
200
196
  @database.disconnect if @database
201
197
  rescue => e
202
198
  @logger.warn("Failed to close connection", :exception => e)
203
- ensure
204
- @connection_lock.unlock
205
199
  end
206
200
  end
207
201
 
@@ -209,8 +203,6 @@ module LogStash module PluginMixins module Jdbc
209
203
  def execute_statement
210
204
  success = false
211
205
  begin
212
- @connection_lock.lock
213
- open_jdbc_connection
214
206
  sql_last_value = @use_column_value ? @value_tracker.value : Time.now.utc
215
207
  @tracking_column_warning_sent = false
216
208
  @statement_handler.perform_query(@database, @value_tracker.value) do |row|
@@ -218,16 +210,16 @@ module LogStash module PluginMixins module Jdbc
218
210
  yield extract_values_from(row)
219
211
  end
220
212
  success = true
221
- rescue Sequel::DatabaseConnectionError, Sequel::DatabaseError, Java::JavaSql::SQLException => e
213
+ rescue Sequel::DatabaseConnectionError,
214
+ Sequel::DatabaseError,
215
+ Sequel::InvalidValue,
216
+ Java::JavaSql::SQLException => e
222
217
  details = { exception: e.class, message: e.message }
223
218
  details[:cause] = e.cause.inspect if e.cause
224
219
  details[:backtrace] = e.backtrace if @logger.debug?
225
220
  @logger.warn("Exception when executing JDBC query", details)
226
221
  else
227
222
  @value_tracker.set_value(sql_last_value)
228
- ensure
229
- close_jdbc_connection
230
- @connection_lock.unlock
231
223
  end
232
224
  return success
233
225
  end
@@ -0,0 +1,61 @@
1
+ # encoding: utf-8
2
+
3
+ require 'tzinfo'
4
+
5
+ module LogStash module PluginMixins module Jdbc
6
+ ##
7
+ # This `TimezoneProxy` allows timezone specs to include extensions indicating preference for ambiguous handling.
8
+ # @see TimezoneProxy::parse
9
+ module TimezoneProxy
10
+ ##
11
+ # @param timezone_spec [String]: a timezone spec, consisting of any valid timezone identifier
12
+ # followed by square-bracketed extensions. Currently-supported
13
+ # extensions are:
14
+ # `dst_enabled_on_overlap:(true|false)`: when encountering an ambiguous time
15
+ # due to daylight-savings transition,
16
+ # assume DST to be either enabled or
17
+ # disabled instead of raising an
18
+ # AmbiguousTime exception
19
+ # @return [TZInfo::Timezone]
20
+ def self.load(timezone_spec)
21
+ # re-load pass-through
22
+ return timezone_spec if timezone_spec.kind_of?(::TZInfo::Timezone)
23
+
24
+ parsed_spec = /\A(?<name>[^\[]+)(\[(?<extensions>[^\]]*)\])?\z/.match(timezone_spec)
25
+
26
+ timezone = ::TZInfo::Timezone.get(parsed_spec[:name])
27
+ return timezone unless parsed_spec[:extensions]
28
+
29
+ parsed_spec[:extensions].split(';').each do |extension_spec|
30
+ timezone = case extension_spec
31
+ when 'dst_enabled_on_overlap:true' then timezone.dup.extend(PeriodForLocalWithDSTPreference::ON)
32
+ when 'dst_enabled_on_overlap:false' then timezone.dup.extend(PeriodForLocalWithDSTPreference::OFF)
33
+ else fail(ArgumentError, "Invalid timezone extension `#{extension_spec}`")
34
+ end
35
+ end
36
+
37
+ timezone
38
+ end
39
+
40
+ module JDBCTimezoneSpecValidator
41
+ def validate_value(value, validator_name)
42
+ return super(value, validator_name) unless validator_name == :jdbc_timezone_spec
43
+
44
+ [true, TimezoneProxy.load(value)] rescue [false, $!.message]
45
+ end
46
+ end
47
+
48
+ ##
49
+ # @api private
50
+ class PeriodForLocalWithDSTPreference < Module
51
+ def initialize(default_dst_enabled_on_overlap)
52
+ define_method(:period_for_local) do |localtime, dst_enabled_on_overlap=nil, &dismabiguation_block|
53
+ super(localtime, dst_enabled_on_overlap.nil? ? default_dst_enabled_on_overlap : dst_enabled_on_overlap, &dismabiguation_block)
54
+ end
55
+ end
56
+
57
+ ON = new(true)
58
+ OFF = new(false)
59
+ end
60
+ end
61
+ end; end; end
@@ -14,7 +14,7 @@ module LogStash module PluginMixins module Jdbc
14
14
  # use this irrespective of the jdbc_default_timezone setting
15
15
  NumericValueTracker.new(handler)
16
16
  else
17
- if plugin.jdbc_default_timezone.nil? || plugin.jdbc_default_timezone.empty?
17
+ if plugin.jdbc_default_timezone.nil?
18
18
  # no TZ stuff for Sequel, use Time
19
19
  TimeValueTracker.new(handler)
20
20
  else
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-integration-jdbc'
3
- s.version = '5.3.0'
3
+ s.version = '5.4.1'
4
4
  s.licenses = ['Apache License (2.0)']
5
5
  s.summary = "Integration with JDBC - input and filter plugins"
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -76,10 +76,8 @@ describe LogStash::Inputs::Jdbc, :integration => true do
76
76
  end
77
77
 
78
78
  it "should not register correctly" do
79
- plugin.register
80
- q = Queue.new
81
79
  expect do
82
- plugin.run(q)
80
+ plugin.register
83
81
  end.to raise_error(::LogStash::PluginLoadingError)
84
82
  end
85
83
  end
@@ -92,16 +90,13 @@ describe LogStash::Inputs::Jdbc, :integration => true do
92
90
  end
93
91
 
94
92
  it "log warning msg when plugin run" do
95
- plugin.register
96
93
  expect( plugin ).to receive(:log_java_exception)
97
94
  expect(plugin.logger).to receive(:warn).once.with("Exception when executing JDBC query",
98
95
  hash_including(:message => instance_of(String)))
99
- q = Queue.new
100
- expect{ plugin.run(q) }.not_to raise_error
96
+ expect{ plugin.register }.to raise_error(::LogStash::ConfigurationError)
101
97
  end
102
98
 
103
99
  it "should log (native) Java driver error" do
104
- plugin.register
105
100
  expect( org.apache.logging.log4j.LogManager ).to receive(:getLogger).and_wrap_original do |m, *args|
106
101
  logger = m.call(*args)
107
102
  expect( logger ).to receive(:error) do |_, e|
@@ -109,8 +104,7 @@ describe LogStash::Inputs::Jdbc, :integration => true do
109
104
  end.and_call_original
110
105
  logger
111
106
  end
112
- q = Queue.new
113
- expect{ plugin.run(q) }.not_to raise_error
107
+ expect{ plugin.register }.to raise_error(::LogStash::ConfigurationError)
114
108
  end
115
109
  end
116
110
  end
@@ -696,7 +696,7 @@ describe LogStash::Inputs::Jdbc do
696
696
  "last_run_metadata_path" => Stud::Temporary.pathname }
697
697
  end
698
698
 
699
- let(:nums) { [BigDecimal.new(10), BigDecimal.new(20), BigDecimal.new(30), BigDecimal.new(40), BigDecimal.new(50)] }
699
+ let(:nums) { [BigDecimal(10), BigDecimal(20), BigDecimal(30), BigDecimal(40), BigDecimal(50)] }
700
700
 
701
701
  before do
702
702
  plugin.register
@@ -1504,6 +1504,49 @@ describe LogStash::Inputs::Jdbc do
1504
1504
  end
1505
1505
  end
1506
1506
 
1507
+ context "when retrieving records with ambiguous timestamps" do
1508
+
1509
+ let(:settings) do
1510
+ {
1511
+ "statement" => "SELECT * from types_table",
1512
+ "jdbc_default_timezone" => jdbc_default_timezone
1513
+ }
1514
+ end
1515
+
1516
+ before(:each) do
1517
+ db << "INSERT INTO types_table (num, string, started_at, custom_time, ranking) VALUES (1, 'A test', '1999-12-31', '2021-11-07 01:23:45', 95.67)"
1518
+ plugin.register
1519
+ end
1520
+
1521
+ context "when initialized with a preference for DST being enabled" do
1522
+ let(:jdbc_default_timezone) { 'America/Chicago[dst_enabled_on_overlap:true]' }
1523
+
1524
+ it 'treats the timestamp column as if DST was enabled' do
1525
+ plugin.run(queue)
1526
+ event = queue.pop
1527
+ expect(event.get("custom_time")).to be_a_logstash_timestamp_equivalent_to("2021-11-07T06:23:45Z")
1528
+ end
1529
+ end
1530
+ context "when initialized with a preference for DST being disabled" do
1531
+ let(:jdbc_default_timezone) { 'America/Chicago[dst_enabled_on_overlap:false]' }
1532
+
1533
+ it 'treats the timestamp column as if DST was disabled' do
1534
+ plugin.run(queue)
1535
+ event = queue.pop
1536
+ expect(event.get("custom_time")).to be_a_logstash_timestamp_equivalent_to("2021-11-07T07:23:45Z")
1537
+ end
1538
+ end
1539
+ context "when initialized without a preference for DST being enabled or disabled" do
1540
+ before(:each) { allow(plugin.logger).to receive(:warn) }
1541
+ let(:jdbc_default_timezone) { 'America/Chicago' }
1542
+
1543
+ it 'the error results in helpful log warning' do
1544
+ plugin.run(queue)
1545
+ expect(plugin.logger).to have_received(:warn).with(a_string_including("Exception when executing JDBC query"), a_hash_including(:message => a_string_including("2021-11-07 01:23:45 is an ambiguous local time")))
1546
+ end
1547
+ end
1548
+ end
1549
+
1507
1550
  context "when an unreadable jdbc_driver_path entry is present" do
1508
1551
  let(:driver_jar_path) do
1509
1552
  jar_file = $CLASSPATH.find { |name| name.index(Jdbc::Derby.driver_jar) }
@@ -1522,16 +1565,12 @@ describe LogStash::Inputs::Jdbc do
1522
1565
  { "statement" => "SELECT * from types_table", "jdbc_driver_library" => invalid_driver_jar_path }
1523
1566
  end
1524
1567
 
1525
- before do
1526
- plugin.register
1527
- end
1528
-
1529
1568
  after do
1530
1569
  plugin.stop
1531
1570
  end
1532
1571
 
1533
1572
  it "raise a loading error" do
1534
- expect { plugin.run(queue) }.
1573
+ expect { plugin.register }.
1535
1574
  to raise_error(LogStash::PluginLoadingError, /unable to load .*? from :jdbc_driver_library, file not readable/)
1536
1575
  end
1537
1576
  end
@@ -0,0 +1,68 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
3
+ require "logstash/plugin_mixins/jdbc/timezone_proxy"
4
+
5
+ describe LogStash::PluginMixins::Jdbc::TimezoneProxy do
6
+ subject(:timezone) { described_class.load(timezone_spec) }
7
+
8
+ context 'when handling a daylight-savings ambiguous time' do
9
+ context 'without extensions' do
10
+ let(:timezone_spec) { 'America/Los_Angeles[]' }
11
+ it 'raises an AmbiguousTime error' do
12
+ expect { timezone.local_time(2021,11,7,1,17) }.to raise_error(::TZInfo::AmbiguousTime)
13
+ end
14
+ end
15
+ context 'with extension `dst_enabled_on_overlap:true`' do
16
+ let(:timezone_spec) { 'America/Los_Angeles[dst_enabled_on_overlap:true]' }
17
+ it 'resolves as if DST were enabled' do
18
+ timestamp = timezone.local_time(2021,11,7,1,17)
19
+ aggregate_failures do
20
+ expect(timestamp.dst?).to be true
21
+ expect(timestamp.zone).to eq('PDT') # Pacific Daylight Time
22
+ expect(timestamp.getutc).to eq(Time.utc(2021,11,7,8,17))
23
+ expect(timestamp.utc_offset).to eq( -7 * 3600 )
24
+ end
25
+ end
26
+ end
27
+ context 'with extension `dst_enabled_on_overlap:false`' do
28
+ let(:timezone_spec) { 'America/Los_Angeles[dst_enabled_on_overlap:false]' }
29
+ it 'resolves as if DST were disabled' do
30
+ timestamp = timezone.local_time(2021,11,7,1,17)
31
+ aggregate_failures do
32
+ expect(timestamp.dst?).to be false
33
+ expect(timestamp.zone).to eq('PST') # Pacific Standard Time
34
+ expect(timestamp.getutc).to eq(Time.utc(2021,11,7,9,17))
35
+ expect(timestamp.utc_offset).to eq( -8 * 3600 )
36
+ end
37
+ end
38
+ end
39
+ end
40
+
41
+ context '#load' do
42
+ context 'when spec is a normal timezone instance' do
43
+ let(:timezone_spec) { ::TZInfo::Timezone.get('America/Los_Angeles') }
44
+ it 'returns that instance' do
45
+ expect(timezone).to be(timezone_spec)
46
+ end
47
+ end
48
+ context 'when spec is a valid unextended timezone spec' do
49
+ let(:timezone_spec) { 'America/Los_Angeles' }
50
+ it 'returns the canonical timezone' do
51
+ expect(timezone).to eq(::TZInfo::Timezone.get('America/Los_Angeles'))
52
+ end
53
+ end
54
+ context 'when spec is an invalid timezone spec' do
55
+ let(:timezone_spec) { 'NotAValidTimezoneIdentifier' }
56
+
57
+ it 'propagates the TZInfo exception' do
58
+ expect { timezone }.to raise_exception(::TZInfo::InvalidTimezoneIdentifier)
59
+ end
60
+ end
61
+ context 'with invalid extension' do
62
+ let(:timezone_spec) { 'America/Los_Angeles[dst_enabled_on_overlap:false;nope:wrong]' }
63
+ it 'raises an exception with a helpful message' do
64
+ expect { timezone }.to raise_exception(ArgumentError, a_string_including("Invalid timezone extension `nope:wrong`"))
65
+ end
66
+ end
67
+ end
68
+ end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-integration-jdbc
3
3
  version: !ruby/object:Gem::Version
4
- version: 5.3.0
4
+ version: 5.4.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2022-06-08 00:00:00.000000000 Z
11
+ date: 2022-11-04 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -278,6 +278,7 @@ files:
278
278
  - lib/logstash/plugin_mixins/jdbc/common.rb
279
279
  - lib/logstash/plugin_mixins/jdbc/jdbc.rb
280
280
  - lib/logstash/plugin_mixins/jdbc/statement_handler.rb
281
+ - lib/logstash/plugin_mixins/jdbc/timezone_proxy.rb
281
282
  - lib/logstash/plugin_mixins/jdbc/value_tracking.rb
282
283
  - lib/logstash/plugin_mixins/jdbc_streaming.rb
283
284
  - lib/logstash/plugin_mixins/jdbc_streaming/cache_payload.rb
@@ -305,6 +306,7 @@ files:
305
306
  - spec/helpers/derbyrun.jar
306
307
  - spec/inputs/integration/integ_spec.rb
307
308
  - spec/inputs/jdbc_spec.rb
309
+ - spec/plugin_mixins/jdbc/timezone_proxy_spec.rb
308
310
  - spec/plugin_mixins/jdbc_streaming/parameter_handler_spec.rb
309
311
  - vendor/jar-dependencies/org/apache/derby/derby/10.14.1.0/derby-10.14.1.0.jar
310
312
  - vendor/jar-dependencies/org/apache/derby/derbyclient/10.14.1.0/derbyclient-10.14.1.0.jar
@@ -357,4 +359,5 @@ test_files:
357
359
  - spec/helpers/derbyrun.jar
358
360
  - spec/inputs/integration/integ_spec.rb
359
361
  - spec/inputs/jdbc_spec.rb
362
+ - spec/plugin_mixins/jdbc/timezone_proxy_spec.rb
360
363
  - spec/plugin_mixins/jdbc_streaming/parameter_handler_spec.rb