sequel 5.22.0 → 5.23.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: dab3e09e5e2003ee0c53b63f8b628b723be5aeb019bb5b1d2669f7583f589ef3
4
- data.tar.gz: 422d4fc893082e88256f0f2ba8b2f82288aea6f7438c0ef68793f30acc2307e0
3
+ metadata.gz: dfbb05761d35a09172ba5f1d4d7847ae58f152e50648e1fa63d4b284f1999ad5
4
+ data.tar.gz: f7d4b90e74df4096171baba005935648519a0333aff77c5ba840bb634be8a4fd
5
5
  SHA512:
6
- metadata.gz: 844854b68bf2b57c802422496b88dc54d9ec97549da18a5b1d5c04a2ac294286de8cff74c6e8e25f21b35e188c1d9abb97ab9eae427b1bce7c0040779f602da2
7
- data.tar.gz: 52317931ac3e82d63bc307c7c970bed264e1412af59ba979069ebb4867e89138901a388419a80b41c88b6d581d959ba2650bced6bc146911913d230fd890aec5
6
+ metadata.gz: 345e4d6691206c1a353a2d103dc835b1fcaebed9f274717aa7356a3caf0da237aa14f168c659774261d02621934229966f9f75ad901eb663806359cb924df532
7
+ data.tar.gz: 0db48e9f07d0dbd598742e85640b8dba965d27bf476e120bf1568e61f7fb19f513e8d9d82b8d69310d1bfde44333553490e14e354a69ce2623943297fe17649f
data/CHANGELOG CHANGED
@@ -1,3 +1,27 @@
1
+ === 5.23.0 (2019-08-01)
2
+
3
+ * Work around a bug on jdbc-sqlite3 3.27.2.1 when parsing schema for tables with columns with default values (jeremyevans)
4
+
5
+ * Work around a bug in jdbc-sqlite3 3.27.2.1 when in Database#foreign_key_list in the jdbc/sqlite3 adapter (jeremyevans)
6
+
7
+ * Make Dataset#execute* private methods respect explicit servers option, fixing Dataset#paged_each in the postgres adapter when sharding (jeremyevans) (#1632)
8
+
9
+ * Handle instances of subclasses of core classes when wrapping objects in the pg_json extension (jeremyevans) (#1631)
10
+
11
+ * Support :ansi Database option in the tinytds adapter (kenaniah) (#1629)
12
+
13
+ * Support cross-database and linked servers when parsing schema on Microsoft SQL Server (kenaniah) (#1629)
14
+
15
+ * Add insert_conflict plugin for automatically handling unique constraint conflicts when saving new model instances on PostgreSQL 9.5+ and SQLite 3.24.0+ (jeremyevans)
16
+
17
+ * Avoid errors when parsing schema in the mock sqlite adapter (jeremyevans)
18
+
19
+ * Avoid possible thread-safety issue in the timezones support (jeremyevans)
20
+
21
+ * Handle offsets when typecasting an array or hash to datetime when Sequel.datetime_class = Time (jeremyevans)
22
+
23
+ * Support Sequel.datetime_class = Time when using the named_timezones extension (jeremyevans)
24
+
1
25
  === 5.22.0 (2019-07-01)
2
26
 
3
27
  * Fix Dataset#multi_insert and #import with return: :primary_key on MSSQL when the dataset has a row_proc (jeremyevans) (#1627)
@@ -106,6 +106,21 @@ You can also use placeholders with :placeholder and a hash of placeholder values
106
106
  items.where(Sequel.lit('category = :category', category: "ruby")).sql
107
107
  # "SELECT * FROM items WHERE category = 'ruby'"
108
108
 
109
+ In order to combine AND and OR together, you have a few options:
110
+
111
+ items.where(category: nil).or(category: "ruby")
112
+ # SELECT * FROM items WHERE (category IS NULL) OR (category = 'ruby')
113
+
114
+ This won't work if you add other conditions:
115
+
116
+ items.where(name: "Programming in Ruby").where(category: nil).or(category: 'ruby')
117
+ # SELECT * FROM items WHERE ((name = 'Programming in Ruby') AND (category IS NULL)) OR (category = 'ruby')
118
+
119
+ The OR applies globally and not locally. To fix this, use & and |:
120
+
121
+ items.where(Sequel[name: "Programming in Ruby"] & (Sequel[category: nil] | Sequel[category: "ruby"]))
122
+ # SELECT * FROM items WHERE ((name = 'Programming in Ruby') AND ((category IS NULL) OR (category = 'ruby')))
123
+
109
124
  === Specifying SQL functions
110
125
 
111
126
  Sequel also allows you to specify functions by using the Sequel.function method:
@@ -401,6 +401,9 @@ options that you may want to set are :login_timeout, :timeout, :tds_version, :az
401
401
 
402
402
  Other Sequel specific options:
403
403
 
404
+ :ansi :: Set to true to enable the ANSI compatibility settings when connecting
405
+ (ANSI_NULLS, ANSI_PADDING, ANSI_WARNINGS, ANSI_NULL_DFLT_ON, QUOTED_IDENTIFIER,
406
+ CONCAT_NULL_YIELDS_NULL).
404
407
  :server_version :: Override the server version to use (9000000 = SQL Server 2005).
405
408
  This also works on any other adapter that connects to Microsoft
406
409
  SQL Server.
@@ -0,0 +1,56 @@
1
+ = New Features
2
+
3
+ * An insert_conflict plugin has been added for automatically handling
4
+ constraint conflicts when saving new model instances. It is
5
+ supported on PostgreSQL 9.5+ and SQLite 3.24.0+.
6
+
7
+ Album.new(name: 'Foo', copies_sold: 1000).
8
+ insert_conflict(
9
+ target: :name,
10
+ update: {copies_sold: Sequel[:excluded][:b]}
11
+ ).
12
+ save
13
+
14
+ * On Microsoft SQL Server, the Database :ansi option has been added,
15
+ which sets the following ANSI related options:
16
+
17
+ * ANSI_NULLS
18
+ * ANSI_PADDING
19
+ * ANSI_WARNINGS
20
+ * ANSI_NULL_DFLT_ON
21
+ * QUOTED_IDENTIFIER
22
+ * CONCAT_NULL_YIELDS_NULL
23
+
24
+ = Other Improvements
25
+
26
+ * Sequel.datetime_class = Time is now supported when using the
27
+ named_timezones extension. For backwards compatibility, the
28
+ named_timezones extension still sets
29
+ Sequel.datetime_class = DateTime. When using Ruby 2.6+, the
30
+ Time instances have the timezone set on them using Ruby 2.6+'s
31
+ timezone support, but basic support works correctly in earlier
32
+ versions of Ruby.
33
+
34
+ * On Microsoft SQL Server, Sequel now handles parsing schema for
35
+ tables in another database on the same server or in a database
36
+ on a linked server.
37
+
38
+ * The pg_json extension now correctly handles subclasses of core
39
+ classes when wrapping objects. This stopped working in Sequel
40
+ 5.21.0, when support for wrapping JSON primitives was added.
41
+
42
+ * Sequel now works around a couple bugs in jdbc-sqlite 3.27.2.1,
43
+ allowing schema parsing and foreign key parsing to work.
44
+
45
+ * Dataset#execute* private methods now respect an explicitly given
46
+ :server option, fixing Dataset#paged_each in the postgres adapter
47
+ when using sharding.
48
+
49
+ * Timezone offsets are now handled correctly when typecasting an array
50
+ or hash to datetime when Sequel.datetime_class = Time.
51
+
52
+ * Sequel now avoids errors when parsing schema when using the mock
53
+ SQLite adapter.
54
+
55
+ * A minor thread-safety issue has been fixed in the named_timezones
56
+ extension.
@@ -169,3 +169,4 @@ SEQUEL_NO_PENDING :: Don't skip any specs, try running all specs (note, can caus
169
169
  SEQUEL_PG_TIMESTAMPTZ :: Use the pg_timestamptz extension when running the postgres specs
170
170
  SEQUEL_SPLIT_SYMBOLS :: Turn on symbol splitting when running the adapter and integration specs
171
171
  SEQUEL_SYNCHRONIZE_SQL :: Use the synchronize_sql extension when running the specs
172
+ SEQUEL_TZINFO_VERSION :: Force the given tzinfo version when running the specs (e.g. '>=2')
@@ -15,6 +15,24 @@ module Sequel
15
15
  end
16
16
 
17
17
  module SQLite
18
+ module ForeignKeyListPragmaConvertorFix
19
+ # For the use of the convertor for String, working around a bug
20
+ # in jdbc-sqlite3 that reports fields are of type
21
+ # java.sql.types.NUMERIC even though they contain non-numeric data.
22
+ def type_convertor(_, _, _, i)
23
+ i > 2 ? TypeConvertor::CONVERTORS[:String] : super
24
+ end
25
+ end
26
+
27
+ module TableInfoPragmaConvertorFix
28
+ # For the use of the convertor for String, working around a bug
29
+ # in jdbc-sqlite3 that reports dflt_value field is of type
30
+ # java.sql.types.NUMERIC even though they contain string data.
31
+ def type_convertor(_, _, _, i)
32
+ i == 5 ? TypeConvertor::CONVERTORS[:String] : super
33
+ end
34
+ end
35
+
18
36
  module DatabaseMethods
19
37
  include Sequel::SQLite::DatabaseMethods
20
38
 
@@ -37,6 +55,17 @@ module Sequel
37
55
  end
38
56
 
39
57
  private
58
+
59
+
60
+ # Add workaround for bug when running foreign_key_list pragma
61
+ def _foreign_key_list_ds(_)
62
+ super.with_extend(ForeignKeyListPragmaConvertorFix)
63
+ end
64
+
65
+ # Add workaround for bug when running table_info pragma
66
+ def _parse_pragma_ds(_, _)
67
+ super.with_extend(TableInfoPragmaConvertorFix)
68
+ end
40
69
 
41
70
  DATABASE_ERROR_REGEXPS = Sequel::SQLite::DatabaseMethods::DATABASE_ERROR_REGEXPS.merge(/Abort due to constraint violation/ => ConstraintViolation).freeze
42
71
  def database_error_regexps
@@ -427,20 +427,19 @@ module Sequel
427
427
  m = output_identifier_meth(opts[:dataset])
428
428
  m2 = input_identifier_meth(opts[:dataset])
429
429
  tn = m2.call(table_name.to_s)
430
- table_id = get(Sequel.function(:object_id, tn))
431
430
  info_sch_sch = opts[:information_schema_schema]
432
431
  inf_sch_qual = lambda{|s| info_sch_sch ? Sequel.qualify(info_sch_sch, s) : Sequel[s]}
433
- sys_qual = lambda{|s| info_sch_sch ? Sequel.qualify(info_sch_sch, Sequel.qualify(Sequel.lit(''), s)) : Sequel[s]}
432
+ table_id = metadata_dataset.from(inf_sch_qual.call(Sequel[:sys][:objects])).where(:name => tn).select_map(:object_id).first
434
433
 
435
- identity_cols = metadata_dataset.from(Sequel.lit('[sys].[columns]')).
434
+ identity_cols = metadata_dataset.from(inf_sch_qual.call(Sequel[:sys][:columns])).
436
435
  where(:object_id=>table_id, :is_identity=>true).
437
436
  select_map(:name)
438
437
 
439
- pk_index_id = metadata_dataset.from(sys_qual.call(Sequel.lit('sysindexes'))).
438
+ pk_index_id = metadata_dataset.from(inf_sch_qual.call(Sequel[:sys][:sysindexes])).
440
439
  where(:id=>table_id, :indid=>1..254){{(status & 2048)=>2048}}.
441
440
  get(:indid)
442
- pk_cols = metadata_dataset.from(sys_qual.call(Sequel.lit('sysindexkeys')).as(:sik)).
443
- join(sys_qual.call(Sequel.lit('syscolumns')).as(:sc), :id=>:id, :colid=>:colid).
441
+ pk_cols = metadata_dataset.from(inf_sch_qual.call(Sequel[:sys][:sysindexkeys]).as(:sik)).
442
+ join(inf_sch_qual.call(Sequel[:sys][:syscolumns]).as(:sc), :id=>:id, :colid=>:colid).
444
443
  where{{sik[:id]=>table_id, sik[:indid]=>pk_index_id}}.
445
444
  select_order_map{sc[:name]}
446
445
 
@@ -10,6 +10,10 @@ module Sequel
10
10
  def self.mock_adapter_setup(db)
11
11
  db.instance_exec do
12
12
  @sqlite_version = 30903
13
+
14
+ def schema_parse_table(*)
15
+ []
16
+ end
13
17
  end
14
18
  end
15
19
 
@@ -60,7 +64,7 @@ module Sequel
60
64
  def foreign_key_list(table, opts=OPTS)
61
65
  m = output_identifier_meth
62
66
  h = {}
63
- metadata_dataset.with_sql("PRAGMA foreign_key_list(?)", input_identifier_meth.call(table)).each do |row|
67
+ _foreign_key_list_ds(table).each do |row|
64
68
  if r = h[row[:id]]
65
69
  r[:columns] << m.call(row[:from])
66
70
  r[:key] << m.call(row[:to]) if r[:key]
@@ -173,6 +177,16 @@ module Sequel
173
177
 
174
178
  private
175
179
 
180
+ # Dataset used for parsing foreign key lists
181
+ def _foreign_key_list_ds(table)
182
+ metadata_dataset.with_sql("PRAGMA foreign_key_list(?)", input_identifier_meth.call(table))
183
+ end
184
+
185
+ # Dataset used for parsing schema
186
+ def _parse_pragma_ds(table_name, opts)
187
+ metadata_dataset.with_sql("PRAGMA table_info(?)", input_identifier_meth(opts[:dataset]).call(table_name))
188
+ end
189
+
176
190
  # Run all alter_table commands in a transaction. This is technically only
177
191
  # needed for drop column.
178
192
  def apply_alter_table(table, ops)
@@ -445,7 +459,7 @@ module Sequel
445
459
  # Parse the output of the table_info pragma
446
460
  def parse_pragma(table_name, opts)
447
461
  pks = 0
448
- sch = metadata_dataset.with_sql("PRAGMA table_info(?)", input_identifier_meth(opts[:dataset]).call(table_name)).map do |row|
462
+ sch = _parse_pragma_ds(table_name, opts).map do |row|
449
463
  row.delete(:cid)
450
464
  row[:allow_null] = row.delete(:notnull).to_i == 0
451
465
  row[:default] = row.delete(:dflt_value)
@@ -16,6 +16,18 @@ module Sequel
16
16
  c = TinyTds::Client.new(opts)
17
17
  c.query_options.merge!(:cache_rows=>false)
18
18
 
19
+ if opts[:ansi]
20
+ sql = %w(
21
+ ANSI_NULLS
22
+ ANSI_PADDING
23
+ ANSI_WARNINGS
24
+ ANSI_NULL_DFLT_ON
25
+ QUOTED_IDENTIFIER
26
+ CONCAT_NULL_YIELDS_NULL
27
+ ).map{|v| "SET #{v} ON"}.join(";")
28
+ log_connection_yield(sql, c){c.execute(sql)}
29
+ end
30
+
19
31
  if (ts = opts[:textsize])
20
32
  sql = "SET TEXTSIZE #{typecast_value_integer(ts)}"
21
33
  log_connection_yield(sql, c){c.execute(sql)}
@@ -333,6 +333,7 @@ module Sequel
333
333
  # after every 50 records.
334
334
  # :return :: When this is set to :primary_key, returns an array of
335
335
  # autoincremented primary key values for the rows inserted.
336
+ # This does not have an effect if +values+ is a Dataset.
336
337
  # :server :: Set the server/shard to use for the transaction and insert
337
338
  # queries.
338
339
  # :slice :: Same as :commit_every, :commit_every takes precedence.
@@ -1069,7 +1070,7 @@ module Sequel
1069
1070
 
1070
1071
  # Set the server to use to :default unless it is already set in the passed opts
1071
1072
  def default_server_opts(opts)
1072
- if @db.sharded?
1073
+ if @db.sharded? && !opts.has_key?(:server)
1073
1074
  opts = Hash[opts]
1074
1075
  opts[:server] = @opts[:server] || :default
1075
1076
  end
@@ -1080,7 +1081,7 @@ module Sequel
1080
1081
  # :read_only server unless a specific server is set.
1081
1082
  def execute(sql, opts=OPTS, &block)
1082
1083
  db = @db
1083
- if db.sharded?
1084
+ if db.sharded? && !opts.has_key?(:server)
1084
1085
  opts = Hash[opts]
1085
1086
  opts[:server] = @opts[:server] || (@opts[:lock] ? :default : :read_only)
1086
1087
  opts
@@ -2,18 +2,21 @@
2
2
  #
3
3
  # Allows the use of named timezones via TZInfo (requires tzinfo).
4
4
  # Forces the use of DateTime as Sequel's datetime_class, since
5
- # ruby's Time class doesn't support timezones other than local
6
- # and UTC.
5
+ # historically, Ruby's Time class doesn't support timezones other
6
+ # than local and UTC. To continue using Ruby's Time class when using
7
+ # the named_timezones extension:
8
+ #
9
+ # # Load the extension
10
+ # Sequel.extension :named_timezones
11
+ #
12
+ # # Set Sequel.datetime_class back to Time
13
+ # Sequel.datetime_class = Time
7
14
  #
8
15
  # This allows you to either pass strings or TZInfo::Timezone
9
16
  # instance to Sequel.database_timezone=, application_timezone=, and
10
17
  # typecast_timezone=. If a string is passed, it is converted to a
11
18
  # TZInfo::Timezone using TZInfo::Timezone.get.
12
19
  #
13
- # To load the extension:
14
- #
15
- # Sequel.extension :named_timezones
16
- #
17
20
  # Let's say you have the database server in New York and the
18
21
  # application server in Los Angeles. For historical reasons, data
19
22
  # is stored in local New York time, but the application server only
@@ -37,7 +40,8 @@
37
40
  # Note that typecasting from the database timezone to the application
38
41
  # timezone when fetching rows is dependent on the database adapter,
39
42
  # and only works on adapters where Sequel itself does the conversion.
40
- # It should work on mysql, postgres, sqlite, ibmdb, and jdbc.
43
+ # It should work with the mysql, postgres, sqlite, ibmdb, and jdbc
44
+ # adapters.
41
45
  #
42
46
  # Related module: Sequel::NamedTimezones
43
47
 
@@ -63,10 +67,48 @@ module Sequel
63
67
 
64
68
  private
65
69
 
66
- # Handle both TZInfo 1 and TZInfo 2
67
- if defined?(TZInfo::VERSION) && TZInfo::VERSION > '2'
70
+ if RUBY_VERSION >= '2.6'
71
+ # Convert the given input Time (which must be in UTC) to the given input timezone,
72
+ # which should be a TZInfo::Timezone instance.
73
+ def convert_input_time_other(v, input_timezone)
74
+ Time.new(v.year, v.mon, v.day, v.hour, v.min, (v.sec + Rational(v.nsec, 1000000000)), input_timezone)
75
+ rescue TZInfo::AmbiguousTime
76
+ raise unless disamb = tzinfo_disambiguator_for(v)
77
+ period = input_timezone.period_for_local(v, &disamb)
78
+ offset = period.utc_total_offset
79
+ Time.at(v.to_i - offset, :in => input_timezone)
80
+ end
81
+
82
+ # Convert the given input Time to the given output timezone,
83
+ # which should be a TZInfo::Timezone instance.
84
+ def convert_output_time_other(v, output_timezone)
85
+ Time.at(v.to_i, :in => output_timezone)
86
+ end
87
+ else
68
88
  # :nodoc:
69
89
  # :nocov:
90
+ def convert_input_time_other(v, input_timezone)
91
+ local_offset = input_timezone.period_for_local(v, &tzinfo_disambiguator_for(v)).utc_total_offset
92
+ Time.new(1970, 1, 1, 0, 0, 0, local_offset) + v.to_i
93
+ end
94
+
95
+ if defined?(TZInfo::VERSION) && TZInfo::VERSION > '2'
96
+ def convert_output_time_other(v, output_timezone)
97
+ v = output_timezone.utc_to_local(v.getutc)
98
+ local_offset = output_timezone.period_for_local(v, &tzinfo_disambiguator_for(v)).utc_total_offset
99
+ Time.new(1970, 1, 1, 0, 0, 0, local_offset) + v.to_i + local_offset
100
+ end
101
+ else
102
+ def convert_output_time_other(v, output_timezone)
103
+ v = output_timezone.utc_to_local(v.getutc)
104
+ local_offset = output_timezone.period_for_local(v, &tzinfo_disambiguator_for(v)).utc_total_offset
105
+ Time.new(1970, 1, 1, 0, 0, 0, local_offset) + v.to_i
106
+ end
107
+ end
108
+ end
109
+
110
+ # Handle both TZInfo 1 and TZInfo 2
111
+ if defined?(TZInfo::VERSION) && TZInfo::VERSION > '2'
70
112
  def convert_input_datetime_other(v, input_timezone)
71
113
  local_offset = Rational(input_timezone.period_for_local(v, &tzinfo_disambiguator_for(v)).utc_total_offset, 86400)
72
114
  (v - local_offset).new_offset(local_offset)
@@ -340,14 +340,18 @@ module Sequel
340
340
  raise Sequel::Error, "invalid array, empty string" if eos?
341
341
  raise Sequel::Error, "invalid array, doesn't start with {" unless scan(/((\[\d+:\d+\])+=)?\{/)
342
342
 
343
+ # :nocov:
343
344
  while !eos?
345
+ # :nocov:
344
346
  char = scan(/[{}",]|[^{}",]+/)
345
347
  if char == ','
346
348
  # Comma outside quoted string indicates end of current entry
347
349
  new_entry
348
350
  elsif char == '"'
349
351
  raise Sequel::Error, "invalid array, opening quote with existing recorded data" unless @recorded.empty?
352
+ # :nocov:
350
353
  while true
354
+ # :nocov:
351
355
  char = scan(/["\\]|[^"\\]+/)
352
356
  if char == '\\'
353
357
  @recorded << getch
@@ -208,11 +208,19 @@ module Sequel
208
208
  JSONB_PRIMITIVE_WRAPPER_MAPPING.freeze
209
209
 
210
210
  JSON_COMBINED_WRAPPER_MAPPING =JSON_WRAPPER_MAPPING.merge(JSON_PRIMITIVE_WRAPPER_MAPPING).freeze
211
- JSON_WRAP_CLASSES = JSON_COMBINED_WRAPPER_MAPPING.keys.freeze
212
-
213
211
  JSONB_COMBINED_WRAPPER_MAPPING =JSONB_WRAPPER_MAPPING.merge(JSONB_PRIMITIVE_WRAPPER_MAPPING).freeze
214
212
  JSONB_WRAP_CLASSES = JSONB_COMBINED_WRAPPER_MAPPING.keys.freeze
215
213
 
214
+ Sequel::Deprecation.deprecate_constant(self, :JSON_WRAPPER_MAPPING)
215
+ Sequel::Deprecation.deprecate_constant(self, :JSONB_WRAPPER_MAPPING)
216
+ Sequel::Deprecation.deprecate_constant(self, :JSON_PRIMITIVE_WRAPPER_MAPPING)
217
+ Sequel::Deprecation.deprecate_constant(self, :JSONB_PRIMITIVE_WRAPPER_MAPPING)
218
+ Sequel::Deprecation.deprecate_constant(self, :JSON_COMBINED_WRAPPER_MAPPING)
219
+ Sequel::Deprecation.deprecate_constant(self, :JSONB_COMBINED_WRAPPER_MAPPING)
220
+ Sequel::Deprecation.deprecate_constant(self, :JSONB_WRAP_CLASSES)
221
+
222
+ JSON_WRAP_CLASSES = [Hash, Array, String, Integer, Float, NilClass, TrueClass, FalseClass].freeze
223
+
216
224
  # Methods enabling Database object integration with the json type.
217
225
  module JSONDatabaseMethods
218
226
  def self.extended(db)
@@ -228,6 +236,69 @@ module Sequel
228
236
  end
229
237
  end
230
238
 
239
+ # Return the wrapper class for the json type if value is Hash or Array.
240
+ def self.json_wrapper(value)
241
+ case value
242
+ when ::Hash
243
+ JSONHash
244
+ when ::Array
245
+ JSONArray
246
+ end
247
+ end
248
+
249
+ # Return the wrapper class for the jsonb type if value is Hash or Array.
250
+ def self.jsonb_wrapper(value)
251
+ case value
252
+ when ::Hash
253
+ JSONBHash
254
+ when ::Array
255
+ JSONBArray
256
+ end
257
+ end
258
+
259
+ # Return the wrapper class for the json type if value is a supported type.
260
+ def self.json_primitive_wrapper(value)
261
+ case value
262
+ when ::Hash
263
+ JSONHash
264
+ when ::Array
265
+ JSONArray
266
+ when ::String
267
+ JSONString
268
+ when ::Integer
269
+ JSONInteger
270
+ when ::Float
271
+ JSONFloat
272
+ when ::NilClass
273
+ JSONNull
274
+ when ::TrueClass
275
+ JSONTrue
276
+ when ::FalseClass
277
+ JSONFalse
278
+ end
279
+ end
280
+
281
+ # Return the wrapper class for the jsonb type if value is a supported type.
282
+ def self.jsonb_primitive_wrapper(value)
283
+ case value
284
+ when ::Hash
285
+ JSONBHash
286
+ when ::Array
287
+ JSONBArray
288
+ when ::String
289
+ JSONBString
290
+ when ::Integer
291
+ JSONBInteger
292
+ when ::Float
293
+ JSONBFloat
294
+ when ::NilClass
295
+ JSONBNull
296
+ when ::TrueClass
297
+ JSONBTrue
298
+ when ::FalseClass
299
+ JSONBFalse
300
+ end
301
+ end
231
302
 
232
303
  # Deprecated
233
304
  def self.db_parse_json(s)
@@ -326,9 +397,9 @@ module Sequel
326
397
  # Wrap the parsed JSON value in the appropriate JSON wrapper class.
327
398
  # Only wrap primitive values if wrap_json_primitives is set.
328
399
  def _wrap_json(value)
329
- if klass = JSON_WRAPPER_MAPPING[value.class]
400
+ if klass = JSONDatabaseMethods.json_wrapper(value)
330
401
  klass.new(value)
331
- elsif klass = JSON_PRIMITIVE_WRAPPER_MAPPING[value.class]
402
+ elsif klass = JSONDatabaseMethods.json_primitive_wrapper(value)
332
403
  if wrap_json_primitives
333
404
  klass.new(value)
334
405
  else
@@ -342,9 +413,9 @@ module Sequel
342
413
  # Wrap the parsed JSON value in the appropriate JSONB wrapper class.
343
414
  # Only wrap primitive values if wrap_json_primitives is set.
344
415
  def _wrap_jsonb(value)
345
- if klass = JSONB_WRAPPER_MAPPING[value.class]
416
+ if klass = JSONDatabaseMethods.jsonb_wrapper(value)
346
417
  klass.new(value)
347
- elsif klass = JSONB_PRIMITIVE_WRAPPER_MAPPING[value.class]
418
+ elsif klass = JSONDatabaseMethods.jsonb_primitive_wrapper(value)
348
419
  if wrap_json_primitives
349
420
  klass.new(value)
350
421
  else
@@ -413,10 +484,10 @@ module Sequel
413
484
  _wrap_json(_parse_json(value))
414
485
  end
415
486
  when *JSON_WRAP_CLASSES
416
- JSON_COMBINED_WRAPPER_MAPPING[value.class].new(value)
487
+ JSONDatabaseMethods.json_primitive_wrapper(value).new(value)
417
488
  when JSONBObject
418
489
  value = value.__getobj__
419
- JSON_COMBINED_WRAPPER_MAPPING[value.class].new(value)
490
+ JSONDatabaseMethods.json_primitive_wrapper(value).new(value)
420
491
  else
421
492
  raise Sequel::InvalidValue, "invalid value for json: #{value.inspect}"
422
493
  end
@@ -433,11 +504,11 @@ module Sequel
433
504
  else
434
505
  _wrap_jsonb(_parse_json(value))
435
506
  end
436
- when *JSONB_WRAP_CLASSES
437
- JSONB_COMBINED_WRAPPER_MAPPING[value.class].new(value)
507
+ when *JSON_WRAP_CLASSES
508
+ JSONDatabaseMethods.jsonb_primitive_wrapper(value).new(value)
438
509
  when JSONObject
439
510
  value = value.__getobj__
440
- JSONB_COMBINED_WRAPPER_MAPPING[value.class].new(value)
511
+ JSONDatabaseMethods.jsonb_primitive_wrapper(value).new(value)
441
512
  else
442
513
  raise Sequel::InvalidValue, "invalid value for jsonb: #{value.inspect}"
443
514
  end
@@ -460,7 +531,7 @@ module Sequel
460
531
  Postgres::JSONHash.new(v)
461
532
  when Postgres::JSONBObject
462
533
  v = v.__getobj__
463
- Postgres::JSON_COMBINED_WRAPPER_MAPPING[v.class].new(v)
534
+ Postgres::JSONDatabaseMethods.json_primitive_wrapper(v).new(v)
464
535
  else
465
536
  Sequel.pg_json_op(v)
466
537
  end
@@ -472,7 +543,7 @@ module Sequel
472
543
  def pg_json_wrap(v)
473
544
  case v
474
545
  when *Postgres::JSON_WRAP_CLASSES
475
- Postgres::JSON_COMBINED_WRAPPER_MAPPING[v.class].new(v)
546
+ Postgres::JSONDatabaseMethods.json_primitive_wrapper(v).new(v)
476
547
  else
477
548
  raise Error, "invalid value passed to Sequel.pg_json_wrap: #{v.inspect}"
478
549
  end
@@ -492,7 +563,7 @@ module Sequel
492
563
  Postgres::JSONBHash.new(v)
493
564
  when Postgres::JSONObject
494
565
  v = v.__getobj__
495
- Postgres::JSONB_COMBINED_WRAPPER_MAPPING[v.class].new(v)
566
+ Postgres::JSONDatabaseMethods.jsonb_primitive_wrapper(v).new(v)
496
567
  else
497
568
  Sequel.pg_jsonb_op(v)
498
569
  end
@@ -503,8 +574,8 @@ module Sequel
503
574
  # other types.
504
575
  def pg_jsonb_wrap(v)
505
576
  case v
506
- when *Postgres::JSONB_WRAP_CLASSES
507
- Postgres::JSONB_COMBINED_WRAPPER_MAPPING[v.class].new(v)
577
+ when *Postgres::JSON_WRAP_CLASSES
578
+ Postgres::JSONDatabaseMethods.jsonb_primitive_wrapper(v).new(v)
508
579
  else
509
580
  raise Error, "invalid value passed to Sequel.pg_jsonb_wrap: #{v.inspect}"
510
581
  end