sequel 5.66.0 → 5.68.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 7284ac53d44e98a2ddaf7c0e8068fe0716a9071bef14c484bbe77cc553d775f0
4
- data.tar.gz: '0828c8d85355205f8b6ee8a58ba5f9b85ebec07381a687f08fa909e883ac8eed'
3
+ metadata.gz: e682467c455e044847fffb313b6d8a669e0dc89ca8ea330468841de642b84227
4
+ data.tar.gz: cd0687fd059a6b6d38cb31279d5cdacccd4a0a309f1155171f12551d43eacb0b
5
5
  SHA512:
6
- metadata.gz: d5e42c07446ea4cd3b42eb566a20cf81b4fb7e641d7b521ead6bcfc32a8ff0a0bf7b76d7a2a8ba654a0bdd80a67db6def93ba1efba61dd5eeebf68a14cdb722a
7
- data.tar.gz: ad7397637ab9bfb0903eff1c46c8ed5901bc2eff3a4ba5a2411ba61fb32f9e6d00871063cc55f6d21798f5790f0a5158c19f0343839a6d6750b53a3301395c90
6
+ metadata.gz: 3c16f027251aa67e04cab0973a57da21682e083c7362f37bdbf84937266ca6878e200fb7c44a0583b8ac6fab62d1fd4b55744929760fc63a9d1f850452db5df6
7
+ data.tar.gz: 60eee01297c47ee9c18ae76b6974b6eb3e492e58c7f408ee9c32476f1ae0ae5848ffc47cb80889f7070b02088e7e66509b64e460a1ef61c90ffb5a59da6a3c09
data/CHANGELOG CHANGED
@@ -1,3 +1,29 @@
1
+ === 5.68.0 (2023-05-01)
2
+
3
+ * Add validation_helpers_generic_type_messages plugin for more useful type validation failure messages (jeremyevans) (#2028)
4
+
5
+ * Make constraint_validations plugin not validate missing columns that have a default value (jeremyevans) (#2023)
6
+
7
+ * Skip normal type name parsing for enum/array/composite/range/multirange types on PostgreSQL (jeremyevans) (#2019)
8
+
9
+ * Fix corner case where pg_extended_date_support did not work correctly when using the jdbc/postgresql adapter (jeremyevans)
10
+
11
+ * Include :min_value and :max_value schema entries for date/timestamp/timestamptz columns on PostgreSQL 9.6+ (jeremyevans)
12
+
13
+ === 5.67.0 (2023-04-01)
14
+
15
+ * Fix dumping of string column sizes in the schema dumper on MSSQL (jeremyevans) (#2013)
16
+
17
+ * Improve dumping of tables in non-default schemas in the schema_dumper extension (jeremyevans) (#2006)
18
+
19
+ * Make Database#{tables,views} support :qualify option on Microsoft SQL Server (jeremyevans)
20
+
21
+ * Avoid use of singleton classes for datasets instances on Ruby 2.4+ (jeremyevans) (#2007)
22
+
23
+ * Deprecate registering datasets extensions using an object other than a module (jeremyevans)
24
+
25
+ * Add set_literalizer extension, for treating set usage in datasets similar to array usage (jeremyevans) (#1997)
26
+
1
27
  === 5.66.0 (2023-03-01)
2
28
 
3
29
  * Recognize SQLite error related to strict tables as a constraint violation when using the amalgalite adapter (jeremyevans)
@@ -670,8 +670,10 @@ polymorphic associations in Sequel about as easy as it is in ActiveRecord. Howe
670
670
  here's how they can be done using Sequel's custom associations (the sequel_polymorphic
671
671
  external plugin is just a generic version of this code):
672
672
 
673
+ Sequel.extension :inflector # for attachable_type.constantize
674
+
673
675
  class Asset < Sequel::Model
674
- many_to_one :attachable, reciprocal: :assets,
676
+ many_to_one :attachable, reciprocal: :assets, reciprocal_type: :one_to_many,
675
677
  setter: (lambda do |attachable|
676
678
  self[:attachable_id] = (attachable.pk if attachable)
677
679
  self[:attachable_type] = (attachable.class.name if attachable)
@@ -0,0 +1,32 @@
1
+ = New Features
2
+
3
+ * A set_literalizer extension has been added, for treating Set
4
+ instances in datasets similar to Array instances:
5
+
6
+ DB.extension :set_literalizer
7
+ DB[:table].where(column: Set.new([1, 2, 3]))
8
+ # SELECT FROM table WHERE (column IN (1, 2, 3))
9
+
10
+ = Improvements
11
+
12
+ * Sequel now avoids the use of singleton classes for datasets on Ruby
13
+ 2.4+, instead creating a regular subclass whenever a dataset would
14
+ be extended via #extension or #with_extend. This significantly
15
+ improves performance, up to 20-40% for common dataset usage,
16
+ because it avoids creating new singleton classes for every dataset
17
+ clone, and it allows for cached method lookup.
18
+
19
+ * Database#tables and #views now support a :qualify option on Microsoft
20
+ SQL Server to returned qualified identifiers.
21
+
22
+ * The schema_dumper extension can now dump tables in non-default schemas
23
+ when using Microsoft SQL Server.
24
+
25
+ * The schema_dumper extension now correctly dumps string column sizes
26
+ when using Microsoft SQL Server.
27
+
28
+ = Backwards Compatibility
29
+
30
+ * Calling Sequel::Dataset.register_extension where the second argument
31
+ is not a module now issues a deprecation warning. Support for this
32
+ will be removed in Sequel 6.
@@ -0,0 +1,61 @@
1
+ = New Features
2
+
3
+ * On PostgreSQL 9.6+, date, timestamp, and timestamptz columns now
4
+ have min_value and max_value column schema entries, allowing the
5
+ auto_validations plugin to automatically enforce minimum and
6
+ maximum values for these column types, raising a validation error
7
+ before saving, instead of database error when the query is sent
8
+ to the database.
9
+
10
+ * A validation_helpers_generic_type_messages plugin has been added,
11
+ which improves the default type validation error messages in
12
+ validation_helpers. This change was not made directly to
13
+ validation_helpers for backwards compatibility reasons, but will
14
+ probably become the default behavior in Sequel 6. Some examples
15
+ of the improvements:
16
+
17
+ # :blob type
18
+ # validation_helpers default: "value is not a valid sequel::sql::blob"
19
+ # with this plugin: "value is not a blob"
20
+
21
+ # :boolean type
22
+ # validation_helpers default: "value is not a valid trueclass or falseclass"
23
+ # with this plugin: "value is not true or false"
24
+
25
+ # :datetime type
26
+ # validation_helpers default: "value is not a valid time or datetime"
27
+ # with this plugin: "value is not a valid timestamp"
28
+
29
+ # custom/database-specific types
30
+ # validation_helpers default: "value is not a valid sequel::class_name"
31
+ # with this plugin: "value is not the expected type"
32
+
33
+ = Improvements
34
+
35
+ * The constraint_validations plugin no longer raises validation
36
+ errors for missing columns that have a default value. If a column
37
+ is missing but has a default value, we can assume the default
38
+ value is valid. Additionally, the constraint validations are now
39
+ based on the underlying column value and not any deserialized
40
+ value, so that the validation matches the constraint.
41
+
42
+ * On PostgreSQL, normal type name parsing is skipped for enum,
43
+ array, composite, range, and multirange types, ensuring that
44
+ such types will not be treated incorrectly based on their
45
+ type name.
46
+
47
+ * The pg_extended_date_support extension now works correctly with
48
+ the jdbc/postgresql adapter if there is already an entry in the
49
+ oid_convertor_map for the timestamp and timestamptz types.
50
+
51
+ = Backwards Compatibility
52
+
53
+ * Database#schema_column_type is no longer called for all columns
54
+ on PostgreSQL when parsing schema. Enum, array, composite, range,
55
+ and multirange types each have their own separate type parsing
56
+ method. So if you were overriding Database#schema_column_type to
57
+ implement custom column schema parsing, you may need to adjust
58
+ your code.
59
+
60
+ * The Sequel::Postgres::ExtendedDateSupport::DATE_YEAR_1 constant
61
+ has been removed.
@@ -404,10 +404,15 @@ module Sequel
404
404
  # Backbone of the tables and views support.
405
405
  def information_schema_tables(type, opts)
406
406
  m = output_identifier_meth
407
- metadata_dataset.from(Sequel[:information_schema][:tables].as(:t)).
407
+ schema = opts[:schema]||'dbo'
408
+ tables = metadata_dataset.from(Sequel[:information_schema][:tables].as(:t)).
408
409
  select(:table_name).
409
- where(:table_type=>type, :table_schema=>(opts[:schema]||'dbo').to_s).
410
+ where(:table_type=>type, :table_schema=>schema.to_s).
410
411
  map{|x| m.call(x[:table_name])}
412
+
413
+ tables.map!{|t| Sequel.qualify(m.call(schema).to_s, m.call(t).to_s)} if opts[:qualify]
414
+
415
+ tables
411
416
  end
412
417
 
413
418
  # Always quote identifiers in the metadata_dataset, so schema parsing works.
@@ -1007,11 +1007,15 @@ module Sequel
1007
1007
  SQL::Function.new(:format_type, pg_type[:oid], pg_attribute[:atttypmod]).as(:db_type),
1008
1008
  SQL::Function.new(:pg_get_expr, pg_attrdef[:adbin], pg_class[:oid]).as(:default),
1009
1009
  SQL::BooleanExpression.new(:NOT, pg_attribute[:attnotnull]).as(:allow_null),
1010
- SQL::Function.new(:COALESCE, SQL::BooleanExpression.from_value_pairs(pg_attribute[:attnum] => SQL::Function.new(:ANY, pg_index[:indkey])), false).as(:primary_key)]}.
1010
+ SQL::Function.new(:COALESCE, SQL::BooleanExpression.from_value_pairs(pg_attribute[:attnum] => SQL::Function.new(:ANY, pg_index[:indkey])), false).as(:primary_key),
1011
+ Sequel[:pg_type][:typtype],
1012
+ (~Sequel[Sequel[:elementtype][:oid]=>nil]).as(:is_array),
1013
+ ]}.
1011
1014
  from(:pg_class).
1012
1015
  join(:pg_attribute, :attrelid=>:oid).
1013
1016
  join(:pg_type, :oid=>:atttypid).
1014
1017
  left_outer_join(Sequel[:pg_type].as(:basetype), :oid=>:typbasetype).
1018
+ left_outer_join(Sequel[:pg_type].as(:elementtype), :typarray=>Sequel[:pg_type][:oid]).
1015
1019
  left_outer_join(:pg_attrdef, :adrelid=>Sequel[:pg_class][:oid], :adnum=>Sequel[:pg_attribute][:attnum]).
1016
1020
  left_outer_join(:pg_index, :indrelid=>Sequel[:pg_class][:oid], :indisprimary=>true).
1017
1021
  where{{pg_attribute[:attisdropped]=>false}}.
@@ -1538,11 +1542,12 @@ module Sequel
1538
1542
  end
1539
1543
 
1540
1544
  # SQL DDL statement for renaming a table. PostgreSQL doesn't allow you to change a table's schema in
1541
- # a rename table operation, so speciying a new schema in new_name will not have an effect.
1545
+ # a rename table operation, so specifying a new schema in new_name will not have an effect.
1542
1546
  def rename_table_sql(name, new_name)
1543
1547
  "ALTER TABLE #{quote_schema_table(name)} RENAME TO #{quote_identifier(schema_and_table(new_name).last)}"
1544
1548
  end
1545
1549
 
1550
+ # Handle interval and citext types.
1546
1551
  def schema_column_type(db_type)
1547
1552
  case db_type
1548
1553
  when /\Ainterval\z/io
@@ -1554,6 +1559,43 @@ module Sequel
1554
1559
  end
1555
1560
  end
1556
1561
 
1562
+ # The schema :type entry to use for array types.
1563
+ def schema_array_type(db_type)
1564
+ :array
1565
+ end
1566
+
1567
+ # The schema :type entry to use for row/composite types.
1568
+ def schema_composite_type(db_type)
1569
+ :composite
1570
+ end
1571
+
1572
+ # The schema :type entry to use for enum types.
1573
+ def schema_enum_type(db_type)
1574
+ :enum
1575
+ end
1576
+
1577
+ # The schema :type entry to use for range types.
1578
+ def schema_range_type(db_type)
1579
+ :range
1580
+ end
1581
+
1582
+ # The schema :type entry to use for multirange types.
1583
+ def schema_multirange_type(db_type)
1584
+ :multirange
1585
+ end
1586
+
1587
+ MIN_DATE = Date.new(-4713, 11, 24)
1588
+ MAX_DATE = Date.new(5874897, 12, 31)
1589
+ MIN_TIMESTAMP = Time.utc(-4713, 11, 24).freeze
1590
+ MAX_TIMESTAMP = (Time.utc(294277) - Rational(1, 1000000)).freeze
1591
+ TYPTYPE_METHOD_MAP = {
1592
+ 'c' => :schema_composite_type,
1593
+ 'e' => :schema_enum_type,
1594
+ 'r' => :schema_range_type,
1595
+ 'm' => :schema_multirange_type,
1596
+ }
1597
+ TYPTYPE_METHOD_MAP.default = :schema_column_type
1598
+ TYPTYPE_METHOD_MAP.freeze
1557
1599
  # The dataset used for parsing table schemas, using the pg_* system catalogs.
1558
1600
  def schema_parse_table(table_name, opts)
1559
1601
  m = output_identifier_meth(opts[:dataset])
@@ -1569,11 +1611,33 @@ module Sequel
1569
1611
  row.delete(:base_oid)
1570
1612
  row.delete(:db_base_type)
1571
1613
  end
1572
- row[:type] = schema_column_type(row[:db_type])
1614
+
1615
+ db_type = row[:db_type]
1616
+ row[:type] = if row.delete(:is_array)
1617
+ schema_array_type(db_type)
1618
+ else
1619
+ send(TYPTYPE_METHOD_MAP[row.delete(:typtype)], db_type)
1620
+ end
1573
1621
  identity = row.delete(:attidentity)
1574
1622
  if row[:primary_key]
1575
1623
  row[:auto_increment] = !!(row[:default] =~ /\A(?:nextval)/i) || identity == 'a' || identity == 'd'
1576
1624
  end
1625
+
1626
+ # :nocov:
1627
+ if server_version >= 90600
1628
+ # :nocov:
1629
+ case row[:oid]
1630
+ when 1082
1631
+ row[:min_value] = MIN_DATE
1632
+ row[:max_value] = MAX_DATE
1633
+ when 1184, 1114
1634
+ if Sequel.datetime_class == Time
1635
+ row[:min_value] = MIN_TIMESTAMP
1636
+ row[:max_value] = MAX_TIMESTAMP
1637
+ end
1638
+ end
1639
+ end
1640
+
1577
1641
  [m.call(row.delete(:name)), row]
1578
1642
  end
1579
1643
  end
@@ -30,19 +30,29 @@ module Sequel
30
30
  @dataset_class.new(self)
31
31
  end
32
32
 
33
- # Fetches records for an arbitrary SQL statement. If a block is given,
34
- # it is used to iterate over the records:
33
+ # Returns a dataset instance for the given SQL string:
35
34
  #
36
- # DB.fetch('SELECT * FROM items'){|r| p r}
35
+ # ds = DB.fetch('SELECT * FROM items')
36
+ #
37
+ # You can then call methods on the dataset to retrieve results:
37
38
  #
38
- # The +fetch+ method returns a dataset instance:
39
+ # ds.all
40
+ # # SELECT * FROM items
41
+ # # => [{:column=>value, ...}, ...]
39
42
  #
40
- # DB.fetch('SELECT * FROM items').all
43
+ # If a block is given, it is passed to #each on the resulting dataset to
44
+ # iterate over the records returned by the query:
45
+ #
46
+ # DB.fetch('SELECT * FROM items'){|r| p r}
47
+ # # {:column=>value, ...}
48
+ # # ...
41
49
  #
42
50
  # +fetch+ can also perform parameterized queries for protection against SQL
43
51
  # injection:
44
52
  #
45
- # DB.fetch('SELECT * FROM items WHERE name = ?', my_name).all
53
+ # ds = DB.fetch('SELECT * FROM items WHERE name = ?', "my name")
54
+ # ds.all
55
+ # # SELECT * FROM items WHERE name = 'my name'
46
56
  #
47
57
  # See caveats listed in Dataset#with_sql regarding datasets using custom
48
58
  # SQL and the methods that can be called on them.
@@ -307,7 +307,7 @@ module Sequel
307
307
  # Examples:
308
308
  # primary_key(:id)
309
309
  # primary_key(:id, type: :Bignum, keep_order: true)
310
- # primary_key([:street_number, :house_number], name: :some constraint_name)
310
+ # primary_key([:street_number, :house_number], name: :some_constraint_name)
311
311
  def primary_key(name, *args)
312
312
  return composite_primary_key(name, *args) if name.is_a?(Array)
313
313
  column = @db.serial_primary_key_options.merge({:name => name})
@@ -900,7 +900,7 @@ module Sequel
900
900
  #
901
901
  # Any other object given is just converted to a string, with "_" converted to " " and upcased.
902
902
  def on_delete_clause(action)
903
- action.to_s.gsub("_", " ").upcase
903
+ action.to_s.tr("_", " ").upcase
904
904
  end
905
905
 
906
906
  # Alias of #on_delete_clause, since the two usually behave the same.
@@ -0,0 +1,42 @@
1
+ # frozen-string-literal: true
2
+
3
+ module Sequel
4
+ class Dataset
5
+ # This module implements methods to support deprecated use of extensions registered
6
+ # not using a module. In such cases, for backwards compatibility, Sequel has to use
7
+ # a singleton class for the dataset.
8
+ module DeprecatedSingletonClassMethods
9
+ # Load the extension into a clone of the receiver.
10
+ def extension(*a)
11
+ c = _clone(:freeze=>false)
12
+ c.send(:_extension!, a)
13
+ c.freeze
14
+ end
15
+
16
+ # Extend the cloned of the receiver with the given modules, instead of the default
17
+ # approach of creating a subclass of the receiver's class and including the modules
18
+ # into that.
19
+ def with_extend(*mods, &block)
20
+ c = _clone(:freeze=>false)
21
+ c.extend(*mods) unless mods.empty?
22
+ c.extend(DatasetModule.new(&block)) if block
23
+ c.freeze
24
+ end
25
+
26
+ private
27
+
28
+ # Load the extensions into the receiver.
29
+ def _extension!(exts)
30
+ Sequel.extension(*exts)
31
+ exts.each do |ext|
32
+ if pr = Sequel.synchronize{EXTENSIONS[ext]}
33
+ pr.call(self)
34
+ else
35
+ raise(Error, "Extension #{ext} does not have specific support handling individual datasets (try: Sequel.extension #{ext.inspect})")
36
+ end
37
+ end
38
+ self
39
+ end
40
+ end
41
+ end
42
+ end
@@ -12,6 +12,10 @@ module Sequel
12
12
  # in the extension).
13
13
  EXTENSIONS = {}
14
14
 
15
+ # Hash of extension name symbols to modules to load to implement the extension.
16
+ EXTENSION_MODULES = {}
17
+ private_constant :EXTENSION_MODULES
18
+
15
19
  EMPTY_ARRAY = [].freeze
16
20
 
17
21
  # The dataset options that require the removal of cached columns if changed.
@@ -45,12 +49,8 @@ module Sequel
45
49
  METHS
46
50
 
47
51
  # Register an extension callback for Dataset objects. ext should be the
48
- # extension name symbol, and mod should either be a Module that the
49
- # dataset is extended with, or a callable object called with the database
50
- # object. If mod is not provided, a block can be provided and is treated
51
- # as the mod object.
52
- #
53
- # If mod is a module, this also registers a Database extension that will
52
+ # extension name symbol, and mod should be a Module that will be
53
+ # included in the dataset's class. This also registers a Database extension that will
54
54
  # extend all of the database's datasets.
55
55
  def self.register_extension(ext, mod=nil, &block)
56
56
  if mod
@@ -58,10 +58,16 @@ module Sequel
58
58
  if mod.is_a?(Module)
59
59
  block = proc{|ds| ds.extend(mod)}
60
60
  Sequel::Database.register_extension(ext){|db| db.extend_datasets(mod)}
61
+ Sequel.synchronize{EXTENSION_MODULES[ext] = mod}
61
62
  else
62
63
  block = mod
63
64
  end
64
65
  end
66
+
67
+ unless mod.is_a?(Module)
68
+ Sequel::Deprecation.deprecate("Providing a block or non-module to Sequel::Dataset.register_extension is deprecated and support for it will be removed in Sequel 6.")
69
+ end
70
+
65
71
  Sequel.synchronize{EXTENSIONS[ext] = block}
66
72
  end
67
73
 
@@ -195,11 +201,15 @@ module Sequel
195
201
  if TRUE_FREEZE
196
202
  # Return a clone of the dataset loaded with the given dataset extensions.
197
203
  # If no related extension file exists or the extension does not have
198
- # specific support for Dataset objects, an Error will be raised.
199
- def extension(*a)
200
- c = _clone(:freeze=>false)
201
- c.send(:_extension!, a)
202
- c.freeze
204
+ # specific support for Dataset objects, an error will be raised.
205
+ def extension(*exts)
206
+ Sequel.extension(*exts)
207
+ mods = exts.map{|ext| Sequel.synchronize{EXTENSION_MODULES[ext]}}
208
+ if mods.all?
209
+ with_extend(*mods)
210
+ else
211
+ with_extend(DeprecatedSingletonClassMethods).extension(*exts)
212
+ end
203
213
  end
204
214
  else
205
215
  # :nocov:
@@ -1199,16 +1209,27 @@ module Sequel
1199
1209
  end
1200
1210
 
1201
1211
  if TRUE_FREEZE
1202
- # Return a clone of the dataset extended with the given modules.
1212
+ # Create a subclass of the receiver's class, and include the given modules
1213
+ # into it. If a block is provided, a DatasetModule is created using the block and
1214
+ # is included into the subclass. Create an instance of the subclass using the
1215
+ # same db and opts, so that the returned dataset operates similarly to a clone
1216
+ # extended with the given modules. This approach is used to avoid singleton
1217
+ # classes, which significantly improves performance.
1218
+ #
1203
1219
  # Note that like Object#extend, when multiple modules are provided
1204
- # as arguments the cloned dataset is extended with the modules in reverse
1205
- # order. If a block is provided, a DatasetModule is created using the block and
1206
- # the clone is extended with that module after any modules given as arguments.
1220
+ # as arguments the subclass includes the modules in reverse order.
1207
1221
  def with_extend(*mods, &block)
1208
- c = _clone(:freeze=>false)
1209
- c.extend(*mods) unless mods.empty?
1210
- c.extend(DatasetModule.new(&block)) if block
1211
- c.freeze
1222
+ c = Class.new(self.class)
1223
+ c.include(*mods) unless mods.empty?
1224
+ c.include(DatasetModule.new(&block)) if block
1225
+ o = c.freeze.allocate
1226
+ o.instance_variable_set(:@db, @db)
1227
+ o.instance_variable_set(:@opts, @opts)
1228
+ o.instance_variable_set(:@cache, {})
1229
+ if cols = cache_get(:_columns)
1230
+ o.send(:columns=, cols)
1231
+ end
1232
+ o.freeze
1212
1233
  end
1213
1234
  else
1214
1235
  # :nocov:
@@ -1315,18 +1336,22 @@ module Sequel
1315
1336
 
1316
1337
  private
1317
1338
 
1318
- # Load the extensions into the receiver, without checking if the receiver is frozen.
1319
- def _extension!(exts)
1320
- Sequel.extension(*exts)
1321
- exts.each do |ext|
1322
- if pr = Sequel.synchronize{EXTENSIONS[ext]}
1323
- pr.call(self)
1324
- else
1325
- raise(Error, "Extension #{ext} does not have specific support handling individual datasets (try: Sequel.extension #{ext.inspect})")
1339
+ # :nocov:
1340
+ unless TRUE_FREEZE
1341
+ # Load the extensions into the receiver, without checking if the receiver is frozen.
1342
+ def _extension!(exts)
1343
+ Sequel.extension(*exts)
1344
+ exts.each do |ext|
1345
+ if pr = Sequel.synchronize{EXTENSIONS[ext]}
1346
+ pr.call(self)
1347
+ else
1348
+ raise(Error, "Extension #{ext} does not have specific support handling individual datasets (try: Sequel.extension #{ext.inspect})")
1349
+ end
1326
1350
  end
1351
+ self
1327
1352
  end
1328
- self
1329
1353
  end
1354
+ # :nocov:
1330
1355
 
1331
1356
  # If invert is true, invert the condition.
1332
1357
  def _invert_filter(cond, invert)
@@ -53,4 +53,8 @@ module Sequel
53
53
  require_relative "dataset/sql"
54
54
  require_relative "dataset/placeholder_literalizer"
55
55
  require_relative "dataset/dataset_module"
56
+
57
+ # :nocov:
58
+ require_relative "dataset/deprecated_singleton_class_methods" if Dataset::TRUE_FREEZE
59
+ # :nocov:
56
60
  end
@@ -28,7 +28,7 @@
28
28
  # connections on every checkout without setting up coarse
29
29
  # connection checkouts will hurt performance, in some cases
30
30
  # significantly. Note that setting up coarse connection
31
- # checkouts reduces the concurrency level acheivable. For
31
+ # checkouts reduces the concurrency level achievable. For
32
32
  # example, in a web application, using Database#synchronize
33
33
  # in a rack middleware will limit the number of concurrent
34
34
  # web requests to the number to connections in the database
@@ -257,7 +257,7 @@ module Sequel
257
257
  end
258
258
 
259
259
  # Make the column type detection handle registered array types.
260
- def schema_column_type(db_type)
260
+ def schema_array_type(db_type)
261
261
  if (db_type =~ /\A([^(]+)(?:\([^(]+\))?\[\]\z/io) && (type = pg_array_schema_type($1))
262
262
  type
263
263
  else
@@ -166,8 +166,7 @@ module Sequel
166
166
  def schema_post_process(_)
167
167
  super.each do |_, s|
168
168
  oid = s[:oid]
169
- if values = Sequel.synchronize{@enum_labels[oid]}
170
- s[:type] = :enum
169
+ if s[:type] == :enum && (values = Sequel.synchronize{@enum_labels[oid]})
171
170
  s[:enum_values] = values
172
171
  end
173
172
  end
@@ -22,7 +22,6 @@
22
22
  module Sequel
23
23
  module Postgres
24
24
  module ExtendedDateSupport
25
- DATE_YEAR_1 = Date.new(1)
26
25
  DATETIME_YEAR_1 = DateTime.new(1)
27
26
  TIME_YEAR_1 = Time.at(-62135596800).utc
28
27
  INFINITE_TIMESTAMP_STRINGS = ['infinity'.freeze, '-infinity'.freeze].freeze
@@ -38,6 +37,15 @@ module Sequel
38
37
  procs = db.conversion_procs
39
38
  procs[1082] = ::Sequel.method(:string_to_date)
40
39
  procs[1184] = procs[1114] = db.method(:to_application_timestamp)
40
+ if ocps = db.instance_variable_get(:@oid_convertor_map)
41
+ # Clear the oid convertor map entries for timestamps if they
42
+ # exist, so it will regenerate new ones that use this extension.
43
+ # This is only taken when using the jdbc adapter.
44
+ Sequel.synchronize do
45
+ ocps.delete(1184)
46
+ ocps.delete(1114)
47
+ end
48
+ end
41
49
  end
42
50
 
43
51
  # Handle BC dates and times in bound variables. This is necessary for Date values
@@ -181,7 +189,7 @@ module Sequel
181
189
 
182
190
  # Handle BC Date objects.
183
191
  def literal_date(date)
184
- if date < DATE_YEAR_1
192
+ if date.year < 1
185
193
  date <<= ((date.year) * 24 - 12)
186
194
  date.strftime("'%Y-%m-%d BC'")
187
195
  else
@@ -221,7 +221,7 @@ module Sequel
221
221
  private
222
222
 
223
223
  # Recognize the registered database multirange types.
224
- def schema_column_type(db_type)
224
+ def schema_multirange_type(db_type)
225
225
  @pg_multirange_schema_types[db_type] || super
226
226
  end
227
227
 
@@ -234,7 +234,7 @@ module Sequel
234
234
  private
235
235
 
236
236
  # Recognize the registered database range types.
237
- def schema_column_type(db_type)
237
+ def schema_range_type(db_type)
238
238
  @pg_range_schema_types[db_type] || super
239
239
  end
240
240
 
@@ -538,12 +538,8 @@ module Sequel
538
538
  private
539
539
 
540
540
  # Make the column type detection handle registered row types.
541
- def schema_column_type(db_type)
542
- if type = @row_schema_types[db_type]
543
- type
544
- else
545
- super
546
- end
541
+ def schema_composite_type(db_type)
542
+ @row_schema_types[db_type] || super
547
543
  end
548
544
  end
549
545
  end
@@ -88,11 +88,11 @@ module Sequel
88
88
  # Note that the migration this produces does not have a down
89
89
  # block, so you cannot reverse it.
90
90
  def dump_foreign_key_migration(options=OPTS)
91
- ts = tables(options)
91
+ ts = _dump_tables(options)
92
92
  <<END_MIG
93
93
  Sequel.migration do
94
94
  change do
95
- #{ts.sort.map{|t| dump_table_foreign_keys(t)}.reject{|x| x == ''}.join("\n\n").gsub(/^/, ' ')}
95
+ #{ts.map{|t| dump_table_foreign_keys(t)}.reject{|x| x == ''}.join("\n\n").gsub(/^/, ' ')}
96
96
  end
97
97
  end
98
98
  END_MIG
@@ -106,11 +106,11 @@ END_MIG
106
106
  # set to :namespace, prepend the table name to the index name if the
107
107
  # database does not use a global index namespace.
108
108
  def dump_indexes_migration(options=OPTS)
109
- ts = tables(options)
109
+ ts = _dump_tables(options)
110
110
  <<END_MIG
111
111
  Sequel.migration do
112
112
  change do
113
- #{ts.sort.map{|t| dump_table_indexes(t, :add_index, options)}.reject{|x| x == ''}.join("\n\n").gsub(/^/, ' ')}
113
+ #{ts.map{|t| dump_table_indexes(t, :add_index, options)}.reject{|x| x == ''}.join("\n\n").gsub(/^/, ' ')}
114
114
  end
115
115
  end
116
116
  END_MIG
@@ -138,7 +138,7 @@ END_MIG
138
138
  options[:foreign_keys] = false
139
139
  end
140
140
 
141
- ts = sort_dumped_tables(tables(options), options)
141
+ ts = sort_dumped_tables(_dump_tables(options), options)
142
142
  skipped_fks = if sfk = options[:skipped_foreign_keys]
143
143
  # Handle skipped foreign keys by adding them at the end via
144
144
  # alter_table/add_foreign_key. Note that skipped foreign keys
@@ -166,6 +166,21 @@ END_MIG
166
166
 
167
167
  private
168
168
 
169
+ # Handle schema option to dump tables in a different schema. Such
170
+ # tables must be schema qualified for this to work correctly.
171
+ def _dump_tables(opts)
172
+ if opts[:schema]
173
+ _literal_table_sort(tables(opts.merge(:qualify=>true)))
174
+ else
175
+ tables(opts).sort
176
+ end
177
+ end
178
+
179
+ # Sort the given table by the literalized value.
180
+ def _literal_table_sort(tables)
181
+ tables.sort_by{|s| literal(s)}
182
+ end
183
+
169
184
  # If a database default exists and can't be converted, and we are dumping with :same_db,
170
185
  # return a string with the inspect method modified a literal string is created if the code is evaled.
171
186
  def column_schema_to_ruby_default_fallback(default, options)
@@ -204,12 +219,20 @@ END_MIG
204
219
  if database_type == :mysql && h[:type] =~ /\Atimestamp/
205
220
  h[:null] = true
206
221
  end
222
+ if database_type == :mssql && schema[:max_length]
223
+ h[:size] = schema[:max_length]
224
+ end
207
225
  h
208
226
  else
209
227
  column_schema_to_ruby_type(schema)
210
228
  end
211
229
  type = col_opts.delete(:type)
212
- col_opts.delete(:size) if col_opts[:size].nil?
230
+ if col_opts.key?(:size) && col_opts[:size].nil?
231
+ col_opts.delete(:size)
232
+ if max_length = schema[:max_length]
233
+ col_opts[:size] = max_length
234
+ end
235
+ end
213
236
  if schema[:generated]
214
237
  if options[:same_db] && database_type == :postgres
215
238
  col_opts[:generated_always_as] = column_schema_to_ruby_default_fallback(schema[:default], options)
@@ -352,7 +375,7 @@ END_MIG
352
375
  options[:skipped_foreign_keys] = skipped_foreign_keys
353
376
  tables
354
377
  else
355
- tables.sort
378
+ tables
356
379
  end
357
380
  end
358
381
 
@@ -377,14 +400,14 @@ END_MIG
377
400
  # outstanding foreign keys and skipping those foreign keys.
378
401
  # The skipped foreign keys will be added at the end of the
379
402
  # migration.
380
- skip_table, skip_fks = table_fks.sort_by{|table, fks| [fks.length, table]}.first
403
+ skip_table, skip_fks = table_fks.sort_by{|table, fks| [fks.length, literal(table)]}.first
381
404
  skip_fks_hash = skipped_foreign_keys[skip_table] = {}
382
405
  skip_fks.each{|fk| skip_fks_hash[fk[:columns]] = fk}
383
406
  this_loop << skip_table
384
407
  end
385
408
 
386
409
  # Add sorted tables from this loop to the final list
387
- sorted_tables.concat(this_loop.sort)
410
+ sorted_tables.concat(_literal_table_sort(this_loop))
388
411
 
389
412
  # Remove tables that were handled this loop
390
413
  this_loop.each{|t| table_fks.delete(t)}
@@ -0,0 +1,58 @@
1
+ # frozen-string-literal: true
2
+ #
3
+ # The set_literalizer extension allows for using Set instances in many of the
4
+ # same places that you would use Array instances:
5
+ #
6
+ # DB[:table].where(column: Set.new([1, 2, 3]))
7
+ # # SELECT FROM table WHERE (column IN (1, 2, 3))
8
+ #
9
+ # To load the extension into all datasets created from a given Database:
10
+ #
11
+ # DB.extension :set_literalizer
12
+ #
13
+ # Related module: Sequel::Dataset::SetLiteralizer
14
+
15
+ require 'set'
16
+
17
+ module Sequel
18
+ class Dataset
19
+ module SetLiteralizer
20
+ # Try to generate the same SQL for Set instances used in datasets
21
+ # that would be used for equivalent Array instances.
22
+ def complex_expression_sql_append(sql, op, args)
23
+ # Array instances are treated specially by
24
+ # Sequel::SQL::BooleanExpression.from_value_pairs. That cannot
25
+ # be modified by a dataset extension, so this tries to convert
26
+ # the complex expression values generated by default to what would
27
+ # be the complex expression values used for the equivalent array.
28
+ case op
29
+ when :'=', :'!='
30
+ if (set = args[1]).is_a?(Set)
31
+ op = op == :'=' ? :IN : :'NOT IN'
32
+ col = args[0]
33
+ array = set.to_a
34
+ if Sequel.condition_specifier?(array) && col.is_a?(Array)
35
+ array = Sequel.value_list(array)
36
+ end
37
+ args = [col, array]
38
+ end
39
+ end
40
+
41
+ super
42
+ end
43
+
44
+ private
45
+
46
+ # Literalize Set instances by converting the set to array.
47
+ def literal_other_append(sql, v)
48
+ if Set === v
49
+ literal_append(sql, v.to_a)
50
+ else
51
+ super
52
+ end
53
+ end
54
+ end
55
+
56
+ register_extension(:set_literalizer, SetLiteralizer)
57
+ end
58
+ end
@@ -125,14 +125,15 @@ module Sequel
125
125
  ds = @dataset.with_quote_identifiers(false)
126
126
  table_name = ds.literal(ds.first_source_table)
127
127
  reflections = {}
128
- @constraint_validations = (Sequel.synchronize{hash[table_name]} || []).map{|r| constraint_validation_array(r, reflections)}
128
+ allow_missing_columns = db_schema.select{|col, sch| sch[:allow_null] == false && nil != sch[:default]}.map(&:first)
129
+ @constraint_validations = (Sequel.synchronize{hash[table_name]} || []).map{|r| constraint_validation_array(r, reflections, allow_missing_columns)}
129
130
  @constraint_validation_reflections = reflections
130
131
  end
131
132
  end
132
133
 
133
134
  # Given a specific database constraint validation metadata row hash, transform
134
135
  # it in an validation method call array suitable for splatting to send.
135
- def constraint_validation_array(r, reflections)
136
+ def constraint_validation_array(r, reflections, allow_missing_columns=EMPTY_ARRAY)
136
137
  opts = {}
137
138
  opts[:message] = r[:message] if r[:message]
138
139
  opts[:allow_nil] = true if db.typecast_value(:boolean, r[:allow_nil])
@@ -191,11 +192,13 @@ module Sequel
191
192
  reflection_opts[:argument] = arg
192
193
  end
193
194
 
194
- a << column
195
- unless opts.empty?
196
- a << opts
195
+ opts[:from] = :values
196
+ if column.is_a?(Symbol) && allow_missing_columns.include?(column)
197
+ opts[:allow_missing] = true
197
198
  end
198
199
 
200
+ a << column << opts
201
+
199
202
  if column.is_a?(Array) && column.length == 1
200
203
  column = column.first
201
204
  end
@@ -28,7 +28,9 @@ module Sequel
28
28
  #
29
29
  # This plugin only works on the postgres adapter when using the pg 0.16+ driver,
30
30
  # PostgreSQL 9.3+ server, and PostgreSQL 9.3+ client library (libpq). In other cases
31
- # it will be a no-op.
31
+ # it will be a no-op. Additionally, the plugin only handles models that select
32
+ # from tables. It does not handle models that select from subqueries, such as
33
+ # subclasses of models using the class_table_inheritance plugin.
32
34
  #
33
35
  # Example:
34
36
  #
@@ -3,7 +3,8 @@
3
3
  module Sequel
4
4
  module Plugins
5
5
  # The prepared_statements plugin modifies the model to use prepared statements for
6
- # instance level inserts and updates.
6
+ # instance level inserts and updates. This plugin exists for backwards compatibility
7
+ # and is not recommended for general use.
7
8
  #
8
9
  # Note that this plugin is unsafe in some circumstances, as it can allow up to
9
10
  # 2^N prepared statements to be created for each type of insert and update query, where
@@ -5,7 +5,8 @@ module Sequel
5
5
  # The prepared_statements_safe plugin modifies the model to reduce the number of
6
6
  # prepared statements that can be created, by setting as many columns as possible
7
7
  # before creating, and by changing +save_changes+ to save all columns instead of
8
- # just the changed ones.
8
+ # just the changed ones. This plugin exists for backwards compatibility
9
+ # and is not recommended for general use.
9
10
  #
10
11
  # This plugin depends on the +prepared_statements+ plugin.
11
12
  #
@@ -75,6 +75,10 @@ module Sequel
75
75
  # "#{Array(attribute).join(I18n.t('errors.joiner'))} #{error_msg}"
76
76
  # end
77
77
  # end
78
+ #
79
+ # It is recommended that users of this plugin that use validates_schema_types also use
80
+ # the validation_helpers_generic_type_messages plugin for more useful type validation
81
+ # failure messages.
78
82
  module ValidationHelpers
79
83
  DEFAULT_OPTIONS = {
80
84
  :exact_length=>{:message=>lambda{|exact| "is not #{exact} characters"}},
@@ -211,7 +215,7 @@ module Sequel
211
215
  klass = klass.to_s.constantize if klass.is_a?(String) || klass.is_a?(Symbol)
212
216
  validatable_attributes_for_type(:type, atts, opts) do |a,v,m|
213
217
  if klass.is_a?(Array) ? !klass.any?{|kls| v.is_a?(kls)} : !v.is_a?(klass)
214
- validation_error_message(m, klass)
218
+ validates_type_error_message(m, klass)
215
219
  end
216
220
  end
217
221
  end
@@ -338,6 +342,9 @@ module Sequel
338
342
  def validation_error_message(message, *args)
339
343
  message.is_a?(Proc) ? message.call(*args) : message
340
344
  end
345
+
346
+ # The validation error message for type validations, for the given class.
347
+ alias validates_type_error_message validation_error_message
341
348
  end
342
349
  end
343
350
  end
@@ -0,0 +1,73 @@
1
+ # frozen-string-literal: true
2
+
3
+ require_relative 'validation_helpers'
4
+
5
+ module Sequel
6
+ module Plugins
7
+ # The validation_helpers_generic_type_messages plugin overrides the default
8
+ # type validation failure messages in the validation_helpers plugin to be
9
+ # more generic and understandable by the average user, instead of always
10
+ # be based on the names of the allowed classes for the type. For example:
11
+ #
12
+ # # :blob type
13
+ # # validation_helpers default: "value is not a valid sequel::sql::blob"
14
+ # # with this plugin: "value is not a blob"
15
+ #
16
+ # # :boolean type
17
+ # # validation_helpers default: "value is not a valid trueclass or falseclass"
18
+ # # with this plugin: "value is not true or false"
19
+ #
20
+ # # :datetime type
21
+ # # validation_helpers default: "value is not a valid time or datetime"
22
+ # # with this plugin: "value is not a valid timestamp"
23
+ #
24
+ # # custom/database-specific types
25
+ # # validation_helpers default: "value is not a valid sequel::class_name"
26
+ # # with this plugin: "value is not the expected type"
27
+ #
28
+ # It is expected that this plugin will become the default behavior of
29
+ # validation_helpers in Sequel 6.
30
+ #
31
+ # To enable this the use of generic type messages for all models, load this
32
+ # plugin into Sequel::Model.
33
+ #
34
+ # Sequel::Model.plugin :validation_helpers_generic_type_messages
35
+ module ValidationHelpersGenericTypeMessages
36
+ OVERRIDE_PROC = ValidationHelpers::DEFAULT_OPTIONS[:type][:message]
37
+ private_constant :OVERRIDE_PROC
38
+
39
+ TYPE_ERROR_STRINGS = {
40
+ String => 'is not a string'.freeze,
41
+ Integer => 'is not an integer'.freeze,
42
+ Date => 'is not a valid date'.freeze,
43
+ [Time, DateTime].freeze => 'is not a valid timestamp'.freeze,
44
+ Sequel::SQLTime => 'is not a valid time'.freeze,
45
+ [TrueClass, FalseClass].freeze => 'is not true or false'.freeze,
46
+ Float => 'is not a number'.freeze,
47
+ BigDecimal => 'is not a number'.freeze,
48
+ Sequel::SQL::Blob => 'is not a blob'.freeze,
49
+ }
50
+ TYPE_ERROR_STRINGS.default = "is not the expected type".freeze
51
+ TYPE_ERROR_STRINGS.freeze
52
+ private_constant :TYPE_ERROR_STRINGS
53
+
54
+ def self.apply(mod)
55
+ mod.plugin :validation_helpers
56
+ end
57
+
58
+ module InstanceMethods
59
+ private
60
+
61
+ # Use a generic error message for type validations.
62
+ def validates_type_error_message(m, klass)
63
+ # SEQUEL6: Make this the default behavior in validation_helpers
64
+ if OVERRIDE_PROC.equal?(m)
65
+ TYPE_ERROR_STRINGS[klass]
66
+ else
67
+ super
68
+ end
69
+ end
70
+ end
71
+ end
72
+ end
73
+ end
@@ -6,7 +6,7 @@ module Sequel
6
6
 
7
7
  # The minor version of Sequel. Bumped for every non-patch level
8
8
  # release, generally around once a month.
9
- MINOR = 66
9
+ MINOR = 68
10
10
 
11
11
  # The tiny version of Sequel. Usually 0, only bumped for bugfix
12
12
  # releases that fix regressions from previous versions.
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sequel
3
3
  version: !ruby/object:Gem::Version
4
- version: 5.66.0
4
+ version: 5.68.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Jeremy Evans
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-03-01 00:00:00.000000000 Z
11
+ date: 2023-05-01 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: minitest
@@ -198,6 +198,8 @@ extra_rdoc_files:
198
198
  - doc/release_notes/5.64.0.txt
199
199
  - doc/release_notes/5.65.0.txt
200
200
  - doc/release_notes/5.66.0.txt
201
+ - doc/release_notes/5.67.0.txt
202
+ - doc/release_notes/5.68.0.txt
201
203
  - doc/release_notes/5.7.0.txt
202
204
  - doc/release_notes/5.8.0.txt
203
205
  - doc/release_notes/5.9.0.txt
@@ -292,6 +294,8 @@ files:
292
294
  - doc/release_notes/5.64.0.txt
293
295
  - doc/release_notes/5.65.0.txt
294
296
  - doc/release_notes/5.66.0.txt
297
+ - doc/release_notes/5.67.0.txt
298
+ - doc/release_notes/5.68.0.txt
295
299
  - doc/release_notes/5.7.0.txt
296
300
  - doc/release_notes/5.8.0.txt
297
301
  - doc/release_notes/5.9.0.txt
@@ -376,6 +380,7 @@ files:
376
380
  - lib/sequel/dataset.rb
377
381
  - lib/sequel/dataset/actions.rb
378
382
  - lib/sequel/dataset/dataset_module.rb
383
+ - lib/sequel/dataset/deprecated_singleton_class_methods.rb
379
384
  - lib/sequel/dataset/features.rb
380
385
  - lib/sequel/dataset/graph.rb
381
386
  - lib/sequel/dataset/misc.rb
@@ -461,6 +466,7 @@ files:
461
466
  - lib/sequel/extensions/sequel_4_dataset_methods.rb
462
467
  - lib/sequel/extensions/server_block.rb
463
468
  - lib/sequel/extensions/server_logging.rb
469
+ - lib/sequel/extensions/set_literalizer.rb
464
470
  - lib/sequel/extensions/split_array_nil.rb
465
471
  - lib/sequel/extensions/sql_comments.rb
466
472
  - lib/sequel/extensions/sql_expr.rb
@@ -579,6 +585,7 @@ files:
579
585
  - lib/sequel/plugins/validation_class_methods.rb
580
586
  - lib/sequel/plugins/validation_contexts.rb
581
587
  - lib/sequel/plugins/validation_helpers.rb
588
+ - lib/sequel/plugins/validation_helpers_generic_type_messages.rb
582
589
  - lib/sequel/plugins/whitelist_security.rb
583
590
  - lib/sequel/plugins/xml_serializer.rb
584
591
  - lib/sequel/sql.rb
@@ -615,7 +622,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
615
622
  - !ruby/object:Gem::Version
616
623
  version: '0'
617
624
  requirements: []
618
- rubygems_version: 3.4.6
625
+ rubygems_version: 3.4.10
619
626
  signing_key:
620
627
  specification_version: 4
621
628
  summary: The Database Toolkit for Ruby