sequel 4.15.0 → 4.16.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: bed2efa61dbcb53c3a0ceb08e9ca858368631d35
4
- data.tar.gz: 11da3e14b94c21e9719cd0d8a700e7f0727bc33b
3
+ metadata.gz: 505bc666a3de9a394dd0bb31e07afff6feca86da
4
+ data.tar.gz: 2082ac9b3731827b5c4e0c39859e8471e706a99d
5
5
  SHA512:
6
- metadata.gz: a132e45671dda78cb933d50f21942ace99b938c3133d54b8dd82babe8fb5657f8c4f63ebb28dc1cd4c13bc08ee5a9736514a3be5ceb2c3dbcc4c7eeb09a5177f
7
- data.tar.gz: a1e4be8380d22afda924e1fec932c47fb5be3f258279010753c12fc5dca9220f38b18b657e84a03875f9108572614292c150d8963ee8b0249eceff06df627d1b
6
+ metadata.gz: f728df92423ad2e207a2db67d89cc7c79819351973c348845978d7ba93826f33caf21b4251dec609c5fcc48c0c13b55460ac2800f7cd36b328d87463c804cc3a
7
+ data.tar.gz: 836ab1fb7bc23c595070f15dff4cf300b80d4b4eaf3e024c305d073a84a3a84c19782bef11359e9e96c0cb739206995ac8a4828ece1a5057c66a8c50fee0f87b
data/CHANGELOG CHANGED
@@ -1,3 +1,25 @@
1
+ === 4.16.0 (2014-11-01)
2
+
3
+ * Make Database#create_table? and #create_join_table? not use IF NOT EXISTS if indexes are being added (jeremyevans) (#904)
4
+
5
+ * Dataset#distinct now accepts virtual row blocks (chanks) (#901)
6
+
7
+ * Recognize disconnect errors in the postgres adapter when SSL is used (jeremyevans) (#900)
8
+
9
+ * Stop converting '' default values to nil default values on MySQL (jeremyevans)
10
+
11
+ * Add Model#qualified_pk_hash, for returning a hash with qualified pk keys (jeremyevans)
12
+
13
+ * Make validates_unique use a qualified primary key if the dataset is joined (jeremyevans) (#895)
14
+
15
+ * Make Sequel::Model.cache_associations = false skip the database's schema cache when loading the schema (jeremyevans)
16
+
17
+ * Make Database#foreign_key_list work on Microsoft SQL Server 2005 (jeremyevans)
18
+
19
+ * Make create_table with :foreign option reversible on PostgreSQL (jeremyevans)
20
+
21
+ * Make drop_table with :foreign option on PostgreSQL drop a foreign table (johnnyt) (#892)
22
+
1
23
  === 4.15.0 (2014-10-01)
2
24
 
3
25
  * Make AssociationReflection#reciprocal not raise error if associated class contains association with invalid associated class (jeremyevans)
@@ -191,7 +191,7 @@ specify an explicit :key option:
191
191
  many_to_one :artist, :key=>:artistid
192
192
  end
193
193
  class Artist
194
- one_to_many :albumst, :key=>:artistid
194
+ one_to_many :albums, :key=>:artistid
195
195
  end
196
196
 
197
197
  For many_to_many associations, the :left_key and :right_key options can be
@@ -358,7 +358,7 @@ you should give it some thought before using it.
358
358
 
359
359
  == Ignoring missing migrations
360
360
 
361
- In some cases, you may want to allow a migration in the database that does not exist in the filesystem (deploying to an older version of code without running a down migration when deploy auto-migrates, for example). If required, you can pass <tt>:allow_missing_migration_files => true<tt> as an option. This will stop errors from being raised if there are migrations in the database that do not exist in the filesystem.
361
+ In some cases, you may want to allow a migration in the database that does not exist in the filesystem (deploying to an older version of code without running a down migration when deploy auto-migrates, for example). If required, you can pass <tt>:allow_missing_migration_files => true</tt> as an option. This will stop errors from being raised if there are migrations in the database that do not exist in the filesystem.
362
362
 
363
363
  == Modifying existing migrations
364
364
 
@@ -0,0 +1,36 @@
1
+ = New Features
2
+
3
+ * Model#qualified_pk_hash has been added, which is similar to
4
+ Model#pk_hash, but uses qualified keys.
5
+
6
+ * Dataset#distinct now accepts a virtual row block.
7
+
8
+ * Database#drop_view with :foreign=>true option now drops foreign
9
+ tables on PostgreSQL. Database#create_table with :foreign option
10
+ is now reversible on PostgreSQL.
11
+
12
+ = Other Improvements
13
+
14
+ * Sequel::Model.cache_associations = false now skips the database's
15
+ schema cache when loading the schema for a model. This fixes
16
+ some issues in environments that use code reloading.
17
+
18
+ * Database#create_table? and #create_join_table? no longer use
19
+ IF NOT EXISTS if indexes are being created.
20
+
21
+ * Model.primary_key_hash and .qualified_primary_key_hash have been
22
+ optimized.
23
+
24
+ * validates_unique in the validation_helpers plugin now uses a
25
+ qualified primary key if the model's dataset is joined. This fixes
26
+ a case when the auto_validations and class_table_inheritance
27
+ plugins are used together.
28
+
29
+ * Disconnect errors are now recognized in the postgres adapter when
30
+ SSL is used for connecting.
31
+
32
+ * Empty string default values are no longer converted to nil default
33
+ values on MySQL.
34
+
35
+ * Database#foreign_key_list now works correctly on Microsoft SQL
36
+ Server 2005.
@@ -109,7 +109,7 @@ module Sequel
109
109
  def execute_insert(sql, opts=OPTS)
110
110
  synchronize(opts[:server]) do |conn|
111
111
  begin
112
- r = log_yield(sql){conn.Execute(sql)}
112
+ log_yield(sql){conn.Execute(sql)}
113
113
  res = log_yield(LAST_INSERT_ID){conn.Execute(LAST_INSERT_ID)}
114
114
  res.getRows.transpose.each{|r| return r.shift}
115
115
  rescue ::WIN32OLERuntimeError => e
@@ -124,7 +124,6 @@ module Sequel
124
124
  def fetch_rows(sql)
125
125
  execute(sql) do |stmt|
126
126
  begin
127
- procs =
128
127
  cols = stmt.column_info.map{|c| [output_identifier(c[COLUMN_INFO_NAME]), CUBRID_TYPE_PROCS[c[COLUMN_INFO_TYPE]]]}
129
128
  @columns = cols.map{|c| c.first}
130
129
  stmt.each do |r|
@@ -38,7 +38,7 @@ module Sequel
38
38
  set_adapter_scheme :db2
39
39
 
40
40
  TEMPORARY = 'GLOBAL TEMPORARY '.freeze
41
- rc, NullHandle = DB2CLI.SQLAllocHandle(DB2CLI::SQL_HANDLE_ENV, DB2CLI::SQL_NULL_HANDLE)
41
+ _, NullHandle = DB2CLI.SQLAllocHandle(DB2CLI::SQL_HANDLE_ENV, DB2CLI::SQL_NULL_HANDLE)
42
42
 
43
43
  # Hash of connection procs for converting
44
44
  attr_reader :conversion_procs
@@ -64,7 +64,7 @@ module Sequel
64
64
  log_connection_execute(conn, sql)
65
65
  sql = "SELECT IDENTITY_VAL_LOCAL() FROM SYSIBM.SYSDUMMY1"
66
66
  log_connection_execute(conn, sql) do |sth|
67
- name, buflen, datatype, size, digits, nullable = checked_error("Could not describe column"){DB2CLI.SQLDescribeCol(sth, 1, 256)}
67
+ _, _, datatype, size, _, _ = checked_error("Could not describe column"){DB2CLI.SQLDescribeCol(sth, 1, 256)}
68
68
  if DB2CLI.SQLFetch(sth) != DB2CLI::SQL_NO_DATA_FOUND
69
69
  v, _ = checked_error("Could not get data"){DB2CLI.SQLGetData(sth, 1, datatype, size)}
70
70
  if v.is_a?(String)
@@ -180,7 +180,6 @@ module Sequel
180
180
  def fetch_rows(sql)
181
181
  execute(sql) do |sth|
182
182
  db = @db
183
- i = 1
184
183
  column_info = get_column_info(sth)
185
184
  cols = column_info.map{|c| c.at(1)}
186
185
  @columns = cols
@@ -213,7 +212,7 @@ module Sequel
213
212
  cps = db.conversion_procs
214
213
 
215
214
  (1..column_count).map do |i|
216
- name, buflen, datatype, size, digits, nullable = db.checked_error("Could not describe column"){DB2CLI.SQLDescribeCol(sth, i, MAX_COL_SIZE)}
215
+ name, _, datatype, size, digits, _ = db.checked_error("Could not describe column"){DB2CLI.SQLDescribeCol(sth, i, MAX_COL_SIZE)}
217
216
  pr = if datatype == DB2CLI::SQL_SMALLINT && convert && size <= 5 && digits <= 1
218
217
  cps[:boolean]
219
218
  elsif datatype == DB2CLI::SQL_CLOB && Sequel::DB2.use_clob_as_blob
@@ -595,7 +595,6 @@ module Sequel
595
595
  # Parse the table schema for the given table.
596
596
  def schema_parse_table(table, opts=OPTS)
597
597
  m = output_identifier_meth(opts[:dataset])
598
- ds = dataset
599
598
  schema, table = metadata_schema_and_table(table, opts)
600
599
  pks, ts = [], []
601
600
  metadata(:getPrimaryKeys, nil, schema, table) do |h|
@@ -781,8 +780,8 @@ module Sequel
781
780
 
782
781
  while result.next
783
782
  row = {}
784
- cols.each do |n, i, pr|
785
- row[n] = pr.call(result, i)
783
+ cols.each do |n, j, pr|
784
+ row[n] = pr.call(result, j)
786
785
  end
787
786
  yield row
788
787
  end
@@ -54,8 +54,8 @@ module Sequel
54
54
  rs.close
55
55
  end
56
56
  else
57
- statement(conn) do |stmt|
58
- rs = stmt.executeQuery(LAST_INSERT_ID)
57
+ statement(conn) do |st|
58
+ rs = st.executeQuery(LAST_INSERT_ID)
59
59
  rs.next
60
60
  rs.getInt(1)
61
61
  end
@@ -133,12 +133,12 @@ module Sequel
133
133
  # depending on the value given.
134
134
  def convert_invalid_date_time=(v)
135
135
  m0 = ::Sequel.method(:string_to_time)
136
- @conversion_procs[11] = (v != false) ? lambda{|v| convert_date_time(v, &m0)} : m0
136
+ @conversion_procs[11] = (v != false) ? lambda{|val| convert_date_time(val, &m0)} : m0
137
137
  m1 = ::Sequel.method(:string_to_date)
138
- m = (v != false) ? lambda{|v| convert_date_time(v, &m1)} : m1
138
+ m = (v != false) ? lambda{|val| convert_date_time(val, &m1)} : m1
139
139
  [10, 14].each{|i| @conversion_procs[i] = m}
140
140
  m2 = method(:to_application_timestamp)
141
- m = (v != false) ? lambda{|v| convert_date_time(v, &m2)} : m2
141
+ m = (v != false) ? lambda{|val| convert_date_time(val, &m2)} : m2
142
142
  [7, 12].each{|i| @conversion_procs[i] = m}
143
143
  @convert_invalid_date_time = v
144
144
  end
@@ -100,7 +100,7 @@ module Sequel
100
100
  if rows = s.fetch_all
101
101
  rows.each do |row|
102
102
  hash = {}
103
- cols.each{|n,i| hash[n] = convert_odbc_value(row[i])}
103
+ cols.each{|n,j| hash[n] = convert_odbc_value(row[j])}
104
104
  yield hash
105
105
  end
106
106
  end
@@ -111,7 +111,6 @@ module Sequel
111
111
  'decimal'.freeze=>Float, 'date'.freeze=>Time, 'datetime'.freeze=>Time,
112
112
  'time'.freeze=>Time, 'boolean'.freeze=>String, 'blob'.freeze=>OCI8::BLOB}
113
113
  def cursor_bind_params(conn, cursor, args)
114
- cursor
115
114
  i = 0
116
115
  args.map do |arg, type|
117
116
  i += 1
@@ -108,7 +108,14 @@ module Sequel
108
108
  # PGconn subclass for connection specific methods used with the
109
109
  # pg, postgres, or postgres-pr driver.
110
110
  class Adapter < ::PGconn
111
- DISCONNECT_ERROR_RE = /\A(?:could not receive data from server|no connection to the server|connection not open|terminating connection due to administrator command)/
111
+ disconnect_errors = [
112
+ 'could not receive data from server',
113
+ 'no connection to the server',
114
+ 'connection not open',
115
+ 'terminating connection due to administrator command',
116
+ 'PQconsumeInput() SSL SYSCALL error'
117
+ ]
118
+ DISCONNECT_ERROR_RE = /\A#{Regexp.union(disconnect_errors)}/
112
119
 
113
120
  self.translate_results = false if respond_to?(:translate_results=)
114
121
 
@@ -144,7 +144,7 @@ module Sequel
144
144
  :rc__name___referenced_column,
145
145
  object_schema_name(:fk__referenced_object_id).as(:schema),
146
146
  object_name(:fk__referenced_object_id).as(:table)]}.
147
- order(:name, :fkc__constraint_column_id)
147
+ order(:fk__name, :fkc__constraint_column_id)
148
148
  h = {}
149
149
  ds.each do |row|
150
150
  if r = h[row[:name]]
@@ -464,7 +464,6 @@ module Sequel
464
464
  row[:allow_null] = row.delete(:Null) == 'YES'
465
465
  row[:default] = row.delete(:Default)
466
466
  row[:primary_key] = row.delete(:Key) == 'PRI'
467
- row[:default] = nil if blank_object?(row[:default])
468
467
  row[:db_type] = row.delete(:Type)
469
468
  row[:type] = schema_column_type(row[:db_type])
470
469
  [m.call(row.delete(:Field)), row]
@@ -906,6 +906,11 @@ module Sequel
906
906
  "DROP TRIGGER#{' IF EXISTS' if opts[:if_exists]} #{name} ON #{quote_schema_table(table)}#{' CASCADE' if opts[:cascade]}"
907
907
  end
908
908
 
909
+ # Support :foreign tables
910
+ def drop_table_sql(name, options)
911
+ "DROP#{' FOREIGN' if options[:foreign]} TABLE#{' IF EXISTS' if options[:if_exists]} #{quote_schema_table(name)}#{' CASCADE' if options[:cascade]}"
912
+ end
913
+
909
914
  # SQL for dropping a view from the database.
910
915
  def drop_view_sql(name, opts=OPTS)
911
916
  "DROP #{'MATERIALIZED ' if opts[:materialized]}VIEW#{' IF EXISTS' if opts[:if_exists]} #{quote_schema_table(name)}#{' CASCADE' if opts[:cascade]}"
@@ -142,7 +142,7 @@ module Sequel
142
142
 
143
143
  # Creates the join table unless it already exists.
144
144
  def create_join_table?(hash, options=OPTS)
145
- if supports_create_table_if_not_exists?
145
+ if supports_create_table_if_not_exists? && options[:no_index]
146
146
  create_join_table(hash, options.merge(:if_not_exists=>true))
147
147
  elsif !table_exists?(join_table_name(hash, options))
148
148
  create_join_table(hash, options)
@@ -215,10 +215,12 @@ module Sequel
215
215
  # # SELECT NULL FROM a LIMIT 1 -- check existence
216
216
  # # CREATE TABLE a (a integer) -- if it doesn't already exist
217
217
  def create_table?(name, options=OPTS, &block)
218
- if supports_create_table_if_not_exists?
219
- create_table(name, options.merge(:if_not_exists=>true), &block)
218
+ options = options.dup
219
+ generator = options[:generator] ||= create_table_generator(&block)
220
+ if generator.indexes.empty? && supports_create_table_if_not_exists?
221
+ create_table(name, options.merge!(:if_not_exists=>true))
220
222
  elsif !table_exists?(name)
221
- create_table(name, options, &block)
223
+ create_table(name, options)
222
224
  end
223
225
  end
224
226
 
@@ -82,16 +82,18 @@ module Sequel
82
82
  c
83
83
  end
84
84
 
85
- # Returns a copy of the dataset with the SQL DISTINCT clause.
86
- # The DISTINCT clause is used to remove duplicate rows from the
87
- # output. If arguments are provided, uses a DISTINCT ON clause,
88
- # in which case it will only be distinct on those columns, instead
89
- # of all returned columns. Raises an error if arguments
90
- # are given and DISTINCT ON is not supported.
85
+ # Returns a copy of the dataset with the SQL DISTINCT clause. The DISTINCT
86
+ # clause is used to remove duplicate rows from the output. If arguments
87
+ # are provided, uses a DISTINCT ON clause, in which case it will only be
88
+ # distinct on those columns, instead of all returned columns. If a block
89
+ # is given, it is treated as a virtual row block, similar to +where+.
90
+ # Raises an error if arguments are given and DISTINCT ON is not supported.
91
91
  #
92
92
  # DB[:items].distinct # SQL: SELECT DISTINCT * FROM items
93
93
  # DB[:items].order(:id).distinct(:id) # SQL: SELECT DISTINCT ON (id) * FROM items ORDER BY id
94
- def distinct(*args)
94
+ # DB[:items].order(:id).distinct{func(:id)} # SQL: SELECT DISTINCT ON (func(id)) * FROM items ORDER BY id
95
+ def distinct(*args, &block)
96
+ virtual_row_columns(args, block)
95
97
  raise(InvalidOperation, "DISTINCT ON not supported") if !args.empty? && !supports_distinct_on?
96
98
  clone(:distinct => args)
97
99
  end
@@ -203,12 +203,12 @@ module Sequel
203
203
  @actions << [:drop_join_table, *args]
204
204
  end
205
205
 
206
- def create_table(*args)
207
- @actions << [:drop_table, args.first]
206
+ def create_table(name, opts=OPTS)
207
+ @actions << [:drop_table, name, opts]
208
208
  end
209
209
 
210
- def create_view(name, _, options={})
211
- @actions << [:drop_view, name, options]
210
+ def create_view(name, _, opts=OPTS)
211
+ @actions << [:drop_view, name, opts]
212
212
  end
213
213
 
214
214
  def rename_column(table, name, new_name)
@@ -259,7 +259,7 @@ module Sequel
259
259
  ds.send(:post_load, objects)
260
260
  else
261
261
  loader = placeholder_eager_loader
262
- loader = loader.with_dataset{|ds| ds.eager(cascade)} if cascade
262
+ loader = loader.with_dataset{|dataset| dataset.eager(cascade)} if cascade
263
263
  objects = loader.all(eo[:id_map].keys)
264
264
  end
265
265
 
@@ -2036,6 +2036,12 @@ module Sequel
2036
2036
  end
2037
2037
  ds
2038
2038
  end
2039
+
2040
+ # If not caching associations, reload the database schema by default,
2041
+ # ignoring any cached values.
2042
+ def reload_db_schema?
2043
+ !@cache_associations
2044
+ end
2039
2045
  end
2040
2046
 
2041
2047
  # Instance methods used to implement the associations support.
@@ -580,14 +580,15 @@ module Sequel
580
580
  # Artist.primary_key_hash(1) # => {:id=>1}
581
581
  # Artist.primary_key_hash([1, 2]) # => {:id1=>1, :id2=>2}
582
582
  def primary_key_hash(value)
583
- raise(Error, "#{self} does not have a primary key") unless key = @primary_key
584
- case key
583
+ case key = @primary_key
584
+ when Symbol
585
+ {key => value}
585
586
  when Array
586
587
  hash = {}
587
- key.each_with_index{|k,i| hash[k] = value[i]}
588
+ key.zip(Array(value)){|k,v| hash[k] = v}
588
589
  hash
589
590
  else
590
- {key => value}
591
+ raise(Error, "#{self} does not have a primary key")
591
592
  end
592
593
  end
593
594
 
@@ -599,9 +600,16 @@ module Sequel
599
600
  # Artist.filter(Artist.qualified_primary_key_hash(1))
600
601
  # # SELECT * FROM artists WHERE (artists.id = 1)
601
602
  def qualified_primary_key_hash(value, qualifier=table_name)
602
- h = primary_key_hash(value)
603
- h.to_a.each{|k,v| h[SQL::QualifiedIdentifier.new(qualifier, k)] = h.delete(k)}
604
- h
603
+ case key = @primary_key
604
+ when Symbol
605
+ {SQL::QualifiedIdentifier.new(qualifier, key) => value}
606
+ when Array
607
+ hash = {}
608
+ key.zip(Array(value)){|k,v| hash[SQL::QualifiedIdentifier.new(qualifier, k)] = v}
609
+ hash
610
+ else
611
+ raise(Error, "#{self} does not have a primary key")
612
+ end
605
613
  end
606
614
 
607
615
  # Similar to finder, but uses a prepared statement instead of a placeholder
@@ -891,7 +899,7 @@ module Sequel
891
899
  # Get the schema from the database, fall back on checking the columns
892
900
  # via the database if that will return inaccurate results or if
893
901
  # it raises an error.
894
- def get_db_schema(reload = false)
902
+ def get_db_schema(reload = reload_db_schema?)
895
903
  set_columns(nil)
896
904
  return nil unless @dataset
897
905
  schema_hash = {}
@@ -933,7 +941,7 @@ module Sequel
933
941
  end
934
942
  schema_hash
935
943
  end
936
-
944
+
937
945
  # Uncached version of setter_methods, to be overridden by plugins
938
946
  # that want to modify the methods used.
939
947
  def get_setter_methods
@@ -1045,6 +1053,11 @@ module Sequel
1045
1053
  end
1046
1054
  end
1047
1055
 
1056
+ # Whether to reload the database schema by default, ignoring any cached value.
1057
+ def reload_db_schema?
1058
+ false
1059
+ end
1060
+
1048
1061
  # Reset the cached fast primary lookup SQL if a simple table and primary key
1049
1062
  # are used, or set it to nil if not used.
1050
1063
  def reset_fast_pk_lookup_sql
@@ -1445,6 +1458,16 @@ module Sequel
1445
1458
  model.primary_key_hash(pk)
1446
1459
  end
1447
1460
 
1461
+ # Returns a hash mapping the receivers primary key column(s) to their values.
1462
+ #
1463
+ # Artist[1].qualified_pk_hash
1464
+ # # => {Sequel.qualify(:artists, :id)=>1}
1465
+ # Artist[[1, 2]].qualified_pk_hash
1466
+ # # => {Sequel.qualify(:artists, :id1)=>1, Sequel.qualify(:artists, :id2)=>2}
1467
+ def qualified_pk_hash(qualifier=model.table_name)
1468
+ model.qualified_primary_key_hash(pk, qualifier)
1469
+ end
1470
+
1448
1471
  # Reloads attributes from database and returns self. Also clears all
1449
1472
  # changed_columns information. Raises an +Error+ if the record no longer
1450
1473
  # exists in the database.
@@ -1631,7 +1654,7 @@ module Sequel
1631
1654
  raise Error, "No dataset for model #{model}" unless ds = model.instance_dataset
1632
1655
 
1633
1656
  cond = if ds.joined_dataset?
1634
- model.qualified_primary_key_hash(pk)
1657
+ qualified_pk_hash
1635
1658
  else
1636
1659
  pk_hash
1637
1660
  end
@@ -2283,11 +2306,12 @@ module Sequel
2283
2306
  # value. If no records matches, returns nil.
2284
2307
  #
2285
2308
  # # Single primary key
2286
- # Artist.dataset.with_pk(1) # SELECT * FROM artists WHERE (id = 1) LIMIT 1
2309
+ # Artist.dataset.with_pk(1)
2310
+ # # SELECT * FROM artists WHERE (artists.id = 1) LIMIT 1
2287
2311
  #
2288
2312
  # # Composite primary key
2289
- # Artist.dataset.with_pk([1, 2]) # SELECT * FROM artists
2290
- # # WHERE ((id1 = 1) AND (id2 = 2)) LIMIT 1
2313
+ # Artist.dataset.with_pk([1, 2])
2314
+ # # SELECT * FROM artists WHERE ((artists.id1 = 1) AND (artists.id2 = 2)) LIMIT 1
2291
2315
  def with_pk(pk)
2292
2316
  first(model.qualified_primary_key_hash(pk))
2293
2317
  end
@@ -83,7 +83,7 @@ module Sequel
83
83
  end
84
84
 
85
85
  if base_ds = opts[:dataset]
86
- ds = base_ds.where(model.qualified_primary_key_hash(pk, table))
86
+ ds = base_ds.where(qualified_pk_hash(table))
87
87
  else
88
88
  base_ds = model.dataset
89
89
  ds = this
@@ -241,7 +241,10 @@ module Sequel
241
241
  ds.where(arr.zip(vals))
242
242
  end
243
243
  ds = yield(ds) if block_given?
244
- ds = ds.exclude(pk_hash) unless new?
244
+ unless new?
245
+ h = ds.joined_dataset? ? qualified_pk_hash : pk_hash
246
+ ds = ds.exclude(h)
247
+ end
245
248
  errors.add(a, message) unless ds.count == 0
246
249
  end
247
250
  end
@@ -3,7 +3,7 @@ module Sequel
3
3
  MAJOR = 4
4
4
  # The minor version of Sequel. Bumped for every non-patch level
5
5
  # release, generally around once a month.
6
- MINOR = 15
6
+ MINOR = 16
7
7
  # The tiny version of Sequel. Usually 0, only bumped for bugfix
8
8
  # releases that fix regressions from previous versions.
9
9
  TINY = 0
@@ -277,10 +277,11 @@ describe "A PostgreSQL database" do
277
277
  a.should == "WARNING: foo\n"
278
278
  end if DB.adapter_scheme == :postgres && SEQUEL_POSTGRES_USES_PG && DB.server_version >= 90000
279
279
 
280
- specify "should support creating foreign tables" do
281
- # This only tests the SQL created, because a true test using file_fdw or postgres_fdw
282
- # requires superuser permissions, and you should not be running the tests as a superuser.
280
+ # These only test the SQL created, because a true test using file_fdw or postgres_fdw
281
+ # requires superuser permissions, and you should not be running the tests as a superuser.
282
+ specify "should support creating and dropping foreign tables" do
283
283
  DB.send(:create_table_sql, :t, DB.create_table_generator{Integer :a}, :foreign=>:f, :options=>{:o=>1}).should == 'CREATE FOREIGN TABLE "t" ("a" integer) SERVER "f" OPTIONS (o \'1\')'
284
+ DB.send(:drop_table_sql, :t, :foreign=>true).should == 'DROP FOREIGN TABLE "t"'
284
285
  end
285
286
  end
286
287
 
@@ -1820,6 +1820,11 @@ describe "Dataset#distinct" do
1820
1820
  @dataset.distinct(Sequel.cast(:stamp, :integer), :node_id=>nil).sql.should == 'SELECT DISTINCT ON (CAST(stamp AS integer), (node_id IS NULL)) name FROM test'
1821
1821
  end
1822
1822
 
1823
+ specify "should use DISTINCT ON if columns are given in a virtual row block and DISTINCT ON is supported" do
1824
+ meta_def(@dataset, :supports_distinct_on?){true}
1825
+ @dataset.distinct{func(:id)}.sql.should == 'SELECT DISTINCT ON (func(id)) name FROM test'
1826
+ end
1827
+
1823
1828
  specify "should do a subselect for count" do
1824
1829
  @dataset.distinct.count
1825
1830
  @db.sqls.should == ['SELECT count(*) AS count FROM (SELECT DISTINCT name FROM test) AS t1 LIMIT 1']
@@ -731,6 +731,17 @@ describe "DB#create_table?" do
731
731
  @db.create_table?(:cats){|*a|}
732
732
  @db.sqls.should == ['CREATE TABLE IF NOT EXISTS cats ()']
733
733
  end
734
+
735
+ specify "should not use IF NOT EXISTS if the indexes are created" do
736
+ meta_def(@db, :table_exists?){|a| false}
737
+ meta_def(@db, :supports_create_table_if_not_exists?){true}
738
+ @db.create_table?(:cats){|*a| Integer :a, :index=>true}
739
+ @db.sqls.should == ['CREATE TABLE cats (a integer)', 'CREATE INDEX cats_a_index ON cats (a)']
740
+
741
+ meta_def(@db, :table_exists?){|a| true}
742
+ @db.create_table?(:cats){|*a| Integer :a, :index=>true}
743
+ @db.sqls.should == []
744
+ end
734
745
  end
735
746
 
736
747
  describe "DB#create_join_table" do
@@ -798,10 +809,21 @@ describe "DB#create_join_table?" do
798
809
  @db.sqls.should == []
799
810
  end
800
811
 
801
- specify "should use IF NOT EXISTS if the database supports it" do
812
+ specify "should not use IF NOT EXISTS" do
813
+ meta_def(@db, :table_exists?){|a| false}
802
814
  meta_def(@db, :supports_create_table_if_not_exists?){true}
803
815
  @db.create_join_table?(:cat_id=>:cats, :dog_id=>:dogs)
804
- @db.sqls.should == ['CREATE TABLE IF NOT EXISTS cats_dogs (cat_id integer NOT NULL REFERENCES cats, dog_id integer NOT NULL REFERENCES dogs, PRIMARY KEY (cat_id, dog_id))', 'CREATE INDEX cats_dogs_dog_id_cat_id_index ON cats_dogs (dog_id, cat_id)']
816
+ @db.sqls.should == ['CREATE TABLE cats_dogs (cat_id integer NOT NULL REFERENCES cats, dog_id integer NOT NULL REFERENCES dogs, PRIMARY KEY (cat_id, dog_id))', 'CREATE INDEX cats_dogs_dog_id_cat_id_index ON cats_dogs (dog_id, cat_id)']
817
+
818
+ meta_def(@db, :table_exists?){|a| true}
819
+ @db.create_join_table?(:cat_id=>:cats, :dog_id=>:dogs)
820
+ @db.sqls.should == []
821
+ end
822
+
823
+ specify "should not use IF NOT EXISTS if no_index is used" do
824
+ meta_def(@db, :supports_create_table_if_not_exists?){true}
825
+ @db.create_join_table?({:cat_id=>:cats, :dog_id=>:dogs}, :no_index=>true)
826
+ @db.sqls.should == ['CREATE TABLE IF NOT EXISTS cats_dogs (cat_id integer NOT NULL REFERENCES cats, dog_id integer NOT NULL REFERENCES dogs, PRIMARY KEY (cat_id, dog_id))']
805
827
  end
806
828
  end
807
829
 
@@ -121,7 +121,7 @@ describe "Reversible Migrations with Sequel.migration{change{}}" do
121
121
  end
122
122
  @db = @c.new
123
123
  @p = Proc.new do
124
- create_table(:a){Integer :a}
124
+ create_table(:a, :foo=>:bar){Integer :a}
125
125
  add_column :a, :b, String
126
126
  add_index :a, :b
127
127
  rename_column :a, :b, :c
@@ -146,7 +146,7 @@ describe "Reversible Migrations with Sequel.migration{change{}}" do
146
146
  specify "should apply up with normal actions in normal order" do
147
147
  p = @p
148
148
  Sequel.migration{change(&p)}.apply(@db, :up)
149
- @db.actions.should == [[:create_table, :a],
149
+ @db.actions.should == [[:create_table, :a, {:foo=>:bar}],
150
150
  [:add_column, :a, :b, String],
151
151
  [:add_index, :a, :b],
152
152
  [:rename_column, :a, :b, :c],
@@ -189,7 +189,7 @@ describe "Reversible Migrations with Sequel.migration{change{}}" do
189
189
  [:rename_column, :a, :c, :b],
190
190
  [:drop_index, :a, :b],
191
191
  [:drop_column, :a, :b],
192
- [:drop_table, :a]]
192
+ [:drop_table, :a, {:foo=>:bar}]]
193
193
  end
194
194
 
195
195
  specify "should raise in the down direction if migration uses unsupported method" do
@@ -477,6 +477,20 @@ describe "Sequel::Plugins::ValidationHelpers" do
477
477
  "SELECT count(*) AS count FROM items WHERE ((a IN (1, 2, 3)) AND (username = '0records') AND (id != 3)) LIMIT 1"]
478
478
  end
479
479
 
480
+ it "should use qualified primary keys for validates_unique when the dataset is joined" do
481
+ @c.columns(:id, :username, :password)
482
+ @c.set_dataset DB[:items]
483
+ c = @c
484
+ @c.set_validations{validates_unique(:username, :dataset=>c.cross_join(:a))}
485
+ @c.dataset._fetch = {:v=>0}
486
+
487
+ DB.reset
488
+ @c.new(:username => "0records", :password => "anothertest").should be_valid
489
+ @c.load(:id=>3, :username => "0records", :password => "anothertest").should be_valid
490
+ DB.sqls.should == ["SELECT count(*) AS count FROM items CROSS JOIN a WHERE (username = '0records') LIMIT 1",
491
+ "SELECT count(*) AS count FROM items CROSS JOIN a WHERE ((username = '0records') AND (items.id != 3)) LIMIT 1"]
492
+ end
493
+
480
494
  it "should support :only_if_modified option for validates_unique, and not check uniqueness for existing records if values haven't changed" do
481
495
  @c.columns(:id, :username, :password)
482
496
  @c.set_dataset DB[:items]
@@ -479,22 +479,38 @@ end
479
479
 
480
480
  describe Sequel::Model::Associations::AssociationReflection, "with caching disabled" do
481
481
  before do
482
- @c = Class.new(Sequel::Model(:foo))
483
- end
484
- after do
485
- Object.send(:remove_const, :ParParent)
482
+ @db = Sequel.mock
483
+ @c = Class.new(Sequel::Model)
484
+ @c.dataset = @db[:foo]
485
+ @c.cache_associations = false
486
486
  end
487
487
 
488
488
  it "should not cache metadata" do
489
- class ::ParParent < Sequel::Model; end
490
- c = ParParent
491
- @c.cache_associations = false
492
- @c.many_to_one :c, :class=>:ParParent
493
- @c.association_reflection(:c).associated_class.should == c
494
- Object.send(:remove_const, :ParParent)
495
- class ::ParParent < Sequel::Model; end
496
- c = ParParent
497
- @c.association_reflection(:c).associated_class.should == c
489
+ begin
490
+ class ::ParParent < Sequel::Model; end
491
+ c = ParParent
492
+ @c.many_to_one :c, :class=>:ParParent
493
+ @c.association_reflection(:c).associated_class.should == c
494
+ Object.send(:remove_const, :ParParent)
495
+ class ::ParParent < Sequel::Model; end
496
+ c = ParParent
497
+ @c.association_reflection(:c).associated_class.should == c
498
+ ensure
499
+ Object.send(:remove_const, :ParParent)
500
+ end
501
+ end
502
+
503
+ it "should not used cached schema" do
504
+ def @db.supports_schema_parsing?; true end
505
+ def @db.schema(table, opts={})
506
+ [[opts[:reload] ? :reload : :id, {}]]
507
+ end
508
+ @c.dataset = @db[:items]
509
+ @c.columns.should == [:reload]
510
+
511
+ @c.cache_associations = true
512
+ @c.dataset = @db[:items]
513
+ @c.columns.should == [:id]
498
514
  end
499
515
  end
500
516
 
@@ -873,11 +873,50 @@ describe "Model#pk_hash" do
873
873
  it "should raise if no primary key" do
874
874
  @m.set_primary_key nil
875
875
  m = @m.new(:id => 111, :x => 2, :y => 3)
876
- proc {m.pk_hash}.should raise_error(Sequel::Error)
876
+ proc{m.pk_hash}.should raise_error(Sequel::Error)
877
877
 
878
878
  @m.no_primary_key
879
879
  m = @m.new(:id => 111, :x => 2, :y => 3)
880
- proc {m.pk_hash}.should raise_error(Sequel::Error)
880
+ proc{m.pk_hash}.should raise_error(Sequel::Error)
881
+ end
882
+ end
883
+
884
+ describe "Model#qualified_pk_hash" do
885
+ before do
886
+ @m = Class.new(Sequel::Model(:items))
887
+ @m.columns :id, :x, :y
888
+ end
889
+
890
+ it "should by default return a hash with the value of the :id column" do
891
+ m = @m.load(:id => 111, :x => 2, :y => 3)
892
+ m.qualified_pk_hash.should == {Sequel.qualify(:items, :id) => 111}
893
+ end
894
+
895
+ it "should accept a custom qualifier" do
896
+ m = @m.load(:id => 111, :x => 2, :y => 3)
897
+ m.qualified_pk_hash(:foo).should == {Sequel.qualify(:foo, :id) => 111}
898
+ end
899
+
900
+ it "should return a hash with the primary key value for custom primary key" do
901
+ @m.set_primary_key :x
902
+ m = @m.load(:id => 111, :x => 2, :y => 3)
903
+ m.qualified_pk_hash.should == {Sequel.qualify(:items, :x) => 2}
904
+ end
905
+
906
+ it "should return a hash with the primary key values for composite primary key" do
907
+ @m.set_primary_key [:y, :x]
908
+ m = @m.load(:id => 111, :x => 2, :y => 3)
909
+ m.qualified_pk_hash.should == {Sequel.qualify(:items, :y) => 3, Sequel.qualify(:items, :x) => 2}
910
+ end
911
+
912
+ it "should raise if no primary key" do
913
+ @m.set_primary_key nil
914
+ m = @m.new(:id => 111, :x => 2, :y => 3)
915
+ proc{m.qualified_pk_hash}.should raise_error(Sequel::Error)
916
+
917
+ @m.no_primary_key
918
+ m = @m.new(:id => 111, :x => 2, :y => 3)
919
+ proc{m.qualified_pk_hash}.should raise_error(Sequel::Error)
881
920
  end
882
921
  end
883
922
 
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sequel
3
3
  version: !ruby/object:Gem::Version
4
- version: 4.15.0
4
+ version: 4.16.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Jeremy Evans
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2014-10-01 00:00:00.000000000 Z
11
+ date: 2014-11-01 00:00:00.000000000 Z
12
12
  dependencies: []
13
13
  description: The Database Toolkit for Ruby
14
14
  email: code@jeremyevans.net
@@ -132,6 +132,7 @@ extra_rdoc_files:
132
132
  - doc/release_notes/4.13.0.txt
133
133
  - doc/release_notes/4.14.0.txt
134
134
  - doc/release_notes/4.15.0.txt
135
+ - doc/release_notes/4.16.0.txt
135
136
  files:
136
137
  - CHANGELOG
137
138
  - MIT-LICENSE
@@ -234,6 +235,7 @@ files:
234
235
  - doc/release_notes/4.13.0.txt
235
236
  - doc/release_notes/4.14.0.txt
236
237
  - doc/release_notes/4.15.0.txt
238
+ - doc/release_notes/4.16.0.txt
237
239
  - doc/release_notes/4.2.0.txt
238
240
  - doc/release_notes/4.3.0.txt
239
241
  - doc/release_notes/4.4.0.txt