sequel 3.15.0 → 3.16.0

Sign up to get free protection for your applications and to get access to all the features.
data/CHANGELOG CHANGED
@@ -1,3 +1,17 @@
1
+ === 3.16.0 (2010-10-01)
2
+
3
+ * Support composite foreign keys for associations in the identity_map plugin (harukizaemon, jeremyevans) (#310)
4
+
5
+ * Handle INTERSECT and EXCEPT on Microsoft SQL Server 2005+ (jfirebaugh)
6
+
7
+ * Add :replace option to Database#create_language in the postgresql adapter (jeremyevans)
8
+
9
+ * Make rcte_tree plugin work when not all columns are selected (jeremyevans)
10
+
11
+ * Add swift adapter (jeremyevans)
12
+
13
+ * Fix literalization of DateTime objects on 1.9 for databases that support fractional seconds (jeremyevans)
14
+
1
15
  === 3.15.0 (2010-09-01)
2
16
 
3
17
  * Make emulated alter_table tasks on SQLite correctly preserve foreign keys (DirtYiCE, jeremyevans)
@@ -13,7 +13,7 @@ toolkit for Ruby.
13
13
  configurations, and database sharding.
14
14
  * Sequel currently has adapters for ADO, Amalgalite, DataObjects,
15
15
  DB2, DBI, Firebird, Informix, JDBC, MySQL, Mysql2, ODBC, OpenBase,
16
- Oracle, PostgreSQL and SQLite3.
16
+ Oracle, PostgreSQL, SQLite3, and Swift.
17
17
 
18
18
  == Resources
19
19
 
data/Rakefile CHANGED
@@ -12,7 +12,7 @@ VERS = lambda do
12
12
  require File.expand_path("../lib/sequel/version", __FILE__)
13
13
  Sequel.version
14
14
  end
15
- CLEAN.include ["**/.*.sw?", "sequel-*.gem", ".config", "rdoc", "coverage", "www/public/*.html", "www/public/rdoc*"]
15
+ CLEAN.include ["**/.*.sw?", "sequel-*.gem", ".config", "rdoc", "coverage", "www/public/*.html", "www/public/rdoc*", '**/*.rbc']
16
16
  RDOC_DEFAULT_OPTS = ["--quiet", "--line-numbers", "--inline-source", '--title', 'Sequel: The Database Toolkit for Ruby']
17
17
  RDOC_OPTS = RDOC_DEFAULT_OPTS + ['--main', 'README.rdoc']
18
18
 
@@ -165,3 +165,10 @@ One of the best features of Sequel is the ability to use datasets as sub-queries
165
165
  #=> "SELECT * FROM consumers WHERE (id IN (SELECT consumer_id FROM consumer_refs WHERE logged_in))"
166
166
 
167
167
  Note that if you are checking for the inclusion of a single column in a subselect, the subselect should only select a single column.
168
+
169
+ == Using OR instead of AND
170
+
171
+ By default, if you chain calls to +filter+, the conditions get ANDed together. If you want to use an OR for a condition, you can use the +or+ method:
172
+
173
+ items.filter(:name=>'Food').or(:vendor=>1).sql
174
+ #=> "SELECT * FROM items WHERE ((name = 'Food') OR (vendor = 1))"
@@ -0,0 +1,45 @@
1
+ = New Adapter
2
+
3
+ * A swift adapter was added to Sequel. Swift is a relatively new
4
+ ruby database library, built on top of a relatively new backend
5
+ called dbic++. While not yet considered production ready, it is
6
+ very fast. The swift adapter is about 33% faster and 40% more
7
+ memory efficient for selects than the postgres adapter using pg
8
+ with sequel_pg, though it is slower and less memory efficient
9
+ for inserts and updates.
10
+
11
+ Sequel's swift adapter currently supports only PostgreSQL and
12
+ MySQL, but support for other databases will probably be added in
13
+ the future.
14
+
15
+ = Other Improvements
16
+
17
+ * Sequel now correctly literalizes DateTime objects on ruby 1.9 for
18
+ databases that support fractional seconds.
19
+
20
+ * The identity_map plugin now handles composite keys in many_to_one
21
+ associations.
22
+
23
+ * The rcte_tree plugin now works when the model's dataset does not
24
+ select all columns. This can happen when using the lazy_attributes
25
+ plugin on the same model.
26
+
27
+ * Sequel now supports INTERSECT and EXCEPT on Microsoft SQL Server
28
+ 2005+.
29
+
30
+ * The Database#create_language method in the shared PostgreSQL
31
+ adapter now accepts a :replace option to replace the currently
32
+ loaded procedural language if it already exists. This option
33
+ is ignored for PostgreSQL versions before 9.0.
34
+
35
+ * The identity_map plugin now handles cases where the plugin is
36
+ loaded separately by two different models.
37
+
38
+ = Backwards Compatibility
39
+
40
+ * While not technically backwards compatibility related, it was
41
+ discovered that the identity_map plugin is incompatible with
42
+ the standard eager loading of many_to_many and many_through_many
43
+ associations. If you want to eagerly load those associations and
44
+ use the identity_map plugin, you should use eager_graph instead
45
+ of eager.
@@ -339,9 +339,9 @@ module Sequel
339
339
  db.server_version(@opts[:server])
340
340
  end
341
341
 
342
- # Microsoft SQL Server does not support INTERSECT or EXCEPT
342
+ # MSSQL 2005+ supports INTERSECT and EXCEPT
343
343
  def supports_intersect_except?
344
- false
344
+ is_2005_or_later?
345
345
  end
346
346
 
347
347
  # MSSQL does not support IS TRUE
@@ -356,7 +356,7 @@ module Sequel
356
356
 
357
357
  # MSSQL 2005+ supports modifying joined datasets
358
358
  def supports_modifying_joins?
359
- true
359
+ is_2005_or_later?
360
360
  end
361
361
 
362
362
  # MSSQL does not support multiple columns for the IN/NOT IN operators
@@ -364,9 +364,9 @@ module Sequel
364
364
  false
365
365
  end
366
366
 
367
- # Only 2005+ supports the output clause.
367
+ # MSSQL 2005+ supports the output clause.
368
368
  def supports_output_clause?
369
- server_version >= 9000000
369
+ is_2005_or_later?
370
370
  end
371
371
 
372
372
  # MSSQL 2005+ supports window functions
@@ -381,6 +381,9 @@ module Sequel
381
381
  end
382
382
 
383
383
  private
384
+ def is_2005_or_later?
385
+ server_version >= 9000000
386
+ end
384
387
 
385
388
  # MSSQL supports the OUTPUT clause for DELETE statements.
386
389
  # It also allows prepending a WITH clause.
@@ -196,6 +196,7 @@ module Sequel
196
196
  # * name : Name of the procedural language (e.g. plpgsql)
197
197
  # * opts : options hash:
198
198
  # * :handler : The name of a previously registered function used as a call handler for this language.
199
+ # * :replace: Replace the installed language if it already exists (on PostgreSQL 9.0+).
199
200
  # * :trusted : Marks the language being created as trusted, allowing unprivileged users to create functions using this language.
200
201
  # * :validator : The name of previously registered function used as a validator of functions defined in this language.
201
202
  def create_language(name, opts={})
@@ -428,7 +429,7 @@ module Sequel
428
429
 
429
430
  # SQL for creating a procedural language.
430
431
  def create_language_sql(name, opts={})
431
- "CREATE#{' TRUSTED' if opts[:trusted]} LANGUAGE #{name}#{" HANDLER #{opts[:handler]}" if opts[:handler]}#{" VALIDATOR #{opts[:validator]}" if opts[:validator]}"
432
+ "CREATE#{' OR REPLACE' if opts[:replace] && server_version >= 90000}#{' TRUSTED' if opts[:trusted]} LANGUAGE #{name}#{" HANDLER #{opts[:handler]}" if opts[:handler]}#{" VALIDATOR #{opts[:validator]}" if opts[:validator]}"
432
433
  end
433
434
 
434
435
  # SQL for creating a database trigger.
@@ -0,0 +1,145 @@
1
+ require 'swift'
2
+
3
+ module Sequel
4
+ # Module holding the Swift support for Sequel. Swift is a
5
+ # ruby front-end for dbic++, a fast database access library
6
+ # written in C++.
7
+ #
8
+ # The Swift adapter currently supports PostgreSQL and MySQL:
9
+ #
10
+ # Sequel.connect('swift://user:password@host/database?db_type=postgres')
11
+ # Sequel.connect('swift://user:password@host/database?db_type=mysql')
12
+ module Swift
13
+ # Contains procs keyed on sub adapter type that extend the
14
+ # given database object so it supports the correct database type.
15
+ DATABASE_SETUP = {:postgres=>proc do |db|
16
+ Sequel.ts_require 'adapters/swift/postgres'
17
+ db.extend(Sequel::Swift::Postgres::DatabaseMethods)
18
+ db.swift_class = ::Swift::DB::Postgres
19
+ end,
20
+ :mysql=>proc do |db|
21
+ Sequel.ts_require 'adapters/swift/mysql'
22
+ db.extend(Sequel::Swift::MySQL::DatabaseMethods)
23
+ db.swift_class = ::Swift::DB::Mysql
24
+ end,
25
+ }
26
+
27
+ class Database < Sequel::Database
28
+ set_adapter_scheme :swift
29
+
30
+ # The Swift adapter class being used by this database. Connections
31
+ # in this database's connection pool will be instances of this class.
32
+ attr_accessor :swift_class
33
+
34
+ # Call the DATABASE_SETUP proc directly after initialization,
35
+ # so the object always uses sub adapter specific code. Also,
36
+ # raise an error immediately if the connection doesn't have a
37
+ # db_type specified, since one is required to include the correct
38
+ # subadapter.
39
+ def initialize(opts)
40
+ super
41
+ if db_type = opts[:db_type] and !db_type.to_s.empty?
42
+ if prok = DATABASE_SETUP[db_type.to_s.to_sym]
43
+ prok.call(self)
44
+ else
45
+ raise(Error, "No :db_type option specified")
46
+ end
47
+ else
48
+ raise(Error, ":db_type option not valid, should be postgres or mysql")
49
+ end
50
+ end
51
+
52
+ # Create an instance of swift_class for the given options.
53
+ def connect(server)
54
+ setup_connection(swift_class.new(server_opts(server)))
55
+ end
56
+
57
+ # Return a Sequel::Swift::Dataset object for this database.
58
+ def dataset(opts = nil)
59
+ Swift::Dataset.new(self, opts)
60
+ end
61
+
62
+ # Execute the given SQL, yielding a Swift::Result if a block is given.
63
+ def execute(sql, opts={})
64
+ synchronize(opts[:server]) do |conn|
65
+ begin
66
+ res = nil
67
+ log_yield(sql){res = conn.prepare(sql).execute}
68
+ yield res if block_given?
69
+ nil
70
+ rescue SwiftError => e
71
+ raise_error(e)
72
+ ensure
73
+ res.finish if res
74
+ end
75
+ end
76
+ end
77
+
78
+ # Execute the SQL on the this database, returning the number of affected
79
+ # rows.
80
+ def execute_dui(sql, opts={})
81
+ synchronize(opts[:server]) do |conn|
82
+ begin
83
+ log_yield(sql){conn.execute(sql)}
84
+ rescue SwiftError => e
85
+ raise_error(e)
86
+ end
87
+ end
88
+ end
89
+
90
+ # Execute the SQL on this database, returning the primary key of the
91
+ # table being inserted to.
92
+ def execute_insert(sql, opts={})
93
+ synchronize(opts[:server]) do |conn|
94
+ begin
95
+ log_yield(sql){conn.prepare(sql).execute.insert_id}
96
+ rescue SwiftError => e
97
+ raise_error(e)
98
+ end
99
+ end
100
+ end
101
+
102
+ private
103
+
104
+ # Method to call on a statement object to execute SQL that does
105
+ # not return any rows.
106
+ def connection_execute_method
107
+ :execute
108
+ end
109
+
110
+ # Close the given database connection.
111
+ def disconnect_connection(c)
112
+ end
113
+
114
+ # Execute SQL on the connection
115
+ def log_connection_execute(conn, sql)
116
+ log_yield(sql){conn.execute(sql)}
117
+ end
118
+
119
+ # Set the :db entry to the same as the :database entry, since
120
+ # Swift uses :db.
121
+ def server_opts(o)
122
+ o = super
123
+ o[:db] ||= o[:database]
124
+ o
125
+ end
126
+
127
+ # Allow extending the given connection when it is first created.
128
+ # By default, just returns the connection.
129
+ def setup_connection(conn)
130
+ conn
131
+ end
132
+ end
133
+
134
+ class Dataset < Sequel::Dataset
135
+ # Set the columns and yield the hashes to the block.
136
+ def fetch_rows(sql, &block)
137
+ execute(sql) do |res|
138
+ @columns = res.fields
139
+ res.each(&block)
140
+ end
141
+ self
142
+ end
143
+ end
144
+ end
145
+ end
@@ -0,0 +1,48 @@
1
+ Sequel.require 'adapters/shared/mysql'
2
+
3
+ module Sequel
4
+ module Swift
5
+ # Database and Dataset instance methods for MySQL specific
6
+ # support via Swift.
7
+ module MySQL
8
+ # Database instance methods for MySQL databases accessed via Swift.
9
+ module DatabaseMethods
10
+ include Sequel::MySQL::DatabaseMethods
11
+
12
+ # Return instance of Sequel::Swift::MySQL::Dataset with the given opts.
13
+ def dataset(opts=nil)
14
+ Sequel::Swift::MySQL::Dataset.new(self, opts)
15
+ end
16
+
17
+ private
18
+
19
+ # The database name for the given database.
20
+ def database_name
21
+ opts[:database]
22
+ end
23
+
24
+ # Consider tinyint(1) columns as boolean.
25
+ def schema_column_type(db_type)
26
+ db_type == 'tinyint(1)' ? :boolean : super
27
+ end
28
+ end
29
+
30
+ # Dataset class for MySQL datasets accessed via Swift.
31
+ class Dataset < Swift::Dataset
32
+ include Sequel::MySQL::DatasetMethods
33
+
34
+ # Use execute_insert to execute the replace_sql.
35
+ def replace(*args)
36
+ execute_insert(replace_sql(*args))
37
+ end
38
+
39
+ private
40
+
41
+ # Use Swift's escape method for quoting.
42
+ def literal_string(s)
43
+ db.synchronize{|c| "'#{c.escape(s)}'"}
44
+ end
45
+ end
46
+ end
47
+ end
48
+ end
@@ -0,0 +1,92 @@
1
+ Sequel.require 'adapters/shared/postgres'
2
+
3
+ module Sequel
4
+ Postgres::CONVERTED_EXCEPTIONS << ::SwiftError
5
+
6
+ module Swift
7
+ # Adapter, Database, and Dataset support for accessing a PostgreSQL
8
+ # database via Swift.
9
+ module Postgres
10
+ # Methods to add to the Swift adapter/connection to allow it to work
11
+ # with the shared PostgreSQL code.
12
+ module AdapterMethods
13
+ include Sequel::Postgres::AdapterMethods
14
+
15
+ # Log all SQL that goes through the execute method to the related
16
+ # database object.
17
+ def execute(sql, *args, &block)
18
+ @db.log_yield(sql){super}
19
+ rescue SwiftError => e
20
+ @db.send(:raise_error, e)
21
+ end
22
+
23
+ private
24
+
25
+ # Swift specific method of getting specific values from a result set.
26
+ def single_value(row)
27
+ row.values.at(0)
28
+ end
29
+ end
30
+
31
+ # Methods to add to Database instances that access PostgreSQL via Swift.
32
+ module DatabaseMethods
33
+ include Sequel::Postgres::DatabaseMethods
34
+
35
+ # Add the primary_keys and primary_key_sequences instance variables,
36
+ # so we can get the correct return values for inserted rows.
37
+ def self.extended(db)
38
+ db.instance_eval do
39
+ @primary_keys = {}
40
+ @primary_key_sequences = {}
41
+ end
42
+ end
43
+
44
+ # Return instance of Sequel::Swift::Postgres::Dataset with the given opts.
45
+ def dataset(opts=nil)
46
+ Sequel::Swift::Postgres::Dataset.new(self, opts)
47
+ end
48
+
49
+ # Run the DELETE/UPDATE SQL on the database and return the number
50
+ # of matched rows.
51
+ def execute_dui(sql, opts={})
52
+ synchronize(opts[:server]) do |conn|
53
+ begin
54
+ conn.execute(sql)
55
+ rescue SwiftError => e
56
+ raise_error(e)
57
+ end
58
+ end
59
+ end
60
+
61
+ # Run the INSERT SQL on the database and return the primary key
62
+ # for the record.
63
+ def execute_insert(sql, opts={})
64
+ synchronize(opts[:server]) do |conn|
65
+ conn.execute(sql)
66
+ insert_result(conn, opts[:table], opts[:values])
67
+ end
68
+ end
69
+
70
+ private
71
+
72
+ # Execute SQL on the connection.
73
+ def log_connection_execute(conn, sql)
74
+ conn.execute(sql)
75
+ end
76
+
77
+ # Extend the adapter with the Swift PostgreSQL AdapterMethods.
78
+ def setup_connection(conn)
79
+ conn = super(conn)
80
+ conn.extend(Sequel::Swift::Postgres::AdapterMethods)
81
+ conn.db = self
82
+ conn.apply_connection_settings
83
+ conn
84
+ end
85
+ end
86
+
87
+ class Dataset < Swift::Dataset
88
+ include Sequel::Postgres::DatasetMethods
89
+ end
90
+ end
91
+ end
92
+ end
@@ -6,7 +6,7 @@ module Sequel
6
6
  # ---------------------
7
7
 
8
8
  # Array of supported database adapters
9
- ADAPTERS = %w'ado amalgalite db2 dbi do firebird informix jdbc mysql mysql2 odbc openbase oracle postgres sqlite'.collect{|x| x.to_sym}
9
+ ADAPTERS = %w'ado amalgalite db2 dbi do firebird informix jdbc mysql mysql2 odbc openbase oracle postgres sqlite swift'.collect{|x| x.to_sym}
10
10
 
11
11
  # Whether to use the single threaded connection pool by default
12
12
  @@single_threaded = false
@@ -631,7 +631,7 @@ module Sequel
631
631
  def select(*columns, &block)
632
632
  columns += Array(Sequel.virtual_row(&block)) if block
633
633
  m = []
634
- columns.map do |i|
634
+ columns.each do |i|
635
635
  i.is_a?(Hash) ? m.concat(i.map{|k, v| SQL::AliasedExpression.new(k,v)}) : m << i
636
636
  end
637
637
  clone(:select => m)
@@ -576,7 +576,7 @@ module Sequel
576
576
  v2 = Sequel.application_to_database_timestamp(v)
577
577
  fmt = default_timestamp_format.gsub(/%[Nz]/) do |m|
578
578
  if m == '%N'
579
- format_timestamp_usec(v.is_a?(DateTime) ? v.sec_fraction*86400000000 : v.usec) if supports_timestamp_usecs?
579
+ format_timestamp_usec(v.is_a?(DateTime) ? v.sec_fraction*(RUBY_VERSION < '1.9.0' ? 86400000000 : 1000000) : v.usec) if supports_timestamp_usecs?
580
580
  else
581
581
  if supports_timestamp_timezones?
582
582
  # Would like to just use %z format, but it doesn't appear to work on Windows
@@ -236,8 +236,8 @@ module Sequel
236
236
  # # INSERT INTO artists (name) VALUES ('Bob')
237
237
  #
238
238
  # Artist.find_or_create(:name=>'Jim'){|a| a.hometown = 'Sactown'}
239
- # # SELECT * FROM artists WHERE (name = 'Bob') LIMIT 1
240
- # # INSERT INTO artists (name, hometown) VALUES ('Bob', 'Sactown')
239
+ # # SELECT * FROM artists WHERE (name = 'Jim') LIMIT 1
240
+ # # INSERT INTO artists (name, hometown) VALUES ('Jim', 'Sactown')
241
241
  def find_or_create(cond, &block)
242
242
  find(cond) || create(cond, &block)
243
243
  end
@@ -3,7 +3,7 @@ module Sequel
3
3
  # The identity_map plugin allows the user to create temporary identity maps
4
4
  # via the with_identity_map method, which takes a block. Inside the block,
5
5
  # objects have a 1-1 correspondence with rows in the database.
6
- #
6
+ #
7
7
  # For example, the following is true, and wouldn't be true if you weren't
8
8
  # using the identity map:
9
9
  # Sequel::Model.with_identity_map do
@@ -19,9 +19,13 @@ module Sequel
19
19
  # in the identity map, the record is returned without a database query being
20
20
  # issued.
21
21
  #
22
- # Identity maps are thread-local and only presist for the duration of the block,
22
+ # Identity maps are thread-local and only persist for the duration of the block,
23
23
  # so they should only be considered as a possible performance enhancer.
24
- #
24
+ #
25
+ # The identity_map plugin is not compatible with the standard eager loading of
26
+ # many_to_many and many_through_many associations. If you want to use the identity_map plugin,
27
+ # you should use +eager_graph+ instead of +eager+ for those associations.
28
+ #
25
29
  # Usage:
26
30
  #
27
31
  # # Use an identity map that will affect all model classes (called before loading subclasses)
@@ -37,13 +41,13 @@ module Sequel
37
41
  def identity_map
38
42
  Thread.current[:sequel_identity_map]
39
43
  end
40
-
44
+
41
45
  # The identity map key for an object of the current class with the given pk.
42
46
  # May not always be correct for a class which uses STI.
43
47
  def identity_map_key(pk)
44
48
  "#{self}:#{pk ? Array(pk).join(',') : "nil:#{rand}"}"
45
49
  end
46
-
50
+
47
51
  # If the identity map is in use, check it for a current copy of the object.
48
52
  # If a copy does not exist, create a new object and add it to the identity map.
49
53
  # If a copy exists, add any values in the given row that aren't currently
@@ -61,7 +65,7 @@ module Sequel
61
65
  end
62
66
  o
63
67
  end
64
-
68
+
65
69
  # Take a block and inside that block use an identity map to ensure a 1-1
66
70
  # correspondence of objects to the database row they represent.
67
71
  def with_identity_map
@@ -73,14 +77,14 @@ module Sequel
73
77
  self.identity_map = nil
74
78
  end
75
79
  end
76
-
80
+
77
81
  private
78
82
 
79
- # Set the thread local identity map to the given value.
80
- def identity_map=(v)
83
+ # Set the thread local identity map to the given value.
84
+ def identity_map=(v)
81
85
  Thread.current[:sequel_identity_map] = v
82
86
  end
83
-
87
+
84
88
  # Check the current identity map if it exists for the object with
85
89
  # the matching pk. If one is found, return it, otherwise call super.
86
90
  def primary_key_lookup(pk)
@@ -105,15 +109,21 @@ module Sequel
105
109
  end
106
110
 
107
111
  private
108
-
112
+
113
+ # The primary keys values of the associated object, given the foreign
114
+ # key columns(s).
115
+ def _associated_object_pk(fk)
116
+ fk.is_a?(Array) ? fk.map{|c| send(c)} : send(fk)
117
+ end
118
+
109
119
  # If the association is a many_to_one and it has a :key option and the
110
120
  # key option has a value and the association uses the primary key of
111
121
  # the associated class as the :primary_key option, check the identity
112
122
  # map for the associated object and return it if present.
113
123
  def _load_associated_objects(opts)
114
124
  klass = opts.associated_class
115
- if idm = model.identity_map and opts[:type] == :many_to_one and opts[:primary_key] == klass.primary_key and
116
- opts[:key] and pk = send(opts[:key]) and o = idm[klass.identity_map_key(pk)]
125
+ if klass.respond_to?(:identity_map) && idm = klass.identity_map and opts[:type] == :many_to_one and opts[:primary_key] == klass.primary_key and
126
+ opts[:key] and pk = _associated_object_pk(opts[:key]) and o = idm[klass.identity_map_key(pk)]
117
127
  o
118
128
  else
119
129
  super
@@ -125,7 +125,7 @@ module Sequel
125
125
  end
126
126
  table_alias = model.dataset.schema_and_table(model.table_name)[1].to_sym
127
127
  model.from(t => table_alias).
128
- with_recursive(t, base_ds,
128
+ with_recursive(t, base_ds.select_all,
129
129
  recursive_ds.
130
130
  select(c_all))
131
131
  end
@@ -172,12 +172,14 @@ module Sequel
172
172
  end
173
173
  end
174
174
  table_alias = model.dataset.schema_and_table(model.table_name)[1].to_sym
175
- model.eager_loading_dataset(r,
175
+ elds = model.eager_loading_dataset(r,
176
176
  model.from(t => table_alias).
177
177
  with_recursive(t, base_case,
178
178
  recursive_case),
179
179
  r.select,
180
- eo[:associations], eo).all do |obj|
180
+ eo[:associations], eo)
181
+ elds = elds.select_append(ka) unless elds.opts[:select] == nil
182
+ elds.all do |obj|
181
183
  opk = obj[prkey]
182
184
  if in_pm = parent_map.has_key?(opk)
183
185
  if idm_obj = parent_map[opk]
@@ -220,7 +222,7 @@ module Sequel
220
222
  end
221
223
  table_alias = model.dataset.schema_and_table(model.table_name)[1].to_sym
222
224
  model.from(t => table_alias).
223
- with_recursive(t, base_ds,
225
+ with_recursive(t, base_ds.select_all,
224
226
  recursive_ds.
225
227
  select(SQL::ColumnAll.new(model.table_name)))
226
228
  end
@@ -273,10 +275,12 @@ module Sequel
273
275
  recursive_case = recursive_case.select_more(SQL::AliasedExpression.new(SQL::QualifiedIdentifier.new(t, la) + 1, la)).filter(SQL::QualifiedIdentifier.new(t, la) < level - 1)
274
276
  end
275
277
  table_alias = model.dataset.schema_and_table(model.table_name)[1].to_sym
276
- model.eager_loading_dataset(r,
278
+ elds = model.eager_loading_dataset(r,
277
279
  model.from(t => table_alias).with_recursive(t, base_case, recursive_case),
278
280
  r.select,
279
- associations, eo).all do |obj|
281
+ associations, eo)
282
+ elds = elds.select_append(ka) unless elds.opts[:select] == nil
283
+ elds.all do |obj|
280
284
  if level
281
285
  no_cache = no_cache_level == obj.values.delete(la)
282
286
  end
@@ -3,7 +3,7 @@ module Sequel
3
3
  MAJOR = 3
4
4
  # The minor version of Sequel. Bumped for every non-patch level
5
5
  # release, generally around once a month.
6
- MINOR = 15
6
+ MINOR = 16
7
7
  # The tiny version of Sequel. Usually 0, only bumped for bugfix
8
8
  # releases that fix regressions from previous versions.
9
9
  TINY = 0
@@ -905,7 +905,7 @@ context "MySQL::Dataset#complex_expression_sql" do
905
905
  end
906
906
  end
907
907
 
908
- unless MYSQL_DB.adapter_scheme == :do or MYSQL_DB.adapter_scheme == :mysql2
908
+ if MYSQL_DB.adapter_scheme == :mysql or MYSQL_DB.adapter_scheme == :jdbc
909
909
  context "MySQL Stored Procedures" do
910
910
  before do
911
911
  MYSQL_DB.create_table(:items){Integer :id; Integer :value}
@@ -918,7 +918,7 @@ unless MYSQL_DB.adapter_scheme == :do or MYSQL_DB.adapter_scheme == :mysql2
918
918
  end
919
919
 
920
920
  specify "should be callable on the database object" do
921
- MYSQL_DB.execute('CREATE PROCEDURE test_sproc() BEGIN DELETE FROM items; END')
921
+ MYSQL_DB.execute_ddl('CREATE PROCEDURE test_sproc() BEGIN DELETE FROM items; END')
922
922
  MYSQL_DB[:items].delete
923
923
  MYSQL_DB[:items].insert(:value=>1)
924
924
  MYSQL_DB[:items].count.should == 1
@@ -927,7 +927,7 @@ unless MYSQL_DB.adapter_scheme == :do or MYSQL_DB.adapter_scheme == :mysql2
927
927
  end
928
928
 
929
929
  specify "should be callable on the dataset object" do
930
- MYSQL_DB.execute('CREATE PROCEDURE test_sproc(a INTEGER) BEGIN SELECT *, a AS b FROM items; END')
930
+ MYSQL_DB.execute_ddl('CREATE PROCEDURE test_sproc(a INTEGER) BEGIN SELECT *, a AS b FROM items; END')
931
931
  MYSQL_DB[:items].delete
932
932
  @d = MYSQL_DB[:items]
933
933
  @d.call_sproc(:select, :test_sproc, 3).should == []
@@ -938,7 +938,7 @@ unless MYSQL_DB.adapter_scheme == :do or MYSQL_DB.adapter_scheme == :mysql2
938
938
  end
939
939
 
940
940
  specify "should be callable on the dataset object with multiple arguments" do
941
- MYSQL_DB.execute('CREATE PROCEDURE test_sproc(a INTEGER, c INTEGER) BEGIN SELECT *, a AS b, c AS d FROM items; END')
941
+ MYSQL_DB.execute_ddl('CREATE PROCEDURE test_sproc(a INTEGER, c INTEGER) BEGIN SELECT *, a AS b, c AS d FROM items; END')
942
942
  MYSQL_DB[:items].delete
943
943
  @d = MYSQL_DB[:items]
944
944
  @d.call_sproc(:select, :test_sproc, 3, 4).should == []
@@ -189,7 +189,6 @@ if POSTGRES_DB.pool.respond_to?(:max_size) and POSTGRES_DB.pool.max_size > 1
189
189
  String :name
190
190
  end
191
191
  @ds = POSTGRES_DB[:items]
192
- clear_sqls
193
192
  end
194
193
  after do
195
194
  POSTGRES_DB.drop_table(:items)
@@ -241,11 +240,22 @@ context "A PostgreSQL dataset with a timestamp field" do
241
240
  @d.delete
242
241
  end
243
242
 
244
- cspecify "should store milliseconds in time fields", :do do
243
+ cspecify "should store milliseconds in time fields for Time objects", :do do
245
244
  t = Time.now
246
245
  @d << {:value=>1, :time=>t}
247
- @d.literal(@d[:value =>'1'][:time]).should == @d.literal(t)
248
- @d[:value=>'1'][:time].usec.should == t.usec
246
+ t2 = @d[:value =>'1'][:time]
247
+ @d.literal(t2).should == @d.literal(t)
248
+ t2.strftime('%Y-%m-%d %H:%M:%S').should == t.strftime('%Y-%m-%d %H:%M:%S')
249
+ t2.is_a?(Time) ? t2.usec : t2.strftime('%N').to_i/1000 == t.usec
250
+ end
251
+
252
+ cspecify "should store milliseconds in time fields for DateTime objects", :do do
253
+ t = DateTime.now
254
+ @d << {:value=>1, :time=>t}
255
+ t2 = @d[:value =>'1'][:time]
256
+ @d.literal(t2).should == @d.literal(t)
257
+ t2.strftime('%Y-%m-%d %H:%M:%S').should == t.strftime('%Y-%m-%d %H:%M:%S')
258
+ t2.is_a?(Time) ? t2.usec : t2.strftime('%N').to_i/1000 == t.strftime('%N').to_i/1000
249
259
  end
250
260
  end
251
261
 
@@ -930,12 +940,12 @@ context "Postgres::Database functions, languages, and triggers" do
930
940
 
931
941
  specify "#create_language and #drop_language should create and drop languages" do
932
942
  @d.send(:create_language_sql, :plpgsql).should == 'CREATE LANGUAGE plpgsql'
933
- @d.create_language(:plpgsql)
943
+ @d.create_language(:plpgsql, :replace=>true)
934
944
  proc{@d.create_language(:plpgsql)}.should raise_error(Sequel::DatabaseError)
935
945
  @d.send(:drop_language_sql, :plpgsql).should == 'DROP LANGUAGE plpgsql'
936
946
  @d.drop_language(:plpgsql)
937
947
  proc{@d.drop_language(:plpgsql)}.should raise_error(Sequel::DatabaseError)
938
- @d.send(:create_language_sql, :plpgsql, :trusted=>true, :handler=>:a, :validator=>:b).should == 'CREATE TRUSTED LANGUAGE plpgsql HANDLER a VALIDATOR b'
948
+ @d.send(:create_language_sql, :plpgsql, :replace=>true, :trusted=>true, :handler=>:a, :validator=>:b).should == (@d.server_version >= 90000 ? 'CREATE OR REPLACE TRUSTED LANGUAGE plpgsql HANDLER a VALIDATOR b' : 'CREATE TRUSTED LANGUAGE plpgsql HANDLER a VALIDATOR b')
939
949
  @d.send(:drop_language_sql, :plpgsql, :if_exists=>true, :cascade=>true).should == 'DROP LANGUAGE IF EXISTS plpgsql CASCADE'
940
950
  # Make sure if exists works
941
951
  @d.drop_language(:plpgsql, :if_exists=>true, :cascade=>true)
@@ -973,7 +973,7 @@ context "Dataset#literal" do
973
973
  specify "should literalize DateTime properly" do
974
974
  t = DateTime.now
975
975
  s = t.strftime("'%Y-%m-%d %H:%M:%S")
976
- @dataset.literal(t).should == "#{s}.#{sprintf('%06i', t.sec_fraction* 86400000000)}'"
976
+ @dataset.literal(t).should == "#{s}.#{sprintf('%06i', t.sec_fraction * (RUBY_VERSION < '1.9.0' ? 86400000000 : 1000000))}'"
977
977
  end
978
978
 
979
979
  specify "should literalize Date properly" do
@@ -998,7 +998,7 @@ context "Dataset#literal" do
998
998
 
999
999
  t = DateTime.now
1000
1000
  s = t.strftime("TIMESTAMP '%Y-%m-%d %H:%M:%S")
1001
- @dataset.literal(t).should == "#{s}.#{sprintf('%06i', t.sec_fraction* 86400000000)}'"
1001
+ @dataset.literal(t).should == "#{s}.#{sprintf('%06i', t.sec_fraction* (RUBY_VERSION < '1.9.0' ? 86400000000 : 1000000))}'"
1002
1002
 
1003
1003
  d = Date.today
1004
1004
  s = d.strftime("DATE '%Y-%m-%d'")
@@ -1014,7 +1014,7 @@ context "Dataset#literal" do
1014
1014
 
1015
1015
  t = DateTime.now.new_offset(0)
1016
1016
  s = t.strftime("'%Y-%m-%d %H:%M:%S")
1017
- @dataset.literal(t).should == "#{s}.#{sprintf('%06i', t.sec_fraction* 86400000000)}+0000'"
1017
+ @dataset.literal(t).should == "#{s}.#{sprintf('%06i', t.sec_fraction* (RUBY_VERSION < '1.9.0' ? 86400000000 : 1000000))}+0000'"
1018
1018
  end
1019
1019
 
1020
1020
  specify "should literalize Time and DateTime properly if the database doesn't support usecs in timestamps" do
@@ -5,7 +5,6 @@ describe "Sequel::Plugins::IdentityMap" do
5
5
  class ::IdentityMapModel < Sequel::Model
6
6
  plugin :identity_map
7
7
  attr_accessor :foo
8
- columns :id
9
8
  ds = dataset
10
9
  def ds.fetch_rows(sql)
11
10
  c = @opts[:where].args.first
@@ -16,7 +15,7 @@ describe "Sequel::Plugins::IdentityMap" do
16
15
  end
17
16
  end
18
17
  class ::IdentityMapAlbum < ::IdentityMapModel
19
- columns :id, :artist_id
18
+ columns :artist_id
20
19
  end
21
20
  class ::IdentityMapArtist < ::IdentityMapModel
22
21
  end
@@ -73,7 +72,7 @@ describe "Sequel::Plugins::IdentityMap" do
73
72
  end
74
73
  end
75
74
 
76
- it "#load should should store the object in the current identity map if it isn't already there" do
75
+ it "#load should store the object in the current identity map if it isn't already there" do
77
76
  @c.with_identity_map do
78
77
  @c.identity_map[@c.identity_map_key(1)].should == nil
79
78
  o = @c.load(:id=>1)
@@ -138,6 +137,24 @@ describe "Sequel::Plugins::IdentityMap" do
138
137
  end
139
138
  end
140
139
 
140
+ it "should use the identity map as a lookup cache when retrieving many_to_one associated records via a composite key" do
141
+ @c1.columns :another_id
142
+ @c1.many_to_one :artist, :class=>@c2, :key=>[:id, :another_id]
143
+ @c.with_identity_map do
144
+ MODEL_DB.sqls.length.should == 0
145
+ o = @c1.load(:id=>1, :another_id=>1, :artist_id=>2)
146
+ a = o.artist
147
+ a.should be_a_kind_of(@c2)
148
+ MODEL_DB.sqls.length.should == 1
149
+ o = @c1.load(:id=>1, :another_id=>2, :artist_id=>2)
150
+ o.artist.should == a
151
+ MODEL_DB.sqls.length.should == 1
152
+ o = @c1.load(:id=>3, :another_id=>3, :artist_id=>3)
153
+ o.artist.should_not == a
154
+ MODEL_DB.sqls.length.should == 2
155
+ end
156
+ end
157
+
141
158
  it "should use the identity map as a lookup cache when retrieving many_to_one associated records" do
142
159
  @c1.many_to_one :artist, :class=>@c2
143
160
  @c.with_identity_map do
@@ -170,7 +187,7 @@ describe "Sequel::Plugins::IdentityMap" do
170
187
  MODEL_DB.sqls.length.should == 3
171
188
  end
172
189
  end
173
-
190
+
174
191
  it "should not use the identity map as a lookup cache if the assocation has a nil :key option" do
175
192
  c = @c2
176
193
  @c1.many_to_one :artist, :class=>@c2, :key=>nil, :dataset=>proc{c.filter(:artist_id=>artist_id)}
@@ -472,7 +472,9 @@ describe Sequel::SQL::Constants do
472
472
  cspecify "should have working CURRENT_DATE", [:odbc, :mssql], [:jdbc, :sqlite] do
473
473
  @db.create_table!(:constants){Date :d}
474
474
  @ds.insert(:d=>Sequel::CURRENT_DATE)
475
- Date.today.should == @c2[@ds.get(:d)]
475
+ d = @c2[@ds.get(:d)]
476
+ d.should be_a_kind_of(Date)
477
+ d.to_s.should == Date.today.to_s
476
478
  end
477
479
 
478
480
  cspecify "should have working CURRENT_TIME", [:do, :mysql], [:jdbc, :sqlite], [:mysql2] do
@@ -925,7 +927,7 @@ describe "Dataset identifier methods" do
925
927
  @db.drop_table(:a)
926
928
  end
927
929
 
928
- cspecify "#identifier_output_method should change how identifiers are output", [:mysql2] do
930
+ cspecify "#identifier_output_method should change how identifiers are output", [:mysql2], [:swift] do
929
931
  @ds.identifier_output_method = :upcase
930
932
  @ds.first.should == {:AB=>1}
931
933
  @ds.identifier_output_method = :uprev
@@ -413,13 +413,22 @@ describe "Touch plugin" do
413
413
  @album.updated_at.to_i.should be_close(Time.now.to_i, 2)
414
414
  end
415
415
 
416
- cspecify "should update the timestamp column for associated records when the record is updated or destroyed", [:do], [:jdbc, :sqlite] do
416
+ cspecify "should update the timestamp column for associated records when the record is updated or destroyed", [:do, :sqlite], [:jdbc, :sqlite] do
417
417
  @artist.updated_at.should == nil
418
418
  @album.update(:name=>'B')
419
- @artist.reload.updated_at.to_i.should be_close(Time.now.to_i, 2)
419
+ ua = @artist.reload.updated_at
420
+ if ua.is_a?(Time)
421
+ ua.to_i.should be_close(Time.now.to_i, 2)
422
+ else
423
+ (DateTime.now - ua).should be_close(0, 2.0/86400)
424
+ end
420
425
  @artist.update(:updated_at=>nil)
421
426
  @album.destroy
422
- @artist.reload.updated_at.to_i.should be_close(Time.now.to_i, 2)
427
+ if ua.is_a?(Time)
428
+ ua.to_i.should be_close(Time.now.to_i, 2)
429
+ else
430
+ (DateTime.now - ua).should be_close(0, 2.0/86400)
431
+ end
423
432
  end
424
433
  end
425
434
 
@@ -644,6 +653,16 @@ if INTEGRATION_DB.dataset.supports_cte?
644
653
  nodes[1].ancestors.should == [@a, @aa]
645
654
  nodes[2].ancestors.should == []
646
655
  end
656
+
657
+ specify "should work correctly if not all columns are selected" do
658
+ Node.plugin :lazy_attributes, :name
659
+ @aaaa.descendants.should == [Node.load(:parent_id=>11, :id=>13)]
660
+ @aa.ancestors.should == [Node.load(:parent_id=>nil, :id=>1)]
661
+ nodes = Node.filter(:id=>[@a.id, @b.id, @aaa.id]).order(:name).eager(:ancestors, :descendants).all
662
+ nodes.should == [{:parent_id=>nil, :id=>1}, {:parent_id=>3, :id=>7}, {:parent_id=>nil, :id=>2}].map{|x| Node.load(x)}
663
+ nodes[2].descendants.should == [{:parent_id=>2, :id=>5}, {:parent_id=>2, :id=>6}].map{|x| Node.load(x)}
664
+ nodes[1].ancestors.should == [{:parent_id=>nil, :id=>1}, {:parent_id=>1, :id=>3}].map{|x| Node.load(x)}
665
+ end
647
666
 
648
667
  specify "should eagerly load descendants to a given level" do
649
668
  nodes = Node.filter(:id=>[@a.id, @b.id, @aaa.id]).order(:name).eager(:descendants=>1).all
@@ -37,19 +37,19 @@ describe "Sequel timezone support" do
37
37
  Sequel.datetime_class = Time
38
38
  end
39
39
 
40
- cspecify "should support using UTC for database storage and local time for the application", [:do, proc{|db| db.database_type != :sqlite}] do
40
+ cspecify "should support using UTC for database storage and local time for the application", [:swift], [:do, proc{|db| db.database_type != :sqlite}] do
41
41
  Sequel.database_timezone = :utc
42
42
  Sequel.application_timezone = :local
43
43
  test_timezone
44
44
  end
45
45
 
46
- cspecify "should support using local time for database storage and UTC for the application", [:do, proc{|db| db.database_type != :sqlite}] do
46
+ cspecify "should support using local time for database storage and UTC for the application", [:swift], [:do, proc{|db| db.database_type != :sqlite}] do
47
47
  Sequel.database_timezone = :local
48
48
  Sequel.application_timezone = :utc
49
49
  test_timezone
50
50
  end
51
51
 
52
- cspecify "should support using UTC for both database storage and for application", [:do, proc{|db| db.database_type != :sqlite}] do
52
+ cspecify "should support using UTC for both database storage and for application", [:swift], [:do, proc{|db| db.database_type != :sqlite}] do
53
53
  Sequel.default_timezone = :utc
54
54
  test_timezone
55
55
  end
@@ -64,7 +64,8 @@ describe "Supported types" do
64
64
  ds = create_items_table_with_column(:dat, Date)
65
65
  d = Date.today
66
66
  ds.insert(:dat => d)
67
- ds.first[:dat].should == d
67
+ ds.first[:dat].should be_a_kind_of(Date)
68
+ ds.first[:dat].to_s.should == d.to_s
68
69
  end
69
70
 
70
71
  cspecify "should support generic datetime type", [:do, :sqlite], [:jdbc, :sqlite] do
@@ -78,7 +79,7 @@ describe "Supported types" do
78
79
  ds.first[:tim].strftime('%Y%m%d%H%M%S').should == t.strftime('%Y%m%d%H%M%S')
79
80
  end
80
81
 
81
- cspecify "should support generic file type", [:do], [:odbc, :mssql], [:mysql2] do
82
+ cspecify "should support generic file type", [:do], [:odbc, :mssql], [:mysql2], [:swift] do
82
83
  ds = create_items_table_with_column(:name, File)
83
84
  ds.insert(:name => ("a\0"*300).to_sequel_blob)
84
85
  ds.all.should == [{:name=>("a\0"*300).to_sequel_blob}]
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sequel
3
3
  version: !ruby/object:Gem::Version
4
- hash: 59
4
+ hash: 71
5
5
  prerelease: false
6
6
  segments:
7
7
  - 3
8
- - 15
8
+ - 16
9
9
  - 0
10
- version: 3.15.0
10
+ version: 3.16.0
11
11
  platform: ruby
12
12
  authors:
13
13
  - Jeremy Evans
@@ -15,7 +15,7 @@ autorequire:
15
15
  bindir: bin
16
16
  cert_chain: []
17
17
 
18
- date: 2010-09-01 00:00:00 -07:00
18
+ date: 2010-10-01 00:00:00 -07:00
19
19
  default_executable:
20
20
  dependencies: []
21
21
 
@@ -79,6 +79,7 @@ extra_rdoc_files:
79
79
  - doc/release_notes/3.13.0.txt
80
80
  - doc/release_notes/3.14.0.txt
81
81
  - doc/release_notes/3.15.0.txt
82
+ - doc/release_notes/3.16.0.txt
82
83
  files:
83
84
  - COPYING
84
85
  - CHANGELOG
@@ -125,6 +126,7 @@ files:
125
126
  - doc/release_notes/3.13.0.txt
126
127
  - doc/release_notes/3.14.0.txt
127
128
  - doc/release_notes/3.15.0.txt
129
+ - doc/release_notes/3.16.0.txt
128
130
  - doc/sharding.rdoc
129
131
  - doc/sql.rdoc
130
132
  - doc/virtual_rows.rdoc
@@ -300,6 +302,9 @@ files:
300
302
  - lib/sequel/adapters/sqlite.rb
301
303
  - lib/sequel/adapters/utils/stored_procedures.rb
302
304
  - lib/sequel/adapters/mysql2.rb
305
+ - lib/sequel/adapters/swift.rb
306
+ - lib/sequel/adapters/swift/mysql.rb
307
+ - lib/sequel/adapters/swift/postgres.rb
303
308
  - lib/sequel/connection_pool.rb
304
309
  - lib/sequel/connection_pool/sharded_single.rb
305
310
  - lib/sequel/connection_pool/sharded_threaded.rb