activerecord-import 0.25.0 → 0.26.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +5 -5
- data/CHANGELOG.md +13 -0
- data/Gemfile +1 -1
- data/README.markdown +51 -3
- data/Rakefile +1 -1
- data/lib/activerecord-import/adapters/mysql_adapter.rb +1 -1
- data/lib/activerecord-import/adapters/sqlite3_adapter.rb +125 -2
- data/lib/activerecord-import/import.rb +62 -21
- data/lib/activerecord-import/version.rb +1 -1
- data/test/import_test.rb +2 -2
- data/test/schema/sqlite3_schema.rb +13 -0
- data/test/support/factories.rb +8 -8
- data/test/support/generate.rb +6 -6
- data/test/support/postgresql/import_examples.rb +25 -1
- data/test/support/shared_examples/on_duplicate_key_update.rb +24 -0
- data/test/support/shared_examples/recursive_import.rb +1 -1
- data/test/support/sqlite3/import_examples.rb +187 -10
- data/test/test_helper.rb +1 -1
- metadata +5 -3
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
|
-
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: ecfc0672f1042ae828bb3125c0675364d1af8e1b4aefb82af633d3656a8d5fc6
|
4
|
+
data.tar.gz: 8b03926293827932a61342fded8e5400fbe8261e92817ce32c09eda7fa16860f
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 874795a3c02d59b9411ab831ba6257c0c4dab7bad2b478e69451b3d5344652f4d2f906f727189170dc8417ae64a5a2f236a61da1b24a8c1112a7e731a087a7ca
|
7
|
+
data.tar.gz: 00abfb0263c84d2c687451bd1344ae73935291b8db074388c64220d1d6e5ee9dccc515f62217cc4a61c5912acd1de60d5a57ac7229b3734fa145bfd7936f5b56
|
data/CHANGELOG.md
CHANGED
@@ -1,3 +1,16 @@
|
|
1
|
+
## Changes in 0.26.0
|
2
|
+
|
3
|
+
### New Features
|
4
|
+
|
5
|
+
* Add on_duplicate_key_update for SQLite. Thanks to @jkowens via \#542.
|
6
|
+
* Add option to update all fields on_duplicate_key_update. Thanks to @aimerald, @jkowens via \#543.
|
7
|
+
|
8
|
+
### Fixes
|
9
|
+
|
10
|
+
* Handle deeply frozen options hashes. Thanks to @jturkel via \#546.
|
11
|
+
* Switch from FactoryGirl to FactoryBot. Thanks to @koic via \#547.
|
12
|
+
* Allow import to work with ProxySQL. Thanks to @GregFarrell via \#550.
|
13
|
+
|
1
14
|
## Changes in 0.25.0
|
2
15
|
|
3
16
|
### New Features
|
data/Gemfile
CHANGED
data/README.markdown
CHANGED
@@ -21,11 +21,53 @@ and then the reviews:
|
|
21
21
|
That would be about 4M SQL insert statements vs 3, which results in vastly improved performance. In our case, it converted
|
22
22
|
an 18 hour batch process to <2 hrs.
|
23
23
|
|
24
|
-
|
24
|
+
## Table of Contents
|
25
25
|
|
26
|
-
|
26
|
+
* [Callbacks](#callbacks)
|
27
|
+
* [Additional Adapters](#additional-adapters)
|
28
|
+
* [Load Path Setup](#load-path-setup)
|
29
|
+
* [More Information](#more-information)
|
30
|
+
|
31
|
+
### Callbacks
|
32
|
+
|
33
|
+
ActiveRecord callbacks related to [creating](http://guides.rubyonrails.org/active_record_callbacks.html#creating-an-object), [updating](http://guides.rubyonrails.org/active_record_callbacks.html#updating-an-object), or [destroying](http://guides.rubyonrails.org/active_record_callbacks.html#destroying-an-object) records (other than `before_validation` and `after_validation`) will NOT be called when calling the import method. This is because it is mass importing rows of data and doesn't necessarily have access to in-memory ActiveRecord objects.
|
34
|
+
|
35
|
+
If you do have a collection of in-memory ActiveRecord objects you can do something like this:
|
36
|
+
|
37
|
+
```
|
38
|
+
books.each do |book|
|
39
|
+
book.run_callbacks(:save) { false }
|
40
|
+
book.run_callbacks(:create) { false }
|
41
|
+
end
|
42
|
+
Book.import(books)
|
43
|
+
```
|
44
|
+
|
45
|
+
This will run before_create and before_save callbacks on each item. The `false` argument is needed to prevent after_save being run, which wouldn't make sense prior to bulk import. Something to note in this example is that the before_create and before_save callbacks will run before the validation callbacks.
|
46
|
+
|
47
|
+
If that is an issue, another possible approach is to loop through your models first to do validations and then only run callbacks on and import the valid models.
|
48
|
+
|
49
|
+
```
|
50
|
+
valid_books = []
|
51
|
+
invalid_books = []
|
52
|
+
|
53
|
+
books.each do |book|
|
54
|
+
if book.valid?
|
55
|
+
valid_books << book
|
56
|
+
else
|
57
|
+
invalid_books << book
|
58
|
+
end
|
59
|
+
end
|
60
|
+
|
61
|
+
valid_books.each do |book|
|
62
|
+
book.run_callbacks(:save) { false }
|
63
|
+
book.run_callbacks(:create) { false }
|
64
|
+
end
|
65
|
+
|
66
|
+
Book.import valid_books, validate: false
|
67
|
+
```
|
68
|
+
|
69
|
+
### Additional Adapters
|
27
70
|
|
28
|
-
## Additional Adapters
|
29
71
|
Additional adapters can be provided by gems external to activerecord-import by providing an adapter that matches the naming convention setup by activerecord-import (and subsequently activerecord) for dynamically loading adapters. This involves also providing a folder on the load path that follows the activerecord-import naming convention to allow activerecord-import to dynamically load the file.
|
30
72
|
|
31
73
|
When `ActiveRecord::Import.require_adapter("fake_name")` is called the require will be:
|
@@ -63,6 +105,12 @@ activerecord-import-fake_name/
|
|
63
105
|
|
64
106
|
When rubygems pushes the `lib` folder onto the load path a `require` will now find `activerecord-import/active_record/adapters/fake_name_adapter` as it runs through the lookup process for a ruby file under that path in `$LOAD_PATH`
|
65
107
|
|
108
|
+
### More Information
|
109
|
+
|
110
|
+
For more information on activerecord-import please see its wiki: https://github.com/zdennis/activerecord-import/wiki
|
111
|
+
|
112
|
+
To document new information, please add to the README instead of the wiki. See https://github.com/zdennis/activerecord-import/issues/397 for discussion.
|
113
|
+
|
66
114
|
# License
|
67
115
|
|
68
116
|
This is licensed under the ruby license.
|
data/Rakefile
CHANGED
@@ -32,7 +32,7 @@ ADAPTERS.each do |adapter|
|
|
32
32
|
namespace :test do
|
33
33
|
desc "Runs #{adapter} database tests."
|
34
34
|
Rake::TestTask.new(adapter) do |t|
|
35
|
-
#
|
35
|
+
# FactoryBot has an issue with warnings, so turn off, so noisy
|
36
36
|
# t.warning = true
|
37
37
|
t.test_files = FileList["test/adapters/#{adapter}.rb", "test/*_test.rb", "test/active_record/*_test.rb", "test/#{adapter}/**/*_test.rb"]
|
38
38
|
end
|
@@ -56,7 +56,7 @@ module ActiveRecord::Import::MysqlAdapter
|
|
56
56
|
# in a single packet
|
57
57
|
def max_allowed_packet # :nodoc:
|
58
58
|
@max_allowed_packet ||= begin
|
59
|
-
result = execute( "SHOW VARIABLES like 'max_allowed_packet'
|
59
|
+
result = execute( "SHOW VARIABLES like 'max_allowed_packet'" )
|
60
60
|
# original Mysql gem responds to #fetch_row while Mysql2 responds to #first
|
61
61
|
val = result.respond_to?(:fetch_row) ? result.fetch_row[1] : result.first[1]
|
62
62
|
val.to_i
|
@@ -1,7 +1,9 @@
|
|
1
1
|
module ActiveRecord::Import::SQLite3Adapter
|
2
2
|
include ActiveRecord::Import::ImportSupport
|
3
|
+
include ActiveRecord::Import::OnDuplicateKeyUpdateSupport
|
3
4
|
|
4
5
|
MIN_VERSION_FOR_IMPORT = "3.7.11".freeze
|
6
|
+
MIN_VERSION_FOR_UPSERT = "3.24.0".freeze
|
5
7
|
SQLITE_LIMIT_COMPOUND_SELECT = 500
|
6
8
|
|
7
9
|
# Override our conformance to ActiveRecord::Import::ImportSupport interface
|
@@ -15,6 +17,10 @@ module ActiveRecord::Import::SQLite3Adapter
|
|
15
17
|
end
|
16
18
|
end
|
17
19
|
|
20
|
+
def supports_on_duplicate_key_update?(current_version = sqlite_version)
|
21
|
+
current_version >= MIN_VERSION_FOR_UPSERT
|
22
|
+
end
|
23
|
+
|
18
24
|
# +sql+ can be a single string or an array. If it is an array all
|
19
25
|
# elements that are in position >= 1 will be appended to the final SQL.
|
20
26
|
def insert_many( sql, values, _options = {}, *args ) # :nodoc:
|
@@ -40,16 +46,133 @@ module ActiveRecord::Import::SQLite3Adapter
|
|
40
46
|
ActiveRecord::Import::Result.new([], number_of_inserts, [], [])
|
41
47
|
end
|
42
48
|
|
43
|
-
def pre_sql_statements( options)
|
49
|
+
def pre_sql_statements( options )
|
44
50
|
sql = []
|
45
51
|
# Options :recursive and :on_duplicate_key_ignore are mutually exclusive
|
46
|
-
if (options[:ignore] || options[:on_duplicate_key_ignore])
|
52
|
+
if !supports_on_duplicate_key_update? && (options[:ignore] || options[:on_duplicate_key_ignore])
|
47
53
|
sql << "OR IGNORE"
|
48
54
|
end
|
49
55
|
sql + super
|
50
56
|
end
|
51
57
|
|
58
|
+
def post_sql_statements( table_name, options ) # :nodoc:
|
59
|
+
sql = []
|
60
|
+
|
61
|
+
if supports_on_duplicate_key_update?
|
62
|
+
# Options :recursive and :on_duplicate_key_ignore are mutually exclusive
|
63
|
+
if (options[:ignore] || options[:on_duplicate_key_ignore]) && !options[:on_duplicate_key_update]
|
64
|
+
sql << sql_for_on_duplicate_key_ignore( options[:on_duplicate_key_ignore] )
|
65
|
+
end
|
66
|
+
end
|
67
|
+
|
68
|
+
sql + super
|
69
|
+
end
|
70
|
+
|
52
71
|
def next_value_for_sequence(sequence_name)
|
53
72
|
%{nextval('#{sequence_name}')}
|
54
73
|
end
|
74
|
+
|
75
|
+
# Add a column to be updated on duplicate key update
|
76
|
+
def add_column_for_on_duplicate_key_update( column, options = {} ) # :nodoc:
|
77
|
+
arg = options[:on_duplicate_key_update]
|
78
|
+
if arg.is_a?( Hash )
|
79
|
+
columns = arg.fetch( :columns ) { arg[:columns] = [] }
|
80
|
+
case columns
|
81
|
+
when Array then columns << column.to_sym unless columns.include?( column.to_sym )
|
82
|
+
when Hash then columns[column.to_sym] = column.to_sym
|
83
|
+
end
|
84
|
+
elsif arg.is_a?( Array )
|
85
|
+
arg << column.to_sym unless arg.include?( column.to_sym )
|
86
|
+
end
|
87
|
+
end
|
88
|
+
|
89
|
+
# Returns a generated ON CONFLICT DO NOTHING statement given the passed
|
90
|
+
# in +args+.
|
91
|
+
def sql_for_on_duplicate_key_ignore( *args ) # :nodoc:
|
92
|
+
arg = args.first
|
93
|
+
conflict_target = sql_for_conflict_target( arg ) if arg.is_a?( Hash )
|
94
|
+
" ON CONFLICT #{conflict_target}DO NOTHING"
|
95
|
+
end
|
96
|
+
|
97
|
+
# Returns a generated ON CONFLICT DO UPDATE statement given the passed
|
98
|
+
# in +args+.
|
99
|
+
def sql_for_on_duplicate_key_update( _table_name, *args ) # :nodoc:
|
100
|
+
arg, primary_key, locking_column = args
|
101
|
+
arg = { columns: arg } if arg.is_a?( Array ) || arg.is_a?( String )
|
102
|
+
return unless arg.is_a?( Hash )
|
103
|
+
|
104
|
+
sql = ' ON CONFLICT '
|
105
|
+
conflict_target = sql_for_conflict_target( arg )
|
106
|
+
|
107
|
+
columns = arg.fetch( :columns, [] )
|
108
|
+
condition = arg[:condition]
|
109
|
+
if columns.respond_to?( :empty? ) && columns.empty?
|
110
|
+
return sql << "#{conflict_target}DO NOTHING"
|
111
|
+
end
|
112
|
+
|
113
|
+
conflict_target ||= sql_for_default_conflict_target( primary_key )
|
114
|
+
unless conflict_target
|
115
|
+
raise ArgumentError, 'Expected :conflict_target to be specified'
|
116
|
+
end
|
117
|
+
|
118
|
+
sql << "#{conflict_target}DO UPDATE SET "
|
119
|
+
if columns.is_a?( Array )
|
120
|
+
sql << sql_for_on_duplicate_key_update_as_array( locking_column, columns )
|
121
|
+
elsif columns.is_a?( Hash )
|
122
|
+
sql << sql_for_on_duplicate_key_update_as_hash( locking_column, columns )
|
123
|
+
elsif columns.is_a?( String )
|
124
|
+
sql << columns
|
125
|
+
else
|
126
|
+
raise ArgumentError, 'Expected :columns to be an Array or Hash'
|
127
|
+
end
|
128
|
+
|
129
|
+
sql << " WHERE #{condition}" if condition.present?
|
130
|
+
|
131
|
+
sql
|
132
|
+
end
|
133
|
+
|
134
|
+
def sql_for_on_duplicate_key_update_as_array( locking_column, arr ) # :nodoc:
|
135
|
+
results = arr.map do |column|
|
136
|
+
qc = quote_column_name( column )
|
137
|
+
"#{qc}=EXCLUDED.#{qc}"
|
138
|
+
end
|
139
|
+
increment_locking_column!(results, locking_column)
|
140
|
+
results.join( ',' )
|
141
|
+
end
|
142
|
+
|
143
|
+
def sql_for_on_duplicate_key_update_as_hash( locking_column, hsh ) # :nodoc:
|
144
|
+
results = hsh.map do |column1, column2|
|
145
|
+
qc1 = quote_column_name( column1 )
|
146
|
+
qc2 = quote_column_name( column2 )
|
147
|
+
"#{qc1}=EXCLUDED.#{qc2}"
|
148
|
+
end
|
149
|
+
increment_locking_column!(results, locking_column)
|
150
|
+
results.join( ',' )
|
151
|
+
end
|
152
|
+
|
153
|
+
def sql_for_conflict_target( args = {} )
|
154
|
+
conflict_target = args[:conflict_target]
|
155
|
+
index_predicate = args[:index_predicate]
|
156
|
+
if conflict_target.present?
|
157
|
+
'(' << Array( conflict_target ).reject( &:blank? ).join( ', ' ) << ') '.tap do |sql|
|
158
|
+
sql << "WHERE #{index_predicate} " if index_predicate
|
159
|
+
end
|
160
|
+
end
|
161
|
+
end
|
162
|
+
|
163
|
+
def sql_for_default_conflict_target( primary_key )
|
164
|
+
conflict_target = Array(primary_key).join(', ')
|
165
|
+
"(#{conflict_target}) " if conflict_target.present?
|
166
|
+
end
|
167
|
+
|
168
|
+
# Return true if the statement is a duplicate key record error
|
169
|
+
def duplicate_key_update_error?(exception) # :nodoc:
|
170
|
+
exception.is_a?(ActiveRecord::StatementInvalid) && exception.to_s.include?('duplicate key')
|
171
|
+
end
|
172
|
+
|
173
|
+
def increment_locking_column!(results, locking_column)
|
174
|
+
if locking_column.present?
|
175
|
+
results << "\"#{locking_column}\"=EXCLUDED.\"#{locking_column}\"+1"
|
176
|
+
end
|
177
|
+
end
|
55
178
|
end
|
@@ -298,8 +298,8 @@ class ActiveRecord::Base
|
|
298
298
|
# recursive import. For database adapters that normally support
|
299
299
|
# setting primary keys on imported objects, this option prevents
|
300
300
|
# that from occurring.
|
301
|
-
# * +on_duplicate_key_update+ - an Array or Hash, tells import to
|
302
|
-
# use MySQL's ON DUPLICATE KEY UPDATE or Postgres
|
301
|
+
# * +on_duplicate_key_update+ - :all, an Array, or Hash, tells import to
|
302
|
+
# use MySQL's ON DUPLICATE KEY UPDATE or Postgres/SQLite ON CONFLICT
|
303
303
|
# DO UPDATE ability. See On Duplicate Key Update below.
|
304
304
|
# * +synchronize+ - an array of ActiveRecord instances for the model
|
305
305
|
# that you are currently importing data into. This synchronizes
|
@@ -358,7 +358,15 @@ class ActiveRecord::Base
|
|
358
358
|
#
|
359
359
|
# == On Duplicate Key Update (MySQL)
|
360
360
|
#
|
361
|
-
# The :on_duplicate_key_update option can be either an Array or a Hash.
|
361
|
+
# The :on_duplicate_key_update option can be either :all, an Array, or a Hash.
|
362
|
+
#
|
363
|
+
# ==== Using :all
|
364
|
+
#
|
365
|
+
# The :on_duplicate_key_update option can be set to :all. All columns
|
366
|
+
# other than the primary key are updated. If a list of column names is
|
367
|
+
# supplied, only those columns will be updated. Below is an example:
|
368
|
+
#
|
369
|
+
# BlogPost.import columns, values, on_duplicate_key_update: :all
|
362
370
|
#
|
363
371
|
# ==== Using an Array
|
364
372
|
#
|
@@ -377,11 +385,19 @@ class ActiveRecord::Base
|
|
377
385
|
#
|
378
386
|
# BlogPost.import columns, attributes, on_duplicate_key_update: { title: :title }
|
379
387
|
#
|
380
|
-
# == On Duplicate Key Update (Postgres 9.5+)
|
388
|
+
# == On Duplicate Key Update (Postgres 9.5+ and SQLite 3.24+)
|
381
389
|
#
|
382
|
-
# The :on_duplicate_key_update option can be an Array or a Hash with up to
|
390
|
+
# The :on_duplicate_key_update option can be :all, an Array, or a Hash with up to
|
383
391
|
# three attributes, :conflict_target (and optionally :index_predicate) or
|
384
|
-
# :constraint_name, and :columns.
|
392
|
+
# :constraint_name (Postgres), and :columns.
|
393
|
+
#
|
394
|
+
# ==== Using :all
|
395
|
+
#
|
396
|
+
# The :on_duplicate_key_update option can be set to :all. All columns
|
397
|
+
# other than the primary key are updated. If a list of column names is
|
398
|
+
# supplied, only those columns will be updated. Below is an example:
|
399
|
+
#
|
400
|
+
# BlogPost.import columns, values, on_duplicate_key_update: :all
|
385
401
|
#
|
386
402
|
# ==== Using an Array
|
387
403
|
#
|
@@ -439,7 +455,15 @@ class ActiveRecord::Base
|
|
439
455
|
#
|
440
456
|
# ===== :columns
|
441
457
|
#
|
442
|
-
# The :columns attribute can be either an Array or a Hash.
|
458
|
+
# The :columns attribute can be either :all, an Array, or a Hash.
|
459
|
+
#
|
460
|
+
# ===== Using :all
|
461
|
+
#
|
462
|
+
# The :columns attribute can be :all. All columns other than the primary key will be updated.
|
463
|
+
# If a list of column names is supplied, only those columns will be updated.
|
464
|
+
# Below is an example:
|
465
|
+
#
|
466
|
+
# BlogPost.import columns, values, on_duplicate_key_update: { conflict_target: :slug, columns: :all }
|
443
467
|
#
|
444
468
|
# ===== Using an Array
|
445
469
|
#
|
@@ -495,16 +519,6 @@ class ActiveRecord::Base
|
|
495
519
|
options[:primary_key] = primary_key
|
496
520
|
options[:locking_column] = locking_column if attribute_names.include?(locking_column)
|
497
521
|
|
498
|
-
# Don't modify incoming arguments
|
499
|
-
on_duplicate_key_update = options[:on_duplicate_key_update]
|
500
|
-
if on_duplicate_key_update && on_duplicate_key_update.duplicable?
|
501
|
-
options[:on_duplicate_key_update] = if on_duplicate_key_update.is_a?(Hash)
|
502
|
-
on_duplicate_key_update.each { |k, v| on_duplicate_key_update[k] = v.dup if v.duplicable? }
|
503
|
-
else
|
504
|
-
on_duplicate_key_update.dup
|
505
|
-
end
|
506
|
-
end
|
507
|
-
|
508
522
|
is_validating = options[:validate_with_context].present? ? true : options[:validate]
|
509
523
|
validator = ActiveRecord::Import::Validator.new(options)
|
510
524
|
|
@@ -522,8 +536,12 @@ class ActiveRecord::Base
|
|
522
536
|
end
|
523
537
|
end
|
524
538
|
|
525
|
-
if models.first.id.nil?
|
526
|
-
|
539
|
+
if models.first.id.nil?
|
540
|
+
Array(primary_key).each do |c|
|
541
|
+
if column_names.include?(c) && columns_hash[c].type == :uuid
|
542
|
+
column_names.delete(c)
|
543
|
+
end
|
544
|
+
end
|
527
545
|
end
|
528
546
|
|
529
547
|
default_values = column_defaults
|
@@ -611,6 +629,29 @@ class ActiveRecord::Base
|
|
611
629
|
array_of_attributes.each { |a| a.concat(new_fields) }
|
612
630
|
end
|
613
631
|
|
632
|
+
# Don't modify incoming arguments
|
633
|
+
on_duplicate_key_update = options[:on_duplicate_key_update]
|
634
|
+
if on_duplicate_key_update
|
635
|
+
updatable_columns = symbolized_column_names.reject { |c| symbolized_primary_key.include? c }
|
636
|
+
options[:on_duplicate_key_update] = if on_duplicate_key_update.is_a?(Hash)
|
637
|
+
on_duplicate_key_update.each_with_object({}) do |(k, v), duped_options|
|
638
|
+
duped_options[k] = if k == :columns && v == :all
|
639
|
+
updatable_columns
|
640
|
+
elsif v.duplicable?
|
641
|
+
v.dup
|
642
|
+
else
|
643
|
+
v
|
644
|
+
end
|
645
|
+
end
|
646
|
+
elsif on_duplicate_key_update == :all
|
647
|
+
updatable_columns
|
648
|
+
elsif on_duplicate_key_update.duplicable?
|
649
|
+
on_duplicate_key_update.dup
|
650
|
+
else
|
651
|
+
on_duplicate_key_update
|
652
|
+
end
|
653
|
+
end
|
654
|
+
|
614
655
|
timestamps = {}
|
615
656
|
|
616
657
|
# record timestamps unless disabled in ActiveRecord::Base
|
@@ -646,7 +687,7 @@ class ActiveRecord::Base
|
|
646
687
|
end
|
647
688
|
|
648
689
|
if options[:synchronize]
|
649
|
-
sync_keys = options[:synchronize_keys] ||
|
690
|
+
sync_keys = options[:synchronize_keys] || Array(primary_key)
|
650
691
|
synchronize( options[:synchronize], sync_keys)
|
651
692
|
end
|
652
693
|
return_obj.num_inserts = 0 if return_obj.num_inserts.nil?
|
@@ -871,7 +912,7 @@ class ActiveRecord::Base
|
|
871
912
|
column = columns[j]
|
872
913
|
|
873
914
|
# be sure to query sequence_name *last*, only if cheaper tests fail, because it's costly
|
874
|
-
if val.nil? &&
|
915
|
+
if val.nil? && Array(primary_key).first == column.name && !sequence_name.blank?
|
875
916
|
connection_memo.next_value_for_sequence(sequence_name)
|
876
917
|
elsif val.respond_to?(:to_sql)
|
877
918
|
"(#{val.to_sql})"
|
data/test/import_test.rb
CHANGED
@@ -567,7 +567,7 @@ describe "#import" do
|
|
567
567
|
|
568
568
|
context "importing through an association scope" do
|
569
569
|
{ has_many: :chapters, polymorphic: :discounts }.each do |association_type, association|
|
570
|
-
book =
|
570
|
+
book = FactoryBot.create :book
|
571
571
|
scope = book.public_send association
|
572
572
|
klass = { chapters: Chapter, discounts: Discount }[association]
|
573
573
|
column = { chapters: :title, discounts: :amount }[association]
|
@@ -609,7 +609,7 @@ describe "#import" do
|
|
609
609
|
|
610
610
|
context "importing model with polymorphic belongs_to" do
|
611
611
|
it "works without error" do
|
612
|
-
book =
|
612
|
+
book = FactoryBot.create :book
|
613
613
|
discount = Discount.new(discountable: book)
|
614
614
|
|
615
615
|
Discount.import([discount])
|
@@ -0,0 +1,13 @@
|
|
1
|
+
ActiveRecord::Schema.define do
|
2
|
+
create_table :alarms, force: true do |t|
|
3
|
+
t.column :device_id, :integer, null: false
|
4
|
+
t.column :alarm_type, :integer, null: false
|
5
|
+
t.column :status, :integer, null: false
|
6
|
+
t.column :metadata, :text
|
7
|
+
t.column :secret_key, :binary
|
8
|
+
t.datetime :created_at
|
9
|
+
t.datetime :updated_at
|
10
|
+
end
|
11
|
+
|
12
|
+
add_index :alarms, [:device_id, :alarm_type], unique: true, where: 'status <> 0'
|
13
|
+
end
|
data/test/support/factories.rb
CHANGED
@@ -1,4 +1,4 @@
|
|
1
|
-
|
1
|
+
FactoryBot.define do
|
2
2
|
sequence(:book_title) { |n| "Book #{n}" }
|
3
3
|
sequence(:chapter_title) { |n| "Chapter #{n}" }
|
4
4
|
sequence(:end_note) { |n| "Endnote #{n}" }
|
@@ -9,7 +9,7 @@ FactoryGirl.define do
|
|
9
9
|
|
10
10
|
factory :invalid_topic, class: "Topic" do
|
11
11
|
sequence(:title) { |n| "Title #{n}" }
|
12
|
-
author_name nil
|
12
|
+
author_name { nil }
|
13
13
|
end
|
14
14
|
|
15
15
|
factory :topic do
|
@@ -27,7 +27,7 @@ FactoryGirl.define do
|
|
27
27
|
|
28
28
|
trait :with_rule do
|
29
29
|
after(:build) do |question|
|
30
|
-
question.build_rule(
|
30
|
+
question.build_rule(FactoryBot.attributes_for(:rule))
|
31
31
|
end
|
32
32
|
end
|
33
33
|
end
|
@@ -40,21 +40,21 @@ FactoryGirl.define do
|
|
40
40
|
factory :topic_with_book, parent: :topic do
|
41
41
|
after(:build) do |topic|
|
42
42
|
2.times do
|
43
|
-
book = topic.books.build(title:
|
43
|
+
book = topic.books.build(title: FactoryBot.generate(:book_title), author_name: 'Stephen King')
|
44
44
|
3.times do
|
45
|
-
book.chapters.build(title:
|
45
|
+
book.chapters.build(title: FactoryBot.generate(:chapter_title))
|
46
46
|
end
|
47
47
|
|
48
48
|
4.times do
|
49
|
-
book.end_notes.build(note:
|
49
|
+
book.end_notes.build(note: FactoryBot.generate(:end_note))
|
50
50
|
end
|
51
51
|
end
|
52
52
|
end
|
53
53
|
end
|
54
54
|
|
55
55
|
factory :book do
|
56
|
-
title 'Tortilla Flat'
|
57
|
-
author_name 'John Steinbeck'
|
56
|
+
title { 'Tortilla Flat' }
|
57
|
+
author_name { 'John Steinbeck' }
|
58
58
|
end
|
59
59
|
|
60
60
|
factory :car do
|
data/test/support/generate.rb
CHANGED
@@ -2,28 +2,28 @@ class ActiveSupport::TestCase
|
|
2
2
|
def Build(*args) # rubocop:disable Style/MethodName
|
3
3
|
n = args.shift if args.first.is_a?(Numeric)
|
4
4
|
factory = args.shift
|
5
|
-
|
5
|
+
factory_bot_args = args.shift || {}
|
6
6
|
|
7
7
|
if n
|
8
8
|
[].tap do |collection|
|
9
|
-
n.times.each { collection <<
|
9
|
+
n.times.each { collection << FactoryBot.build(factory.to_s.singularize.to_sym, factory_bot_args) }
|
10
10
|
end
|
11
11
|
else
|
12
|
-
|
12
|
+
FactoryBot.build(factory.to_s.singularize.to_sym, factory_bot_args)
|
13
13
|
end
|
14
14
|
end
|
15
15
|
|
16
16
|
def Generate(*args) # rubocop:disable Style/MethodName
|
17
17
|
n = args.shift if args.first.is_a?(Numeric)
|
18
18
|
factory = args.shift
|
19
|
-
|
19
|
+
factory_bot_args = args.shift || {}
|
20
20
|
|
21
21
|
if n
|
22
22
|
[].tap do |collection|
|
23
|
-
n.times.each { collection <<
|
23
|
+
n.times.each { collection << FactoryBot.create(factory.to_s.singularize.to_sym, factory_bot_args) }
|
24
24
|
end
|
25
25
|
else
|
26
|
-
|
26
|
+
FactoryBot.create(factory.to_s.singularize.to_sym, factory_bot_args)
|
27
27
|
end
|
28
28
|
end
|
29
29
|
end
|
@@ -295,6 +295,30 @@ def should_support_postgresql_upsert_functionality
|
|
295
295
|
end
|
296
296
|
|
297
297
|
context "using a hash" do
|
298
|
+
context "with :columns :all" do
|
299
|
+
let(:columns) { %w( id title author_name author_email_address parent_id ) }
|
300
|
+
let(:updated_values) { [[99, "Book - 2nd Edition", "Jane Doe", "janedoe@example.com", 57]] }
|
301
|
+
|
302
|
+
macro(:perform_import) do |*opts|
|
303
|
+
Topic.import columns, updated_values, opts.extract_options!.merge(on_duplicate_key_update: { conflict_target: :id, columns: :all }, validate: false)
|
304
|
+
end
|
305
|
+
|
306
|
+
setup do
|
307
|
+
values = [[99, "Book", "John Doe", "john@doe.com", 17, 3]]
|
308
|
+
Topic.import columns + ['replies_count'], values, validate: false
|
309
|
+
end
|
310
|
+
|
311
|
+
it "should update all specified columns" do
|
312
|
+
perform_import
|
313
|
+
updated_topic = Topic.find(99)
|
314
|
+
assert_equal 'Book - 2nd Edition', updated_topic.title
|
315
|
+
assert_equal 'Jane Doe', updated_topic.author_name
|
316
|
+
assert_equal 'janedoe@example.com', updated_topic.author_email_address
|
317
|
+
assert_equal 57, updated_topic.parent_id
|
318
|
+
assert_equal 3, updated_topic.replies_count
|
319
|
+
end
|
320
|
+
end
|
321
|
+
|
298
322
|
context "with :columns a hash" do
|
299
323
|
let(:columns) { %w( id title author_name author_email_address parent_id ) }
|
300
324
|
let(:values) { [[99, "Book", "John Doe", "john@doe.com", 17]] }
|
@@ -312,7 +336,7 @@ def should_support_postgresql_upsert_functionality
|
|
312
336
|
it "should not modify the passed in :on_duplicate_key_update columns array" do
|
313
337
|
assert_nothing_raised do
|
314
338
|
columns = %w(title author_name).freeze
|
315
|
-
Topic.import columns, [%w(foo, bar)], on_duplicate_key_update: { columns: columns }
|
339
|
+
Topic.import columns, [%w(foo, bar)], { on_duplicate_key_update: { columns: columns }.freeze }.freeze
|
316
340
|
end
|
317
341
|
end
|
318
342
|
|
@@ -213,6 +213,30 @@ def should_support_basic_on_duplicate_key_update
|
|
213
213
|
end
|
214
214
|
|
215
215
|
context "with :on_duplicate_key_update" do
|
216
|
+
describe 'using :all' do
|
217
|
+
let(:columns) { %w( id title author_name author_email_address parent_id ) }
|
218
|
+
let(:updated_values) { [[99, "Book - 2nd Edition", "Jane Doe", "janedoe@example.com", 57]] }
|
219
|
+
|
220
|
+
macro(:perform_import) do |*opts|
|
221
|
+
Topic.import columns, updated_values, opts.extract_options!.merge(on_duplicate_key_update: :all, validate: false)
|
222
|
+
end
|
223
|
+
|
224
|
+
setup do
|
225
|
+
values = [[99, "Book", "John Doe", "john@doe.com", 17, 3]]
|
226
|
+
Topic.import columns + ['replies_count'], values, validate: false
|
227
|
+
end
|
228
|
+
|
229
|
+
it 'updates all specified columns' do
|
230
|
+
perform_import
|
231
|
+
updated_topic = Topic.find(99)
|
232
|
+
assert_equal 'Book - 2nd Edition', updated_topic.title
|
233
|
+
assert_equal 'Jane Doe', updated_topic.author_name
|
234
|
+
assert_equal 'janedoe@example.com', updated_topic.author_email_address
|
235
|
+
assert_equal 57, updated_topic.parent_id
|
236
|
+
assert_equal 3, updated_topic.replies_count
|
237
|
+
end
|
238
|
+
end
|
239
|
+
|
216
240
|
describe "argument safety" do
|
217
241
|
it "should not modify the passed in :on_duplicate_key_update array" do
|
218
242
|
assert_nothing_raised do
|
@@ -11,7 +11,7 @@ def should_support_recursive_import
|
|
11
11
|
let(:num_chapters) { 18 }
|
12
12
|
let(:num_endnotes) { 24 }
|
13
13
|
|
14
|
-
let(:new_question_with_rule) {
|
14
|
+
let(:new_question_with_rule) { FactoryBot.build :question, :with_rule }
|
15
15
|
|
16
16
|
it 'imports top level' do
|
17
17
|
assert_difference "Topic.count", +num_topics do
|
@@ -1,6 +1,8 @@
|
|
1
1
|
# encoding: UTF-8
|
2
2
|
def should_support_sqlite3_import_functionality
|
3
|
-
|
3
|
+
if ActiveRecord::Base.connection.supports_on_duplicate_key_update?
|
4
|
+
should_support_sqlite_upsert_functionality
|
5
|
+
end
|
4
6
|
|
5
7
|
describe "#supports_imports?" do
|
6
8
|
context "and SQLite is 3.7.11 or higher" do
|
@@ -49,18 +51,193 @@ def should_support_sqlite3_import_functionality
|
|
49
51
|
assert_equal 2500, Topic.count, "Failed to insert all records. Make sure you have a supported version of SQLite3 (3.7.11 or higher) installed"
|
50
52
|
end
|
51
53
|
end
|
54
|
+
end
|
55
|
+
end
|
56
|
+
|
57
|
+
def should_support_sqlite_upsert_functionality
|
58
|
+
should_support_basic_on_duplicate_key_update
|
59
|
+
should_support_on_duplicate_key_ignore
|
60
|
+
|
61
|
+
describe "#import" do
|
62
|
+
extend ActiveSupport::TestCase::ImportAssertions
|
63
|
+
|
64
|
+
macro(:perform_import) { raise "supply your own #perform_import in a context below" }
|
65
|
+
macro(:updated_topic) { Topic.find(@topic.id) }
|
66
|
+
|
67
|
+
context "with :on_duplicate_key_ignore and validation checks turned off" do
|
68
|
+
let(:columns) { %w( id title author_name author_email_address parent_id ) }
|
69
|
+
let(:values) { [[99, "Book", "John Doe", "john@doe.com", 17]] }
|
70
|
+
let(:updated_values) { [[99, "Book - 2nd Edition", "Author Should Not Change", "johndoe@example.com", 57]] }
|
71
|
+
|
72
|
+
setup do
|
73
|
+
Topic.import columns, values, validate: false
|
74
|
+
end
|
75
|
+
|
76
|
+
it "should not update any records" do
|
77
|
+
result = Topic.import columns, updated_values, on_duplicate_key_ignore: true, validate: false
|
78
|
+
assert_equal [], result.ids
|
79
|
+
end
|
80
|
+
end
|
81
|
+
|
82
|
+
context "with :on_duplicate_key_update and validation checks turned off" do
|
83
|
+
asssertion_group(:should_support_on_duplicate_key_update) do
|
84
|
+
should_not_update_fields_not_mentioned
|
85
|
+
should_update_foreign_keys
|
86
|
+
should_not_update_created_at_on_timestamp_columns
|
87
|
+
should_update_updated_at_on_timestamp_columns
|
88
|
+
end
|
89
|
+
|
90
|
+
context "using a hash" do
|
91
|
+
context "with :columns a hash" do
|
92
|
+
let(:columns) { %w( id title author_name author_email_address parent_id ) }
|
93
|
+
let(:values) { [[99, "Book", "John Doe", "john@doe.com", 17]] }
|
94
|
+
let(:updated_values) { [[99, "Book - 2nd Edition", "Author Should Not Change", "johndoe@example.com", 57]] }
|
95
|
+
|
96
|
+
macro(:perform_import) do |*opts|
|
97
|
+
Topic.import columns, updated_values, opts.extract_options!.merge(on_duplicate_key_update: { conflict_target: :id, columns: update_columns }, validate: false)
|
98
|
+
end
|
99
|
+
|
100
|
+
setup do
|
101
|
+
Topic.import columns, values, validate: false
|
102
|
+
@topic = Topic.find 99
|
103
|
+
end
|
104
|
+
|
105
|
+
it "should not modify the passed in :on_duplicate_key_update columns array" do
|
106
|
+
assert_nothing_raised do
|
107
|
+
columns = %w(title author_name).freeze
|
108
|
+
Topic.import columns, [%w(foo, bar)], on_duplicate_key_update: { columns: columns }
|
109
|
+
end
|
110
|
+
end
|
111
|
+
|
112
|
+
context "using string hash map" do
|
113
|
+
let(:update_columns) { { "title" => "title", "author_email_address" => "author_email_address", "parent_id" => "parent_id" } }
|
114
|
+
should_support_on_duplicate_key_update
|
115
|
+
should_update_fields_mentioned
|
116
|
+
end
|
117
|
+
|
118
|
+
context "using string hash map, but specifying column mismatches" do
|
119
|
+
let(:update_columns) { { "title" => "author_email_address", "author_email_address" => "title", "parent_id" => "parent_id" } }
|
120
|
+
should_support_on_duplicate_key_update
|
121
|
+
should_update_fields_mentioned_with_hash_mappings
|
122
|
+
end
|
123
|
+
|
124
|
+
context "using symbol hash map" do
|
125
|
+
let(:update_columns) { { title: :title, author_email_address: :author_email_address, parent_id: :parent_id } }
|
126
|
+
should_support_on_duplicate_key_update
|
127
|
+
should_update_fields_mentioned
|
128
|
+
end
|
129
|
+
|
130
|
+
context "using symbol hash map, but specifying column mismatches" do
|
131
|
+
let(:update_columns) { { title: :author_email_address, author_email_address: :title, parent_id: :parent_id } }
|
132
|
+
should_support_on_duplicate_key_update
|
133
|
+
should_update_fields_mentioned_with_hash_mappings
|
134
|
+
end
|
135
|
+
end
|
136
|
+
|
137
|
+
context 'with :index_predicate' do
|
138
|
+
let(:columns) { %w( id device_id alarm_type status metadata ) }
|
139
|
+
let(:values) { [[99, 17, 1, 1, 'foo']] }
|
140
|
+
let(:updated_values) { [[99, 17, 1, 2, 'bar']] }
|
141
|
+
|
142
|
+
macro(:perform_import) do |*opts|
|
143
|
+
Alarm.import columns, updated_values, opts.extract_options!.merge(on_duplicate_key_update: { conflict_target: [:device_id, :alarm_type], index_predicate: 'status <> 0', columns: [:status] }, validate: false)
|
144
|
+
end
|
145
|
+
|
146
|
+
macro(:updated_alarm) { Alarm.find(@alarm.id) }
|
147
|
+
|
148
|
+
setup do
|
149
|
+
Alarm.import columns, values, validate: false
|
150
|
+
@alarm = Alarm.find 99
|
151
|
+
end
|
152
|
+
|
153
|
+
context 'supports on duplicate key update for partial indexes' do
|
154
|
+
it 'should not update created_at timestamp columns' do
|
155
|
+
Timecop.freeze Chronic.parse("5 minutes from now") do
|
156
|
+
perform_import
|
157
|
+
assert_in_delta @alarm.created_at.to_i, updated_alarm.created_at.to_i, 1
|
158
|
+
end
|
159
|
+
end
|
160
|
+
|
161
|
+
it 'should update updated_at timestamp columns' do
|
162
|
+
time = Chronic.parse("5 minutes from now")
|
163
|
+
Timecop.freeze time do
|
164
|
+
perform_import
|
165
|
+
assert_in_delta time.to_i, updated_alarm.updated_at.to_i, 1
|
166
|
+
end
|
167
|
+
end
|
168
|
+
|
169
|
+
it 'should not update fields not mentioned' do
|
170
|
+
perform_import
|
171
|
+
assert_equal 'foo', updated_alarm.metadata
|
172
|
+
end
|
173
|
+
|
174
|
+
it 'should update fields mentioned with hash mappings' do
|
175
|
+
perform_import
|
176
|
+
assert_equal 2, updated_alarm.status
|
177
|
+
end
|
178
|
+
end
|
179
|
+
end
|
180
|
+
|
181
|
+
context 'with :condition' do
|
182
|
+
let(:columns) { %w( id device_id alarm_type status metadata) }
|
183
|
+
let(:values) { [[99, 17, 1, 1, 'foo']] }
|
184
|
+
let(:updated_values) { [[99, 17, 1, 1, 'bar']] }
|
185
|
+
|
186
|
+
macro(:perform_import) do |*opts|
|
187
|
+
Alarm.import(
|
188
|
+
columns,
|
189
|
+
updated_values,
|
190
|
+
opts.extract_options!.merge(
|
191
|
+
on_duplicate_key_update: {
|
192
|
+
conflict_target: [:id],
|
193
|
+
condition: "alarms.metadata NOT LIKE '%foo%'",
|
194
|
+
columns: [:metadata]
|
195
|
+
},
|
196
|
+
validate: false
|
197
|
+
)
|
198
|
+
)
|
199
|
+
end
|
200
|
+
|
201
|
+
macro(:updated_alarm) { Alarm.find(@alarm.id) }
|
202
|
+
|
203
|
+
setup do
|
204
|
+
Alarm.import columns, values, validate: false
|
205
|
+
@alarm = Alarm.find 99
|
206
|
+
end
|
207
|
+
|
208
|
+
it 'should not update fields not matched' do
|
209
|
+
perform_import
|
210
|
+
assert_equal 'foo', updated_alarm.metadata
|
211
|
+
end
|
212
|
+
end
|
213
|
+
|
214
|
+
context "with no :conflict_target" do
|
215
|
+
context "with no primary key" do
|
216
|
+
it "raises ArgumentError" do
|
217
|
+
error = assert_raises ArgumentError do
|
218
|
+
Rule.import Build(3, :rules), on_duplicate_key_update: [:condition_text], validate: false
|
219
|
+
end
|
220
|
+
assert_match(/Expected :conflict_target to be specified/, error.message)
|
221
|
+
end
|
222
|
+
end
|
223
|
+
end
|
224
|
+
|
225
|
+
context "with no :columns" do
|
226
|
+
let(:columns) { %w( id title author_name author_email_address ) }
|
227
|
+
let(:values) { [[100, "Book", "John Doe", "john@doe.com"]] }
|
228
|
+
let(:updated_values) { [[100, "Title Should Not Change", "Author Should Not Change", "john@nogo.com"]] }
|
52
229
|
|
53
|
-
|
54
|
-
|
230
|
+
macro(:perform_import) do |*opts|
|
231
|
+
Topic.import columns, updated_values, opts.extract_options!.merge(on_duplicate_key_update: { conflict_target: :id }, validate: false)
|
232
|
+
end
|
55
233
|
|
56
|
-
|
57
|
-
|
58
|
-
|
59
|
-
|
60
|
-
ActiveRecord::Base.connection.stubs(:logger).returns(logger)
|
234
|
+
setup do
|
235
|
+
Topic.import columns, values, validate: false
|
236
|
+
@topic = Topic.find 100
|
237
|
+
end
|
61
238
|
|
62
|
-
|
63
|
-
|
239
|
+
should_update_updated_at_on_timestamp_columns
|
240
|
+
end
|
64
241
|
end
|
65
242
|
end
|
66
243
|
end
|
data/test/test_helper.rb
CHANGED
@@ -58,7 +58,7 @@ ActiveSupport::Notifications.subscribe(/active_record.sql/) do |_, _, _, _, hsh|
|
|
58
58
|
ActiveRecord::Base.logger.info hsh[:sql]
|
59
59
|
end
|
60
60
|
|
61
|
-
require "
|
61
|
+
require "factory_bot"
|
62
62
|
Dir[File.dirname(__FILE__) + "/support/**/*.rb"].each { |file| require file }
|
63
63
|
|
64
64
|
# Load base/generic schema
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: activerecord-import
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.
|
4
|
+
version: 0.26.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Zach Dennis
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2018-
|
11
|
+
date: 2018-10-05 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
name: activerecord
|
@@ -146,6 +146,7 @@ files:
|
|
146
146
|
- test/schema/mysql2_schema.rb
|
147
147
|
- test/schema/postgis_schema.rb
|
148
148
|
- test/schema/postgresql_schema.rb
|
149
|
+
- test/schema/sqlite3_schema.rb
|
149
150
|
- test/schema/version.rb
|
150
151
|
- test/sqlite3/import_test.rb
|
151
152
|
- test/support/active_support/test_case_extensions.rb
|
@@ -183,7 +184,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
183
184
|
version: '0'
|
184
185
|
requirements: []
|
185
186
|
rubyforge_project:
|
186
|
-
rubygems_version: 2.
|
187
|
+
rubygems_version: 2.7.7
|
187
188
|
signing_key:
|
188
189
|
specification_version: 4
|
189
190
|
summary: Bulk insert extension for ActiveRecord
|
@@ -236,6 +237,7 @@ test_files:
|
|
236
237
|
- test/schema/mysql2_schema.rb
|
237
238
|
- test/schema/postgis_schema.rb
|
238
239
|
- test/schema/postgresql_schema.rb
|
240
|
+
- test/schema/sqlite3_schema.rb
|
239
241
|
- test/schema/version.rb
|
240
242
|
- test/sqlite3/import_test.rb
|
241
243
|
- test/support/active_support/test_case_extensions.rb
|