searchkick 2.0.4 → 2.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 275674a7d9ef11e2faba8426a3a85172d84cf16c
4
- data.tar.gz: 3134c16ed33df803187e4b8b75dd3f8167745e62
3
+ metadata.gz: 95d822316595ca2806bc9f2e3e44644ee2ab5fd8
4
+ data.tar.gz: 27f86dc98a28378ee489b5adf7bfd11be054baf1
5
5
  SHA512:
6
- metadata.gz: bf4ab3f4fb5557051f9e5bbc25fd9aaf26f3c5403e473e86f6108f5418a53cc3ee391ec0cf204cc717ca0df7f96ba5e83deeb42bb725bc6025f6258c067d9c7d
7
- data.tar.gz: dba920c1264cf007eb5a4759c83f3bf0269a8b36bf381b4e4f6c8b23089abfb1a90c373a3ae6853f4145bae751b1181f7bcd77fb06cd839cd45c8e350c0f22ea
6
+ metadata.gz: fb18990e391306eb6d3f9a6193b8c14acf8cb22691f0a470d7c47b40c6146a58bc7babc6250378dfadf86b85f5f6846bd79f5302969b54de632df77aa7a55809
7
+ data.tar.gz: 0983c46e8eda2d402cc632a9849e97fca61f3036cfa96378ff81921dc6901416671b6192a184a23dc04615534422f79a7f00c2694c3b9cb46d7b38f7a2410345
data/CHANGELOG.md CHANGED
@@ -1,3 +1,8 @@
1
+ ## 2.1.0
2
+
3
+ - Background reindexing and queues are officially supported
4
+ - Log updates and deletes
5
+
1
6
  ## 2.0.4
2
7
 
3
8
  - Added support for queuing updates [experimental]
data/README.md CHANGED
@@ -29,7 +29,18 @@ Plus:
29
29
 
30
30
  **Searchkick 2.0 was just released!** See [notable changes](#200).
31
31
 
32
- ## Get Started
32
+ ## Contents
33
+
34
+ - [Getting Started](#getting-started)
35
+ - [Querying](#querying)
36
+ - [Indexing](#indexing)
37
+ - [Aggregations](#aggregations)
38
+ - [Deployment](#deployment)
39
+ - [Performance](#performance)
40
+ - [Elasticsearch DSL](#advanced)
41
+ - [Reference](#reference)
42
+
43
+ ## Getting Started
33
44
 
34
45
  [Install Elasticsearch](https://www.elastic.co/guide/en/elasticsearch/reference/current/setup.html). For Homebrew, use:
35
46
 
@@ -73,7 +84,7 @@ end
73
84
 
74
85
  Searchkick supports the complete [Elasticsearch Search API](https://www.elastic.co/guide/en/elasticsearch/reference/current/search-search.html). As your search becomes more advanced, we recommend you use the [Elasticsearch DSL](#advanced) for maximum flexibility.
75
86
 
76
- ### Queries
87
+ ## Querying
77
88
 
78
89
  Query like SQL
79
90
 
@@ -381,7 +392,7 @@ And use:
381
392
  Product.search "🍨🍰", emoji: true
382
393
  ```
383
394
 
384
- ### Indexing
395
+ ## Indexing
385
396
 
386
397
  Control what data is indexed with the `search_data` method. Call `Product.reindex` after changing this method.
387
398
 
@@ -425,6 +436,8 @@ If a reindex is interrupted, you can resume it with:
425
436
  Product.reindex(resume: true)
426
437
  ```
427
438
 
439
+ For large data sets, try [parallel reindexing](#parallel-reindexing).
440
+
428
441
  ### To Reindex, or Not to Reindex
429
442
 
430
443
  #### Reindex
@@ -439,7 +452,7 @@ Product.reindex(resume: true)
439
452
 
440
453
  ### Stay Synced
441
454
 
442
- There are three strategies for keeping the index synced with your database.
455
+ There are four strategies for keeping the index synced with your database.
443
456
 
444
457
  1. Immediate (default)
445
458
 
@@ -457,7 +470,11 @@ There are three strategies for keeping the index synced with your database.
457
470
 
458
471
  And [install Active Job](https://github.com/ankane/activejob_backport) for Rails 4.1 and below. Jobs are added to a queue named `searchkick`.
459
472
 
460
- 3. Manual
473
+ 3. Queuing
474
+
475
+ Push ids of records that need updated to a queue and reindex in the background in batches. This is more performant than the asynchronous method, which updates records individually. See [how to set up](#queuing).
476
+
477
+ 4. Manual
461
478
 
462
479
  Turn off automatic syncing
463
480
 
@@ -543,6 +560,8 @@ Reindex and set up a cron job to add new conversions daily.
543
560
  rake searchkick:reindex CLASS=Product
544
561
  ```
545
562
 
563
+ **Note:** For a more performant (but more advanced) approach, check out [performant conversions](#performant-conversions).
564
+
546
565
  ### Personalized Results
547
566
 
548
567
  Order results differently for each user. For example, show a user’s previously purchased products before other results.
@@ -962,23 +981,23 @@ Product.search("soap", explain: true).response
962
981
  See how Elasticsearch tokenizes your queries with:
963
982
 
964
983
  ```ruby
965
- Product.searchkick_index.tokens("Dish Washer Soap", analyzer: "searchkick_index")
984
+ Product.search_index.tokens("Dish Washer Soap", analyzer: "searchkick_index")
966
985
  # ["dish", "dishwash", "washer", "washersoap", "soap"]
967
986
 
968
- Product.searchkick_index.tokens("dishwasher soap", analyzer: "searchkick_search")
987
+ Product.search_index.tokens("dishwasher soap", analyzer: "searchkick_search")
969
988
  # ["dishwashersoap"] - no match
970
989
 
971
- Product.searchkick_index.tokens("dishwasher soap", analyzer: "searchkick_search2")
990
+ Product.search_index.tokens("dishwasher soap", analyzer: "searchkick_search2")
972
991
  # ["dishwash", "soap"] - match!!
973
992
  ```
974
993
 
975
994
  Partial matches
976
995
 
977
996
  ```ruby
978
- Product.searchkick_index.tokens("San Diego", analyzer: "searchkick_word_start_index")
997
+ Product.search_index.tokens("San Diego", analyzer: "searchkick_word_start_index")
979
998
  # ["s", "sa", "san", "d", "di", "die", "dieg", "diego"]
980
999
 
981
- Product.searchkick_index.tokens("dieg", analyzer: "searchkick_word_search")
1000
+ Product.search_index.tokens("dieg", analyzer: "searchkick_word_search")
982
1001
  # ["dieg"] - match!!
983
1002
  ```
984
1003
 
@@ -1136,29 +1155,7 @@ class Product < ActiveRecord::Base
1136
1155
  end
1137
1156
  ```
1138
1157
 
1139
- ### Routing
1140
-
1141
- Searchkick supports [Elasticsearch’s routing feature](https://www.elastic.co/blog/customizing-your-document-routing), which can significantly speed up searches.
1142
-
1143
- ```ruby
1144
- class Business < ActiveRecord::Base
1145
- searchkick routing: true
1146
-
1147
- def search_routing
1148
- city_id
1149
- end
1150
- end
1151
- ```
1152
-
1153
- Reindex and search with:
1154
-
1155
- ```ruby
1156
- Business.search "ice cream", routing: params[:city_id]
1157
- ```
1158
-
1159
- ## Large Data Sets
1160
-
1161
- ### Background Reindexing [experimental, ActiveRecord only]
1158
+ ### Parallel Reindexing
1162
1159
 
1163
1160
  For large data sets, you can use background jobs to parallelize reindexing.
1164
1161
 
@@ -1170,7 +1167,7 @@ Product.reindex(async: true)
1170
1167
  Once the jobs complete, promote the new index with:
1171
1168
 
1172
1169
  ```ruby
1173
- Product.searchkick_index.promote(index_name)
1170
+ Product.search_index.promote(index_name)
1174
1171
  ```
1175
1172
 
1176
1173
  You can optionally track the status with Redis:
@@ -1185,9 +1182,9 @@ And use:
1185
1182
  Searchkick.reindex_status(index_name)
1186
1183
  ```
1187
1184
 
1188
- ### Queues [master, experimental]
1185
+ ### Queuing
1189
1186
 
1190
- You can also queue updates and do them in bulk for better performance. First, set up Redis in an initializer.
1187
+ Push ids of records needing reindexed to a queue and reindex in bulk for better performance. First, set up Redis in an initializer.
1191
1188
 
1192
1189
  ```ruby
1193
1190
  Searchkick.redis = Redis.new
@@ -1210,14 +1207,121 @@ Searchkick::ProcessQueueJob.perform_later(class_name: "Product")
1210
1207
  You can check the queue length with:
1211
1208
 
1212
1209
  ```ruby
1213
- Product.searchkick_index.reindex_queue.length
1210
+ Product.search_index.reindex_queue.length
1214
1211
  ```
1215
1212
 
1216
1213
  For more tips, check out [Keeping Elasticsearch in Sync](https://www.elastic.co/blog/found-keeping-elasticsearch-in-sync).
1217
1214
 
1215
+ ### Routing
1216
+
1217
+ Searchkick supports [Elasticsearch’s routing feature](https://www.elastic.co/blog/customizing-your-document-routing), which can significantly speed up searches.
1218
+
1219
+ ```ruby
1220
+ class Business < ActiveRecord::Base
1221
+ searchkick routing: true
1222
+
1223
+ def search_routing
1224
+ city_id
1225
+ end
1226
+ end
1227
+ ```
1228
+
1229
+ Reindex and search with:
1230
+
1231
+ ```ruby
1232
+ Business.search "ice cream", routing: params[:city_id]
1233
+ ```
1234
+
1235
+ ### Partial Reindexing
1236
+
1237
+ Reindex a subset of attributes to reduce time spent generating search data and cut down on network traffic.
1238
+
1239
+ ```ruby
1240
+ class Product < ActiveRecord::Base
1241
+ def search_data
1242
+ {
1243
+ name: name
1244
+ }.merge(search_prices)
1245
+ end
1246
+
1247
+ def search_prices
1248
+ {
1249
+ price: price,
1250
+ sale_price: sale_price
1251
+ }
1252
+ end
1253
+ end
1254
+ ```
1255
+
1256
+ And use:
1257
+
1258
+ ```ruby
1259
+ Product.reindex(:search_prices)
1260
+ ```
1261
+
1262
+ ### Performant Conversions
1263
+
1264
+ Split out conversions into a separate method so you can use partial reindexing, and cache conversions to prevent N+1 queries. Be sure to use a centralized cache store like Memcached or Redis.
1265
+
1266
+ ```ruby
1267
+ class Product < ActiveRecord::Base
1268
+ def search_data
1269
+ {
1270
+ name: name
1271
+ }.merge(search_conversions)
1272
+ end
1273
+
1274
+ def search_conversions
1275
+ {
1276
+ conversions: Rails.cache.read("search_conversions:#{self.class.name}:#{id}") || {}
1277
+ }
1278
+ end
1279
+ end
1280
+ ```
1281
+
1282
+ Create a job to update the cache and reindex records with new conversions.
1283
+
1284
+ ```ruby
1285
+ class ReindexConversionsJob < ActiveJob::Base
1286
+ def perform(class_name)
1287
+ # get records that have a recent conversion
1288
+ recently_converted_ids =
1289
+ Searchjoy::Search.where("convertable_type = ? AND converted_at > ?", class_name, 1.day.ago)
1290
+ .order(:convertable_id).uniq.pluck(:convertable_id)
1291
+
1292
+ # split into groups
1293
+ recently_converted_ids.in_groups_of(1000, false) do |ids|
1294
+ # fetch conversions and group by record
1295
+ conversions_by_record = {}
1296
+ conversions =
1297
+ Searchjoy::Search.where(convertable_id: ids, convertable_type: class_name)
1298
+ .group(:convertable_id, :query).uniq.count(:user_id)
1299
+
1300
+ conversions.each do |(id, query), count|
1301
+ (conversions_by_record[id] ||= {})[query] = count
1302
+ end
1303
+
1304
+ # write to cache
1305
+ conversions_by_record.each do |id, conversions|
1306
+ Rails.cache.write("search_conversions:#{class_name}:#{id}", conversions)
1307
+ end
1308
+
1309
+ # partial reindex
1310
+ class_name.constantize.where(id: ids).reindex(:search_conversions)
1311
+ end
1312
+ end
1313
+ end
1314
+ ```
1315
+
1316
+ Run the job with:
1317
+
1318
+ ```ruby
1319
+ ReindexConversionsJob.perform_later("Product")
1320
+ ```
1321
+
1218
1322
  ## Advanced
1219
1323
 
1220
- Prefer to use the [Elasticsearch DSL](https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-queries.html) but still want awesome features like zero-downtime reindexing?
1324
+ Searchkick makes it easy to use the Elasticsearch DSL on its own.
1221
1325
 
1222
1326
  ### Advanced Mapping
1223
1327
 
@@ -1334,25 +1438,10 @@ Reindex associations
1334
1438
  store.products.reindex
1335
1439
  ```
1336
1440
 
1337
- Reindex a subset of attributes (partial reindex)
1338
-
1339
- ```ruby
1340
- class Product < ActiveRecord::Base
1341
- def search_prices
1342
- {
1343
- price: price,
1344
- sale_price: sale_price
1345
- }
1346
- end
1347
- end
1348
-
1349
- Product.reindex(:search_prices)
1350
- ```
1351
-
1352
1441
  Remove old indices
1353
1442
 
1354
1443
  ```ruby
1355
- Product.searchkick_index.clean_indices
1444
+ Product.search_index.clean_indices
1356
1445
  ```
1357
1446
 
1358
1447
  Use custom settings
@@ -6,7 +6,12 @@ module Searchkick
6
6
  klass = class_name.constantize
7
7
  index = index_name ? Searchkick::Index.new(index_name) : klass.searchkick_index
8
8
  record_ids ||= min_id..max_id
9
- index.import_scope(Searchkick.load_records(klass, record_ids), method_name: method_name, batch: true, batch_id: batch_id)
9
+ index.import_scope(
10
+ Searchkick.load_records(klass, record_ids),
11
+ method_name: method_name,
12
+ batch: true,
13
+ batch_id: batch_id
14
+ )
10
15
  end
11
16
  end
12
17
  end
@@ -248,37 +248,14 @@ module Searchkick
248
248
  end
249
249
 
250
250
  def import_scope(scope, resume: false, method_name: nil, async: false, batch: false, batch_id: nil, full: false)
251
- batch_size = @options[:batch_size] || 1000
252
-
253
251
  # use scope for import
254
252
  scope = scope.search_import if scope.respond_to?(:search_import)
255
253
 
256
254
  if batch
257
255
  import_or_update scope.to_a, method_name, async
258
- Searchkick.redis.srem(batches_key, batch_id) if batch_id && Searchkick.redis
256
+ redis.srem(batches_key, batch_id) if batch_id && redis
259
257
  elsif full && async
260
- if scope.respond_to?(:primary_key)
261
- # TODO expire Redis key
262
- primary_key = scope.primary_key
263
- starting_id = scope.minimum(primary_key) || 0
264
- max_id = scope.maximum(primary_key) || 0
265
- batches_count = ((max_id - starting_id + 1) / batch_size.to_f).ceil
266
-
267
- batches_count.times do |i|
268
- batch_id = i + 1
269
- min_id = starting_id + (i * batch_size)
270
- Searchkick::BulkReindexJob.perform_later(
271
- class_name: scope.model_name.name,
272
- min_id: min_id,
273
- max_id: min_id + batch_size - 1,
274
- index_name: name,
275
- batch_id: batch_id
276
- )
277
- Searchkick.redis.sadd(batches_key, batch_id) if Searchkick.redis
278
- end
279
- else
280
- raise Searchkick::Error, "async option only supported for ActiveRecord"
281
- end
258
+ full_reindex_async(scope)
282
259
  elsif scope.respond_to?(:find_in_batches)
283
260
  if resume
284
261
  # use total docs instead of max id since there's not a great way
@@ -294,23 +271,14 @@ module Searchkick
294
271
  import_or_update batch, method_name, async
295
272
  end
296
273
  else
297
- # https://github.com/karmi/tire/blob/master/lib/tire/model/import.rb
298
- # use cursor for Mongoid
299
- items = []
300
- # TODO add resume
301
- scope.all.each do |item|
302
- items << item
303
- if items.length == batch_size
304
- import_or_update items, method_name, async
305
- items = []
306
- end
274
+ each_batch(scope) do |items|
275
+ import_or_update items, method_name, async
307
276
  end
308
- import_or_update items, method_name, async
309
277
  end
310
278
  end
311
279
 
312
280
  def batches_left
313
- Searchkick.redis.scard(batches_key) if Searchkick.redis
281
+ redis.scard(batches_key) if redis
314
282
  end
315
283
 
316
284
  # other
@@ -432,21 +400,85 @@ module Searchkick
432
400
  method_name: method_name ? method_name.to_s : nil
433
401
  )
434
402
  else
435
- retries = 0
436
403
  records = records.select(&:should_index?)
437
- begin
438
- method_name ? bulk_update(records, method_name) : import(records)
439
- rescue Faraday::ClientError => e
440
- if retries < 1
441
- retries += 1
442
- retry
404
+ if records.any?
405
+ with_retries do
406
+ method_name ? bulk_update(records, method_name) : import(records)
443
407
  end
444
- raise e
445
408
  end
446
409
  end
447
410
  end
448
411
  end
449
412
 
413
+ def full_reindex_async(scope)
414
+ if scope.respond_to?(:primary_key)
415
+ # TODO expire Redis key
416
+ primary_key = scope.primary_key
417
+ starting_id = scope.minimum(primary_key) || 0
418
+ max_id = scope.maximum(primary_key) || 0
419
+ batches_count = ((max_id - starting_id + 1) / batch_size.to_f).ceil
420
+
421
+ batches_count.times do |i|
422
+ batch_id = i + 1
423
+ min_id = starting_id + (i * batch_size)
424
+ bulk_reindex_job scope, batch_id, min_id: min_id, max_id: min_id + batch_size - 1
425
+ end
426
+ else
427
+ batch_id = 1
428
+ # TODO remove any eager loading
429
+ scope = scope.only(:_id) if scope.respond_to?(:only)
430
+ each_batch(scope) do |items|
431
+ bulk_reindex_job scope, batch_id, record_ids: items.map { |i| i.id.to_s }
432
+ batch_id += 1
433
+ end
434
+ end
435
+ end
436
+
437
+ def each_batch(scope)
438
+ # https://github.com/karmi/tire/blob/master/lib/tire/model/import.rb
439
+ # use cursor for Mongoid
440
+ items = []
441
+ scope.all.each do |item|
442
+ items << item
443
+ if items.length == batch_size
444
+ yield items
445
+ items = []
446
+ end
447
+ end
448
+ yield items if items.any?
449
+ end
450
+
451
+ def bulk_reindex_job(scope, batch_id, options)
452
+ Searchkick::BulkReindexJob.perform_later({
453
+ class_name: scope.model_name.name,
454
+ index_name: name,
455
+ batch_id: batch_id
456
+ }.merge(options))
457
+ redis.sadd(batches_key, batch_id) if redis
458
+ end
459
+
460
+ def batch_size
461
+ @batch_size ||= @options[:batch_size] || 1000
462
+ end
463
+
464
+ def with_retries
465
+ retries = 0
466
+
467
+ begin
468
+ yield
469
+ rescue Faraday::ClientError => e
470
+ if retries < 1
471
+ retries += 1
472
+ retry
473
+ end
474
+ raise e
475
+ end
476
+ end
477
+
478
+ def redis
479
+ Searchkick.redis
480
+ end
481
+
450
482
  def batches_key
451
483
  "searchkick:reindex:#{name}:batches"
452
484
  end
@@ -51,8 +51,44 @@ module Searchkick
51
51
  name: "#{records.first.searchkick_klass.name} Import",
52
52
  count: records.size
53
53
  }
54
- ActiveSupport::Notifications.instrument("request.searchkick", event) do
55
- super(records)
54
+ if Searchkick.callbacks_value == :bulk
55
+ super
56
+ else
57
+ ActiveSupport::Notifications.instrument("request.searchkick", event) do
58
+ super
59
+ end
60
+ end
61
+ end
62
+ end
63
+
64
+ def bulk_update(records, *args)
65
+ if records.any?
66
+ event = {
67
+ name: "#{records.first.searchkick_klass.name} Update",
68
+ count: records.size
69
+ }
70
+ if Searchkick.callbacks_value == :bulk
71
+ super
72
+ else
73
+ ActiveSupport::Notifications.instrument("request.searchkick", event) do
74
+ super
75
+ end
76
+ end
77
+ end
78
+ end
79
+
80
+ def bulk_delete(records)
81
+ if records.any?
82
+ event = {
83
+ name: "#{records.first.searchkick_klass.name} Delete",
84
+ count: records.size
85
+ }
86
+ if Searchkick.callbacks_value == :bulk
87
+ super
88
+ else
89
+ ActiveSupport::Notifications.instrument("request.searchkick", event) do
90
+ super
91
+ end
56
92
  end
57
93
  end
58
94
  end
@@ -15,8 +15,8 @@ module Searchkick
15
15
  # bulk reindex
16
16
  index = klass.searchkick_index
17
17
  Searchkick.callbacks(:bulk) do
18
- index.bulk_index(records)
19
- index.bulk_delete(delete_records)
18
+ index.bulk_index(records) if records.any?
19
+ index.bulk_delete(delete_records) if delete_records.any?
20
20
  end
21
21
  end
22
22
  end
@@ -5,7 +5,7 @@ module Searchkick
5
5
  def perform(class_name:)
6
6
  model = class_name.constantize
7
7
 
8
- limit = 1000
8
+ limit = model.searchkick_index.options[:batch_size] || 1000
9
9
  record_ids = Searchkick::ReindexQueue.new(model.searchkick_index.name).reserve(limit: limit)
10
10
  if record_ids.any?
11
11
  Searchkick::ProcessBatchJob.perform_later(
@@ -4,6 +4,8 @@ module Searchkick
4
4
 
5
5
  def initialize(name)
6
6
  @name = name
7
+
8
+ raise Searchkick::Error, "Searchkick.redis not set" unless redis
7
9
  end
8
10
 
9
11
  def push(record_id)
@@ -1,3 +1,3 @@
1
1
  module Searchkick
2
- VERSION = "2.0.4"
2
+ VERSION = "2.1.0"
3
3
  end
@@ -35,7 +35,7 @@ class CallbacksTest < Minitest::Test
35
35
  store_names ["Product A", "Product B"]
36
36
  end
37
37
  Product.searchkick_index.refresh
38
- assert_search "product", [], load: false
38
+ assert_search "product", [], load: false, conversions: false
39
39
  assert_equal 2, reindex_queue.length
40
40
 
41
41
  Searchkick::ProcessQueueJob.perform_later(class_name: "Product")
@@ -4,4 +4,4 @@ source 'https://rubygems.org'
4
4
  gemspec path: "../../"
5
5
 
6
6
  gem "mongoid", "~> 5.0.0"
7
- gem "activejob_backport"
7
+ gem "activejob"
data/test/index_test.rb CHANGED
@@ -87,7 +87,7 @@ class IndexTest < Minitest::Test
87
87
 
88
88
  def test_remove_blank_id
89
89
  store_names ["Product A"]
90
- Product.searchkick_index.remove(OpenStruct.new)
90
+ Product.searchkick_index.remove(Product.new)
91
91
  assert_search "product", ["Product A"]
92
92
  ensure
93
93
  Product.reindex
data/test/reindex_test.rb CHANGED
@@ -1,12 +1,9 @@
1
1
  require_relative "test_helper"
2
2
 
3
3
  class ReindexTest < Minitest::Test
4
- def setup
4
+ def test_scoped
5
5
  skip if nobrainer?
6
- super
7
- end
8
6
 
9
- def test_scoped
10
7
  store_names ["Product A"]
11
8
  Searchkick.callbacks(false) do
12
9
  store_names ["Product B", "Product C"]
@@ -16,6 +13,8 @@ class ReindexTest < Minitest::Test
16
13
  end
17
14
 
18
15
  def test_associations
16
+ skip if nobrainer?
17
+
19
18
  store_names ["Product A"]
20
19
  store = Store.create!(name: "Test")
21
20
  Product.create!(name: "Product B", store_id: store.id)
@@ -24,13 +23,13 @@ class ReindexTest < Minitest::Test
24
23
  end
25
24
 
26
25
  def test_async
27
- skip unless defined?(ActiveJob) && defined?(ActiveRecord)
26
+ skip if !defined?(ActiveJob)
28
27
 
29
28
  Searchkick.callbacks(false) do
30
29
  store_names ["Product A"]
31
30
  end
32
31
  reindex = Product.reindex(async: true)
33
- assert_search "product", []
32
+ assert_search "product", [], conversions: false
34
33
 
35
34
  index = Searchkick::Index.new(reindex[:index_name])
36
35
  index.refresh
data/test/routing_test.rb CHANGED
@@ -10,4 +10,14 @@ class RoutingTest < Minitest::Test
10
10
  index_options = Store.searchkick_index.index_options
11
11
  assert_equal index_options[:mappings][:_default_][:_routing], required: true
12
12
  end
13
+
14
+ def test_routing_correct_node
15
+ store_names ["Dollar Tree"], Store
16
+ assert_search "dollar", ["Dollar Tree"], {routing: "Dollar Tree"}, Store
17
+ end
18
+
19
+ def test_routing_incorrect_node
20
+ store_names ["Dollar Tree"], Store
21
+ assert_search "dollar", ["Dollar Tree"], {routing: "Boom"}, Store
22
+ end
13
23
  end
data/test/test_helper.rb CHANGED
@@ -25,7 +25,7 @@ if defined?(ActiveJob)
25
25
  ActiveJob::Base.queue_adapter = :inline
26
26
  end
27
27
 
28
- ActiveSupport::LogSubscriber.logger = Logger.new(STDOUT) if ENV["NOTIFICATIONS"]
28
+ ActiveSupport::LogSubscriber.logger = ActiveSupport::Logger.new(STDOUT) if ENV["NOTIFICATIONS"]
29
29
 
30
30
  def elasticsearch_below50?
31
31
  Searchkick.server_below?("5.0.0-alpha1")
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: searchkick
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.0.4
4
+ version: 2.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Andrew Kane
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-01-15 00:00:00.000000000 Z
11
+ date: 2017-01-16 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: activemodel
@@ -189,7 +189,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
189
189
  version: '0'
190
190
  requirements: []
191
191
  rubyforge_project:
192
- rubygems_version: 2.6.8
192
+ rubygems_version: 2.5.1
193
193
  signing_key:
194
194
  specification_version: 4
195
195
  summary: Searchkick learns what your users are looking for. As more people search,