batch-loader 0.3.0 → 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: e17f7617e248aa7b911c64dac882cdb192bb9b2c
4
- data.tar.gz: 6b9d612ae58f95a0715c0d23427176375d8450c4
3
+ metadata.gz: 76a4b89eafc3e0421ef17d5b4fc8147e56058307
4
+ data.tar.gz: d146eea3db7f1e5fe2656484ab56dbf135a44333
5
5
  SHA512:
6
- metadata.gz: b34fa00da8e309c99b7bae84e8155dcd847722f5da30d60c19a318a114459d825645ef3377259219e5fa9c64c1ffd1495e4ac78bf4907ab811f66b8a8b0ba178
7
- data.tar.gz: 7e97b3aa40becb1515a3a26d2a1992f2ccbfda2a4f5f090fc7c329e898319aa765edec379f8a0070d555276a0ab00cb24d9cf4235b230d0e7fcb6a0e0bce6625
6
+ metadata.gz: b110668f7022e179dace6e64200a05dada22ad8cb0064a50b5dcfaf960e5ae8884ca1fb6d9b8e8b2fb84cd16e15637633c7dfaa7c69d1f82bca74be2bddab11e
7
+ data.tar.gz: 46f3e19c98b9adebe2bb24c6bfaf59a25566da8b735317ac6b671721867394f4a193e5faceab7b8b7c4d8bee10db26dbfcf51a85dfade8133bda474ab17b9f39
data/CHANGELOG.md CHANGED
@@ -8,10 +8,50 @@ one of the following labels: `Added`, `Changed`, `Deprecated`,
8
8
  to manage the versions of this gem so
9
9
  that you can set version constraints properly.
10
10
 
11
- #### [Unreleased](https://github.com/exAspArk/batch-loader/compare/v0.3.0...HEAD)
11
+ #### [Unreleased](https://github.com/exAspArk/batch-loader/compare/v1.0.0...HEAD)
12
12
 
13
13
  * WIP
14
14
 
15
+ #### [v1.0.0](https://github.com/exAspArk/batch-loader/compare/v0.3.0...v1.0.0) – 2017-08-21
16
+
17
+ * `Removed`: `BatchLoader.sync!` and `BatchLoader#sync`. Now syncing is done implicitly when you call any method on the lazy object.
18
+
19
+ Before:
20
+
21
+ ```ruby
22
+ def load_user(user_id)
23
+ BatchLoader.for(user_id).batch { ... }
24
+ end
25
+
26
+ users = [load_user(1), load_user(2), load_user(3)]
27
+ puts BatchLoader.sync!(users) # or users.map!(&:sync)
28
+ ```
29
+
30
+ After:
31
+
32
+ ```ruby
33
+ users = [load_user(1), load_user(2), load_user(3)]
34
+ puts users
35
+ ```
36
+
37
+ * `Removed`: `BatchLoader#load`. Use `loader` lambda instead:
38
+
39
+ Before:
40
+
41
+ ```ruby
42
+ BatchLoader.for(user_id).batch do |user_ids, batch_loader|
43
+ user_ids.each { |user_id| batch_loader.load(user_id, user_id) }
44
+ end
45
+ ```
46
+
47
+ After:
48
+
49
+ ```ruby
50
+ BatchLoader.for(user_id).batch do |user_ids, loader|
51
+ user_ids.each { |user_id| loader.call(user_id, user_id) }
52
+ end
53
+ ```
54
+
15
55
  #### [v0.3.0](https://github.com/exAspArk/batch-loader/compare/v0.2.0...v0.3.0) – 2017-08-03
16
56
 
17
57
  * `Added`: `BatchLoader::Executor.clear_current` to clear cache manually.
data/README.md CHANGED
@@ -6,7 +6,7 @@
6
6
  [![Downloads](https://img.shields.io/gem/dt/batch-loader.svg)](https://rubygems.org/gems/batch-loader)
7
7
  [![Latest Version](https://img.shields.io/gem/v/batch-loader.svg)](https://rubygems.org/gems/batch-loader)
8
8
 
9
- Simple tool to avoid N+1 DB queries, HTTP requests, etc.
9
+ This gem provides a generic lazy batching mechanism to avoid N+1 DB queries, HTTP queries, etc.
10
10
 
11
11
  ## Contents
12
12
 
@@ -15,26 +15,30 @@ Simple tool to avoid N+1 DB queries, HTTP requests, etc.
15
15
  * [Why?](#why)
16
16
  * [Basic example](#basic-example)
17
17
  * [How it works](#how-it-works)
18
- * [REST API example](#rest-api-example)
18
+ * [RESTful API example](#restful-api-example)
19
19
  * [GraphQL example](#graphql-example)
20
20
  * [Caching](#caching)
21
21
  * [Installation](#installation)
22
22
  * [Implementation details](#implementation-details)
23
23
  * [Development](#development)
24
24
  * [Contributing](#contributing)
25
+ * [Alternatives](#alternatives)
25
26
  * [License](#license)
26
27
  * [Code of Conduct](#code-of-conduct)
27
28
 
29
+ <a href="https://www.universe.com/" target="_blank" rel="noopener noreferrer">
30
+ <img src="images/universe.png" height="41" width="153" alt="Sponsored by Universe" style="max-width:100%;">
31
+ </a>
32
+
28
33
  ## Highlights
29
34
 
30
35
  * Generic utility to avoid N+1 DB queries, HTTP requests, etc.
31
36
  * Adapted Ruby implementation of battle-tested tools like [Haskell Haxl](https://github.com/facebook/Haxl), [JS DataLoader](https://github.com/facebook/dataloader), etc.
32
- * Parent objects don't have to know about children's requirements, batching is isolated.
33
- * Automatically caches previous queries.
34
- * Doesn't require to create custom classes.
35
- * Thread-safe (`BatchLoader#load`).
36
- * Has zero dependencies.
37
- * Works with any Ruby code, including REST APIs and GraphQL.
37
+ * Batching is isolated and lazy, load data in batch where and when it's needed.
38
+ * Automatically caches previous queries (identity map).
39
+ * Thread-safe (`loader`).
40
+ * No need to share batching through variables or custom defined classes.
41
+ * No dependencies, no monkey-patches, no extra primitives such as Promises.
38
42
 
39
43
  ## Usage
40
44
 
@@ -47,19 +51,15 @@ def load_posts(ids)
47
51
  Post.where(id: ids)
48
52
  end
49
53
 
50
- def load_users(posts)
51
- posts.map { |post| post.user }
52
- end
53
-
54
54
  posts = load_posts([1, 2, 3]) # Posts SELECT * FROM posts WHERE id IN (1, 2, 3)
55
55
  # _ ↓ _
56
56
  # ↙ ↓ ↘
57
- # U ↓ ↓ SELECT * FROM users WHERE id = 1
58
- users = load_users(post) # ↓ U ↓ SELECT * FROM users WHERE id = 2
59
- # ↓ ↓ U SELECT * FROM users WHERE id = 3
57
+ users = posts.map do |post| # U ↓ ↓ SELECT * FROM users WHERE id = 1
58
+ post.user # ↓ U ↓ SELECT * FROM users WHERE id = 2
59
+ end # ↓ ↓ U SELECT * FROM users WHERE id = 3
60
60
  # ↘ ↓ ↙
61
61
  # ¯ ↓ ¯
62
- users.map { |u| user.name } # Users
62
+ puts users # Users
63
63
  ```
64
64
 
65
65
  The naive approach would be to preload dependent objects on the top level:
@@ -84,22 +84,18 @@ def load_posts(ids)
84
84
  posts.each { |post| post.user = user_by_id[post.user_id] }
85
85
  end
86
86
 
87
- def load_users(posts)
88
- posts.map { |post| post.user }
89
- end
90
-
91
87
  posts = load_posts([1, 2, 3]) # Posts SELECT * FROM posts WHERE id IN (1, 2, 3)
92
88
  # _ ↓ _ SELECT * FROM users WHERE id IN (1, 2, 3)
93
89
  # ↙ ↓ ↘
94
- # U ↓ ↓
95
- users = load_posts(post.user) # ↓ U ↓
96
- # ↓ ↓ U
90
+ users = posts.map do |post| # U ↓ ↓
91
+ post.user # ↓ U ↓
92
+ end # ↓ ↓ U
97
93
  # ↘ ↓ ↙
98
94
  # ¯ ↓ ¯
99
- users.map { |u| user.name } # Users
95
+ puts users # Users
100
96
  ```
101
97
 
102
- But the problem here is that `load_posts` now depends on the child association and knows that it has to preload the data for `load_users`. And it'll do it every time, even if it's not necessary. Can we do better? Sure!
98
+ But the problem here is that `load_posts` now depends on the child association and knows that it has to preload data for future use. And it'll do it every time, even if it's not necessary. Can we do better? Sure!
103
99
 
104
100
  ### Basic example
105
101
 
@@ -110,56 +106,54 @@ def load_posts(ids)
110
106
  Post.where(id: ids)
111
107
  end
112
108
 
113
- def load_users(posts)
114
- posts.map do |post|
115
- BatchLoader.for(post.user_id).batch do |user_ids, batch_loader|
116
- User.where(id: user_ids).each { |u| batch_loader.load(u.id, user) }
117
- end
109
+ def load_user(post)
110
+ BatchLoader.for(post.user_id).batch do |user_ids, loader|
111
+ User.where(id: user_ids).each { |user| loader.call(user.id, user) }
118
112
  end
119
113
  end
120
114
 
121
- posts = load_posts([1, 2, 3]) # Posts SELECT * FROM posts WHERE id IN (1, 2, 3)
122
- # _ ↓ _
123
- # ↙ ↓ ↘
124
- # BL ↓ ↓
125
- users = load_users(posts) # ↓ BL ↓
126
- # ↓ ↓ BL
127
- # ↘ ↓ ↙
128
- # ¯ ↓ ¯
129
- BatchLoader.sync!(users).map(&:name) # Users SELECT * FROM users WHERE id IN (1, 2, 3)
115
+ posts = load_posts([1, 2, 3]) # Posts SELECT * FROM posts WHERE id IN (1, 2, 3)
116
+ # _ ↓ _
117
+ # ↙ ↓ ↘
118
+ users = posts.map do |post| # BL ↓ ↓
119
+ load_user(post) # ↓ BL ↓
120
+ end # ↓ ↓ BL
121
+ # ↘ ↓ ↙
122
+ # ¯ ↓ ¯
123
+ puts users # Users SELECT * FROM users WHERE id IN (1, 2, 3)
130
124
  ```
131
125
 
132
126
  As we can see, batching is isolated and described right in a place where it's needed.
133
127
 
134
128
  ### How it works
135
129
 
136
- In general, `BatchLoader` returns a lazy object. In other programming languages it usually called Promise, but I personally prefer to call it lazy, since Ruby already uses the name in standard library :) Each lazy object knows which data it needs to load and how to batch the query. When all the lazy objects are collected it's possible to resolve them once without N+1 queries.
130
+ In general, `BatchLoader` returns a lazy object. Each lazy object knows which data it needs to load and how to batch the query. As soon as you need to use the lazy objects, they will be automatically loaded once without N+1 queries.
137
131
 
138
- So, when we call `BatchLoader.for` we pass an item (`user_id`) which should be batched. For the `batch` method, we pass a block which uses all the collected items (`user_ids`):
132
+ So, when we call `BatchLoader.for` we pass an item (`user_id`) which should be collected and used for batching later. For the `batch` method, we pass a block which will use all the collected items (`user_ids`):
139
133
 
140
134
  <pre>
141
- BatchLoader.for(post.<b>user_id</b>).batch do |<b>user_ids</b>, batch_loader|
135
+ BatchLoader.for(post.<b>user_id</b>).batch do |<b>user_ids</b>, loader|
142
136
  ...
143
137
  end
144
138
  </pre>
145
139
 
146
- Inside the block we execute a batch query for our items (`User.where`). After that, all we have to do is to call `load` method and pass an item which was used in `BatchLoader.for` method (`user_id`) and the loaded object itself (`user`):
140
+ Inside the block we execute a batch query for our items (`User.where`). After that, all we have to do is to call `loader` by passing an item which was used in `BatchLoader.for` method (`user_id`) and the loaded object itself (`user`):
147
141
 
148
142
  <pre>
149
- BatchLoader.for(post.<b>user_id</b>).batch do |user_ids, batch_loader|
150
- User.where(id: user_ids).each { |u| batch_loader.load(<b>u.id</b>, <b>user</b>) }
143
+ BatchLoader.for(post.<b>user_id</b>).batch do |user_ids, loader|
144
+ User.where(id: user_ids).each { |user| loader.call(<b>user.id</b>, <b>user</b>) }
151
145
  end
152
146
  </pre>
153
147
 
154
- Now we can resolve all the collected `BatchLoader` objects:
148
+ When we call any method on the lazy object, it'll be automatically loaded through batching for all instantiated `BatchLoader`s:
155
149
 
156
150
  <pre>
157
- BatchLoader.sync!(users) # => SELECT * FROM users WHERE id IN (1, 2, 3)
151
+ puts users # => SELECT * FROM users WHERE id IN (1, 2, 3)
158
152
  </pre>
159
153
 
160
154
  For more information, see the [Implementation details](#implementation-details) section.
161
155
 
162
- ### REST API example
156
+ ### RESTful API example
163
157
 
164
158
  Now imagine we have a regular Rails app with N+1 HTTP requests:
165
159
 
@@ -187,8 +181,8 @@ As we can see, the code above will make N+1 HTTP requests, one for each post. Le
187
181
  ```ruby
188
182
  class Post < ApplicationRecord
189
183
  def rating_lazy
190
- BatchLoader.for(post).batch do |posts, batch_loader|
191
- Parallel.each(posts, in_threads: 10) { |post| batch_loader.load(post, post.rating) }
184
+ BatchLoader.for(post).batch do |posts, loader|
185
+ Parallel.each(posts, in_threads: 10) { |post| loader.call(post, post.rating) }
192
186
  end
193
187
  end
194
188
 
@@ -196,21 +190,22 @@ class Post < ApplicationRecord
196
190
  end
197
191
  ```
198
192
 
199
- `BatchLoader#load` is thread-safe. So, if `HttpClient` is also thread-safe, then with `parallel` gem we can execute all HTTP requests concurrently in threads (there are some benchmarks for [concurrent HTTP requests](https://github.com/exAspArk/concurrent_http_requests) in Ruby). Thanks to Matz, MRI releases GIL when thread hits blocking I/O – HTTP request in our case.
193
+ `loader` is thread-safe. So, if `HttpClient` is also thread-safe, then with `parallel` gem we can execute all HTTP requests concurrently in threads (there are some benchmarks for [concurrent HTTP requests](https://github.com/exAspArk/concurrent_http_requests) in Ruby). Thanks to Matz, MRI releases GIL when thread hits blocking I/O – HTTP request in our case.
200
194
 
201
- Now we can resolve all `BatchLoader` objects in the controller:
195
+ In the controller, all we have to do is to replace `post.rating` with the lazy `post.rating_lazy`:
202
196
 
203
197
  ```ruby
204
198
  class PostsController < ApplicationController
205
199
  def index
206
200
  posts = Post.limit(10)
207
201
  serialized_posts = posts.map { |post| {id: post.id, rating: post.rating_lazy} }
208
- render json: BatchLoader.sync!(serialized_posts)
202
+
203
+ render json: serialized_posts
209
204
  end
210
205
  end
211
206
  ```
212
207
 
213
- `BatchLoader` caches the resolved values. To ensure that the cache is purged between requests in the app add the following middleware to your `config/application.rb`:
208
+ `BatchLoader` caches the loaded values. To ensure that the cache is purged between requests in the app add the following middleware to your `config/application.rb`:
214
209
 
215
210
  ```ruby
216
211
  config.middleware.use BatchLoader::Middleware
@@ -220,8 +215,7 @@ See the [Caching](#caching) section for more information.
220
215
 
221
216
  ### GraphQL example
222
217
 
223
- With GraphQL using batching is particularly useful. You can't use usual techniques such as preloading associations in advance to avoid N+1 queries.
224
- Since you don't know which fields user is going to ask in a query.
218
+ Batching is particularly useful with GraphQL. Using such techniques as preloading data in advance to avoid N+1 queries can be very complicated, since a user can ask for any available fields in a query.
225
219
 
226
220
  Let's take a look at the simple [graphql-ruby](https://github.com/rmosolgo/graphql-ruby) schema example:
227
221
 
@@ -246,7 +240,7 @@ UserType = GraphQL::ObjectType.define do
246
240
  end
247
241
  ```
248
242
 
249
- If we want to execute a simple query like:
243
+ If we want to execute a simple query like the following, we will get N+1 queries for each `post.user`:
250
244
 
251
245
  ```ruby
252
246
  query = "
@@ -258,81 +252,79 @@ query = "
258
252
  }
259
253
  }
260
254
  "
261
- Schema.execute(query, variables: {}, context: {})
255
+ Schema.execute(query)
262
256
  ```
263
257
 
264
- We will get N+1 queries for each `post.user`. To avoid this problem, all we have to do is to change the resolver to use `BatchLoader`:
258
+ To avoid this problem, all we have to do is to change the resolver to return `BatchLoader`:
265
259
 
266
260
  ```ruby
267
261
  PostType = GraphQL::ObjectType.define do
268
262
  name "Post"
269
263
  field :user, !UserType, resolve: ->(post, args, ctx) do
270
- BatchLoader.for(post.user_id).batch do |user_ids, batch_loader|
271
- User.where(id: user_ids).each { |user| batch_loader.load(user.id, user) }
264
+ BatchLoader.for(post.user_id).batch do |user_ids, loader|
265
+ User.where(id: user_ids).each { |user| loader.call(user.id, user) }
272
266
  end
273
267
  end
274
268
  end
275
269
  ```
276
270
 
277
- And setup GraphQL with built-in `lazy_resolve` method:
271
+ And setup GraphQL to use the built-in `lazy_resolve` method:
278
272
 
279
273
  ```ruby
280
274
  Schema = GraphQL::Schema.define do
281
275
  query QueryType
282
- lazy_resolve BatchLoader, :sync
276
+ use BatchLoader::GraphQL
283
277
  end
284
278
  ```
285
279
 
280
+ That's it.
281
+
286
282
  ### Caching
287
283
 
288
- By default `BatchLoader` caches the resolved values. You can test it by running something like:
284
+ By default `BatchLoader` caches the loaded values. You can test it by running something like:
289
285
 
290
286
  ```ruby
291
287
  def user_lazy(id)
292
- BatchLoader.for(id).batch do |ids, batch_loader|
293
- User.where(id: ids).each { |user| batch_loader.load(user.id, user) }
288
+ BatchLoader.for(id).batch do |ids, loader|
289
+ User.where(id: ids).each { |user| loader.call(user.id, user) }
294
290
  end
295
291
  end
296
292
 
297
- user_lazy(1) # no request
298
- # => <#BatchLoader>
293
+ puts user_lazy(1) # SELECT * FROM users WHERE id IN (1)
294
+ # => <#User:...>
295
+
296
+ puts user_lazy(1) # no request
297
+ # => <#User:...>
298
+ ```
299
299
 
300
- user_lazy(1).sync # SELECT * FROM users WHERE id IN (1)
301
- # => <#User>
300
+ Usually, it's just enough to clear the cache between HTTP requests in the app. To do so, simply add the middleware:
302
301
 
303
- user_lazy(1).sync # no request
304
- # => <#User>
302
+ ```ruby
303
+ use BatchLoader::Middleware
305
304
  ```
306
305
 
307
306
  To drop the cache manually you can run:
308
307
 
309
308
  ```ruby
310
- user_lazy(1).sync # SELECT * FROM users WHERE id IN (1)
311
- user_lazy(1).sync # no request
309
+ puts user_lazy(1) # SELECT * FROM users WHERE id IN (1)
310
+ puts user_lazy(1) # no request
312
311
 
313
312
  BatchLoader::Executor.clear_current
314
313
 
315
- user_lazy(1).sync # SELECT * FROM users WHERE id IN (1)
316
- ```
317
-
318
- Usually, it's just enough to clear the cache between HTTP requests in the app. To do so, simply add the middleware:
319
-
320
- ```ruby
321
- # calls "BatchLoader::Executor.clear_current" after each request
322
- use BatchLoader::Middleware
314
+ puts user_lazy(1) # SELECT * FROM users WHERE id IN (1)
323
315
  ```
324
316
 
325
317
  In some rare cases it's useful to disable caching for `BatchLoader`. For example, in tests or after data mutations:
326
318
 
327
319
  ```ruby
328
320
  def user_lazy(id)
329
- BatchLoader.for(id).batch(cache: false) do |ids, batch_loader|
321
+ BatchLoader.for(id).batch(cache: false) do |ids, loader|
330
322
  # ...
331
323
  end
332
324
  end
333
325
 
334
- user_lazy(1).sync # SELECT * FROM users WHERE id IN (1)
335
- user_lazy(1).sync # SELECT * FROM users WHERE id IN (1)
326
+ puts user_lazy(1) # SELECT * FROM users WHERE id IN (1)
327
+ puts user_lazy(1) # SELECT * FROM users WHERE id IN (1)
336
328
  ```
337
329
 
338
330
  ## Installation
@@ -353,7 +345,7 @@ Or install it yourself as:
353
345
 
354
346
  ## Implementation details
355
347
 
356
- Coming soon
348
+ See the [slides](https://speakerdeck.com/exaspark/batching-a-powerful-way-to-solve-n-plus-1-queries) [37-42].
357
349
 
358
350
  ## Development
359
351
 
@@ -365,6 +357,22 @@ To install this gem onto your local machine, run `bundle exec rake install`. To
365
357
 
366
358
  Bug reports and pull requests are welcome on GitHub at https://github.com/exAspArk/batch-loader. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [Contributor Covenant](http://contributor-covenant.org) code of conduct.
367
359
 
360
+ ## Alternatives
361
+
362
+ There are some other Ruby implementations for batching such as:
363
+
364
+ * [shopify/graphql-batch](https://github.com/shopify/graphql-batch)
365
+ * [sheerun/dataloader](https://github.com/sheerun/dataloader)
366
+
367
+ However, `batch-loader` has some differences:
368
+
369
+ * It is implemented for general usage and can be used not only with GraphQL. In fact, we use it for RESTful APIs and GraphQL on production at the same time.
370
+ * It doesn't try to mimic implementations in other programming languages which have an asynchronous nature. So, it doesn't load extra dependencies to bring such primitives as Promises, which are not very popular in Ruby community.
371
+ Instead, it uses the idea of lazy objects, which are included in the [Ruby standard library](https://ruby-doc.org/core-2.4.1/Enumerable.html#method-i-lazy). These lazy objects allow one to return the necessary data at the end when it's necessary.
372
+ * It doesn't force you to share batching through variables or custom defined classes, just pass a block to the `batch` method.
373
+ * It doesn't require to return an array of the loaded objects in the same order as the passed items. I find it difficult to satisfy these constraints: to sort the loaded objects and add `nil` values for the missing ones. Instead, it provides the `loader` lambda which simply maps an item to the loaded object.
374
+ * It doesn't depend on any other external dependencies. For example, no need to load huge external libraries for thread-safety, the gem is thread-safe out of the box.
375
+
368
376
  ## License
369
377
 
370
378
  The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
data/batch-loader.gemspec CHANGED
@@ -9,13 +9,13 @@ Gem::Specification.new do |spec|
9
9
  spec.authors = ["exAspArk"]
10
10
  spec.email = ["exaspark@gmail.com"]
11
11
 
12
- spec.summary = %q{Simple tool to avoid N+1 DB queries, HTTP requests, etc.}
13
- spec.description = %q{Simple tool to avoid N+1 DB queries, HTTP requests, etc.}
12
+ spec.summary = %q{Powerful tool to avoid N+1 DB or HTTP queries}
13
+ spec.description = %q{Powerful tool to avoid N+1 DB or HTTP queries}
14
14
  spec.homepage = "https://github.com/exAspArk/batch-loader"
15
15
  spec.license = "MIT"
16
16
 
17
17
  spec.files = `git ls-files -z`.split("\x0").reject do |f|
18
- f.match(%r{^(test|spec|features)/})
18
+ f.match(%r{^(spec|images)/})
19
19
  end
20
20
  spec.bindir = "exe"
21
21
  spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) }
data/lib/batch_loader.rb CHANGED
@@ -1,67 +1,115 @@
1
+ # frozen_string_literal: true
2
+
1
3
  require "batch_loader/version"
2
4
  require "batch_loader/executor_proxy"
3
5
  require "batch_loader/middleware"
6
+ require "batch_loader/graphql"
4
7
 
5
8
  class BatchLoader
6
9
  NoBatchError = Class.new(StandardError)
7
- BatchAlreadyExistsError = Class.new(StandardError)
8
10
 
9
11
  def self.for(item)
10
12
  new(item: item)
11
13
  end
12
14
 
13
- def self.sync!(value)
14
- case value
15
- when Array
16
- value.map! { |v| sync!(v) }
17
- when Hash
18
- value.each { |k, v| value[k] = sync!(v) }
19
- when BatchLoader
20
- sync!(value.sync)
21
- else
22
- value
23
- end
24
- end
25
-
26
- attr_reader :item, :batch_block, :cache
27
-
28
15
  def initialize(item:)
29
16
  @item = item
30
17
  end
31
18
 
32
19
  def batch(cache: true, &batch_block)
33
- raise BatchAlreadyExistsError if @batch_block
34
20
  @cache = cache
35
21
  @batch_block = batch_block
36
- executor_for_block.add(item: item)
22
+ executor_proxy.add(item: @item)
23
+
24
+ singleton_class.class_eval { undef_method(:batch) }
25
+
37
26
  self
38
27
  end
39
28
 
40
- def load(item, value)
41
- executor_for_block.load(item: item, value: value)
29
+ def batch_loader?
30
+ true
42
31
  end
43
32
 
44
- def sync
45
- unless executor_for_block.value_loaded?(item: item)
46
- batch_block.call(executor_for_block.list_items, self)
47
- executor_for_block.delete_items
48
- end
49
- result = executor_for_block.loaded_value(item: item)
50
- purge_cache unless cache
51
- result
33
+ def respond_to?(method_name)
34
+ method_name == :batch_loader? || method_missing(:respond_to?, method_name)
52
35
  end
53
36
 
54
37
  private
55
38
 
56
- def executor_for_block
57
- @executor_for_block ||= begin
58
- raise NoBatchError.new("Please provide a batch block first") unless batch_block
59
- BatchLoader::ExecutorProxy.new(&batch_block)
39
+ def method_missing(method_name, *args, &block)
40
+ sync!.public_send(method_name, *args, &block)
41
+ end
42
+
43
+ def sync!
44
+ return self if @synced
45
+
46
+ ensure_batched
47
+ loaded_value = executor_proxy.loaded_value(item: @item)
48
+
49
+ if @cache
50
+ replace_with!(loaded_value)
51
+ @synced = true
52
+ self
53
+ else
54
+ purge_cache
55
+ loaded_value
56
+ end
57
+ end
58
+
59
+ def ensure_batched
60
+ return if executor_proxy.value_loaded?(item: @item)
61
+
62
+ items = executor_proxy.list_items
63
+ loader = ->(item, value) { executor_proxy.load(item: item, value: value) }
64
+ items.each { |item| loader.call(item, nil) }
65
+ @batch_block.call(items, loader)
66
+ executor_proxy.delete(items: items)
67
+ end
68
+
69
+ def singleton_class
70
+ class << self
71
+ self
72
+ end
73
+ end
74
+
75
+ def replace_with!(value)
76
+ BatchLoader.send(:without_warnings) do
77
+ ignore_method_names = %i[singleton_method_added].freeze
78
+ singleton_class.class_eval do
79
+ (value.methods - ignore_method_names).each do |method_name|
80
+ define_method(method_name) do |*args, &block|
81
+ value.public_send(method_name, *args, &block)
82
+ end
83
+ end
84
+ end
60
85
  end
61
86
  end
62
87
 
63
88
  def purge_cache
64
- executor_for_block.unload_value(item: item)
65
- executor_for_block.add(item: item)
89
+ executor_proxy.unload_value(item: @item)
90
+ executor_proxy.add(item: @item)
91
+ end
92
+
93
+ def executor_proxy
94
+ @executor_proxy ||= begin
95
+ raise NoBatchError.new("Please provide a batch block first") unless @batch_block
96
+ BatchLoader::ExecutorProxy.new(&@batch_block)
97
+ end
98
+ end
99
+
100
+ class << self
101
+ private
102
+
103
+ def without_warnings(&block)
104
+ warning_level = $VERBOSE
105
+ $VERBOSE = nil
106
+ block.call
107
+ $VERBOSE = warning_level
108
+ end
109
+ end
110
+
111
+ without_warnings do
112
+ leave_method_names = %i[batch batch_loader? respond_to?].freeze
113
+ (instance_methods - leave_method_names).each { |method_name| undef_method(method_name) }
66
114
  end
67
115
  end
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  class BatchLoader
2
4
  class Executor
3
5
  NAMESPACE = :batch_loader
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  require "batch_loader/executor"
2
4
 
3
5
  class BatchLoader
@@ -11,15 +13,15 @@ class BatchLoader
11
13
  end
12
14
 
13
15
  def add(item:)
14
- items << item
16
+ items_to_load << item
15
17
  end
16
18
 
17
19
  def list_items
18
- items.to_a
20
+ items_to_load.to_a
19
21
  end
20
22
 
21
- def delete_items
22
- global_executor.items_by_block[@block_hash_key] = Set.new
23
+ def delete(items:)
24
+ global_executor.items_by_block[@block_hash_key] = items_to_load - items
23
25
  end
24
26
 
25
27
  def load(item:, value:)
@@ -40,7 +42,7 @@ class BatchLoader
40
42
 
41
43
  private
42
44
 
43
- def items
45
+ def items_to_load
44
46
  global_executor.items_by_block[@block_hash_key]
45
47
  end
46
48
 
@@ -0,0 +1,30 @@
1
+ # frozen_string_literal: true
2
+
3
+ class BatchLoader
4
+ class GraphQL
5
+ class Wrapper
6
+ def initialize(batch_loader)
7
+ @batch_loader = batch_loader
8
+ end
9
+
10
+ def sync
11
+ @batch_loader
12
+ end
13
+ end
14
+
15
+ def self.use(schema_definition)
16
+ schema_definition.lazy_resolve(BatchLoader::GraphQL::Wrapper, :sync)
17
+ schema_definition.instrument(:field, self)
18
+ end
19
+
20
+ def self.instrument(type, field)
21
+ old_resolve_proc = field.resolve_proc
22
+ new_resolve_proc = ->(object, arguments, context) do
23
+ result = old_resolve_proc.call(object, arguments, context)
24
+ result.respond_to?(:batch_loader?) ? BatchLoader::GraphQL::Wrapper.new(result) : result
25
+ end
26
+
27
+ field.redefine { resolve(new_resolve_proc) }
28
+ end
29
+ end
30
+ end
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  class BatchLoader
2
4
  class Middleware
3
5
  def initialize(app)
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  class BatchLoader
2
- VERSION = "0.3.0"
4
+ VERSION = "1.0.0"
3
5
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: batch-loader
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.3.0
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - exAspArk
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2017-08-03 00:00:00.000000000 Z
11
+ date: 2017-08-21 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: bundler
@@ -80,7 +80,7 @@ dependencies:
80
80
  - - "~>"
81
81
  - !ruby/object:Gem::Version
82
82
  version: '3.4'
83
- description: Simple tool to avoid N+1 DB queries, HTTP requests, etc.
83
+ description: Powerful tool to avoid N+1 DB or HTTP queries
84
84
  email:
85
85
  - exaspark@gmail.com
86
86
  executables: []
@@ -104,6 +104,7 @@ files:
104
104
  - lib/batch_loader.rb
105
105
  - lib/batch_loader/executor.rb
106
106
  - lib/batch_loader/executor_proxy.rb
107
+ - lib/batch_loader/graphql.rb
107
108
  - lib/batch_loader/middleware.rb
108
109
  - lib/batch_loader/version.rb
109
110
  homepage: https://github.com/exAspArk/batch-loader
@@ -129,5 +130,5 @@ rubyforge_project:
129
130
  rubygems_version: 2.5.2
130
131
  signing_key:
131
132
  specification_version: 4
132
- summary: Simple tool to avoid N+1 DB queries, HTTP requests, etc.
133
+ summary: Powerful tool to avoid N+1 DB or HTTP queries
133
134
  test_files: []