dataloader 0.0.1 → 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: f032c75315465da12e47780bd7cd70ef20cf42d7
4
- data.tar.gz: e065585b95c677ad39a5f84a069a6afc0412ee7e
3
+ metadata.gz: 1245888f26c565a88d6e2868c45218014e883ece
4
+ data.tar.gz: 25fc613bfa5ae2af41c6a9b31d44c77428e5d214
5
5
  SHA512:
6
- metadata.gz: 2278ac3002ff9122bf74a3c0df593e8741b0042be119bc7d8249cdcc6906cbfe811c0101f97d330485a572c176913f3046d87336c13fb0b7e6e127c85abe8819
7
- data.tar.gz: 5f8c7e2fecfc3fd1d9365d9d220a9d4ad45764e8bc259db06eac42f231ace5cdbe2708a26b12361132b85f5d93be174b291577c3a79f5ee2ce81c017fc3b3fce
6
+ metadata.gz: 68f9ea657f150d09ed9aeec7789e076a46865897d79b150f7303c8e6b2c860a4ae4e001cb05c9b76626c875942ad6f86cce81f33744e1ff0159cfa719b2cc105
7
+ data.tar.gz: 3dc95561576e7961454b734852619f81f87a2ba4c9e56b65ac4fc758ba5b2d49921a6a59ea1ed59cb7b03a9b6e3220318d922a89934630cdcec64814e14b068a
data/LICENSE ADDED
@@ -0,0 +1,22 @@
1
+ Copyright (c) 2017-present, Adam Stankiewicz
2
+
3
+ MIT License
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining
6
+ a copy of this software and associated documentation files (the
7
+ "Software"), to deal in the Software without restriction, including
8
+ without limitation the rights to use, copy, modify, merge, publish,
9
+ distribute, sublicense, and/or sell copies of the Software, and to
10
+ permit persons to whom the Software is furnished to do so, subject to
11
+ the following conditions:
12
+
13
+ The above copyright notice and this permission notice shall be
14
+ included in all copies or substantial portions of the Software.
15
+
16
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
17
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
18
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
19
+ NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
20
+ LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
21
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
22
+ WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
@@ -0,0 +1,259 @@
1
+ # ![](http://i.imgur.com/ZdJKtj1.png) Dataloader
2
+
3
+ [![Build Status](https://travis-ci.org/sheerun/dataloader.svg?branch=master)](https://travis-ci.org/sheerun/dataloader) [![codecov](https://codecov.io/gh/sheerun/dataloader/branch/master/graph/badge.svg)](https://codecov.io/gh/sheerun/dataloader)
4
+
5
+
6
+ Dataloader is a generic utility to be used as part of your application's data fetching layer to provide a simplified and consistent API to perform batching and caching within a request. It is heavily inspired by [Facebook's dataloader](https://github.com/facebook/dataloader).
7
+
8
+ ## Getting started
9
+
10
+ First, install Dataloader using bundler:
11
+
12
+ ```ruby
13
+ gem "dataloader"
14
+ ```
15
+
16
+ To get started, create a `Dataloader`. Each `Dataloader` instance represents a unique cache. Typically instances are created per request when used within a web-server. To see how to use with GraphQL server, see section below.
17
+
18
+
19
+ ## Basic usage
20
+
21
+ ```ruby
22
+ # It will be called only once with ids = [1, 2, 3]
23
+ loader = Dataloader.new do |ids|
24
+ User.find(*ids)
25
+ end
26
+
27
+ # Schedule data to load
28
+ promise_one = loader.load(0)
29
+ promise_two = loader.load_many([1, 2])
30
+
31
+ # Get promises results
32
+ user0 = promise_one.sync
33
+ user1, user2 = promise_two.sync
34
+ ```
35
+
36
+ ## Using with GraphQL
37
+
38
+ You can pass loaders passed inside [`context`](https://rmosolgo.github.io/graphql-ruby/queries/executing_queries).
39
+
40
+ ```ruby
41
+ UserType = GraphQL::ObjectType.define do
42
+ field :name, types.String
43
+ end
44
+
45
+ QueryType = GraphQL::ObjectType.define do
46
+ name "Query"
47
+ description "The query root of this schema"
48
+
49
+ field :user do
50
+ type UserType
51
+ argument :id, !types.ID
52
+ resolve ->(obj, args, ctx) {
53
+ ctx[:user_loader].load(args["id"])
54
+ }
55
+ end
56
+ end
57
+
58
+ Schema = GraphQL::Schema.define do
59
+ lazy_resolve(Promise, :sync)
60
+
61
+ query QueryType
62
+ end
63
+
64
+ context = {
65
+ user_loader: Dataloader.new do |ids|
66
+ User.find(*ids)
67
+ end
68
+ }
69
+
70
+ Schema.execute("{ user(id: 12) { name } }", context: context)
71
+ ```
72
+
73
+ ## Batching
74
+
75
+ You can create loaders by providing a batch loading function.
76
+
77
+ ```ruby
78
+ user_loader = Dataloader.new { |ids| User.find(*ids) }
79
+ ```
80
+
81
+ A batch loading block accepts an Array of keys, and returns a Promise which resolves to an Array or Hash of values.
82
+
83
+ Dataloader will coalesce all individual loads which occur until first `.sync` is called on any promise returned by `#load` or `#load_many`, and then call your batch function with all requested keys.
84
+
85
+ ```ruby
86
+ user_loader.load(1)
87
+ .then { |user| user_loader.load(user.invited_by_id)) }
88
+ .then { |invited_by| "User 1 was invited by ${invited_by[:name]}" }
89
+
90
+ # Elsewhere in your backend
91
+ user_loader.load(2)
92
+ .then { |user| user_loader.load(user.invited_by_id)) }
93
+ .then { |invited_by| "User 2 was invited by ${invited_by[:name]}" }
94
+ ```
95
+
96
+ A naive solution is to isisue four SQL queries to get required information, but with `Dataloader` this application will make at most two queries (one to load users, and second one to load invites).
97
+
98
+ `Dataloader` allows you to decouple unrelated parts of your application without sacrificing the performance of batch data-loading. While the loader presents an API that loads individual values, all concurrent requests will be coalesced and presented to your batch loading function. This allows your application to safely distribute data fetching requirements throughout your application and maintain minimal outgoing data requests.
99
+
100
+ ### Batch function
101
+
102
+ A batch loading function accepts an Array of keys, and returns Array of values or Hash that maps from keys to values (or a [Promise](https://github.com/lgierth/promise.rb) that returns such Array or Hash). There are a few constraints that must be upheld:
103
+
104
+ * The Array of values must be the same length as the Array of keys.
105
+ * Each index in the Array of values must correspond to the same index in the Array of keys.
106
+ * If Hash is returned, it must include all keys passed to batch loading function
107
+
108
+ For example, if your batch function was provided the Array of keys: `[ 2, 9, 6 ]`, you could return one of following:
109
+
110
+ ```ruby
111
+ [
112
+ { id: 2, name: "foo" },
113
+ { id: 9, name: "bar" },
114
+ { id: 6, name: "baz" }
115
+ ]
116
+ ```
117
+
118
+ ```ruby
119
+ {
120
+ 2 => { id: 2, name: "foo" },
121
+ 9 => { id: 9, name: "bar" },
122
+ 6 => { id: 6, name: "baz" }
123
+ }
124
+ ```
125
+
126
+ ## Caching
127
+
128
+ Dataloader provides a memoization cache for all loads which occur withing single instance of it. After `#load` is called once with a given key, the resulting Promise is cached to eliminate redundant loads.
129
+
130
+ In addition to reliving pressure on your data storage, caching results per-request also creates fewer objects which may relieve memory pressure on your application:
131
+
132
+ ```
133
+ promise1 = user_loader.load(1)
134
+ promise2 = user_loader.load(1)
135
+ promise1 == promise2 # => true
136
+ ```
137
+
138
+ ### Caching per-request
139
+
140
+ `Dataloader` caching does not replace Redis, Memcache, or any other shared application-level cache. DataLoader is first and foremost a data loading mechanism, and its cache only serves the purpose of not repeatedly loading the same data in the context of a single request to your Application. To do this, it maintains a simple in-memory memoization cache (more accurately: `#load` is a memoized function).
141
+
142
+ Avoid multiple requests from different users using the same `Dataloader` instance, which could result in cached data incorrectly appearing in each request. Typically, `Dataloader` instances are created when a request begins, and are not used once the request ends.
143
+
144
+ See "Using with GraphQL" section to see how you can pass dataloader instances using context.
145
+
146
+ ### Caching errors
147
+
148
+ If a batch load fails (that is, a batch function throws or returns a rejected Promise), then the requested values will not be cached. However if a batch function returns an Error instance for an individual value, that Error will be cached to avoid frequently loading the same Error.
149
+
150
+ In some circumstances you may wish to clear the cache for these individual Errors:
151
+
152
+ ```ruby
153
+ user_loader.load(1).rescue do |error|
154
+ user_loader.cache.delete(1)
155
+ raise error
156
+ end
157
+ ```
158
+
159
+ ### Disabling cache
160
+
161
+ In certain uncommon cases, a Dataloader which does not cache may be desirable. Calling `Dataloader.new({ cache: nil }) { ... }` will ensure that every call to `#load` will produce a new Promise, and requested keys will not be saved in memory.
162
+
163
+ However, when the memoization cache is disabled, your batch function will receive an array of keys which may contain duplicates! Each key will be associated with each call to `#load`. Your batch loader should provide a value for each instance of the requested key.
164
+
165
+ ```ruby
166
+ loader = Dataloader.new({ cache: nil }) do |keys|
167
+ puts keys
168
+ some_loading_function(keys)
169
+ end
170
+
171
+ loader.load('A')
172
+ loader.load('B')
173
+ loader.load('A')
174
+
175
+ // > [ 'A', 'B', 'A' ]
176
+ ```
177
+
178
+ ## API
179
+
180
+ ### `Dataloader`
181
+
182
+ `Dataloader` is a class for fetching data given unique keys such as the id column (or any other key).
183
+
184
+ Each `Dataloader` instance contains a unique memoized cache. Because of it, it is recommended to use one `Datalaoder` instane **per web request**. You can use more long-lived instances, but then you need to take care of manually cleaning the cache.
185
+
186
+ You shound't share the same dataloader instance across different threads. This behavior is currently undefined.
187
+
188
+ ### `Dataloader.new(options = {}, &batch_load)`
189
+
190
+ Create a new `Dataloader` given a batch loading function and options.
191
+
192
+ * `batch_load`: A block which accepts an Array of keys, and returns Array of values or Hash that maps from keys to values (or a [Promise](https://github.com/lgierth/promise.rb) that returns such value).
193
+ * `options`: An optional hash of options:
194
+ * `:key` A function to produce a cache key for a given load key. Defaults to function { |key| key }. Useful to provide when objects are keys and two similarly shaped objects should be considered equivalent.
195
+ * `:cache` An instance of cache used for caching of promies. Defaults to `Concurrent::Map.new`.
196
+ - The only required API is `#compute_if_absent(key)`).
197
+ - You can pass `nil` if you want to disable the cache.
198
+ - You can pass pre-populated cache as well. The values can be Promises.
199
+ * `:max_batch_size` Limits the number of items that get passed in to the batchLoadFn. Defaults to `INFINITY`. You can pass `1` to disable batching.
200
+
201
+ ### `#load(key)`
202
+
203
+ **key** [Object] a key to load using `batch_load`
204
+
205
+ Returns a [Promise](https://github.com/lgierth/promise.rb) of computed value.
206
+
207
+ You can resolve this promise when you actually need the value with `promise.sync`.
208
+
209
+ All calls to `#load` are batched until the first `#sync` is encountered. Then is starts batching again, et caetera.
210
+
211
+ ### `#load_many(keys)`
212
+
213
+ **keys** [Array<Object>] list of keys to load using `batch_load`
214
+
215
+ Returns a [Promise<Array>](https://github.com/lgierth/promise.rb) of array of computed values.
216
+
217
+ To give an example, to multiple keys:
218
+
219
+ ```ruby
220
+ promise = loader.load_many(['a', 'b'])
221
+ object_a, object_b = promise.sync
222
+ ```
223
+
224
+ This is equivalent to the more verbose:
225
+
226
+ ```ruby
227
+ promise = Promise.all([loader.load('a'), loader.load('b')])
228
+ object_a, object_b = promise.sync
229
+ ```
230
+
231
+ ### `#cache`
232
+
233
+ Returns the internal cache that can be overridden with `:cache` option (see constructor)
234
+
235
+ This field is writable, so you can reset the cache with something like:
236
+
237
+ ```ruby
238
+ loader.cache = Concurrent::Map.new
239
+ ```
240
+
241
+ ### `#wait`
242
+
243
+ Triggers all batched loaders until there are no keys to resolve.
244
+
245
+ This method is invoked automatically when value of any promise is requested with `#sync`
246
+
247
+ Here is the implementation that Dataloader sets as a default for [Promise](https://github.com/lgierth/promise.rb):
248
+
249
+ ```ruby
250
+ class Promise
251
+ def wait
252
+ Dataloader.wait
253
+ end
254
+ end
255
+ ```
256
+
257
+ ## License
258
+
259
+ MIT
@@ -1,5 +1,237 @@
1
- class DataLoader
2
- def self.test
3
- "test"
1
+ require "concurrent"
2
+ require "promise"
3
+
4
+ # @!visibility private
5
+ class Promise
6
+ alias_method :wait_old, :wait
7
+
8
+ def wait
9
+ Dataloader.wait
10
+ wait_old
11
+ end
12
+ end
13
+
14
+ class Dataloader
15
+ # @!visibility private
16
+ VERSION = "1.0.0".freeze
17
+
18
+ # @!visibility private
19
+ class NoCache
20
+ def compute_if_absent(key)
21
+ yield
22
+ end
23
+ end
24
+
25
+ # @!visibility private
26
+ class Batch
27
+ def initialize(dataloader)
28
+ # Used for storing cache of promises and batch load method
29
+ @dataloader = dataloader
30
+ # Batch can be dispatched only once (it loads all queued promises)
31
+ @dispatched = false
32
+ # This is where result of executing batch is stored
33
+ @result = Promise.new
34
+ # This is where items to batch load are stored
35
+ @queue = Concurrent::Array.new
36
+ # We store pending batches to load per-thread
37
+ Thread.current[:pending_batches].unshift(self)
38
+ end
39
+
40
+ def dispatch
41
+ @dispatched = true
42
+
43
+ result = @dataloader.batch_load.call(@queue)
44
+
45
+ if result.is_a?(Promise)
46
+ result.then do |values|
47
+ @result.fulfill(handle_result(@queue, values))
48
+ end
49
+ else
50
+ @result.fulfill(handle_result(@queue, result))
51
+ end
52
+ end
53
+
54
+ def dispatched?
55
+ @dispatched
56
+ end
57
+
58
+ def queue(key)
59
+ raise StandardError, "Cannot queue elements after batch is dispatched. Queued key: #{key}. This error shoudn't happen. Please raise an issue on https://github.com/sheerun/dataloader with steps to reproduce" if @dispatched
60
+
61
+ @queue.push(key)
62
+
63
+ if @queue.size >= @dataloader.max_batch_size
64
+ dispatch
65
+ end
66
+
67
+ @result.then do |values|
68
+ unless values.key?(key)
69
+ raise StandardError, "Batch loader didn't resolve a key: #{key}. Resolved keys: #{values.keys}"
70
+ end
71
+
72
+ values[key]
73
+ end
74
+ end
75
+
76
+ protected
77
+
78
+ def handle_result(keys, values)
79
+ unless values.is_a?(Array) || values.is_a?(Hash)
80
+ raise TypeError, "Batch loader must return an Array or Hash, but returned: #{values.class.name}"
81
+ end
82
+
83
+ if keys.size != values.size
84
+ raise StandardError, "Batch loader must be instantiated with function that returns Array or Hash " \
85
+ "of the same size as provided to it Array of keys" \
86
+ "\n\nProvided keys:\n#{keys}" \
87
+ "\n\nReturned values:\n#{values}"
88
+ end
89
+
90
+ values = Hash[keys.zip(values)] if values.is_a?(Array)
91
+
92
+ values
93
+ end
94
+ end
95
+
96
+ # Returns the internal cache that can be overridden with `:cache` option (see constructor)
97
+ # This field is writable, so you can reset the cache with something like:
98
+ #
99
+ # loader.cache = Concurrent::Map.new
100
+ #
101
+ # Defaults to Concurrent::Map.new
102
+ attr_accessor :cache
103
+
104
+ # @!visibility private
105
+ attr_reader :batch_load
106
+
107
+ # @!visibility private
108
+ attr_reader :max_batch_size
109
+
110
+ # Creates new dataloader
111
+ #
112
+ # @option options [Proc] :key A function to produce a cache key for a given load key. Defaults to proc { |key| key }. Useful to provide when objects are keys and two similarly shaped objects should be considered equivalent.
113
+ # @option options [Object] :cache An instance of cache used for caching of promies (the only required api is `#compute_if_absent`). Defaults to Concurrent::Map.new. Values can be either promises or actual values. You can pass `nil` if you don't want caching.
114
+ # @option options [Object] :max_batch_size Limits the number of items that get passed in to the batchLoadFn. Defaults to Float::INFINITY. You can pass 1 to disable batching.
115
+ # @yieldparam [Array] array is batched ids to load
116
+ # @yieldreturn [Promise] a promise of loaded value with batch_load block
117
+ def initialize(options = {}, &batch_load)
118
+ Thread.current[:pending_batches] ||= []
119
+
120
+ unless block_given?
121
+ raise TypeError, "Dataloader must be constructed with a block which accepts " \
122
+ "Array and returns either Array or Hash of the same size (or Promise)"
123
+ end
124
+
125
+ @batch_promise = Batch.new(self)
126
+ @batch_load = batch_load
127
+
128
+ @key = options.fetch(:key, lambda { |key| key })
129
+ @cache = options.fetch(:cache, Concurrent::Map.new)
130
+ @max_batch_size = options.fetch(:max_batch_size, Float::INFINITY)
131
+
132
+ if @cache.nil?
133
+ @cache = NoCache.new
134
+ end
135
+ end
136
+
137
+ # Forces all currently pending promises to be executed and resolved
138
+ #
139
+ # This method is invoked automatically when value of any promise is requested with `.sync`
140
+ # @example Promise.rb implementation that waits for all batched promises (default):
141
+ # class Promise
142
+ # def wait
143
+ # Dataloader.wait
144
+ # end
145
+ # end
146
+ def self.wait
147
+ until Thread.current[:pending_batches].empty?
148
+ pending = Thread.current[:pending_batches]
149
+ Thread.current[:pending_batches] = []
150
+ pending.each(&:dispatch)
151
+ end
152
+ end
153
+
154
+ # Loads a key, returning a [Promise](https://github.com/lgierth/promise.rb) for the value represented by that key.
155
+ #
156
+ # You can resolve this promise when you actually need the value with `promise.sync`.
157
+ #
158
+ # All calls to `#load` are batched until the first `#sync` is encountered. Then is starts batching again, et caetera.
159
+ #
160
+ # @param key [Object] key to load using `batch_load` proc
161
+ # @return [Promise<Object>] A Promise of computed value
162
+ # @example Load promises of two users and resolve them:
163
+ # user_loader = Dataloader.new do |ids|
164
+ # User.find(*ids)
165
+ # end
166
+ #
167
+ # user1_promise = user_loader.load(1)
168
+ # user2_promise = user_loader.load(2)
169
+ #
170
+ # user1 = user1_promise.sync
171
+ # user2 = user2_promise.sync
172
+ def load(key)
173
+ if key.nil?
174
+ raise TypeError, "#load must be called with a key, but got: nil"
175
+ end
176
+
177
+ result = compute_if_absent(key) do
178
+ batch_promise.queue(key)
179
+ end
180
+
181
+ unless result.is_a?(Promise)
182
+ return Promise.new.fulfill(result)
183
+ end
184
+
185
+ result
186
+ end
187
+
188
+ #
189
+ #
190
+ # Loads multiple keys, promising an array of values:
191
+ #
192
+ # ```ruby
193
+ # promise = loader.load_many(['a', 'b'])
194
+ # object_a, object_b = promise.sync
195
+ # ```
196
+ #
197
+ # This is equivalent to the more verbose:
198
+ #
199
+ # ```ruby
200
+ # promise = Promise.all([loader.load('a'), loader.load('b')])
201
+ # object_a, object_b = promise.sync
202
+ # ```
203
+ #
204
+ # @param keys [Array<Object>] list of keys to load using `batch_load` proc
205
+ # @return [Promise<Object>] A Promise of computed values
206
+ # @example Load promises of two users and resolve them:
207
+ # user_loader = Dataloader.new do |ids|
208
+ # User.find(*ids)
209
+ # end
210
+ #
211
+ # users_promise = user_loader.load_many([1, 2])
212
+ #
213
+ # user1, user2 = users_promise.sync
214
+ def load_many(keys)
215
+ unless keys.is_a?(Array)
216
+ raise TypeError, "#load_many must be called with an Array, but got: #{keys.class.name}"
217
+ end
218
+
219
+ Promise.all(keys.map(&method(:load)))
220
+ end
221
+
222
+ protected
223
+
224
+ def compute_if_absent(key)
225
+ @cache.compute_if_absent(@key.call(key)) do
226
+ yield
227
+ end
228
+ end
229
+
230
+ def batch_promise
231
+ if @batch_promise.dispatched?
232
+ @batch_promise = Batch.new(self)
233
+ end
234
+
235
+ @batch_promise
4
236
  end
5
237
  end
metadata CHANGED
@@ -1,26 +1,57 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: dataloader
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.1
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
- - Samer Buna
7
+ - Adam Stankiewicz
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2016-06-06 00:00:00.000000000 Z
12
- dependencies: []
13
- description: DataLoader for Ruby
14
- email: samer@agilelabs.com
11
+ date: 2017-07-04 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: concurrent-ruby
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - "~>"
18
+ - !ruby/object:Gem::Version
19
+ version: '1'
20
+ type: :runtime
21
+ prerelease: false
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - "~>"
25
+ - !ruby/object:Gem::Version
26
+ version: '1'
27
+ - !ruby/object:Gem::Dependency
28
+ name: promise.rb
29
+ requirement: !ruby/object:Gem::Requirement
30
+ requirements:
31
+ - - "~>"
32
+ - !ruby/object:Gem::Version
33
+ version: 0.7.3
34
+ type: :runtime
35
+ prerelease: false
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - "~>"
39
+ - !ruby/object:Gem::Version
40
+ version: 0.7.3
41
+ description: A data loading utility to batch loading of promises. It can be used with
42
+ graphql gem.
43
+ email:
44
+ - sheerun@sher.pl
15
45
  executables: []
16
46
  extensions: []
17
47
  extra_rdoc_files: []
18
48
  files:
19
- - Rakefile
49
+ - LICENSE
50
+ - README.md
20
51
  - lib/dataloader.rb
21
- - test/test_dataloader.rb
22
- homepage: http://rubygems.org/gems/dataloader
23
- licenses: []
52
+ homepage: https://github.com/sheerun/dataloader
53
+ licenses:
54
+ - MIT
24
55
  metadata: {}
25
56
  post_install_message:
26
57
  rdoc_options: []
@@ -28,19 +59,18 @@ require_paths:
28
59
  - lib
29
60
  required_ruby_version: !ruby/object:Gem::Requirement
30
61
  requirements:
31
- - - '>='
62
+ - - ">="
32
63
  - !ruby/object:Gem::Version
33
64
  version: '0'
34
65
  required_rubygems_version: !ruby/object:Gem::Requirement
35
66
  requirements:
36
- - - '>='
67
+ - - ">="
37
68
  - !ruby/object:Gem::Version
38
69
  version: '0'
39
70
  requirements: []
40
71
  rubyforge_project:
41
- rubygems_version: 2.0.14.1
72
+ rubygems_version: 2.6.11
42
73
  signing_key:
43
- specification_version: 3
44
- summary: DataLoader
45
- test_files:
46
- - test/test_dataloader.rb
74
+ specification_version: 4
75
+ summary: Batch data loading, works great with graphql
76
+ test_files: []
data/Rakefile DELETED
@@ -1,8 +0,0 @@
1
- require 'rake/testtask'
2
-
3
- Rake::TestTask.new do |t|
4
- t.libs << 'test'
5
- end
6
-
7
- desc "Run tests"
8
- task :default => :test
@@ -1,8 +0,0 @@
1
- require 'test/unit'
2
- require 'dataloader'
3
-
4
- class DataLoaderTest < Test::Unit::TestCase
5
- def test_test
6
- assert_equal "test", DataLoader.test
7
- end
8
- end