sidekiq_bulk_job 0.1.2 → 0.1.6

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA256:
3
- metadata.gz: e3f6ce66f186f36e576e2611b892ecbb013f888b54f27bbc36f1a463aedc7b1f
4
- data.tar.gz: 8deff846fe66b2fa565af200678aa20a67ba0ef1550ef920fb5930da0d76bd22
2
+ SHA1:
3
+ metadata.gz: b24f02dbcf87a2fe41c7b73c64c92c698b8c6411
4
+ data.tar.gz: ddc7ec50e7055fe406c286309f599c07b4940390
5
5
  SHA512:
6
- metadata.gz: 1199ea0291f826b28e7eb7e3fac6c5c00e98c69d37adc46e110e567ec82977ad3519c15fada8df89b07d5237161f457ab1c71271797897eff37f7677c0e26359
7
- data.tar.gz: 1cb8a1f0c85caa3c05c994e1412b679251293d333b9d0b30831c13130e68a480035e484d73249e7be28645e01c7f237ea9dfd5348bdb1f97e16f73375faaa144
6
+ metadata.gz: 647ed0d297ed21e29729520a9f21a8dd6e7d54467b8ff7dcdac957f8cb4d48b0d879c71de33db9e51c152daa6d2a2ad24fa8a4491e7ee892b0e0f3eb791ac3d3
7
+ data.tar.gz: d401dd44b727304ca2e84acbc2a2d772eb8a81381e5f6ff0b609b72e3012ac9f12db4d708a953893ca03a84a60719ec7a7ff46766fe21f32bf74350d0d8f75f1
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- sidekiq_bulk_job (0.1.2)
4
+ sidekiq_bulk_job (0.1.5)
5
5
  sidekiq (~> 5.2.7)
6
6
 
7
7
  GEM
@@ -52,4 +52,4 @@ DEPENDENCIES
52
52
  sidekiq_bulk_job!
53
53
 
54
54
  BUNDLED WITH
55
- 2.1.4
55
+ 2.2.25
data/README.md CHANGED
@@ -1,8 +1,6 @@
1
1
  # SidekiqBulkJob
2
2
 
3
- Welcome to your new gem! In this directory, you'll find the files you need to be able to package up your Ruby library into a gem. Put your Ruby code in the file `lib/sidekiq_bulk_job`. To experiment with that code, run `bin/console` for an interactive prompt.
4
-
5
- TODO: Delete this and the text above, and describe your gem
3
+ A tool to collect the same class jobs together and running in batch.
6
4
 
7
5
  ## Installation
8
6
 
@@ -22,26 +20,226 @@ Or install it yourself as:
22
20
 
23
21
  ## Usage
24
22
 
25
- ###
23
+ ### Initialization:
24
+
25
+ ##### Parameters:
26
+
27
+ * redis: redis client.
28
+ * logger: log object,default Logger.new(STDOUT).
29
+ * process_fail: a callback when the job fail.
30
+ * async_delay: await delay time,default 60 seconds.
31
+ * scheduled_delay: scheduled job delay time,default 10 seconds.
32
+ * queue: default sidekiq running queue. By default the batch job will run at queue as same as sidekiq worker defined.
33
+ * batch_size: batch size in same job,default 3000.
34
+ * prefix: redis key prefix, default SidekiqBulkJob.
35
+
36
+ ```ruby
37
+ process_fail = lambda do |job_class_name, args, exception|
38
+ # do something
39
+ # send email
40
+ end
41
+ SidekiqBulkJob.config({
42
+ redis: Redis.new,
43
+ logger: Logger.new(STDOUT),
44
+ process_fail: process_fail,
45
+ async_delay: ASYNC_DELAY,
46
+ scheduled_delay: SCHEDULED_DELAY,
47
+ queue: :test,
48
+ batch_size: BATCH_SIZE,
49
+ prefix: "SidekiqBulkJob"
50
+ })
51
+ // push a job
52
+ SidekiqBulkJob.perform_async(TestJob, 10)
53
+ ```
54
+
55
+ ### Usage
56
+
57
+ At first define a TestJob as example
26
58
  ```ruby
27
- process_fail = lambda do |job_class_name, args, exception|
28
- # do somethine
29
- # send email
59
+ # create a sidekiq worker, use default queue
60
+ class TestJob
61
+ include Sidekiq::Worker
62
+ sidekiq_options queue: :default
63
+
64
+ def perform(*args)
65
+ puts args
66
+ end
30
67
  end
31
- client = Redis.new
32
- logger = Logger.new(STDOUT)
33
- logger.level = Logger::WARN
34
- SidekiqBulkJob.config redis: client, logger: logger, process_fail: process_fail, queue: :default, batch_size: 3000, prefix: "SidekiqBulkJob"
68
+ ```
69
+
70
+ ##### Use SidekiqBulkJob async
35
71
 
36
- // push a job
72
+ SidekiqBulkJob will collect the same job in to a list, a batch job will create when beyond the ```batch_size``` in ```async_delay``` amount, and clear the list. The list will continue to collect the job which pushing inside. If reach the```async_delay``` time, the SidekiqBulkJob will also created to finish all job collected.
73
+
74
+ ```ruby
75
+ # create a sidekiq worker, use default queue
76
+ class TestJob
77
+ include Sidekiq::Worker
78
+ sidekiq_options queue: :default
79
+
80
+ def perform(*args)
81
+ puts args
82
+ end
83
+ end
84
+
85
+ # simple use
37
86
  SidekiqBulkJob.perform_async(TestJob, 10)
87
+
88
+ # here will not create 1001 job in sidekiq
89
+ # now there are tow jobs created, one is collected 1000 TestJob in batch, another has 1 job inside.
90
+ (BATCH_SIZE + 1).times do |i|
91
+ SidekiqBulkJob.perform_async(TestJob, i)
92
+ end
38
93
  ```
39
94
 
40
- ## Development
95
+ ##### Use SidekiqWork batch_perform_async to run async task
96
+
97
+ ```ruby
98
+ # same as SidekiqBulkJob.perform_async(TestJob, 10)
99
+ TestJob.batch_perform_async(10)
100
+ ```
101
+
102
+ ##### Use SidekiqBulkJob perform_at/perform_in to set scheduled task
103
+
104
+ ```ruby
105
+ # run at 1 minute after with single job
106
+ SidekiqBulkJob.perform_at(1.minutes.after, TestJob, 10)
107
+ # same as below
108
+ SidekiqBulkJob.perform_in(1 * 60, TestJob, 10)
109
+ ```
110
+
111
+ ##### Use SidekiqWork batch_perform_at/batch_perform_in to set scheduled task
112
+
113
+ ```ruby
114
+ # same as SidekiqBulkJob.perform_at(1.minutes.after, TestJob, 10)
115
+ TestJob.batch_perform_at(1.minutes.after, 10)
116
+ # same as SidekiqBulkJob.perform_in(1 * 60, TestJob, 10)
117
+ TestJob.batch_perform_in(1.minute, 10)
118
+ ```
119
+
120
+ ##### Use setter to set task
121
+
122
+ ```ruby
123
+ # set queue to test and run async
124
+ TestJob.set(queue: :test).batch_perform_async(10)
125
+ # set queue to test and run after 90 seconds
126
+ TestJob.set(queue: :test, in: 90).batch_perform_async(10)
127
+
128
+ #batch_perform_in first params interval will be overrided 'in'/'at' option at setter
129
+ # run after 90 seconds instead of 10 seconds
130
+ TestJob.set(queue: :test, in: 90).batch_perform_in(10, 10)
131
+ ```
132
+
133
+ ## 中文
134
+
135
+ ### 初始化:
136
+
137
+ ##### 参数:
41
138
 
42
- After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
139
+ * redis: redis client
140
+ * logger: 日志对象,默认Logger.new(STDOUT)
141
+ * process_fail: 当job处理失败的通用回调
142
+ * async_delay: 延迟等待时间,默认60秒
143
+ * scheduled_delay: 定时任务延迟时间,默认10秒
144
+ * queue: 默认运行队列。根据job本身设置的队列运行,当没有设置时候就使用这里设置的队列运行
145
+ * batch_size: 同种类型job批量运行数量,默认3000
146
+ * prefix: 存储到redis的前缀,默认SidekiqBulkJob
43
147
 
44
- To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).
148
+ ```ruby
149
+ process_fail = lambda do |job_class_name, args, exception|
150
+ # do something
151
+ # send email
152
+ end
153
+ SidekiqBulkJob.config({
154
+ redis: Redis.new,
155
+ logger: Logger.new(STDOUT),
156
+ process_fail: process_fail,
157
+ async_delay: ASYNC_DELAY,
158
+ scheduled_delay: SCHEDULED_DELAY,
159
+ queue: :test,
160
+ batch_size: BATCH_SIZE,
161
+ prefix: "SidekiqBulkJob"
162
+ })
163
+ // push a job
164
+ SidekiqBulkJob.perform_async(TestJob, 10)
165
+ ```
166
+
167
+ ### 用法
168
+
169
+ 设置一个TestJob举例子
170
+ ```ruby
171
+ # create a sidekiq worker, use default queue
172
+ class TestJob
173
+ include Sidekiq::Worker
174
+ sidekiq_options queue: :default
175
+
176
+ def perform(*args)
177
+ puts args
178
+ end
179
+ end
180
+ ```
181
+
182
+ ##### 使用SidekiqBulkJob的async接口
183
+
184
+ SidekiqBulkJob会把同类型的job汇总到一个list中,当```async_delay```时间内超过```batch_size```,会新建一个batch job立刻执行汇总的全部jobs,清空list,清空的list会继续收集后续推入的job;如果在```async_delay```时间内未到达```batch_size```则会在最后一个job推入后等待```async_delay```时间创建一个batch job执行汇总的全部jobs
185
+ ```ruby
186
+ # create a sidekiq worker, use default queue
187
+ class TestJob
188
+ include Sidekiq::Worker
189
+ sidekiq_options queue: :default
190
+
191
+ def perform(*args)
192
+ puts args
193
+ end
194
+ end
195
+
196
+ # simple use
197
+ SidekiqBulkJob.perform_async(TestJob, 10)
198
+
199
+ # here will not create 1001 job in sidekiq
200
+ # now there are tow jobs created, one is collected 1000 TestJob in batch, another has 1 job inside.
201
+ (BATCH_SIZE + 1).times do |i|
202
+ SidekiqBulkJob.perform_async(TestJob, i)
203
+ end
204
+ ```
205
+
206
+ ##### 使用SidekiqWork的batch_perform_async接口异步执行任务
207
+
208
+ ```ruby
209
+ # same as SidekiqBulkJob.perform_async(TestJob, 10)
210
+ TestJob.batch_perform_async(10)
211
+ ```
212
+
213
+ ##### 使用SidekiqBulkJob的perform_at/perform_in接口设置定时任务
214
+
215
+ ```ruby
216
+ # run at 1 minute after with single job
217
+ SidekiqBulkJob.perform_at(1.minutes.after, TestJob, 10)
218
+ # same as below
219
+ SidekiqBulkJob.perform_in(1 * 60, TestJob, 10)
220
+ ```
221
+
222
+ ##### 使用SidekiqWork的batch_perform_at/batch_perform_in接口设置定时任务
223
+
224
+ ```ruby
225
+ # same as SidekiqBulkJob.perform_at(1.minutes.after, TestJob, 10)
226
+ TestJob.batch_perform_at(1.minutes.after, 10)
227
+ # same as SidekiqBulkJob.perform_in(1 * 60, TestJob, 10)
228
+ TestJob.batch_perform_in(1.minute, 10)
229
+ ```
230
+
231
+ ##### 使用setter设置
232
+
233
+ ```ruby
234
+ # set queue to test and run async
235
+ TestJob.set(queue: :test).batch_perform_async(10)
236
+ # set queue to test and run after 90 seconds
237
+ TestJob.set(queue: :test, in: 90).batch_perform_async(10)
238
+
239
+ #batch_perform_in first params interval will be overrided 'in'/'at' option at setter
240
+ # run after 90 seconds instead of 10 seconds
241
+ TestJob.set(queue: :test, in: 90).batch_perform_in(10, 10)
242
+ ```
45
243
 
46
244
  ## Contributing
47
245
 
@@ -29,7 +29,7 @@ module SidekiqBulkJob
29
29
  job_class_name: job_class.to_s,
30
30
  perfrom_args: args,
31
31
  queue: options[:queue] || SidekiqBulkJob.queue
32
- }.compact
32
+ }.select { |_, value| !value.nil? }
33
33
  SidekiqBulkJob.process payload
34
34
  else
35
35
  perform_in(options[:at] || options[:in], job_class, *args)
@@ -51,7 +51,7 @@ module SidekiqBulkJob
51
51
  at: ts,
52
52
  perfrom_args: args,
53
53
  queue: options[:queue] || SidekiqBulkJob.queue
54
- }.compact
54
+ }.select { |_, value| !value.nil? }
55
55
  SidekiqBulkJob.process payload
56
56
  else
57
57
  perform_async(job_class, *args)
@@ -64,18 +64,37 @@ module SidekiqBulkJob
64
64
 
65
65
  class << self
66
66
 
67
- attr_accessor :prefix, :redis, :queue, :batch_size, :logger, :process_fail
67
+ attr_accessor :prefix, :redis, :queue, :scheduled_delay, :async_delay, :batch_size, :logger, :process_fail
68
68
 
69
- def config(redis: , logger: , process_fail: , queue: :default, batch_size: 3000, prefix: "SidekiqBulkJob")
69
+ def config(redis: , logger: , process_fail: , async_delay: 60, scheduled_delay: 10, queue: :default, batch_size: 3000, prefix: "SidekiqBulkJob")
70
70
  if redis.nil?
71
71
  raise ArgumentError.new("redis not allow nil")
72
72
  end
73
+ if logger.nil?
74
+ raise ArgumentError.new("logger not allow nil")
75
+ end
76
+ if process_fail.nil?
77
+ raise ArgumentError.new("process_fail not allow nil")
78
+ end
79
+ if async_delay.to_f < 2
80
+ raise ArgumentError.new("async_delay not allow less than 2 seconds.")
81
+ elsif async_delay.to_f > 5 * 60
82
+ raise ArgumentError.new("async_delay not allow greater than 5 minutes.")
83
+ end
84
+ if scheduled_delay.to_f < 2
85
+ raise ArgumentError.new("scheduled_delay not allow less than 2 seconds.")
86
+ elsif scheduled_delay.to_f > 30
87
+ raise ArgumentError.new("scheduled_delay not allow greater than 2 seconds.")
88
+ end
89
+
73
90
  self.redis = redis
74
91
  self.queue = queue
75
92
  self.batch_size = batch_size
76
93
  self.prefix = prefix
77
94
  self.logger = logger
78
95
  self.process_fail = process_fail
96
+ self.async_delay = async_delay.to_f
97
+ self.scheduled_delay = scheduled_delay.to_f
79
98
  end
80
99
 
81
100
  def set(options)
@@ -113,7 +132,7 @@ module SidekiqBulkJob
113
132
  def process(job_class_name: , at: nil, perfrom_args: [], queue: self.queue)
114
133
  if at.nil?
115
134
  key = generate_key(job_class_name)
116
- client.lpush key, perfrom_args.to_json
135
+ client.lpush key, SidekiqBulkJob::Utils.dump(perfrom_args)
117
136
  bulk_run(job_class_name, key, queue: queue) if need_flush?(key)
118
137
  monitor(job_class_name, queue: queue)
119
138
  else
@@ -121,18 +140,18 @@ module SidekiqBulkJob
121
140
  args_redis_key = nil
122
141
  target = scheduled_set.find do |job|
123
142
  if job.klass == SidekiqBulkJob::ScheduledJob.to_s &&
124
- job.at.to_i.between?((at - 5).to_i, (at + 30).to_i) # 允许30秒延迟
143
+ job.at.to_i.between?((at - self.scheduled_delay).to_i, (at + self.scheduled_delay).to_i) # 允许30秒延迟
125
144
  _job_class_name, args_redis_key = job.args
126
145
  _job_class_name == job_class_name
127
146
  end
128
147
  end
129
148
  if !target.nil? && !args_redis_key.nil? && !args_redis_key.empty?
130
149
  # 往现有的job参数set里增加参数
131
- client.lpush args_redis_key, perfrom_args.to_json
150
+ client.lpush args_redis_key, SidekiqBulkJob::Utils.dump(perfrom_args)
132
151
  else
133
152
  # 新增加一个
134
153
  args_redis_key = SecureRandom.hex
135
- client.lpush args_redis_key, perfrom_args.to_json
154
+ client.lpush args_redis_key, SidekiqBulkJob::Utils.dump(perfrom_args)
136
155
  SidekiqBulkJob::ScheduledJob.client_push("queue" => queue, "class" => SidekiqBulkJob::ScheduledJob, "at" => at, "args" => [job_class_name, args_redis_key])
137
156
  end
138
157
  end
@@ -197,7 +216,7 @@ module SidekiqBulkJob
197
216
  if !_monitor.nil?
198
217
  # TODO debug log
199
218
  else
200
- SidekiqBulkJob::Monitor.client_push("queue" => queue, "at" => (time_now + 60).to_f, "class" => SidekiqBulkJob::Monitor, "args" => [time_now.to_f, job_class_name])
219
+ SidekiqBulkJob::Monitor.client_push("queue" => queue, "at" => (time_now + self.async_delay).to_f, "class" => SidekiqBulkJob::Monitor, "args" => [time_now.to_f, job_class_name])
201
220
  end
202
221
  end
203
222
 
@@ -0,0 +1,59 @@
1
+ module SidekiqBulkJob
2
+ class BulkErrorHandler
3
+
4
+ ErrorCollection = Struct.new(:args, :exception) do
5
+ def message
6
+ exception.message
7
+ end
8
+
9
+ def backtrace
10
+ exception.backtrace
11
+ end
12
+ end
13
+
14
+ attr_accessor :job_class_name, :errors, :jid
15
+
16
+ def initialize(job_class_name, jid)
17
+ @jid = jid
18
+ @job_class_name = job_class_name
19
+ @errors = []
20
+ end
21
+
22
+ def add(job_args, exception)
23
+ errors << ErrorCollection.new(job_args, exception)
24
+ end
25
+
26
+ def backtrace
27
+ errors.map(&:backtrace).flatten
28
+ end
29
+
30
+ def args
31
+ [job_class_name, errors.map(&:args)]
32
+ end
33
+
34
+ def failed?
35
+ !errors.empty?
36
+ end
37
+
38
+ def raise_error
39
+ error = BulkError.new(errors.map(&:message).join('; '))
40
+ error.set_backtrace self.backtrace
41
+ error
42
+ end
43
+
44
+ def retry_count
45
+ SidekiqBulkJob.redis.incr jid
46
+ end
47
+
48
+ def clear
49
+ SidekiqBulkJob.redis.del jid
50
+ end
51
+
52
+ class BulkError < StandardError
53
+ def initialize(message)
54
+ super(message)
55
+ end
56
+ end
57
+
58
+ end
59
+ end
@@ -1,30 +1,37 @@
1
1
  require "sidekiq"
2
2
 
3
3
  require "sidekiq_bulk_job/job_retry"
4
+ require "sidekiq_bulk_job/bulk_error_handler"
4
5
  require "sidekiq_bulk_job/utils"
5
6
 
6
7
  module SidekiqBulkJob
7
8
  class BulkJob
8
9
  include Sidekiq::Worker
9
- sidekiq_options queue: :default, retry: false
10
+ sidekiq_options queue: :default, retry: true
10
11
 
11
12
  def perform(job_class_name, args_array)
12
13
  target_name, method_name = SidekiqBulkJob::Utils.split_class_name_with_method job_class_name
13
14
  job = SidekiqBulkJob::Utils.constantize(target_name)
15
+ error_handle = BulkErrorHandler.new(job_class_name, self.jid)
14
16
  args_array.each do |_args|
15
17
  begin
16
- args = JSON.parse _args
18
+ args = SidekiqBulkJob::Utils.load _args
17
19
  if SidekiqBulkJob::Utils.class_with_method?(job_class_name)
18
20
  job.send(method_name, *args)
19
21
  else
20
22
  job.new.send(method_name, *args)
21
23
  end
22
24
  rescue Exception => e
23
- SidekiqBulkJob.logger.error("#{job_class_name} Args: #{args}, Error: #{e.full_message}")
25
+ error_handle.add _args, e
26
+ SidekiqBulkJob.logger.error("#{job_class_name} Args: #{args}, Error: #{e.respond_to?(:full_message) ? e.full_message : e.message}")
24
27
  SidekiqBulkJob.fail_callback(job_class_name: job_class_name, args: args, exception: e)
25
- SidekiqBulkJob::JobRetry.new(job, args, e).push
26
28
  end
27
29
  end
30
+ if error_handle.failed?
31
+ SidekiqBulkJob::JobRetry.new(job, error_handle).push
32
+ else
33
+ error_handle.clear
34
+ end
28
35
  end
29
36
  end
30
37
  end
@@ -1,24 +1,26 @@
1
1
  require "sidekiq"
2
+ require "sidekiq/job_retry"
2
3
 
3
4
  require "sidekiq_bulk_job/utils"
4
- require 'sidekiq/job_retry'
5
+ require "sidekiq_bulk_job/bulk_error_handler"
5
6
 
6
7
  module SidekiqBulkJob
7
8
  class JobRetry
8
9
 
9
- def initialize(klass, args, exception, options={})
10
+ def initialize(klass, error_handle, options={})
10
11
  @handler = Sidekiq::JobRetry.new(options)
11
12
  @klass = klass
12
- @args = args
13
- @exception = exception
13
+ @error_handle = error_handle
14
+ @retry_count = 0
14
15
  end
15
16
 
16
17
  def push(options={})
18
+ @retry_count = SidekiqBulkJob.redis.incr @error_handle.jid
17
19
  opts = job_options(options)
18
20
  queue_as = queue(@klass) || :default
19
21
  begin
20
22
  @handler.local(SidekiqBulkJob::BulkJob, opts, queue_as) do
21
- raise @exception
23
+ raise @error_handle.raise_error
22
24
  end
23
25
  rescue Exception => e
24
26
  end
@@ -28,7 +30,12 @@ module SidekiqBulkJob
28
30
 
29
31
  def job_options(options={})
30
32
  # 0 retry: no retry and dead queue
31
- opts = { 'class' => @klass.to_s, 'args' => @args, 'retry' => 0 }.merge(options)
33
+ opts = {
34
+ 'class' => SidekiqBulkJob::BulkJob.to_s,
35
+ 'args' => @error_handle.args,
36
+ 'retry' => true,
37
+ 'retry_count' => @retry_count.to_i
38
+ }.merge(options)
32
39
  if Sidekiq::VERSION >= "6.0.2"
33
40
  Sidekiq.dump_json(opts)
34
41
  else
@@ -14,14 +14,14 @@ module SidekiqBulkJob
14
14
  args_array = SidekiqBulkJob.flush args_redis_key
15
15
  args_array.each do |_args|
16
16
  begin
17
- args = JSON.parse _args
17
+ args = SidekiqBulkJob::Utils.load _args
18
18
  if SidekiqBulkJob::Utils.class_with_method?(job_class_name)
19
19
  job.send(method_name, *args)
20
20
  else
21
21
  job.new.send(method_name, *args)
22
22
  end
23
23
  rescue Exception => e
24
- SidekiqBulkJob.logger.error("#{job_class_name} Args: #{args}, Error: #{e.full_message}")
24
+ SidekiqBulkJob.logger.error("#{job_class_name} Args: #{args}, Error: #{e.respond_to?(:full_message) ? e.full_message : e.message}")
25
25
  SidekiqBulkJob.fail_callback(job_class_name: job_class_name, args: args, exception: e)
26
26
  SidekiqBulkJob::JobRetry.new(job, args, e).push
27
27
  end
@@ -1,3 +1,6 @@
1
+ require 'yaml'
2
+ require "sidekiq/extensions/active_record"
3
+
1
4
  module SidekiqBulkJob
2
5
  module Utils
3
6
 
@@ -80,6 +83,18 @@ module SidekiqBulkJob
80
83
  end
81
84
  end
82
85
 
86
+ def load yaml, legacy_filename = Object.new, filename: nil, fallback: false, symbolize_names: false
87
+ YAML.load yaml, legacy_filename, filename: filename, fallback: fallback, symbolize_names: symbolize_names
88
+ end
89
+
90
+ def dump o, io = nil, options = {}
91
+ marshalled = YAML.dump o, io, options
92
+ if marshalled.size > Sidekiq::Extensions::SIZE_LIMIT
93
+ SidekiqBulkJob.logger.warn { "job argument is #{marshalled.bytesize} bytes, you should refactor it to reduce the size" }
94
+ end
95
+ marshalled
96
+ end
97
+
83
98
  end
84
99
 
85
100
  end
@@ -1,3 +1,3 @@
1
1
  module SidekiqBulkJob
2
- VERSION = "0.1.2"
2
+ VERSION = "0.1.6"
3
3
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sidekiq_bulk_job
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.2
4
+ version: 0.1.6
5
5
  platform: ruby
6
6
  authors:
7
7
  - scalaview
8
- autorequire:
8
+ autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2020-11-11 00:00:00.000000000 Z
11
+ date: 2021-08-17 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: sidekiq
@@ -73,6 +73,7 @@ files:
73
73
  - bin/setup
74
74
  - lib/sidekiq_bulk_job.rb
75
75
  - lib/sidekiq_bulk_job/batch_runner.rb
76
+ - lib/sidekiq_bulk_job/bulk_error_handler.rb
76
77
  - lib/sidekiq_bulk_job/bulk_job.rb
77
78
  - lib/sidekiq_bulk_job/job_retry.rb
78
79
  - lib/sidekiq_bulk_job/monitor.rb
@@ -84,7 +85,7 @@ homepage: https://github.com/scalaview/sidekiq_bulk_job
84
85
  licenses:
85
86
  - MIT
86
87
  metadata: {}
87
- post_install_message:
88
+ post_install_message:
88
89
  rdoc_options: []
89
90
  require_paths:
90
91
  - lib
@@ -99,8 +100,9 @@ required_rubygems_version: !ruby/object:Gem::Requirement
99
100
  - !ruby/object:Gem::Version
100
101
  version: '0'
101
102
  requirements: []
102
- rubygems_version: 3.0.3
103
- signing_key:
103
+ rubyforge_project:
104
+ rubygems_version: 2.5.2
105
+ signing_key:
104
106
  specification_version: 4
105
107
  summary: Collect same jobs to single worker, reduce job number and improve thread
106
108
  utilization.