sidekiq 6.0.0 → 6.0.5

Sign up to get free protection for your applications and to get access to all the features.

Potentially problematic release.


This version of sidekiq might be problematic. Click here for more details.

Files changed (50) hide show
  1. checksums.yaml +4 -4
  2. data/6.0-Upgrade.md +3 -1
  3. data/Changes.md +110 -1
  4. data/Ent-Changes.md +7 -1
  5. data/Gemfile +1 -1
  6. data/Gemfile.lock +105 -93
  7. data/Pro-Changes.md +9 -1
  8. data/README.md +3 -1
  9. data/bin/sidekiqload +8 -4
  10. data/bin/sidekiqmon +4 -5
  11. data/lib/generators/sidekiq/worker_generator.rb +11 -1
  12. data/lib/sidekiq.rb +12 -0
  13. data/lib/sidekiq/api.rb +124 -91
  14. data/lib/sidekiq/cli.rb +29 -18
  15. data/lib/sidekiq/client.rb +18 -4
  16. data/lib/sidekiq/fetch.rb +7 -7
  17. data/lib/sidekiq/job_logger.rb +11 -3
  18. data/lib/sidekiq/job_retry.rb +23 -10
  19. data/lib/sidekiq/launcher.rb +3 -5
  20. data/lib/sidekiq/logger.rb +107 -11
  21. data/lib/sidekiq/middleware/chain.rb +11 -2
  22. data/lib/sidekiq/monitor.rb +1 -16
  23. data/lib/sidekiq/paginator.rb +7 -2
  24. data/lib/sidekiq/processor.rb +18 -20
  25. data/lib/sidekiq/redis_connection.rb +3 -0
  26. data/lib/sidekiq/scheduled.rb +13 -12
  27. data/lib/sidekiq/testing.rb +12 -0
  28. data/lib/sidekiq/util.rb +0 -2
  29. data/lib/sidekiq/version.rb +1 -1
  30. data/lib/sidekiq/web/application.rb +19 -18
  31. data/lib/sidekiq/web/helpers.rb +23 -11
  32. data/lib/sidekiq/worker.rb +4 -4
  33. data/sidekiq.gemspec +2 -2
  34. data/web/assets/javascripts/dashboard.js +2 -2
  35. data/web/assets/stylesheets/application-dark.css +125 -0
  36. data/web/assets/stylesheets/application.css +9 -0
  37. data/web/locales/de.yml +14 -2
  38. data/web/locales/en.yml +2 -0
  39. data/web/locales/ja.yml +2 -0
  40. data/web/views/_job_info.erb +2 -1
  41. data/web/views/busy.erb +4 -1
  42. data/web/views/dead.erb +2 -2
  43. data/web/views/layout.erb +1 -0
  44. data/web/views/morgue.erb +4 -1
  45. data/web/views/queue.erb +10 -1
  46. data/web/views/queues.erb +8 -0
  47. data/web/views/retries.erb +4 -1
  48. data/web/views/retry.erb +2 -2
  49. data/web/views/scheduled.erb +4 -1
  50. metadata +9 -8
@@ -4,12 +4,20 @@
4
4
 
5
5
  Please see [http://sidekiq.org/](http://sidekiq.org/) for more details and how to buy.
6
6
 
7
+ 5.0.1
8
+ ---------
9
+
10
+ - Rejigger batch failures UI to add direct links to retries and scheduled jobs [#4209]
11
+ - Delete batch data with `UNLINK` [#4155]
12
+ - Fix bug where a scheduled job can lose its scheduled time when using reliable push [#4267]
13
+ - Sidekiq::JobSet#scan and #find_job APIs have been promoted to Sidekiq OSS. [#4259]
14
+
7
15
  5.0.0
8
16
  ---------
9
17
 
10
18
  - There is no significant migration from Sidekiq Pro 4.0 to 5.0
11
19
  but make sure you read the [update notes for Sidekiq
12
- 6.0](/mperham/sidekiq/blob/master/6.0-Upgrade.md).
20
+ 6.0](https://github.com/mperham/sidekiq/blob/master/6.0-Upgrade.md).
13
21
  - Removed various deprecated APIs and associated warnings.
14
22
  - **BREAKING CHANGE** Remove the `Sidekiq::Batch::Status#dead_jobs` API in favor of
15
23
  `Sidekiq::Batch::Status#dead_jids`. [#4217]
data/README.md CHANGED
@@ -3,6 +3,7 @@ Sidekiq
3
3
 
4
4
  [![Gem Version](https://badge.fury.io/rb/sidekiq.svg)](https://rubygems.org/gems/sidekiq)
5
5
  [![Code Climate](https://codeclimate.com/github/mperham/sidekiq.svg)](https://codeclimate.com/github/mperham/sidekiq)
6
+ [![Test Coverage](https://codeclimate.com/github/mperham/sidekiq/badges/coverage.svg)](https://codeclimate.com/github/mperham/sidekiq/coverage)
6
7
  [![Build Status](https://circleci.com/gh/mperham/sidekiq/tree/master.svg?style=svg)](https://circleci.com/gh/mperham/sidekiq/tree/master)
7
8
  [![Gitter Chat](https://badges.gitter.im/mperham/sidekiq.svg)](https://gitter.im/mperham/sidekiq)
8
9
 
@@ -18,7 +19,8 @@ Performance
18
19
 
19
20
  Version | Latency | Garbage created for 10k jobs | Time to process 100k jobs | Throughput | Ruby
20
21
  -----------------|------|---------|---------|------------------------|-----
21
- Sidekiq 6.0.0 | 3 ms | 156 MB | 19 sec | **5200 jobs/sec** | MRI 2.6.3
22
+ Sidekiq 6.0.2 | 3 ms | 156 MB | 14.0 sec| **7100 jobs/sec** | MRI 2.6.3
23
+ Sidekiq 6.0.0 | 3 ms | 156 MB | 19 sec | 5200 jobs/sec | MRI 2.6.3
22
24
  Sidekiq 4.0.0 | 10 ms | 151 MB | 22 sec | 4500 jobs/sec |
23
25
  Sidekiq 3.5.1 | 22 ms | 1257 MB | 125 sec | 800 jobs/sec |
24
26
  Resque 1.25.2 | - | - | 420 sec | 240 jobs/sec |
@@ -5,7 +5,8 @@
5
5
  $TESTING = false
6
6
 
7
7
  #require 'ruby-prof'
8
- Bundler.require(:default)
8
+ require 'bundler/setup'
9
+ Bundler.require(:default, :load_test)
9
10
 
10
11
  require_relative '../lib/sidekiq/cli'
11
12
  require_relative '../lib/sidekiq/launcher'
@@ -102,17 +103,20 @@ iter.times do
102
103
  end
103
104
  Sidekiq.logger.error "Created #{count*iter} jobs"
104
105
 
106
+ start = Time.now
107
+
105
108
  Monitoring = Thread.new do
106
109
  watchdog("monitor thread") do
107
110
  while true
108
- sleep 0.5
111
+ sleep 0.2
109
112
  qsize = Sidekiq.redis do |conn|
110
113
  conn.llen "queue:default"
111
114
  end
112
115
  total = qsize
113
- Sidekiq.logger.error("RSS: #{Process.rss} Pending: #{total}")
116
+ #Sidekiq.logger.error("RSS: #{Process.rss} Pending: #{total}")
114
117
  if total == 0
115
- Sidekiq.logger.error("Done, now here's the latency for three jobs")
118
+ Sidekiq.logger.error("Done, #{iter * count} jobs in #{Time.now - start} sec")
119
+ Sidekiq.logger.error("Now here's the latency for three jobs")
116
120
 
117
121
  LoadWorker.perform_async(1, Time.now.to_f)
118
122
  LoadWorker.perform_async(2, Time.now.to_f)
@@ -2,8 +2,7 @@
2
2
 
3
3
  require 'sidekiq/monitor'
4
4
 
5
- if ARGV[0] == 'status'
6
- Sidekiq::Monitor::Status.new.display(ARGV[1])
7
- else
8
- Sidekiq::Monitor.print_usage
9
- end
5
+ section = "all"
6
+ section = ARGV[0] if ARGV.size == 1
7
+
8
+ Sidekiq::Monitor::Status.new.display(section)
@@ -16,7 +16,9 @@ module Sidekiq
16
16
  end
17
17
 
18
18
  def create_test_file
19
- if defined?(RSpec)
19
+ return unless test_framework
20
+
21
+ if test_framework == :rspec
20
22
  create_worker_spec
21
23
  else
22
24
  create_worker_test
@@ -42,6 +44,14 @@ module Sidekiq
42
44
  )
43
45
  template "worker_test.rb.erb", template_file
44
46
  end
47
+
48
+ def file_name
49
+ @_file_name ||= super.sub(/_?worker\z/i, "")
50
+ end
51
+
52
+ def test_framework
53
+ ::Rails.application.config.generators.options[:rails][:test_framework]
54
+ end
45
55
  end
46
56
  end
47
57
  end
@@ -192,6 +192,7 @@ module Sidekiq
192
192
 
193
193
  def self.log_formatter=(log_formatter)
194
194
  @log_formatter = log_formatter
195
+ logger.formatter = log_formatter
195
196
  end
196
197
 
197
198
  def self.logger
@@ -199,9 +200,20 @@ module Sidekiq
199
200
  end
200
201
 
201
202
  def self.logger=(logger)
203
+ if logger.nil?
204
+ self.logger.level = Logger::FATAL
205
+ return self.logger
206
+ end
207
+
208
+ logger.extend(Sidekiq::LoggingUtils)
209
+
202
210
  @logger = logger
203
211
  end
204
212
 
213
+ def self.pro?
214
+ defined?(Sidekiq::Pro)
215
+ end
216
+
205
217
  # How frequently Redis should be checked by a random Sidekiq process for
206
218
  # scheduled and retriable jobs. Each individual process will take turns by
207
219
  # waiting some multiple of this value.
@@ -2,23 +2,11 @@
2
2
 
3
3
  require "sidekiq"
4
4
 
5
- module Sidekiq
6
- module RedisScanner
7
- def sscan(conn, key)
8
- cursor = "0"
9
- result = []
10
- loop do
11
- cursor, values = conn.sscan(key, cursor)
12
- result.push(*values)
13
- break if cursor == "0"
14
- end
15
- result
16
- end
17
- end
5
+ require "zlib"
6
+ require "base64"
18
7
 
8
+ module Sidekiq
19
9
  class Stats
20
- include RedisScanner
21
-
22
10
  def initialize
23
11
  fetch_stats!
24
12
  end
@@ -77,11 +65,11 @@ module Sidekiq
77
65
  }
78
66
 
79
67
  processes = Sidekiq.redis { |conn|
80
- sscan(conn, "processes")
68
+ conn.sscan_each("processes").to_a
81
69
  }
82
70
 
83
71
  queues = Sidekiq.redis { |conn|
84
- sscan(conn, "queues")
72
+ conn.sscan_each("queues").to_a
85
73
  }
86
74
 
87
75
  pipe2_res = Sidekiq.redis { |conn|
@@ -92,8 +80,8 @@ module Sidekiq
92
80
  }
93
81
 
94
82
  s = processes.size
95
- workers_size = pipe2_res[0...s].map(&:to_i).inject(0, &:+)
96
- enqueued = pipe2_res[s..-1].map(&:to_i).inject(0, &:+)
83
+ workers_size = pipe2_res[0...s].sum(&:to_i)
84
+ enqueued = pipe2_res[s..-1].sum(&:to_i)
97
85
 
98
86
  default_queue_latency = if (entry = pipe1_res[6].first)
99
87
  job = begin
@@ -142,11 +130,9 @@ module Sidekiq
142
130
  end
143
131
 
144
132
  class Queues
145
- include RedisScanner
146
-
147
133
  def lengths
148
134
  Sidekiq.redis do |conn|
149
- queues = sscan(conn, "queues")
135
+ queues = conn.sscan_each("queues").to_a
150
136
 
151
137
  lengths = conn.pipelined {
152
138
  queues.each do |queue|
@@ -154,13 +140,8 @@ module Sidekiq
154
140
  end
155
141
  }
156
142
 
157
- i = 0
158
- array_of_arrays = queues.each_with_object({}) { |queue, memo|
159
- memo[queue] = lengths[i]
160
- i += 1
161
- }.sort_by { |_, size| size }
162
-
163
- Hash[array_of_arrays.reverse]
143
+ array_of_arrays = queues.zip(lengths).sort_by { |_, size| -size }
144
+ Hash[array_of_arrays]
164
145
  end
165
146
  end
166
147
  end
@@ -182,18 +163,12 @@ module Sidekiq
182
163
  private
183
164
 
184
165
  def date_stat_hash(stat)
185
- i = 0
186
166
  stat_hash = {}
187
- keys = []
188
- dates = []
189
-
190
- while i < @days_previous
191
- date = @start_date - i
192
- datestr = date.strftime("%Y-%m-%d")
193
- keys << "stat:#{stat}:#{datestr}"
194
- dates << datestr
195
- i += 1
196
- end
167
+ dates = @start_date.downto(@start_date - @days_previous + 1).map { |date|
168
+ date.strftime("%Y-%m-%d")
169
+ }
170
+
171
+ keys = dates.map { |datestr| "stat:#{stat}:#{datestr}" }
197
172
 
198
173
  begin
199
174
  Sidekiq.redis do |conn|
@@ -225,13 +200,12 @@ module Sidekiq
225
200
  #
226
201
  class Queue
227
202
  include Enumerable
228
- extend RedisScanner
229
203
 
230
204
  ##
231
205
  # Return all known queues within Redis.
232
206
  #
233
207
  def self.all
234
- Sidekiq.redis { |c| sscan(c, "queues") }.sort.map { |q| Sidekiq::Queue.new(q) }
208
+ Sidekiq.redis { |c| c.sscan_each("queues").to_a }.sort.map { |q| Sidekiq::Queue.new(q) }
235
209
  end
236
210
 
237
211
  attr_reader :name
@@ -299,7 +273,7 @@ module Sidekiq
299
273
  def clear
300
274
  Sidekiq.redis do |conn|
301
275
  conn.multi do
302
- conn.del(@rname)
276
+ conn.unlink(@rname)
303
277
  conn.srem("queues", name)
304
278
  end
305
279
  end
@@ -349,7 +323,7 @@ module Sidekiq
349
323
  end
350
324
  when "ActiveJob::QueueAdapters::SidekiqAdapter::JobWrapper"
351
325
  job_class = @item["wrapped"] || args[0]
352
- if job_class == "ActionMailer::DeliveryJob"
326
+ if job_class == "ActionMailer::DeliveryJob" || job_class == "ActionMailer::MailDeliveryJob"
353
327
  # MailerClass#mailer_method
354
328
  args[0]["arguments"][0..1].join("#")
355
329
  else
@@ -372,6 +346,9 @@ module Sidekiq
372
346
  if (self["wrapped"] || args[0]) == "ActionMailer::DeliveryJob"
373
347
  # remove MailerClass, mailer_method and 'deliver_now'
374
348
  job_args.drop(3)
349
+ elsif (self["wrapped"] || args[0]) == "ActionMailer::MailDeliveryJob"
350
+ # remove MailerClass, mailer_method and 'deliver_now'
351
+ job_args.drop(3).first["args"]
375
352
  else
376
353
  job_args
377
354
  end
@@ -400,6 +377,20 @@ module Sidekiq
400
377
  Time.at(self["created_at"] || self["enqueued_at"] || 0).utc
401
378
  end
402
379
 
380
+ def tags
381
+ self["tags"] || []
382
+ end
383
+
384
+ def error_backtrace
385
+ # Cache nil values
386
+ if defined?(@error_backtrace)
387
+ @error_backtrace
388
+ else
389
+ value = self["error_backtrace"]
390
+ @error_backtrace = value && uncompress_backtrace(value)
391
+ end
392
+ end
393
+
403
394
  attr_reader :queue
404
395
 
405
396
  def latency
@@ -433,6 +424,23 @@ module Sidekiq
433
424
  Sidekiq.logger.warn "Unable to load YAML: #{ex.message}" unless Sidekiq.options[:environment] == "development"
434
425
  default
435
426
  end
427
+
428
+ def uncompress_backtrace(backtrace)
429
+ if backtrace.is_a?(Array)
430
+ # Handle old jobs with raw Array backtrace format
431
+ backtrace
432
+ else
433
+ decoded = Base64.decode64(backtrace)
434
+ uncompressed = Zlib::Inflate.inflate(decoded)
435
+ begin
436
+ Sidekiq.load_json(uncompressed)
437
+ rescue
438
+ # Handle old jobs with marshalled backtrace format
439
+ # TODO Remove in 7.x
440
+ Marshal.load(uncompressed)
441
+ end
442
+ end
443
+ end
436
444
  end
437
445
 
438
446
  class SortedEntry < Job
@@ -458,8 +466,9 @@ module Sidekiq
458
466
  end
459
467
 
460
468
  def reschedule(at)
461
- delete
462
- @parent.schedule(at, item)
469
+ Sidekiq.redis do |conn|
470
+ conn.zincrby(@parent.name, at.to_f - @score, Sidekiq.dump_json(@item))
471
+ end
463
472
  end
464
473
 
465
474
  def add_to_queue
@@ -503,7 +512,7 @@ module Sidekiq
503
512
  else
504
513
  # multiple jobs with the same score
505
514
  # find the one with the right JID and push it
506
- hash = results.group_by { |message|
515
+ matched, nonmatched = results.partition { |message|
507
516
  if message.index(jid)
508
517
  msg = Sidekiq.load_json(message)
509
518
  msg["jid"] == jid
@@ -512,12 +521,12 @@ module Sidekiq
512
521
  end
513
522
  }
514
523
 
515
- msg = hash.fetch(true, []).first
524
+ msg = matched.first
516
525
  yield msg if msg
517
526
 
518
527
  # push the rest back onto the sorted set
519
528
  conn.multi do
520
- hash.fetch(false, []).each do |message|
529
+ nonmatched.each do |message|
521
530
  conn.zadd(parent.name, score.to_f.to_s, message)
522
531
  end
523
532
  end
@@ -540,9 +549,20 @@ module Sidekiq
540
549
  Sidekiq.redis { |c| c.zcard(name) }
541
550
  end
542
551
 
552
+ def scan(match, count = 100)
553
+ return to_enum(:scan, match, count) unless block_given?
554
+
555
+ match = "*#{match}*" unless match.include?("*")
556
+ Sidekiq.redis do |conn|
557
+ conn.zscan_each(name, match: match, count: count) do |entry, score|
558
+ yield SortedEntry.new(self, score, entry)
559
+ end
560
+ end
561
+ end
562
+
543
563
  def clear
544
564
  Sidekiq.redis do |conn|
545
- conn.del(name)
565
+ conn.unlink(name)
546
566
  end
547
567
  end
548
568
  alias_method :💣, :clear
@@ -576,28 +596,40 @@ module Sidekiq
576
596
  end
577
597
  end
578
598
 
599
+ ##
600
+ # Fetch jobs that match a given time or Range. Job ID is an
601
+ # optional second argument.
579
602
  def fetch(score, jid = nil)
603
+ begin_score, end_score =
604
+ if score.is_a?(Range)
605
+ [score.first, score.last]
606
+ else
607
+ [score, score]
608
+ end
609
+
580
610
  elements = Sidekiq.redis { |conn|
581
- conn.zrangebyscore(name, score, score)
611
+ conn.zrangebyscore(name, begin_score, end_score, with_scores: true)
582
612
  }
583
613
 
584
614
  elements.each_with_object([]) do |element, result|
585
- entry = SortedEntry.new(self, score, element)
586
- if jid
587
- result << entry if entry.jid == jid
588
- else
589
- result << entry
590
- end
615
+ data, job_score = element
616
+ entry = SortedEntry.new(self, job_score, data)
617
+ result << entry if jid.nil? || entry.jid == jid
591
618
  end
592
619
  end
593
620
 
594
621
  ##
595
622
  # Find the job with the given JID within this sorted set.
596
- #
597
- # This is a slow, inefficient operation. Do not use under
598
- # normal conditions. Sidekiq Pro contains a faster version.
623
+ # This is a slower O(n) operation. Do not use for app logic.
599
624
  def find_job(jid)
600
- detect { |j| j.jid == jid }
625
+ Sidekiq.redis do |conn|
626
+ conn.zscan_each(name, match: "*#{jid}*", count: 100) do |entry, score|
627
+ job = JSON.parse(entry)
628
+ matched = job["jid"] == jid
629
+ return SortedEntry.new(self, score, entry) if matched
630
+ end
631
+ end
632
+ nil
601
633
  end
602
634
 
603
635
  def delete_by_value(name, value)
@@ -612,11 +644,13 @@ module Sidekiq
612
644
  Sidekiq.redis do |conn|
613
645
  elements = conn.zrangebyscore(name, score, score)
614
646
  elements.each do |element|
615
- message = Sidekiq.load_json(element)
616
- if message["jid"] == jid
617
- ret = conn.zrem(name, element)
618
- @_size -= 1 if ret
619
- break ret
647
+ if element.index(jid)
648
+ message = Sidekiq.load_json(element)
649
+ if message["jid"] == jid
650
+ ret = conn.zrem(name, element)
651
+ @_size -= 1 if ret
652
+ break ret
653
+ end
620
654
  end
621
655
  end
622
656
  end
@@ -720,7 +754,6 @@ module Sidekiq
720
754
  #
721
755
  class ProcessSet
722
756
  include Enumerable
723
- include RedisScanner
724
757
 
725
758
  def initialize(clean_plz = true)
726
759
  cleanup if clean_plz
@@ -731,7 +764,7 @@ module Sidekiq
731
764
  def cleanup
732
765
  count = 0
733
766
  Sidekiq.redis do |conn|
734
- procs = sscan(conn, "processes").sort
767
+ procs = conn.sscan_each("processes").to_a.sort
735
768
  heartbeats = conn.pipelined {
736
769
  procs.each do |key|
737
770
  conn.hget(key, "info")
@@ -741,40 +774,37 @@ module Sidekiq
741
774
  # the hash named key has an expiry of 60 seconds.
742
775
  # if it's not found, that means the process has not reported
743
776
  # in to Redis and probably died.
744
- to_prune = []
745
- heartbeats.each_with_index do |beat, i|
746
- to_prune << procs[i] if beat.nil?
747
- end
777
+ to_prune = procs.select.with_index { |proc, i|
778
+ heartbeats[i].nil?
779
+ }
748
780
  count = conn.srem("processes", to_prune) unless to_prune.empty?
749
781
  end
750
782
  count
751
783
  end
752
784
 
753
785
  def each
754
- procs = Sidekiq.redis { |conn| sscan(conn, "processes") }.sort
786
+ result = Sidekiq.redis { |conn|
787
+ procs = conn.sscan_each("processes").to_a.sort
755
788
 
756
- Sidekiq.redis do |conn|
757
789
  # We're making a tradeoff here between consuming more memory instead of
758
790
  # making more roundtrips to Redis, but if you have hundreds or thousands of workers,
759
791
  # you'll be happier this way
760
- result = conn.pipelined {
792
+ conn.pipelined do
761
793
  procs.each do |key|
762
794
  conn.hmget(key, "info", "busy", "beat", "quiet")
763
795
  end
764
- }
796
+ end
797
+ }
765
798
 
766
- result.each do |info, busy, at_s, quiet|
767
- # If a process is stopped between when we query Redis for `procs` and
768
- # when we query for `result`, we will have an item in `result` that is
769
- # composed of `nil` values.
770
- next if info.nil?
799
+ result.each do |info, busy, at_s, quiet|
800
+ # If a process is stopped between when we query Redis for `procs` and
801
+ # when we query for `result`, we will have an item in `result` that is
802
+ # composed of `nil` values.
803
+ next if info.nil?
771
804
 
772
- hash = Sidekiq.load_json(info)
773
- yield Process.new(hash.merge("busy" => busy.to_i, "beat" => at_s.to_f, "quiet" => quiet))
774
- end
805
+ hash = Sidekiq.load_json(info)
806
+ yield Process.new(hash.merge("busy" => busy.to_i, "beat" => at_s.to_f, "quiet" => quiet))
775
807
  end
776
-
777
- nil
778
808
  end
779
809
 
780
810
  # This method is not guaranteed accurate since it does not prune the set
@@ -885,11 +915,10 @@ module Sidekiq
885
915
  #
886
916
  class Workers
887
917
  include Enumerable
888
- include RedisScanner
889
918
 
890
919
  def each
891
920
  Sidekiq.redis do |conn|
892
- procs = sscan(conn, "processes")
921
+ procs = conn.sscan_each("processes").to_a
893
922
  procs.sort.each do |key|
894
923
  valid, workers = conn.pipelined {
895
924
  conn.exists(key)
@@ -897,7 +926,11 @@ module Sidekiq
897
926
  }
898
927
  next unless valid
899
928
  workers.each_pair do |tid, json|
900
- yield key, tid, Sidekiq.load_json(json)
929
+ hsh = Sidekiq.load_json(json)
930
+ p = hsh["payload"]
931
+ # avoid breaking API, this is a side effect of the JSON optimization in #4316
932
+ hsh["payload"] = Sidekiq.load_json(p) if p.is_a?(String)
933
+ yield key, tid, hsh
901
934
  end
902
935
  end
903
936
  end
@@ -911,7 +944,7 @@ module Sidekiq
911
944
  # which can easily get out of sync with crashy processes.
912
945
  def size
913
946
  Sidekiq.redis do |conn|
914
- procs = sscan(conn, "processes")
947
+ procs = conn.sscan_each("processes").to_a
915
948
  if procs.empty?
916
949
  0
917
950
  else
@@ -919,7 +952,7 @@ module Sidekiq
919
952
  procs.each do |key|
920
953
  conn.hget(key, "busy")
921
954
  end
922
- }.map(&:to_i).inject(:+)
955
+ }.sum(&:to_i)
923
956
  end
924
957
  end
925
958
  end