sidekiq-unique-jobs 7.0.0.beta14 → 7.0.0.beta15

Sign up to get free protection for your applications and to get access to all the features.

Potentially problematic release.


This version of sidekiq-unique-jobs might be problematic. Click here for more details.

checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ebb9d4c539ef0f829d56cf7c8f907044c4f9cee33fedb25c3a1ee56325db8376
4
- data.tar.gz: f2891752e702e8060d4907fa5764855545d81974d602bb922d734ba9ba252384
3
+ metadata.gz: fd354b14a7f7872a112b42ea75ab268b9781b21f3c116458a1eb38e6d91c4e02
4
+ data.tar.gz: '02469065fc7048993402dbb9670c0e50e583f5c63d59dd113ea6d3d3bb4f471f'
5
5
  SHA512:
6
- metadata.gz: f17fface6439ddf9daebc649ae8d3af8520a3e067c2fb7bba4157b0d591a2668eb90295f009e739f6496336725b4fb92ba2c78b9499c16b9aab2cf629794500f
7
- data.tar.gz: 9ba022debfd062ce58d9a33dfb793dace4f0ea3edebf1577f89d315c50787f86e9ef31c372e249111e419553ab67e9779183c7c12315e779ff5b802bd6a1443e
6
+ metadata.gz: 2c47febf8a69f7738131a3d0bfe1b8560be1b0e4332ce933f421a0314a6a92139c71ef8b18ff249de64c84421ebb1c9cd5ff6b275795545a0abdaacbf4d08320
7
+ data.tar.gz: fa2378be7428b1973a7c71ffb3abb40a4d76d3ef61ea2c291c4d39790ee623e4d9a5327055b5113149c9db4787106843a6d5c0f111b64eff0677a783e3701bc7
data/CHANGELOG.md CHANGED
@@ -1,5 +1,21 @@
1
1
  # Changelog
2
2
 
3
+ ## [v7.0.0.beta14](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v7.0.0.beta14) (2020-03-30)
4
+
5
+ [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v6.0.21...v7.0.0.beta14)
6
+
7
+ **Fixed bugs:**
8
+
9
+ - Use thread-safe digest creation mechanism [\#483](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/483) ([zormandi](https://github.com/zormandi))
10
+
11
+ ## [v6.0.21](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v6.0.21) (2020-03-30)
12
+
13
+ [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v7.0.0.beta13...v6.0.21)
14
+
15
+ **Fixed bugs:**
16
+
17
+ - Use thread-safe digest creation mechanism [\#484](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/484) ([mhenrixon](https://github.com/mhenrixon))
18
+
3
19
  ## [v7.0.0.beta13](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v7.0.0.beta13) (2020-03-26)
4
20
 
5
21
  [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v7.0.0.beta12...v7.0.0.beta13)
@@ -117,10 +133,6 @@
117
133
 
118
134
  [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v6.0.18...v7.0.0.beta6)
119
135
 
120
- **Implemented enhancements:**
121
-
122
- - Clarify usage with global\_id and sidekiq-status [\#455](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/455) ([mhenrixon](https://github.com/mhenrixon))
123
-
124
136
  **Merged pull requests:**
125
137
 
126
138
  - Fix that Sidekiq now sends instance of worker [\#459](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/459) ([mhenrixon](https://github.com/mhenrixon))
@@ -163,6 +175,10 @@
163
175
 
164
176
  [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v6.0.16...v6.0.17)
165
177
 
178
+ **Implemented enhancements:**
179
+
180
+ - Clarify usage with global\_id and sidekiq-status [\#455](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/455) ([mhenrixon](https://github.com/mhenrixon))
181
+
166
182
  **Fixed bugs:**
167
183
 
168
184
  - Allow redis namespace to work with deletion [\#451](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/451) ([mhenrixon](https://github.com/mhenrixon))
@@ -457,6 +473,7 @@
457
473
  **Fixed bugs:**
458
474
 
459
475
  - Enable replace strategy [\#315](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/315) ([mhenrixon](https://github.com/mhenrixon))
476
+ - Remove unused method [\#307](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/307) ([mhenrixon](https://github.com/mhenrixon))
460
477
 
461
478
  **Closed issues:**
462
479
 
@@ -474,7 +491,6 @@
474
491
 
475
492
  - Not unlocking automatically \(version 6.0.0rc5\) [\#293](https://github.com/mhenrixon/sidekiq-unique-jobs/issues/293)
476
493
  - Bug fixes [\#310](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/310) ([mhenrixon](https://github.com/mhenrixon))
477
- - Remove unused method [\#307](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/307) ([mhenrixon](https://github.com/mhenrixon))
478
494
 
479
495
  ## [v6.0.1](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v6.0.1) (2018-07-31)
480
496
 
@@ -824,7 +840,6 @@
824
840
  - missed space [\#188](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/188) ([TheBigSadowski](https://github.com/TheBigSadowski))
825
841
  - Convert unless if to just 1 if [\#179](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/179) ([otzy007](https://github.com/otzy007))
826
842
  - fix for \#168. Handle the NOSCRIPT by sending the script again [\#178](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/178) ([otzy007](https://github.com/otzy007))
827
- - Fixed gitter badge link [\#176](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/176) ([andrew](https://github.com/andrew))
828
843
 
829
844
  ## [v4.0.17](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v4.0.17) (2016-03-02)
830
845
 
@@ -840,6 +855,7 @@
840
855
 
841
856
  **Merged pull requests:**
842
857
 
858
+ - Fixed gitter badge link [\#176](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/176) ([andrew](https://github.com/andrew))
843
859
  - Fix for sidekiq delete failing for version 3.4.x [\#167](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/167) ([theprogrammerin](https://github.com/theprogrammerin))
844
860
  - Run lock timeout configurable [\#164](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/164) ([Slania](https://github.com/Slania))
845
861
 
@@ -30,6 +30,7 @@ module SidekiqUniqueJobs
30
30
  ON_CLIENT_CONFLICT ||= "on_client_conflict"
31
31
  ON_CONFLICT ||= "on_conflict"
32
32
  ON_SERVER_CONFLICT ||= "on_server_conflict"
33
+ PROCESSES ||= "processes"
33
34
  QUEUE ||= "queue"
34
35
  RETRY ||= "retry"
35
36
  SCHEDULE ||= "schedule"
@@ -8,6 +8,8 @@ module SidekiqUniqueJobs
8
8
  # @author Mikael Henriksson <mikael@zoolutions.se>
9
9
  #
10
10
  class Validator
11
+ #
12
+ # @return [Hash] a hash mapping of deprecated keys and their new value
11
13
  DEPRECATED_KEYS = {
12
14
  UNIQUE.to_sym => LOCK.to_sym,
13
15
  UNIQUE_ARGS.to_sym => LOCK_ARGS.to_sym,
@@ -61,6 +63,12 @@ module SidekiqUniqueJobs
61
63
  lock_config
62
64
  end
63
65
 
66
+ #
67
+ # Validate deprecated keys
68
+ # adds useful information about how to proceed with fixing handle_deprecations
69
+ #
70
+ # @return [void]
71
+ #
64
72
  def handle_deprecations
65
73
  DEPRECATED_KEYS.each do |old, new|
66
74
  next unless @options.key?(old)
@@ -111,6 +111,12 @@ module SidekiqUniqueJobs
111
111
  default_worker_options[UNIQUE_ARGS]
112
112
  end
113
113
 
114
+ #
115
+ # The globally default worker options configured from Sidekiq
116
+ #
117
+ #
118
+ # @return [Hash<String, Object>]
119
+ #
114
120
  def default_worker_options
115
121
  @default_worker_options ||= Sidekiq.default_worker_options.stringify_keys
116
122
  end
@@ -107,13 +107,13 @@ module SidekiqUniqueJobs
107
107
 
108
108
  # the strategy to use as conflict resolution from sidekiq client
109
109
  def on_client_conflict
110
- @on_client_conflict ||= on_conflict&.(:[], :client) if on_conflict.is_a?(Hash)
110
+ @on_client_conflict ||= on_conflict[:client] if on_conflict.is_a?(Hash)
111
111
  @on_client_conflict ||= on_conflict
112
112
  end
113
113
 
114
114
  # the strategy to use as conflict resolution from sidekiq server
115
115
  def on_server_conflict
116
- @on_client_conflict ||= on_conflict&.(:[], :server) if on_conflict.is_a?(Hash)
116
+ @on_server_conflict ||= on_conflict[:server] if on_conflict.is_a?(Hash)
117
117
  @on_server_conflict ||= on_conflict
118
118
  end
119
119
  end
@@ -1,6 +1,6 @@
1
1
  # frozen_string_literal: true
2
2
 
3
- require 'openssl'
3
+ require "openssl"
4
4
 
5
5
  module SidekiqUniqueJobs
6
6
  # Handles uniqueness of sidekiq arguments
@@ -52,7 +52,7 @@ module SidekiqUniqueJobs
52
52
  # @param [Hash] item a Sidekiq job hash
53
53
  # @option item [Integer] :lock_ttl the configured expiration
54
54
  # @option item [String] :jid the sidekiq job id
55
- # @option item [String] :unique_digest the unique digest (See: {LockDigest#unique_digest})
55
+ # @option item [String] :unique_digest the unique digest (See: {LockDigest#lock_digest})
56
56
  # @param [Sidekiq::RedisConnection, ConnectionPool] redis_pool the redis connection
57
57
  #
58
58
  def initialize(item, redis_pool = nil)
@@ -23,6 +23,7 @@ local redisversion = ARGV[6]
23
23
  <%= include_partial "shared/_common.lua" %>
24
24
  <%= include_partial "shared/_find_digest_in_queues.lua" %>
25
25
  <%= include_partial "shared/_find_digest_in_sorted_set.lua" %>
26
+ <%= include_partial "shared/_find_digest_in_process_set.lua" %>
26
27
  ---------- END local functions ----------
27
28
 
28
29
 
@@ -61,6 +62,16 @@ repeat
61
62
  end
62
63
  end
63
64
 
65
+ -- TODO: Add check for jobs checked out by process
66
+ if found ~= true then
67
+ log_debug("Searching for digest:", digest, "in process sets")
68
+ local queue = find_digest_in_process_set(digest)
69
+ if queue then
70
+ log_debug("found digest:", digest, "in queue:", queue)
71
+ found = true
72
+ end
73
+ end
74
+
64
75
  if found ~= true then
65
76
  local queued = digest .. ":QUEUED"
66
77
  local primed = digest .. ":PRIMED"
@@ -0,0 +1,35 @@
1
+ local function find_digest_in_process_set(digest)
2
+ local process_cursor = 0
3
+ local job_cursor = 0
4
+ local pattern = "*" .. digest .. "*"
5
+ local found = false
6
+
7
+ log_debug("searching in list processes:",
8
+ "for digest:", digest,
9
+ "cursor:", process_cursor)
10
+
11
+ repeat
12
+ local process_paginator = redis.call("SSCAN", "processes", process_cursor, "MATCH", "*")
13
+ local next_process_cursor = process_paginator[1]
14
+ local processes = process_paginator[2]
15
+ log_debug("Found number of processes:", #processes, "next cursor:", next_process_cursor)
16
+
17
+ for _, process in ipairs(processes) do
18
+ log_debug("searching in process set:", process,
19
+ "for digest:", digest,
20
+ "cursor:", process_cursor)
21
+
22
+ local job = redis.call("HGET", process, "info")
23
+
24
+ if string.find(job, digest) then
25
+ log_debug("Found digest", digest, "in process:", process)
26
+ found = true
27
+ break
28
+ end
29
+ end
30
+
31
+ process_cursor = next_process_cursor
32
+ until found == true or process_cursor == "0"
33
+
34
+ return found
35
+ end
@@ -6,7 +6,7 @@ local function find_digest_in_queues(digest)
6
6
 
7
7
  repeat
8
8
  log_debug("searching all queues for a matching digest:", digest)
9
- local pagination = redis.call("SCAN", cursor, "MATCH", "*queue:*", "COUNT", count)
9
+ local pagination = redis.call("SCAN", cursor, "MATCH", "queue:*", "COUNT", count)
10
10
  local next_cursor = pagination[1]
11
11
  local queues = pagination[2]
12
12
 
@@ -34,12 +34,9 @@ local function find_digest_in_queues(digest)
34
34
  end
35
35
  index = index + per
36
36
  end
37
-
38
- cursor = next_cursor
39
- if cursor == "0" then
40
- log_debug("Looped through all queues, stopping iteration")
41
- end
42
37
  end
38
+
39
+ cursor = next_cursor
43
40
  until found == true or cursor == "0"
44
41
 
45
42
  return result
@@ -0,0 +1,29 @@
1
+ # frozen_string_literal: true
2
+
3
+ module SidekiqUniqueJobs
4
+ module Orphans
5
+ #
6
+ # Class DeleteOrphans provides deletion of orphaned digests
7
+ #
8
+ # @note this is a much slower version of the lua script but does not crash redis
9
+ #
10
+ # @author Mikael Henriksson <mikael@zoolutions.se>
11
+ #
12
+ class LuaReaper < Reaper
13
+ #
14
+ # Delete orphaned digests
15
+ #
16
+ #
17
+ # @return [Integer] the number of reaped locks
18
+ #
19
+ def call
20
+ call_script(
21
+ :reap_orphans,
22
+ conn,
23
+ keys: [DIGESTS, SCHEDULE, RETRY, PROCESSES],
24
+ argv: [reaper_count],
25
+ )
26
+ end
27
+ end
28
+ end
29
+ end
@@ -13,6 +13,17 @@ module SidekiqUniqueJobs
13
13
  include SidekiqUniqueJobs::Connection
14
14
  include SidekiqUniqueJobs::Script::Caller
15
15
  include SidekiqUniqueJobs::Logging
16
+ include SidekiqUniqueJobs::JSON
17
+
18
+ require_relative "lua_reaper"
19
+ require_relative "ruby_reaper"
20
+
21
+ #
22
+ # @return [Hash<Symbol, SidekiqUniqueJobs::Orphans::Reaper] the current implementation of reapers
23
+ REAPERS = {
24
+ lua: SidekiqUniqueJobs::Orphans::LuaReaper,
25
+ ruby: SidekiqUniqueJobs::Orphans::RubyReaper,
26
+ }.freeze
16
27
 
17
28
  #
18
29
  # Execute deletion of orphaned digests
@@ -27,7 +38,10 @@ module SidekiqUniqueJobs
27
38
  redis { |rcon| new(rcon).call }
28
39
  end
29
40
 
30
- attr_reader :conn, :digests, :scheduled, :retried
41
+ #
42
+ # @!attribute [r] conn
43
+ # @return [Redis] a redis connection
44
+ attr_reader :conn
31
45
 
32
46
  #
33
47
  # Initialize a new instance of DeleteOrphans
@@ -35,10 +49,7 @@ module SidekiqUniqueJobs
35
49
  # @param [Redis] conn a connection to redis
36
50
  #
37
51
  def initialize(conn)
38
- @conn = conn
39
- @digests = SidekiqUniqueJobs::Digests.new
40
- @scheduled = Redis::SortedSet.new(SCHEDULE)
41
- @retried = Redis::SortedSet.new(RETRY)
52
+ @conn = conn
42
53
  end
43
54
 
44
55
  #
@@ -78,160 +89,12 @@ module SidekiqUniqueJobs
78
89
  # @return [Integer] the number of reaped locks
79
90
  #
80
91
  def call
81
- case reaper
82
- when :ruby
83
- execute_ruby_reaper
84
- when :lua
85
- execute_lua_reaper
92
+ if (implementation = REAPERS[reaper])
93
+ implementation.new(conn).call
86
94
  else
87
95
  log_fatal(":#{reaper} is invalid for `SidekiqUnqiueJobs.config.reaper`")
88
96
  end
89
97
  end
90
-
91
- #
92
- # Executes the ruby reaper
93
- #
94
- #
95
- # @return [Integer] the number of deleted locks
96
- #
97
- def execute_ruby_reaper
98
- BatchDelete.call(orphans, conn)
99
- end
100
-
101
- #
102
- # Executes the lua reaper
103
- #
104
- #
105
- # @return [Integer] the number of deleted locks
106
- #
107
- def execute_lua_reaper
108
- call_script(
109
- :reap_orphans,
110
- conn,
111
- keys: [SidekiqUniqueJobs::DIGESTS, SidekiqUniqueJobs::SCHEDULE, SidekiqUniqueJobs::RETRY],
112
- argv: [reaper_count],
113
- )
114
- end
115
-
116
- #
117
- # Find orphaned digests
118
- #
119
- #
120
- # @return [Array<String>] an array of orphaned digests
121
- #
122
- def orphans
123
- conn.zrevrange(digests.key, 0, -1).each_with_object([]) do |digest, result|
124
- next if belongs_to_job?(digest)
125
-
126
- result << digest
127
- break if result.size >= reaper_count
128
- end
129
- end
130
-
131
- #
132
- # Checks if the digest has a matching job.
133
- # 1. It checks the scheduled set
134
- # 2. It checks the retry set
135
- # 3. It goes through all queues
136
- #
137
- #
138
- # @param [String] digest the digest to search for
139
- #
140
- # @return [true] when either of the checks return true
141
- # @return [false] when no job was found for this digest
142
- #
143
- def belongs_to_job?(digest)
144
- scheduled?(digest) || retried?(digest) || enqueued?(digest)
145
- end
146
-
147
- #
148
- # Checks if the digest exists in the Sidekiq::ScheduledSet
149
- #
150
- # @param [String] digest the current digest
151
- #
152
- # @return [true] when digest exists in scheduled set
153
- #
154
- def scheduled?(digest)
155
- in_sorted_set?(SCHEDULE, digest)
156
- end
157
-
158
- #
159
- # Checks if the digest exists in the Sidekiq::RetrySet
160
- #
161
- # @param [String] digest the current digest
162
- #
163
- # @return [true] when digest exists in retry set
164
- #
165
- def retried?(digest)
166
- in_sorted_set?(RETRY, digest)
167
- end
168
-
169
- #
170
- # Checks if the digest exists in a Sidekiq::Queue
171
- #
172
- # @param [String] digest the current digest
173
- #
174
- # @return [true] when digest exists in any queue
175
- #
176
- def enqueued?(digest)
177
- Sidekiq.redis do |conn|
178
- queues(conn) do |queue|
179
- entries(conn, queue) do |entry|
180
- return true if entry.include?(digest)
181
- end
182
- end
183
-
184
- false
185
- end
186
- end
187
-
188
- #
189
- # Loops through all the redis queues and yields them one by one
190
- #
191
- # @param [Redis] conn the connection to use for fetching queues
192
- #
193
- # @return [void]
194
- #
195
- # @yield queues one at a time
196
- #
197
- def queues(conn, &block)
198
- conn.sscan_each("queues", &block)
199
- end
200
-
201
- def entries(conn, queue) # rubocop:disable Metrics/MethodLength
202
- queue_key = "queue:#{queue}"
203
- initial_size = conn.llen(queue_key)
204
- deleted_size = 0
205
- page = 0
206
- page_size = 50
207
-
208
- loop do
209
- range_start = page * page_size - deleted_size
210
- range_end = range_start + page_size - 1
211
- entries = conn.lrange(queue_key, range_start, range_end)
212
- page += 1
213
-
214
- entries.each do |entry|
215
- yield entry
216
- end
217
-
218
- deleted_size = initial_size - conn.llen(queue_key)
219
- end
220
- end
221
-
222
- #
223
- # Checks a sorted set for the existance of this digest
224
- #
225
- #
226
- # @param [String] key the key for the sorted set
227
- # @param [String] digest the digest to scan for
228
- #
229
- # @return [true] when found
230
- # @return [false] when missing
231
- #
232
- def in_sorted_set?(key, digest)
233
- conn.zscan_each(key, match: "*#{digest}*", count: 1).to_a.any?
234
- end
235
98
  end
236
99
  end
237
100
  end
@@ -0,0 +1,183 @@
1
+ # frozen_string_literal: true
2
+
3
+ module SidekiqUniqueJobs
4
+ module Orphans
5
+ #
6
+ # Class DeleteOrphans provides deletion of orphaned digests
7
+ #
8
+ # @note this is a much slower version of the lua script but does not crash redis
9
+ #
10
+ # @author Mikael Henriksson <mikael@zoolutions.se>
11
+ #
12
+ class RubyReaper < Reaper
13
+ #
14
+ # @!attribute [r] digests
15
+ # @return [SidekiqUniqueJobs::Digests] digest collection
16
+ attr_reader :digests
17
+ #
18
+ # @!attribute [r] scheduled
19
+ # @return [Redis::SortedSet] the Sidekiq ScheduleSet
20
+ attr_reader :scheduled
21
+ #
22
+ # @!attribute [r] retried
23
+ # @return [Redis::SortedSet] the Sidekiq RetrySet
24
+ attr_reader :retried
25
+
26
+ #
27
+ # Initialize a new instance of DeleteOrphans
28
+ #
29
+ # @param [Redis] conn a connection to redis
30
+ #
31
+ def initialize(conn)
32
+ super(conn)
33
+ @digests = SidekiqUniqueJobs::Digests.new
34
+ @scheduled = Redis::SortedSet.new(SCHEDULE)
35
+ @retried = Redis::SortedSet.new(RETRY)
36
+ end
37
+
38
+ #
39
+ # Delete orphaned digests
40
+ #
41
+ #
42
+ # @return [Integer] the number of reaped locks
43
+ #
44
+ def call
45
+ BatchDelete.call(orphans, conn)
46
+ end
47
+
48
+ #
49
+ # Find orphaned digests
50
+ #
51
+ #
52
+ # @return [Array<String>] an array of orphaned digests
53
+ #
54
+ def orphans
55
+ conn.zrevrange(digests.key, 0, -1).each_with_object([]) do |digest, result|
56
+ next if belongs_to_job?(digest)
57
+
58
+ result << digest
59
+ break if result.size >= reaper_count
60
+ end
61
+ end
62
+
63
+ #
64
+ # Checks if the digest has a matching job.
65
+ # 1. It checks the scheduled set
66
+ # 2. It checks the retry set
67
+ # 3. It goes through all queues
68
+ #
69
+ #
70
+ # @param [String] digest the digest to search for
71
+ #
72
+ # @return [true] when either of the checks return true
73
+ # @return [false] when no job was found for this digest
74
+ #
75
+ def belongs_to_job?(digest)
76
+ scheduled?(digest) || retried?(digest) || enqueued?(digest) || active?(digest)
77
+ end
78
+
79
+ #
80
+ # Checks if the digest exists in the Sidekiq::ScheduledSet
81
+ #
82
+ # @param [String] digest the current digest
83
+ #
84
+ # @return [true] when digest exists in scheduled set
85
+ #
86
+ def scheduled?(digest)
87
+ in_sorted_set?(SCHEDULE, digest)
88
+ end
89
+
90
+ #
91
+ # Checks if the digest exists in the Sidekiq::RetrySet
92
+ #
93
+ # @param [String] digest the current digest
94
+ #
95
+ # @return [true] when digest exists in retry set
96
+ #
97
+ def retried?(digest)
98
+ in_sorted_set?(RETRY, digest)
99
+ end
100
+
101
+ #
102
+ # Checks if the digest exists in a Sidekiq::Queue
103
+ #
104
+ # @param [String] digest the current digest
105
+ #
106
+ # @return [true] when digest exists in any queue
107
+ #
108
+ def enqueued?(digest)
109
+ Sidekiq.redis do |conn|
110
+ queues(conn) do |queue|
111
+ entries(conn, queue) do |entry|
112
+ return true if entry.include?(digest)
113
+ end
114
+ end
115
+
116
+ false
117
+ end
118
+ end
119
+
120
+ def active?(digest)
121
+ Sidekiq.redis do |conn|
122
+ procs = conn.sscan_each("processes").to_a.sort
123
+
124
+ result = conn.pipelined do
125
+ procs.map do |key|
126
+ conn.hget(key, "info")
127
+ end
128
+ end
129
+
130
+ result.flatten.compact.any? { |job| load_json(job)[LOCK_DIGEST] == digest }
131
+ end
132
+ end
133
+
134
+ #
135
+ # Loops through all the redis queues and yields them one by one
136
+ #
137
+ # @param [Redis] conn the connection to use for fetching queues
138
+ #
139
+ # @return [void]
140
+ #
141
+ # @yield queues one at a time
142
+ #
143
+ def queues(conn, &block)
144
+ conn.sscan_each("queues", &block)
145
+ end
146
+
147
+ def entries(conn, queue) # rubocop:disable Metrics/MethodLength
148
+ queue_key = "queue:#{queue}"
149
+ initial_size = conn.llen(queue_key)
150
+ deleted_size = 0
151
+ page = 0
152
+ page_size = 50
153
+
154
+ loop do
155
+ range_start = page * page_size - deleted_size
156
+ range_end = range_start + page_size - 1
157
+ entries = conn.lrange(queue_key, range_start, range_end)
158
+ page += 1
159
+
160
+ entries.each do |entry|
161
+ yield entry
162
+ end
163
+
164
+ deleted_size = initial_size - conn.llen(queue_key)
165
+ end
166
+ end
167
+
168
+ #
169
+ # Checks a sorted set for the existance of this digest
170
+ #
171
+ #
172
+ # @param [String] key the key for the sorted set
173
+ # @param [String] digest the digest to scan for
174
+ #
175
+ # @return [true] when found
176
+ # @return [false] when missing
177
+ #
178
+ def in_sorted_set?(key, digest)
179
+ conn.zscan_each(key, match: "*#{digest}*", count: 1).to_a.any?
180
+ end
181
+ end
182
+ end
183
+ end
@@ -3,5 +3,5 @@
3
3
  module SidekiqUniqueJobs
4
4
  #
5
5
  # @return [String] the current SidekiqUniqueJobs version
6
- VERSION = "7.0.0.beta14"
6
+ VERSION = "7.0.0.beta15"
7
7
  end
@@ -2,7 +2,7 @@
2
2
 
3
3
  begin
4
4
  require "sidekiq/web"
5
- rescue LoadError # rubocop:disable Lint/SuppressedException
5
+ rescue LoadError
6
6
  # client-only usage
7
7
  end
8
8
 
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sidekiq-unique-jobs
3
3
  version: !ruby/object:Gem::Version
4
- version: 7.0.0.beta14
4
+ version: 7.0.0.beta15
5
5
  platform: ruby
6
6
  authors:
7
7
  - Mikael Henriksson
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2020-03-30 00:00:00.000000000 Z
11
+ date: 2020-04-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: brpoplpush-redis_script
@@ -294,7 +294,6 @@ files:
294
294
  - lib/sidekiq_unique_jobs/lua/delete_by_digest.lua
295
295
  - lib/sidekiq_unique_jobs/lua/delete_job_by_digest.lua
296
296
  - lib/sidekiq_unique_jobs/lua/find_digest_in_queues.lua
297
- - lib/sidekiq_unique_jobs/lua/find_digest_in_sorted_set.lua
298
297
  - lib/sidekiq_unique_jobs/lua/lock.lua
299
298
  - lib/sidekiq_unique_jobs/lua/locked.lua
300
299
  - lib/sidekiq_unique_jobs/lua/queue.lua
@@ -303,11 +302,11 @@ files:
303
302
  - lib/sidekiq_unique_jobs/lua/shared/_current_time.lua
304
303
  - lib/sidekiq_unique_jobs/lua/shared/_delete_from_queue.lua
305
304
  - lib/sidekiq_unique_jobs/lua/shared/_delete_from_sorted_set.lua
305
+ - lib/sidekiq_unique_jobs/lua/shared/_find_digest_in_process_set.lua
306
306
  - lib/sidekiq_unique_jobs/lua/shared/_find_digest_in_queues.lua
307
307
  - lib/sidekiq_unique_jobs/lua/shared/_find_digest_in_sorted_set.lua
308
308
  - lib/sidekiq_unique_jobs/lua/shared/_hgetall.lua
309
309
  - lib/sidekiq_unique_jobs/lua/shared/_upgrades.lua
310
- - lib/sidekiq_unique_jobs/lua/shared/find_digest_in_sorted_set.lua
311
310
  - lib/sidekiq_unique_jobs/lua/unlock.lua
312
311
  - lib/sidekiq_unique_jobs/lua/update_version.lua
313
312
  - lib/sidekiq_unique_jobs/lua/upgrade.lua
@@ -324,9 +323,11 @@ files:
324
323
  - lib/sidekiq_unique_jobs/on_conflict/reschedule.rb
325
324
  - lib/sidekiq_unique_jobs/on_conflict/strategy.rb
326
325
  - lib/sidekiq_unique_jobs/options_with_fallback.rb
326
+ - lib/sidekiq_unique_jobs/orphans/lua_reaper.rb
327
327
  - lib/sidekiq_unique_jobs/orphans/manager.rb
328
328
  - lib/sidekiq_unique_jobs/orphans/observer.rb
329
329
  - lib/sidekiq_unique_jobs/orphans/reaper.rb
330
+ - lib/sidekiq_unique_jobs/orphans/ruby_reaper.rb
330
331
  - lib/sidekiq_unique_jobs/profiler.rb
331
332
  - lib/sidekiq_unique_jobs/redis.rb
332
333
  - lib/sidekiq_unique_jobs/redis/entity.rb
@@ -1,24 +0,0 @@
1
- local function find_digest_in_sorted_set(name, digest)
2
- local cursor = 0
3
- local count = 5
4
- local pattern = "*" .. digest .. "*"
5
- local found = false
6
-
7
- log_debug("searching in:", name,
8
- "for digest:", digest,
9
- "cursor:", cursor)
10
- repeat
11
- local pagination = redis.call("ZSCAN", name, cursor, "MATCH", pattern, "COUNT", count)
12
- local next_cursor = pagination[1]
13
- local items = pagination[2]
14
-
15
- if #items > 0 then
16
- log_debug("Found digest", digest, "in zset:", name)
17
- found = true
18
- end
19
-
20
- cursor = next_cursor
21
- until found == true or cursor == "0"
22
-
23
- return found
24
- end
@@ -1,24 +0,0 @@
1
- local function find_digest_in_sorted_set(name, digest)
2
- local cursor = 0
3
- local count = 5
4
- local pattern = "*" .. digest .. "*"
5
- local found = false
6
-
7
- log_debug("searching in:", name,
8
- "for digest:", digest,
9
- "cursor:", cursor)
10
- repeat
11
- local pagination = redis.call("ZSCAN", name, cursor, "MATCH", pattern, "COUNT", count)
12
- local next_cursor = pagination[1]
13
- local items = pagination[2]
14
-
15
- if #items > 0 then
16
- log_debug("Found digest", digest, "in sorted set:", name)
17
- found = true
18
- end
19
-
20
- cursor = next_cursor
21
- until found == true or cursor == "0"
22
-
23
- return found
24
- end