sidekiq-unique-jobs 7.0.0.beta4 → 7.0.0.beta5

Sign up to get free protection for your applications and to get access to all the features.

Potentially problematic release.


This version of sidekiq-unique-jobs might be problematic. Click here for more details.

checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c73d16cc7119474898230003931d398ee6f0e6b925117cbbf763cbb9b740edf1
4
- data.tar.gz: 8371f1cf3bb123a5df3a603c3c67abb584d51cbb646b088d9c2e9593343f4672
3
+ metadata.gz: d70a897c3f1aa180a2ec8d966218836038210bdf294f3dc2be955f3e2f4915f9
4
+ data.tar.gz: d57e756148fe58b0b72c8d1e32208bb6db61f1121a54b80ecbd275ee79a46828
5
5
  SHA512:
6
- metadata.gz: 4671e078b75e70a2f427229b2d8dbcf259a32a3f8b13ab61ed9963b9d63327b31834082724a5abd4b46a79867c51f8578a36178027f690cac442550b8fed5bfe
7
- data.tar.gz: b0bda233fddf2ff36394702b6b074a09290690802a7dcea8fe4be3ff56d95f0a7641b75287ea71a3417e11c7df6578f2403d77a57a1d24187fc3c6cac0bc5890
6
+ metadata.gz: c5c94eb9bc3b4ce1dd5ebcd7f7af5ae4456ff6783a47783ef5fe96b6bdf5c045d8adfff3e5abfe4390f2ee01d20e8e47a4cc50132cb2f31e17947dccbc0caf82
7
+ data.tar.gz: e0d9b47ba7930b254ce7e6741577de17a02607f8d89a5ff3aa313ed978bfefdb518d61f62c4437f75a93d267e3ffd3b5cc58bfd003160fc552d0bb91df21ec76
data/CHANGELOG.md CHANGED
@@ -1,5 +1,35 @@
1
1
  # Changelog
2
2
 
3
+ ## [v7.0.0.beta4](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v7.0.0.beta4) (2019-11-25)
4
+
5
+ [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v7.0.0.beta3...v7.0.0.beta4)
6
+
7
+ **Fixed bugs:**
8
+
9
+ - Fix ruby reaper [\#444](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/444) ([mhenrixon](https://github.com/mhenrixon))
10
+
11
+ ## [v7.0.0.beta3](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v7.0.0.beta3) (2019-11-24)
12
+
13
+ [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v7.0.0.beta2...v7.0.0.beta3)
14
+
15
+ **Implemented enhancements:**
16
+
17
+ - Brpoplpush redis script [\#434](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/434) ([mhenrixon](https://github.com/mhenrixon))
18
+ - Drop support for almost EOL ruby 2.4 [\#433](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/433) ([mhenrixon](https://github.com/mhenrixon))
19
+
20
+ **Fixed bugs:**
21
+
22
+ - Redis is busy running script and script never terminates [\#441](https://github.com/mhenrixon/sidekiq-unique-jobs/issues/441)
23
+ - Make the ruby reaper plain ruby [\#443](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/443) ([mhenrixon](https://github.com/mhenrixon))
24
+
25
+ **Closed issues:**
26
+
27
+ - Some jobs seem to be treated as duplicate despite empty queue [\#440](https://github.com/mhenrixon/sidekiq-unique-jobs/issues/440)
28
+
29
+ **Merged pull requests:**
30
+
31
+ - Fix typo and some formatting issues in README [\#442](https://github.com/mhenrixon/sidekiq-unique-jobs/pull/442) ([ajkerr](https://github.com/ajkerr))
32
+
3
33
  ## [v7.0.0.beta2](https://github.com/mhenrixon/sidekiq-unique-jobs/tree/v7.0.0.beta2) (2019-10-08)
4
34
 
5
35
  [Full Changelog](https://github.com/mhenrixon/sidekiq-unique-jobs/compare/v7.0.0.beta1...v7.0.0.beta2)
data/README.md CHANGED
@@ -355,7 +355,7 @@ With this lock type it is possible to put any number of these jobs on the queue,
355
355
 
356
356
  **NOTE** Unless this job is configured with a `lock_timeout: nil` or `lock_timeout: > 0` then all jobs that are attempted to be executed will just be dropped without waiting.
357
357
 
358
- There is an example of this to try it out in the `my_app` application. Run `foreman start` in the root of the directory and open the url: `localhost:5000/work/duplicate_while_executing`.
358
+ There is an example of this to try it out in the `myapp` application. Run `foreman start` in the root of the directory and open the url: `localhost:5000/work/duplicate_while_executing`.
359
359
 
360
360
  In the console you should see something like:
361
361
 
@@ -601,7 +601,8 @@ For sidekiq versions before 5.1 a `sidekiq_retries_exhausted` block is required
601
601
  ```ruby
602
602
  class MyWorker
603
603
  sidekiq_retries_exhausted do |msg, _ex|
604
- SidekiqUniqueJobs::Digests.del(digest: msg['unique_digest']) if msg['unique_digest']
604
+ digest = msg['unique_digest']
605
+ SidekiqUniqueJobs::Digests.delete_by_digest(digest) if digest
605
606
  end
606
607
  end
607
608
  ```
@@ -612,7 +613,8 @@ Starting in v5.1, Sidekiq can also fire a global callback when a job dies:
612
613
  # this goes in your initializer
613
614
  Sidekiq.configure_server do |config|
614
615
  config.death_handlers << ->(job, _ex) do
615
- SidekiqUniqueJobs::Digests.del(digest: job['unique_digest']) if job['unique_digest']
616
+ digest = msg['unique_digest']
617
+ SidekiqUniqueJobs::Digests.delete_by_digest(digest) if digest
616
618
  end
617
619
  end
618
620
  ```
@@ -16,9 +16,9 @@ module SidekiqUniqueJobs
16
16
  desc "list PATTERN", "list all unique digests and their expiry time"
17
17
  option :count, aliases: :c, type: :numeric, default: 1000, desc: "The max number of digests to return"
18
18
  def list(pattern = "*")
19
- digests = SidekiqUniqueJobs::Digests.new.entries(pattern: pattern, count: options[:count])
20
- say "Found #{digests.size} digests matching '#{pattern}':"
21
- print_in_columns(digests.sort) if digests.any?
19
+ entries = digests.entries(pattern: pattern, count: options[:count])
20
+ say "Found #{entries.size} digests matching '#{pattern}':"
21
+ print_in_columns(entries.sort) if entries.any?
22
22
  end
23
23
 
24
24
  desc "del PATTERN", "deletes unique digests from redis by pattern"
@@ -27,10 +27,10 @@ module SidekiqUniqueJobs
27
27
  def del(pattern)
28
28
  max_count = options[:count]
29
29
  if options[:dry_run]
30
- digests = SidekiqUniqueJobs::Digests.new.entries(pattern: pattern, count: max_count)
31
- say "Would delete #{digests.size} digests matching '#{pattern}'"
30
+ result = digests.entries(pattern: pattern, count: max_count)
31
+ say "Would delete #{result.size} digests matching '#{pattern}'"
32
32
  else
33
- deleted_count = SidekiqUniqueJobs::Digests.new.del(pattern: pattern, count: max_count)
33
+ deleted_count = digests.delete_by_pattern(pattern, count: max_count)
34
34
  say "Deleted #{deleted_count} digests matching '#{pattern}'"
35
35
  end
36
36
  end
@@ -46,6 +46,10 @@ module SidekiqUniqueJobs
46
46
  end
47
47
 
48
48
  no_commands do
49
+ def digests
50
+ @digests ||= SidekiqUniqueJobs::Digests.new
51
+ end
52
+
49
53
  def console_class
50
54
  require "pry"
51
55
  Pry
@@ -41,5 +41,6 @@ module SidekiqUniqueJobs
41
41
  UNIQUE_ARGS ||= "unique_args"
42
42
  UNIQUE_DIGEST ||= "unique_digest"
43
43
  UNIQUE_PREFIX ||= "unique_prefix"
44
+ UNIQUE_REAPER ||= "uniquejobs:reaper"
44
45
  WORKER ||= "worker"
45
46
  end
@@ -27,26 +27,44 @@ module SidekiqUniqueJobs
27
27
  redis { |conn| conn.zadd(key, now_f, digest) }
28
28
  end
29
29
 
30
+ # Deletes unique digests by pattern
30
31
  #
31
- # Deletes unique digest either by a digest or pattern
32
- #
33
- # @overload call_script(digest: "abcdefab")
34
- # Call script with digest
35
- # @param [String] digest: a digest to delete
36
- # @overload call_script(pattern: "*", count: 1_000)
37
- # Call script with pattern
38
- # @param [String] pattern: "*" a pattern to match
39
- # @param [String] count: DEFAULT_COUNT the number of keys to delete
40
- #
41
- # @raise [ArgumentError] when given neither pattern nor digest
42
- #
32
+ # @param [String] pattern a key pattern to match with
33
+ # @param [Integer] count the maximum number
43
34
  # @return [Array<String>] with unique digests
35
+ def delete_by_pattern(pattern, count: DEFAULT_COUNT)
36
+ result, elapsed = timed do
37
+ digests = entries(pattern: pattern, count: count).keys
38
+ redis { |conn| BatchDelete.call(digests, conn) }
39
+ end
40
+
41
+ log_info("#{__method__}(#{pattern}, count: #{count}) completed in #{elapsed}ms")
42
+
43
+ result
44
+ end
45
+
46
+ # Delete unique digests by digest
47
+ # Also deletes the :AVAILABLE, :EXPIRED etc keys
44
48
  #
45
- def del(digest: nil, pattern: nil, count: DEFAULT_COUNT)
46
- return delete_by_pattern(pattern, count: count) if pattern
47
- return delete_by_digest(digest) if digest
49
+ # @param [String] digest a unique digest to delete
50
+ def delete_by_digest(digest) # rubocop:disable Metrics/MethodLength
51
+ result, elapsed = timed do
52
+ call_script(:delete_by_digest, [
53
+ digest,
54
+ "#{digest}:QUEUED",
55
+ "#{digest}:PRIMED",
56
+ "#{digest}:LOCKED",
57
+ "#{digest}:RUN",
58
+ "#{digest}:RUN:QUEUED",
59
+ "#{digest}:RUN:PRIMED",
60
+ "#{digest}:RUN:LOCKED",
61
+ key,
62
+ ])
63
+ end
64
+
65
+ log_info("#{__method__}(#{digest}) completed in #{elapsed}ms")
48
66
 
49
- raise ArgumentError, "##{__method__} requires either a :digest or a :pattern"
67
+ result
50
68
  end
51
69
 
52
70
  #
@@ -92,37 +110,5 @@ module SidekiqUniqueJobs
92
110
  ]
93
111
  end
94
112
  end
95
-
96
- private
97
-
98
- # Deletes unique digests by pattern
99
- #
100
- # @param [String] pattern a key pattern to match with
101
- # @param [Integer] count the maximum number
102
- # @return [Array<String>] with unique digests
103
- def delete_by_pattern(pattern, count: DEFAULT_COUNT)
104
- result, elapsed = timed do
105
- digests = entries(pattern: pattern, count: count).keys
106
- redis { |conn| BatchDelete.call(digests, conn) }
107
- end
108
-
109
- log_info("#{__method__}(#{pattern}, count: #{count}) completed in #{elapsed}ms")
110
-
111
- result
112
- end
113
-
114
- # Delete unique digests by digest
115
- # Also deletes the :AVAILABLE, :EXPIRED etc keys
116
- #
117
- # @param [String] digest a unique digest to delete
118
- def delete_by_digest(digest)
119
- result, elapsed = timed do
120
- call_script(:delete_by_digest, [digest, key])
121
- end
122
-
123
- log_info("#{__method__}(#{digest}) completed in #{elapsed}ms")
124
-
125
- result
126
- end
127
113
  end
128
114
  end
@@ -17,18 +17,24 @@ module SidekiqUniqueJobs
17
17
  # @yield to the worker class perform method
18
18
  def execute
19
19
  if unlock
20
- runtime_lock.execute { return yield }
20
+ lock_on_failure do
21
+ runtime_lock.execute { return yield }
22
+ end
21
23
  else
22
24
  log_warn "couldn't unlock digest: #{item[UNIQUE_DIGEST]} #{item[JID]}"
23
25
  end
24
26
  end
25
27
 
26
- #
27
- # Lock only when the server is processing the job
28
- #
29
- #
30
- # @return [SidekiqUniqueJobs::Lock::WhileExecuting] an instance of a lock
31
- #
28
+ private
29
+
30
+ def lock_on_failure
31
+ yield
32
+ rescue Exception # rubocop:disable Lint/RescueException
33
+ log_error("Runtime lock failed to execute job, restoring server lock")
34
+ lock
35
+ raise
36
+ end
37
+
32
38
  def runtime_lock
33
39
  @runtime_lock ||= SidekiqUniqueJobs::Lock::WhileExecuting.new(item, callback, redis_pool)
34
40
  end
@@ -1,6 +1,13 @@
1
1
  -------- BEGIN keys ---------
2
- local digest = KEYS[1]
3
- local digests = KEYS[2]
2
+ local digest = KEYS[1]
3
+ local queued = KEYS[2]
4
+ local primed = KEYS[3]
5
+ local locked = KEYS[4]
6
+ local run_digest = KEYS[5]
7
+ local run_queued = KEYS[6]
8
+ local run_primed = KEYS[7]
9
+ local run_locked = KEYS[8]
10
+ local digests = KEYS[9]
4
11
  -------- END keys ---------
5
12
 
6
13
  -------- BEGIN injected arguments --------
@@ -15,17 +22,6 @@ local redisversion = tostring(ARGV[5])
15
22
  <%= include_partial "shared/_common.lua" %>
16
23
  ---------- END local functions ----------
17
24
 
18
- -------- BEGIN Variables --------
19
- local queued = digest .. ":QUEUED"
20
- local primed = digest .. ":PRIMED"
21
- local locked = digest .. ":LOCKED"
22
- local run_digest = digest .. ":RUN"
23
- local run_queued = digest .. ":RUN:QUEUED"
24
- local run_primed = digest .. ":RUN:PRIMED"
25
- local run_locked = digest .. ":RUN:LOCKED"
26
- -------- END Variables --------
27
-
28
-
29
25
  -------- BEGIN delete_by_digest.lua --------
30
26
  local counter = 0
31
27
  local redis_version = toversion(redisversion)
@@ -64,7 +64,11 @@ module SidekiqUniqueJobs
64
64
  # @return [Integer] the number of keys deleted
65
65
  #
66
66
  def delete_lock
67
- call_script(:delete_by_digest, keys: [unique_digest, DIGESTS])
67
+ digests.delete_by_digest(unique_digest)
68
+ end
69
+
70
+ def digests
71
+ @digests ||= SidekiqUniqueJobs::Digests.new
68
72
  end
69
73
  end
70
74
  end
@@ -7,7 +7,9 @@ module SidekiqUniqueJobs
7
7
  #
8
8
  # @author Mikael Henriksson <mikael@zoolutions.se>
9
9
  #
10
- class Manager
10
+ module Manager
11
+ module_function
12
+
11
13
  include SidekiqUniqueJobs::Connection
12
14
  include SidekiqUniqueJobs::Logging
13
15
 
@@ -17,8 +19,11 @@ module SidekiqUniqueJobs
17
19
  #
18
20
  # @return [Concurrent::TimerTask] the task that was started
19
21
  #
20
- def self.start
22
+ def start # rubocop:disable
23
+ return if registered?
24
+
21
25
  with_logging_context do
26
+ register_reaper_process
22
27
  log_info("Starting Reaper")
23
28
  task.add_observer(Observer.new)
24
29
  task.execute
@@ -32,9 +37,10 @@ module SidekiqUniqueJobs
32
37
  #
33
38
  # @return [Boolean]
34
39
  #
35
- def self.stop
40
+ def stop
36
41
  with_logging_context do
37
42
  log_info("Stopping Reaper")
43
+ unregister_reaper_process
38
44
  task.shutdown
39
45
  end
40
46
  end
@@ -45,7 +51,7 @@ module SidekiqUniqueJobs
45
51
  #
46
52
  # @return [<type>] <description>
47
53
  #
48
- def self.task
54
+ def task
49
55
  @task ||= Concurrent::TimerTask.new(timer_task_options) do
50
56
  with_logging_context do
51
57
  redis do |conn|
@@ -61,7 +67,7 @@ module SidekiqUniqueJobs
61
67
  #
62
68
  # @return [Hash]
63
69
  #
64
- def self.timer_task_options
70
+ def timer_task_options
65
71
  { run_now: true,
66
72
  execution_interval: reaper_interval,
67
73
  timeout_interval: reaper_timeout }
@@ -70,14 +76,14 @@ module SidekiqUniqueJobs
70
76
  #
71
77
  # @see SidekiqUniqueJobs::Config#reaper_interval
72
78
  #
73
- def self.reaper_interval
79
+ def reaper_interval
74
80
  SidekiqUniqueJobs.config.reaper_interval
75
81
  end
76
82
 
77
83
  #
78
84
  # @see SidekiqUniqueJobs::Config#reaper_timeout
79
85
  #
80
- def self.reaper_timeout
86
+ def reaper_timeout
81
87
  SidekiqUniqueJobs.config.reaper_timeout
82
88
  end
83
89
 
@@ -88,13 +94,43 @@ module SidekiqUniqueJobs
88
94
  # @return [Hash] when logger responds to `:with_context`
89
95
  # @return [String] when logger does not responds to `:with_context`
90
96
  #
91
- def self.logging_context
97
+ def logging_context
92
98
  if logger_context_hash?
93
99
  { "uniquejobs" => "reaper" }
94
100
  else
95
101
  "uniquejobs=orphan-reaper"
96
102
  end
97
103
  end
104
+
105
+ #
106
+ # Checks if a reaper is already registered
107
+ #
108
+ #
109
+ # @return [true, false]
110
+ #
111
+ def registered?
112
+ redis { |conn| conn.get(UNIQUE_REAPER) }.to_i == 1
113
+ end
114
+
115
+ #
116
+ # Writes a mutex key to redis
117
+ #
118
+ #
119
+ # @return [void]
120
+ #
121
+ def register_reaper_process
122
+ redis { |conn| conn.set(UNIQUE_REAPER, 1) }
123
+ end
124
+
125
+ #
126
+ # Removes mutex key from redis
127
+ #
128
+ #
129
+ # @return [void]
130
+ #
131
+ def unregister_reaper_process
132
+ redis { |conn| conn.del(UNIQUE_REAPER) }
133
+ end
98
134
  end
99
135
  end
100
136
  end
@@ -3,5 +3,5 @@
3
3
  module SidekiqUniqueJobs
4
4
  #
5
5
  # @return [String] the current SidekiqUniqueJobs version
6
- VERSION = "7.0.0.beta4"
6
+ VERSION = "7.0.0.beta5"
7
7
  end
@@ -32,7 +32,7 @@ module SidekiqUniqueJobs
32
32
  end
33
33
 
34
34
  app.get "/locks/delete_all" do
35
- digests.del(pattern: "*", count: digests.count)
35
+ digests.delete_by_pattern("*", count: digests.count)
36
36
  redirect_to :locks
37
37
  end
38
38
 
@@ -44,7 +44,7 @@ module SidekiqUniqueJobs
44
44
  end
45
45
 
46
46
  app.get "/locks/:digest/delete" do
47
- digests.del(digest: params[:digest])
47
+ digests.delete_by_digest(params[:digest])
48
48
  redirect_to :locks
49
49
  end
50
50
 
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sidekiq-unique-jobs
3
3
  version: !ruby/object:Gem::Version
4
- version: 7.0.0.beta4
4
+ version: 7.0.0.beta5
5
5
  platform: ruby
6
6
  authors:
7
7
  - Mikael Henriksson
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2019-11-25 00:00:00.000000000 Z
11
+ date: 2019-11-26 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: brpoplpush-redis_script
@@ -371,7 +371,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
371
371
  - !ruby/object:Gem::Version
372
372
  version: 1.3.1
373
373
  requirements: []
374
- rubygems_version: 3.0.3
374
+ rubygems_version: 3.0.6
375
375
  signing_key:
376
376
  specification_version: 4
377
377
  summary: Sidekiq middleware that prevents duplicates jobs