resque-batched-logger 0.0.9

Sign up to get free protection for your applications and to get access to all the features.
data/.gitignore ADDED
@@ -0,0 +1,5 @@
1
+ *.gem
2
+ .bundle
3
+ Gemfile.lock
4
+ pkg/*
5
+ coverage
data/Gemfile ADDED
@@ -0,0 +1,4 @@
1
+ source "http://rubygems.org"
2
+
3
+ # Specify your gem's dependencies in resque-batched-logger.gemspec
4
+ gemspec
data/README ADDED
@@ -0,0 +1,75 @@
1
+ Resque Batched Logger
2
+ =====================
3
+
4
+ ResqueBatchedLogger is an extension to the fantastic background work gem 'resque'. This extension is intended for when you have large volumes of work you process in logical groups, or batches, and want timing, and logging information on the group as a whole. For example we have a daily import process which spawns a large amount of several types of jobs, running a couple of hours. This grouped logging lets us track when those jobs finish, and how long each took (by type)
5
+
6
+ Installation:
7
+ =============
8
+
9
+ gem install resque-batched-logger
10
+
11
+ Usage:
12
+ ======
13
+
14
+ # include the module in your job classes
15
+ class MyJob
16
+ include Resque::Plugins::BatchedLogging
17
+ def self.perform(*args)
18
+ # your job implementation
19
+ end
20
+ end
21
+
22
+ # enqueue your jobs by calling #enqueue from within a batched block on your job classes
23
+ # This will call any custom #self.enqueue or #self.create methods defined on your job class, defaulting to the standard Resque.enqueue after wrapping your job in it's logging code
24
+ # jobs will be batched grouped by the job class name, and must be unique so for running multiple batches at once see below on how to do so with subclassing.
25
+ MyJob.batched do
26
+ enqueue(1,2,3,{:my => :options})
27
+ end
28
+
29
+ You'll need to run at least one resque-worker on the queue named 'batched_logger', (recommended only one for consistency of log ordering) this worker can be reasonably low priority, depending on how quickly you need output to the logfile.
30
+
31
+ The logging of each batch of jobs will be written to a log file at `log/batched_jobs.log` by the BatchedLogger resque job.
32
+
33
+ Sample Output:
34
+ ==============
35
+
36
+ ==== Batched jobs 'Daily MyJob run' : logged at Mon Mar 28 16:25:04 +1300 2011 ====
37
+ batch started processing at: Mon Mar 28 16:23:00 +1300 2011
38
+ batch finished processing at: Mon Mar 28 16:25:02 +1300 2011
39
+ Total run time for batch: 122 seconds
40
+ Jobs Enqueued: 220
41
+ Jobs Processed: 220
42
+ Average time per job: 0.527 seconds
43
+ Total time spent processing jobs: 116 seconds
44
+ ==== Batched jobs 'Daily MyJob run' completed at Mon Mar 28 16:25:02 +1300 2011 took 122 seconds ====
45
+
46
+ Advanced Usage
47
+ ==============
48
+
49
+ For batching jobs of the same type in multiple groups, or if your application might enqueue jobs of the same type while your batched jobs are running, it is recommended that you subclass your jobs for a batched class, to allow an exclusive batch scope
50
+
51
+ example:
52
+
53
+ class BackendLifting
54
+ @queue = :standard
55
+ def self.perform(user_id)
56
+ # do something
57
+ end
58
+ end
59
+ class BatchedBackendLifting < BackendLifting
60
+ # This is simply to provide an exclusive scope
61
+ end
62
+
63
+ This allows you to batch your jobs via:
64
+
65
+ BatchedBackendLifting.batched do
66
+ user_ids.each do |id|
67
+ enqueue(id)
68
+ end
69
+ end
70
+
71
+ Which will prevent any `BackendLifting` jobs your application enqueues simultaneously from being inadvertently logged as a batched job. This is only necessary if you want to guarantee there are no additional jobs added to the logging of your batch, or the ability to enqueue multiple batches at once.
72
+
73
+ How it works
74
+ ============
75
+ Log information is stored in redis until all jobs have been processed, and once all the jobs present in the batch have been performed, the `BatchedJobsLogger` pulls this information out of redis, aggregates it and outputs it to the logfile 'log/batched_jobs.log'.
data/Rakefile ADDED
@@ -0,0 +1,19 @@
1
+ require 'bundler'
2
+ Bundler::GemHelper.install_tasks
3
+
4
+ require 'rake/testtask'
5
+ Rake::TestTask.new(:test) do |test|
6
+ test.libs << 'lib' << 'test'
7
+ test.pattern = 'test/**/test_*.rb'
8
+ test.verbose = true
9
+ end
10
+
11
+ require 'rcov/rcovtask'
12
+ Rcov::RcovTask.new do |test|
13
+ test.rcov_opts << '--exclude /gems/,/Library/,/usr/,spec,lib/tasks' # exclude external gems/libraries
14
+ test.libs << 'test'
15
+ test.pattern = 'test/**/test_*.rb'
16
+ test.verbose = true
17
+ end
18
+
19
+ task :default => :test
@@ -0,0 +1,111 @@
1
+ module Resque
2
+ module Plugins
3
+ class BatchedLogger < Resque::Job
4
+
5
+ require 'time'
6
+ require 'fileutils'
7
+
8
+ @queue = :batched_logger
9
+ LOG_FILE = "log/batched_jobs.log"
10
+ @@logger = nil
11
+
12
+ # Requeue's all loggers
13
+ def self.requeue_all
14
+ Resque.redis.keys("*:jobcount").each do |key|
15
+ Resque.enqueue(self, key.gsub(":jobcount", ""))
16
+ end
17
+ end
18
+
19
+ def self.perform(batch_name)
20
+ begin
21
+ FileUtils.mkdir_p(File.dirname(LOG_FILE))
22
+ FileUtils.touch(LOG_FILE)
23
+
24
+ if !batch_to_log?(batch_name)
25
+ logger.puts("No jobs to log for '#{batch_name}'")
26
+ elsif jobs_pending?(batch_name)
27
+ sleep 5 # Wait 5 seconds
28
+ Resque.enqueue(self, batch_name) # Requeue, to check again
29
+ else
30
+ # Start processing the logs
31
+ # Pull in the info stored in redis
32
+ job_stats = {
33
+ :processing_time => 0, :longest_processing_time => 0,
34
+ :processed_job_count => 0, :enqueued_job_count => Resque.redis.get("#{batch_name}:jobcount"),
35
+ :start_time => nil, :finish_time => nil
36
+ }
37
+ Resque.redis.del("#{batch_name}:jobcount") # Stop any other logging processes picking up the same job
38
+ # lpop pops the first element off the list in redis
39
+ while job = Resque.redis.lpop("batch_stats:#{batch_name}")
40
+ job = decode_job(job) # Decode from string format
41
+ # job => [run_time, start_time, end_time]
42
+ job_stats[:processing_time] += job[0] # run_time
43
+ job_stats[:longest_processing_time] = job[0] if job[0] > job_stats[:longest_processing_time]
44
+ job_stats[:start_time] ||= job[1] # start_time
45
+ job_stats[:finish_time] ||= job[2]
46
+ job_stats[:finish_time] = job[2] if job[2] > job_stats[:finish_time] # end_time
47
+ job_stats[:processed_job_count] += 1
48
+ end
49
+ Resque.redis.del("batch_stats:#{batch_name}") # Cleanup the array of stats we've just processed (it should be empty now)
50
+
51
+ if job_stats[:processed_job_count].zero?
52
+ log_empty_batch(batch_name)
53
+ else
54
+ log_stats(batch_name, job_stats)
55
+ end
56
+ end
57
+ ensure
58
+ logger.close
59
+ @@logger = nil
60
+ end
61
+ end
62
+
63
+ private
64
+
65
+ def self.logger
66
+ @@logger ||= File.open(LOG_FILE, "a")
67
+ end
68
+
69
+ def self.log_stats(batch_name, job_stats)
70
+ job_stats[:total_time] = job_stats[:finish_time] - job_stats[:start_time]
71
+
72
+ # Aggregate stats
73
+ job_stats[:average_time] = job_stats[:total_time] / job_stats[:processed_job_count].to_f
74
+
75
+ logger.puts "==== Batched jobs '#{batch_name}' : logged at #{Time.now.to_s} ===="
76
+ logger.puts " batch started processing at: #{job_stats[:start_time]}"
77
+ logger.puts " batch finished processing at: #{job_stats[:finish_time]}"
78
+ logger.puts " Total run time for batch: #{job_stats[:total_time]} seconds"
79
+
80
+ logger.puts " Jobs Enqueued: #{job_stats[:enqueued_job_count]}"
81
+ logger.puts " Jobs Processed: #{job_stats[:processed_job_count]}"
82
+ logger.puts " Average time per job: #{job_stats[:average_time]} seconds"
83
+ logger.puts " Total time spent processing jobs: #{job_stats[:processing_time]} seconds"
84
+
85
+ logger.puts "==== Batched jobs '#{batch_name}' completed at #{job_stats[:finish_time]} took #{job_stats[:total_time]} seconds ===="
86
+ end
87
+
88
+ def self.log_empty_batch(batch_name)
89
+ logger.puts "==== Batched jobs '#{batch_name}' : logged at #{Time.now.to_s} ===="
90
+ logger.puts "==== Batched jobs '#{batch_name}' completed, 0 jobs enqueued ===="
91
+ end
92
+
93
+ # It is possible that there might be more jobs processed than the number specified, in those cases we will process them all anyway
94
+ def self.jobs_pending?(batch_name)
95
+ jobs = Resque.redis.get("#{batch_name}:jobcount")
96
+ jobs && Resque.redis.llen("batch_stats:#{batch_name}") < jobs.to_i
97
+ end
98
+ def self.batch_to_log?(batch_name)
99
+ jobs = Resque.redis.get("#{batch_name}:jobcount")
100
+ end
101
+
102
+ def self.decode_job(job)
103
+ job = JSON.parse(job)
104
+ job[1] = Time.parse(job[1])
105
+ job[2] = Time.parse(job[2])
106
+ job
107
+ end
108
+
109
+ end
110
+ end
111
+ end
@@ -0,0 +1,85 @@
1
+ module Resque
2
+ module Plugins
3
+ module BatchedLogging
4
+
5
+ require 'benchmark'
6
+
7
+ class BatchExists < StandardError; end;
8
+
9
+ def self.included(base)
10
+ base.class_eval do
11
+ class << self
12
+
13
+ # MyJob.batched(:batch_name => "MyCustomBatchName") do
14
+ # enqueue(1,2,3,{:my => :options})
15
+ # end
16
+ def batched(&block)
17
+ batch_name = self.to_s # Group the batch of jobs by the job's class name
18
+ raise Resque::Plugins::BatchedLogging::BatchExists.new("Batch for '#{batch_name}' exists already") if Resque.redis.get("#{batch_name}:jobcount")
19
+
20
+ job_count = 0
21
+ Resque.redis.set("#{batch_name}:jobcount", job_count) # Set the job count right away, because the workers will check for it's existence
22
+ if block_given?
23
+ proxy_obj = Resque::Plugins::BatchedLogging::BatchLoggerProxy.new(self, batch_name)
24
+ proxy_obj.run(&block)
25
+ job_count = proxy_obj.job_count
26
+ else
27
+ raise "Must pass a block through to a batched group of jobs"
28
+ end
29
+ Resque.redis.set("#{batch_name}:jobcount", job_count)
30
+ Resque.enqueue(Resque::Plugins::BatchedLogger, batch_name) # Queue a job to proccess the log information that is stored in redis
31
+ end
32
+
33
+ # Plugin.around_hook for wrapping the job in logging code
34
+ def around_perform_log_as_batched(*args)
35
+ batch_name = self.to_s
36
+ # Presence of the jobcount variable means that batched logging is enabled for this job type
37
+ if Resque.redis.get("#{batch_name}:jobcount")
38
+ # Perform our logging
39
+ start_time = Time.now
40
+ run_time = Benchmark.realtime do
41
+ yield
42
+ end
43
+ end_time = Time.now
44
+ # Store, [run_time, start_time, end_time] as an entry into redis under in an array for this specific job type & queue
45
+ # rpush appends to the end of a list in redis
46
+ Resque.redis.rpush("batch_stats:#{batch_name}", [run_time, start_time, end_time].to_json) # Push the values onto the end of the list (will create the list if it doesn't exist)
47
+ # End of our logging
48
+ else
49
+ yield # Just perform the standard job without benchmarking
50
+ end
51
+ end
52
+
53
+ end
54
+ end
55
+ end
56
+
57
+ # For enabling MyJob.in_batches with block syntax
58
+ class BatchLoggerProxy
59
+ attr_reader :job_type, :batch_name
60
+ attr_accessor :job_count
61
+ def initialize(job_type, batch_name)
62
+ @job_type = job_type
63
+ @batch_name = batch_name || @job_type.to_s
64
+ @job_count = 0
65
+ end
66
+ def run(&block)
67
+ instance_eval(&block) if block_given?
68
+ end
69
+ # Capture #create, and #enqueue calls that are done within the scope of a MyJob.in_batches block and enqueue the original job
70
+ def enqueue(*args)
71
+ if @job_type.respond_to?(:enqueue)
72
+ @job_type.enqueue(*args)
73
+ elsif @job_type.respond_to?(:create)
74
+ @job_type.create(*args)
75
+ else
76
+ Resque.enqueue(@job_type, *args)
77
+ end
78
+ @job_count += 1
79
+ end
80
+ alias :create :enqueue
81
+ end
82
+
83
+ end
84
+ end
85
+ end
@@ -0,0 +1,7 @@
1
+ module Resque
2
+ module Batched
3
+ module Logger
4
+ VERSION = "0.0.9"
5
+ end
6
+ end
7
+ end
@@ -0,0 +1,3 @@
1
+ lib_dir = File.dirname(__FILE__)
2
+ require File.join(lib_dir, 'resque/plugins/batched_logging')
3
+ require File.join(lib_dir, 'resque/plugins/batched_logger')
@@ -0,0 +1,27 @@
1
+ # -*- encoding: utf-8 -*-
2
+ $:.push File.expand_path("../lib", __FILE__)
3
+ require "resque-batched-logger/version"
4
+
5
+ Gem::Specification.new do |s|
6
+ s.name = "resque-batched-logger"
7
+ s.version = Resque::Batched::Logger::VERSION
8
+ s.platform = Gem::Platform::RUBY
9
+ s.authors = ["Jeremy Olliver"]
10
+ s.email = ["jeremy.olliver@gmail.com"]
11
+ s.homepage = "http://github.com/heaps/resque-batched-logger"
12
+ s.summary = %q{Allows resque jobs to be run in batches with aggregate logging}
13
+ s.description = %q{Allows resque jobs to be run in batches with aggregate logging, timing and statistics}
14
+
15
+ s.files = `git ls-files`.split("\n")
16
+ s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
17
+ s.executables = `git ls-files -- bin/*`.split("\n").map{ |f| File.basename(f) }
18
+ s.require_paths = ["lib"]
19
+
20
+ s.add_dependency 'resque'
21
+ # redis and json, are already dependencies of resque, but declaring these explicitly since these libraries are directly called from within this gem
22
+ s.add_dependency 'redis'
23
+ s.add_dependency 'json'
24
+ s.add_development_dependency 'bundler'
25
+ s.add_development_dependency 'minitest'
26
+ s.add_development_dependency 'rcov'
27
+ end
data/test/helper.rb ADDED
@@ -0,0 +1,39 @@
1
+ require 'rubygems'
2
+ require 'bundler'
3
+ begin
4
+ Bundler.setup(:default, :development)
5
+ rescue Bundler::BundlerError => e
6
+ $stderr.puts e.message
7
+ $stderr.puts "Run `bundle install` to install missing gems"
8
+ exit e.status_code
9
+ end
10
+ require 'minitest/unit'
11
+ require 'resque'
12
+
13
+ # Namespace our tests
14
+ Resque.redis = Redis::Namespace.new('batched-logger/test:', :redis => Resque.redis)
15
+
16
+ $LOAD_PATH.unshift(File.dirname(__FILE__))
17
+ $LOAD_PATH.unshift(File.join(File.dirname(__FILE__), '..', 'lib'))
18
+ require 'resque-batched-logger'
19
+ require 'shared_utilities'
20
+
21
+ class MiniTest::Unit::TestCase
22
+ # def setup
23
+ # global_teardown
24
+ # end
25
+ end
26
+
27
+ def global_teardown
28
+ # Don't do a flushdb on redis, that doesn't respect the namespace
29
+ Resque.redis.keys("*:jobcount").each {|k| Resque.redis.del("#{k.to_s}") } # Clear our job count
30
+ Resque.redis.keys("batch_stats:*").each {|k| Resque.redis.del("#{k.to_s}") } # Clear the lists of job stats
31
+ Resque.clear_test_jobs
32
+ SampleJob.clear_history
33
+ SampleModuleJob.clear_history
34
+ FileUtils.rm(Resque::Plugins::BatchedLogger::LOG_FILE) if File.exist?(Resque::Plugins::BatchedLogger::LOG_FILE)
35
+ end
36
+ global_teardown
37
+
38
+
39
+ MiniTest::Unit.autorun
@@ -0,0 +1,72 @@
1
+ # Test Helper classes
2
+ class SampleJob
3
+ @@job_history = []
4
+ @queue = 'sample_job_queue'
5
+ include Resque::Plugins::BatchedLogging
6
+
7
+ def self.perform(*args)
8
+ @@job_history << args
9
+ sleep(0.5)
10
+ end
11
+
12
+ def self.job_history
13
+ @@job_history
14
+ end
15
+ def self.clear_history
16
+ @@job_history = []
17
+ end
18
+ end
19
+
20
+ class BatchedSampleJob < SampleJob
21
+ # This subclass is simply for namespacing batched 'SampleJob's
22
+ end
23
+
24
+ module SampleModuleJob
25
+ @@job_history = []
26
+ include Resque::Plugins::BatchedLogging
27
+
28
+ def self.perform(*args)
29
+ @@job_history << args
30
+ sleep(0.5)
31
+ end
32
+ def self.job_history
33
+ @@job_history
34
+ end
35
+ def self.clear_history
36
+ @@job_history = []
37
+ end
38
+ end
39
+
40
+ # Test Overrides
41
+ module Resque
42
+ @@test_jobs = []
43
+ class TestJob
44
+ attr_accessor :args
45
+ def initialize(*args)
46
+ @args = args
47
+ end
48
+ def perform
49
+ local_payload = {'class' => @args.shift, 'args' => @args }
50
+ # Perform using Resque::Job, because that's what implements the hooks we need to test
51
+ Resque::Job.new('test_queue', local_payload).perform
52
+ end
53
+ end
54
+ def self.enqueue(*args)
55
+ @@test_jobs << Resque::TestJob.new(*args)
56
+ end
57
+ def self.test_jobs
58
+ @@test_jobs
59
+ end
60
+ def self.clear_test_jobs
61
+ @@test_jobs = []
62
+ end
63
+ def self.perform_test_jobs(options = {})
64
+ number_processed = 0
65
+ while job = @@test_jobs.shift # Pop from the front of the array of pending jobs
66
+ job.perform
67
+ number_processed +=1
68
+ break if number_processed == options[:limit]
69
+ end
70
+ number_processed
71
+ end
72
+ end
@@ -0,0 +1,72 @@
1
+ require 'helper'
2
+
3
+ class TestBatchedLogger < MiniTest::Unit::TestCase
4
+
5
+ require 'parsedate'
6
+
7
+ def test_log_format
8
+ arguments = [[1,2,3], [5,6,{:custom => :options}]]
9
+ batch_name = "SampleJob"
10
+ SampleJob.batched do
11
+ arguments.each do |arg|
12
+ enqueue(*arg)
13
+ end
14
+ end
15
+ Resque.perform_test_jobs
16
+ logged_data = File.read(Resque::Plugins::BatchedLogger::LOG_FILE)
17
+ assert_message_logged_with_valid_times(logged_data, Regexp.new("==== Batched jobs \'#{batch_name}\' : logged at (.*) ===="))
18
+ assert_message_logged_with_valid_times(logged_data, /batch started processing at: (.*)/)
19
+ assert_message_logged_with_valid_times(logged_data, /batch finished processing at: (.*)/)
20
+ assert logged_data.match(/Total run time for batch: [\d\.]+ seconds/), "log should include total run time"
21
+ assert logged_data.match(/Jobs Enqueued: 2/), "log should include number of jobs enqueued"
22
+ assert logged_data.match(/Jobs Processed: 2/), "log should include number of jobs processed"
23
+ assert logged_data.match(/Average time per job: [\d\.]+ seconds/), "log should include average time per job"
24
+ assert logged_data.match(/Total time spent processing jobs: [\d\.]+ seconds/), "log should total time spent processing"
25
+ assert_message_logged_with_valid_times(logged_data, Regexp.new("==== Batched jobs \'#{batch_name}\' completed at (.*) took [\\d\\.]+ seconds ===="))
26
+
27
+ assert total_run_time = logged_data.match(/Total run time for batch: ([\d\.]+) seconds/)[1]
28
+ assert final_completion_time = logged_data.match(Regexp.new("==== Batched jobs \'#{batch_name}\' completed at .* took ([\\d\\.]+) seconds ===="))[1]
29
+ assert_equal(total_run_time, final_completion_time, "Final completion length should match total time for batch")
30
+
31
+ assert_empty Resque.redis.keys("*:jobcount")
32
+ assert_empty Resque.redis.keys("batch_stats:*")
33
+ end
34
+
35
+ def test_logger_running_before_jobs_finished
36
+ Resque::Plugins::BatchedLogger.perform("SampleJob") # Should do no work
37
+ assert_empty Resque.test_jobs
38
+ arguments = [[1,2,3], [5,6,{:custom => :options}]]
39
+ SampleJob.batched do
40
+ arguments.each do |arg|
41
+ enqueue(*arg)
42
+ end
43
+ end
44
+ assert_equal 3, Resque.test_jobs.size
45
+ Resque.test_jobs.pop.perform # Try doing the batch logger first, should be requeued
46
+ assert_equal 3, Resque.test_jobs.size, "The batch logger should have been requeued"
47
+ Resque.perform_test_jobs
48
+ assert_empty Resque.test_jobs
49
+ end
50
+
51
+ def teardown
52
+ global_teardown
53
+ end
54
+
55
+ protected
56
+
57
+ # Line must match overall Regexp 'format', & any selected options with an () in the regexp must be a valid time string
58
+ def assert_message_logged_with_valid_times(string, format)
59
+ matches = string.match(format)
60
+ assert matches, "input #{string} did not match the specified format: #{format.inspect}"
61
+ matches[1..-1].each do |m|
62
+ assert valid_date?(m), "Expected #{m} to be a valid date"
63
+ end
64
+ end
65
+
66
+ def valid_date?(string)
67
+ # Returns true if the string's Year, Month and Day can be correctly determined
68
+ # parsedate returns an array of values for a date/time starting with year, decreasing in size
69
+ (d = ParseDate.parsedate(string)) && d[0..2].compact.size == 3
70
+ end
71
+
72
+ end
@@ -0,0 +1,107 @@
1
+ require 'helper'
2
+
3
+ class TestBatchedLogging < MiniTest::Unit::TestCase
4
+
5
+ def test_class_extensions
6
+ defined_methods = [:batched, :around_perform_log_as_batched]
7
+ [SampleJob, SampleModuleJob, BatchedSampleJob].each do |klass|
8
+ defined_methods.each do |meth_name|
9
+ assert klass.respond_to?(meth_name), "##{meth_name} should be defined on #{klass.to_s}"
10
+ end
11
+ end
12
+ end
13
+
14
+ def test_enqueueing_a_batch
15
+ arguments = [[1,2,3], [5,6,{:custom => :options}]]
16
+ SampleJob.batched do
17
+ arguments.each do |arg|
18
+ enqueue(*arg)
19
+ end
20
+ end
21
+ expected_job_list = arguments.collect {|j| [SampleJob] + j } # Jobs should be enqueued with correct Job class, and batch group
22
+ expected_job_list << [Resque::Plugins::BatchedLogger, "SampleJob"] # We expect the BatchedLogger to have been enqueued as well
23
+ assert_equal expected_job_list, Resque.test_jobs.collect(&:args), "Enqueued arguments should have a batch name hash appened"
24
+
25
+ Resque.perform_test_jobs
26
+ assert_equal arguments, SampleJob.job_history, "The processing job should have recieved the arguments without the :batched_log_group options hash"
27
+ assert_empty Resque.test_jobs, "Queue should be empty"
28
+ end
29
+
30
+ def test_enqueueing_without_batched
31
+ arguments = [[1,2,3], [5,6,{:custom => :options}]]
32
+ arguments.each do |args|
33
+ Resque.enqueue(SampleJob, *args)
34
+ end
35
+ expected_job_list = arguments.collect {|j| [SampleJob] + j }
36
+ assert_equal expected_job_list, Resque.test_jobs.collect(&:args), "Enqueued arguments should be unmodified"
37
+
38
+ Resque.perform_test_jobs
39
+ assert_equal arguments, SampleJob.job_history, "The arguments recieved by the actual job should have been unmodified"
40
+ assert_empty Resque.test_jobs, "Queue should be empty"
41
+ end
42
+
43
+ def test_queuing_multiple_batches
44
+ arguments = [[1,2,3], [4,5], [5,6,{:custom => :options}]]
45
+ # Should be able to do this twice without raising an error
46
+ 2.times do
47
+ SampleJob.batched do
48
+ arguments.each do |args|
49
+ enqueue(*args)
50
+ end
51
+ end
52
+ assert_equal 4, Resque.test_jobs.size, "3 jobs + the logger job should have been queued"
53
+ Resque.perform_test_jobs
54
+ assert_empty Resque.test_jobs
55
+ end
56
+ end
57
+
58
+ def test_queueing_nothing
59
+ SampleJob.batched do
60
+ [].each do |args|
61
+ enqueue(*args)
62
+ end
63
+ end
64
+ Resque.perform_test_jobs
65
+ end
66
+
67
+ def test_single_jobs_dont_interrupt_batch
68
+ arguments = [[1,2,3], [4,5], [5,6,{:custom => :options}]]
69
+ SampleJob.batched do
70
+ enqueue(*arguments[0]) # Enqueue the first job as a batch
71
+ Resque.enqueue(SampleJob, *arguments[1]) # Enqueueing a non batched job in the middle of the batch being queued up (should be processed independently)
72
+ enqueue(*arguments[2]) # Enqueueing a second batched job
73
+ end
74
+ assert_equal 4, Resque.test_jobs.size, "3 jobs + the logger job should have been queued"
75
+ assert_equal 2, Resque.redis.get("SampleJob:jobcount").to_i, "should have listed 2 jobs on the queue"
76
+ Resque.perform_test_jobs(:limit => 3) # Process just the first 3 jobs (2 batched, 1 individual, don't process the logs yet)
77
+ assert_equal arguments, SampleJob.job_history
78
+ assert_equal 3, Resque.redis.llen("batch_stats:SampleJob"), "All 3 jobs will have been processed as batched"
79
+ Resque.perform_test_jobs # Do the log processing
80
+ assert_empty Resque.test_jobs
81
+ end
82
+
83
+ # Same test as 'test_single_jobs_dont_interrupt_batch', except we'll batch with the subclass, and hence the enqueueing of the superclas won't effect our logged job count
84
+ def test_sub_classed_batch_jobs
85
+ assert BatchedSampleJob.respond_to?(:perform)
86
+ assert_equal SampleJob, BatchedSampleJob.superclass
87
+
88
+ arguments = [[1,2,3], [4,5], [5,6,{:custom => :options}]]
89
+ BatchedSampleJob.batched do
90
+ enqueue(*arguments[0]) # Enqueue the first job as a batch
91
+ Resque.enqueue(SampleJob, *arguments[1]) # Enqueueing a non batched job in the middle of the batch being queued up (should be processed independently)
92
+ enqueue(*arguments[2]) # Enqueueing a second batched job
93
+ end
94
+ assert_equal 4, Resque.test_jobs.size, "3 jobs + the logger job should have been queued"
95
+ assert_equal 2, Resque.redis.get("BatchedSampleJob:jobcount").to_i, "should have listed 2 jobs on the queue"
96
+ Resque.perform_test_jobs(:limit => 3) # Process just the first 3 jobs (2 batched, 1 individual, don't process the logs yet)
97
+ assert_equal arguments, SampleJob.job_history
98
+ assert_equal 2, Resque.redis.llen("batch_stats:BatchedSampleJob"), "Only the 2 batched jobs should have been processed as batched"
99
+ Resque.perform_test_jobs # Do the log processing
100
+ assert_empty Resque.test_jobs
101
+ end
102
+
103
+ def teardown
104
+ global_teardown
105
+ end
106
+
107
+ end
metadata ADDED
@@ -0,0 +1,166 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: resque-batched-logger
3
+ version: !ruby/object:Gem::Version
4
+ hash: 13
5
+ prerelease:
6
+ segments:
7
+ - 0
8
+ - 0
9
+ - 9
10
+ version: 0.0.9
11
+ platform: ruby
12
+ authors:
13
+ - Jeremy Olliver
14
+ autorequire:
15
+ bindir: bin
16
+ cert_chain: []
17
+
18
+ date: 2011-03-31 00:00:00 +13:00
19
+ default_executable:
20
+ dependencies:
21
+ - !ruby/object:Gem::Dependency
22
+ name: resque
23
+ prerelease: false
24
+ requirement: &id001 !ruby/object:Gem::Requirement
25
+ none: false
26
+ requirements:
27
+ - - ">="
28
+ - !ruby/object:Gem::Version
29
+ hash: 3
30
+ segments:
31
+ - 0
32
+ version: "0"
33
+ type: :runtime
34
+ version_requirements: *id001
35
+ - !ruby/object:Gem::Dependency
36
+ name: redis
37
+ prerelease: false
38
+ requirement: &id002 !ruby/object:Gem::Requirement
39
+ none: false
40
+ requirements:
41
+ - - ">="
42
+ - !ruby/object:Gem::Version
43
+ hash: 3
44
+ segments:
45
+ - 0
46
+ version: "0"
47
+ type: :runtime
48
+ version_requirements: *id002
49
+ - !ruby/object:Gem::Dependency
50
+ name: json
51
+ prerelease: false
52
+ requirement: &id003 !ruby/object:Gem::Requirement
53
+ none: false
54
+ requirements:
55
+ - - ">="
56
+ - !ruby/object:Gem::Version
57
+ hash: 3
58
+ segments:
59
+ - 0
60
+ version: "0"
61
+ type: :runtime
62
+ version_requirements: *id003
63
+ - !ruby/object:Gem::Dependency
64
+ name: bundler
65
+ prerelease: false
66
+ requirement: &id004 !ruby/object:Gem::Requirement
67
+ none: false
68
+ requirements:
69
+ - - ">="
70
+ - !ruby/object:Gem::Version
71
+ hash: 3
72
+ segments:
73
+ - 0
74
+ version: "0"
75
+ type: :development
76
+ version_requirements: *id004
77
+ - !ruby/object:Gem::Dependency
78
+ name: minitest
79
+ prerelease: false
80
+ requirement: &id005 !ruby/object:Gem::Requirement
81
+ none: false
82
+ requirements:
83
+ - - ">="
84
+ - !ruby/object:Gem::Version
85
+ hash: 3
86
+ segments:
87
+ - 0
88
+ version: "0"
89
+ type: :development
90
+ version_requirements: *id005
91
+ - !ruby/object:Gem::Dependency
92
+ name: rcov
93
+ prerelease: false
94
+ requirement: &id006 !ruby/object:Gem::Requirement
95
+ none: false
96
+ requirements:
97
+ - - ">="
98
+ - !ruby/object:Gem::Version
99
+ hash: 3
100
+ segments:
101
+ - 0
102
+ version: "0"
103
+ type: :development
104
+ version_requirements: *id006
105
+ description: Allows resque jobs to be run in batches with aggregate logging, timing and statistics
106
+ email:
107
+ - jeremy.olliver@gmail.com
108
+ executables: []
109
+
110
+ extensions: []
111
+
112
+ extra_rdoc_files: []
113
+
114
+ files:
115
+ - .gitignore
116
+ - Gemfile
117
+ - README
118
+ - Rakefile
119
+ - lib/resque-batched-logger.rb
120
+ - lib/resque-batched-logger/version.rb
121
+ - lib/resque/plugins/batched_logger.rb
122
+ - lib/resque/plugins/batched_logging.rb
123
+ - resque-batched-logger.gemspec
124
+ - test/helper.rb
125
+ - test/shared_utilities.rb
126
+ - test/test_batched_logger.rb
127
+ - test/test_batched_logging.rb
128
+ has_rdoc: true
129
+ homepage: http://github.com/heaps/resque-batched-logger
130
+ licenses: []
131
+
132
+ post_install_message:
133
+ rdoc_options: []
134
+
135
+ require_paths:
136
+ - lib
137
+ required_ruby_version: !ruby/object:Gem::Requirement
138
+ none: false
139
+ requirements:
140
+ - - ">="
141
+ - !ruby/object:Gem::Version
142
+ hash: 3
143
+ segments:
144
+ - 0
145
+ version: "0"
146
+ required_rubygems_version: !ruby/object:Gem::Requirement
147
+ none: false
148
+ requirements:
149
+ - - ">="
150
+ - !ruby/object:Gem::Version
151
+ hash: 3
152
+ segments:
153
+ - 0
154
+ version: "0"
155
+ requirements: []
156
+
157
+ rubyforge_project:
158
+ rubygems_version: 1.6.2
159
+ signing_key:
160
+ specification_version: 3
161
+ summary: Allows resque jobs to be run in batches with aggregate logging
162
+ test_files:
163
+ - test/helper.rb
164
+ - test/shared_utilities.rb
165
+ - test/test_batched_logger.rb
166
+ - test/test_batched_logging.rb