processor 1.0.0 → 2.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (41) hide show
  1. data/.coveralls.yml +1 -0
  2. data/.gitignore +1 -0
  3. data/README.md +68 -41
  4. data/example/migrator.rb +2 -0
  5. data/lib/processor/data/array_processor.rb +4 -0
  6. data/lib/processor/data/null_processor.rb +7 -10
  7. data/lib/processor/environment.rb +3 -0
  8. data/lib/processor/event_processor.rb +25 -0
  9. data/lib/processor/logger_messages.rb +56 -0
  10. data/lib/processor/messenger.rb +16 -3
  11. data/lib/processor/observer/logger.rb +43 -34
  12. data/lib/processor/observer/null_observer.rb +8 -3
  13. data/lib/processor/process_runner/threads.rb +3 -8
  14. data/lib/processor/runner.rb +8 -23
  15. data/lib/processor/subroutine/counter.rb +25 -0
  16. data/lib/processor/subroutine/name.rb +28 -0
  17. data/lib/processor/subroutine/recursion.rb +29 -0
  18. data/lib/processor/thread.rb +5 -1
  19. data/lib/processor/version.rb +1 -1
  20. data/lib/processor.rb +2 -1
  21. data/processor.gemspec +6 -4
  22. data/spec/processor/data/array_processor_spec.rb +1 -2
  23. data/spec/processor/data/null_processor_spec.rb +27 -2
  24. data/spec/processor/event_processor_spec.rb +58 -0
  25. data/spec/processor/logger_messages_spec.rb +69 -0
  26. data/spec/processor/messenger_spec.rb +15 -0
  27. data/spec/processor/observer/logger_spec.rb +47 -16
  28. data/spec/processor/observer/null_observer_spec.rb +33 -0
  29. data/spec/processor/process_runner/specs.rb +12 -23
  30. data/spec/processor/process_runner/threads_spec.rb +1 -3
  31. data/spec/processor/runner_spec.rb +28 -54
  32. data/spec/processor/subroutine/counter_spec.rb +44 -0
  33. data/spec/processor/subroutine/name_spec.rb +23 -0
  34. data/spec/processor/subroutine/recursion_spec.rb +23 -0
  35. data/spec/processor/thread_spec.rb +1 -1
  36. data/spec/spec_helper_lite.rb +7 -2
  37. data/spec/support/dummy_file +0 -0
  38. metadata +47 -12
  39. data/lib/processor/data/solr_processor.rb +0 -23
  40. data/lib/processor/events_registrator.rb +0 -16
  41. data/spec/processor/events_registrator_spec.rb +0 -15
data/.coveralls.yml ADDED
@@ -0,0 +1 @@
1
+ service_name: travis-ci
data/.gitignore CHANGED
@@ -7,6 +7,7 @@ Gemfile.lock
7
7
  /tmp
8
8
  /log
9
9
  /tags
10
+ /coverage
10
11
 
11
12
  ## generic files to ignore
12
13
  *~
data/README.md CHANGED
@@ -1,15 +1,31 @@
1
1
  Processor
2
2
  ==========
3
3
  [![Build Status](https://travis-ci.org/AlexParamonov/processor.png?branch=master)](http://travis-ci.org/AlexParamonov/processor)
4
- [![Gemnasium Build Status](https://gemnasium.com/AlexParamonov/processor.png)](http://gemnasium.com/AlexParamonov/processor)
4
+ [![Gemnasium Build Status](https://gemnasium.com/AlexParamonov/processor.png)](http://gemnasium.com/AlexParamonov/processor)
5
+ [![Coverage Status](https://coveralls.io/repos/AlexParamonov/processor/badge.png?branch=master)](https://coveralls.io/r/AlexParamonov/processor?branch=master)
5
6
 
6
- Processor could execute any `DataProcessor` you specify and log entire
7
- process using any number of loggers you need.
8
7
 
9
- You may add own observers for monitoring background tasks on even send
10
- an email to bussiness with generated report.
8
+ Processor is a tool that helps to iterate over collection and
9
+ perform complex actions on a result. It is extremely useful in data
10
+ migrations, report generation, etc.
11
11
 
12
- Processor provide customisation for almost every part of it.
12
+ Collection could be iteratively fetched by parts but for processor
13
+ it will looks like an endless collection. There are a lot such tiny
14
+ goodness that makes usage of processor pleasant. Need logging,
15
+ exception processing, post/pre processing a result - no problems, all
16
+ included and easily extended.
17
+
18
+ Use the processor to DRY your migrations, report and stop mess with
19
+ logging and post processing.
20
+
21
+ Did I mentioned you can run in threads as easy as say
22
+ `processor.run_in_threads 10`?
23
+
24
+ Processor could execute any `DataProcessor` you specify and log
25
+ entire process using any number of loggers you need. You may add own
26
+ observers for monitoring background tasks on even send an email to
27
+ business with generated report. Processor provide customisation for
28
+ almost every part of it.
13
29
 
14
30
 
15
31
  Contents
@@ -21,6 +37,7 @@ Contents
21
37
  1. Run modes
22
38
  1. Processor Thread
23
39
  1. Observers
40
+ 1. Contacts
24
41
  1. Compatibility
25
42
  1. Contributing
26
43
  1. Copyright
@@ -51,20 +68,29 @@ Usage
51
68
  ------------
52
69
 
53
70
  ### Data processors
54
- Actual processing is done by a Data Processor, provided by end user.
55
- This processor should implement in general 2 methods:
71
+ Working with a data is responsibility of a `DataProcessor`.
56
72
 
57
- 1. `process(record)`
58
- 1. `records`
73
+ `DataProcessor` should obtain records to process by its `records`
74
+ method and process a record by `process` method. If some post/pre
75
+ action is needed, it could be performed inside `start` and `finish`
76
+ methods. In case of exceptions there are `error(exception)` and
77
+ `record_error(record, exception)` methods. `error` method is called if
78
+ unprocessed errors happened during processing and `record_error` if
79
+ processing current record raised. `finalize` method will run in any
80
+ case allowing you to gracefully finalize processing.
59
81
 
60
- But it is recomended to implement a `name` method also, because it
61
- is required by several observers. Inherit your Data Processor from
62
- NullProcessor to get default behavior out of the box.
82
+ To add new `DataProcessor` it is recommended to inherit from
83
+ `NullProcessor` and implement methods that are needed only.
63
84
 
64
- See `Processor::Example::Migration` for example (`example/migration.rb`).
85
+ `Processor` provides several data processors:
65
86
 
66
- There are several predefined data processors you can reuse:
87
+ 1. NullProcessor [[code](https://github.com/AlexParamonov/processor/blob/master/lib/processor/data/null_processor.rb), [specs](https://github.com/AlexParamonov/processor/blob/master/spec/processor/data/null_processor_spec.rb)]
88
+ 1. ArrayProcessor [[code](https://github.com/AlexParamonov/processor/blob/master/lib/processor/data/array_processor.rb), [specs](https://github.com/AlexParamonov/processor/blob/master/spec/processor/data/array_processor_spec.rb)]
89
+ 1. BatchProcessor [[code](https://github.com/AlexParamonov/processor/blob/master/lib/processor/data/batch_processor.rb), [specs](https://github.com/AlexParamonov/processor/blob/master/spec/processor/data/batch_processor_spec.rb)]
90
+ 1. CsvProcessor
91
+ 1. SolrPagesProcessor
67
92
 
93
+ The last two are more as example, your probably would change them.
68
94
 
69
95
  #### ArrayProcessor
70
96
  The simplest one: `process` and `records` methods should be implemented.
@@ -73,13 +99,12 @@ The simplest one: `process` and `records` methods should be implemented.
73
99
  #### BatchProcessor
74
100
  Allows to fetch records by batches of defined size.
75
101
 
76
- It is based on `query` method that suppose to run a query method on
77
- database.
102
+ It is based on `query` method that is supposed to run a query method on
103
+ a database.
78
104
 
79
- Recomended to override `fetch_batch` method to get real reason to
80
- use batch processing. `fetch_batch` could be `query.first(10)` or
81
- `query.page(next_page)`. See `data/solr_pages_processor.rb` and
82
- `data/solr_processor.rb` for example.
105
+ Recommended to override `fetch_batch` method to get real reason to use
106
+ batch processing. `fetch_batch` could be `query.page(next_page)` for
107
+ example. See `data/solr_pages_processor.rb`.
83
108
 
84
109
 
85
110
  #### Other
@@ -94,7 +119,7 @@ Currently 2 run modes are supported:
94
119
  It runs `process` one by one for each found record returned by
95
120
  `records` method.
96
121
 
97
- Recomended to call it using a `Processor::Thread`:
122
+ Call it using a `Processor::Thread`:
98
123
  ``` ruby
99
124
  Processor::Thread.new(migration).run_successive
100
125
  ```
@@ -109,19 +134,20 @@ constructor:
109
134
  Processor::ProcessRunner::Threads.new 5
110
135
  ```
111
136
 
112
- Recomended to call it using a Processor::Thread :
137
+ Call it using a `Processor::Thread`:
113
138
  ``` ruby
114
139
  Processor::Thread.new(migration).run_in_threads 5
115
140
  ```
116
141
 
117
142
 
118
143
  ### Observers
119
- Processor support unlimited number of observers, watching processing.
144
+ Processor support unlimited number of observers that are watching
145
+ processing.
120
146
 
121
- Thay could monitor running migrations and output to logs, console or
122
- file usefull information. Or thay can show a progress bar to your
123
- console. Or pack a generated report to archive and send by email to
124
- bussiness on success or notify developers on failure.
147
+ They could monitor `Data Processor`s and output to logs, console
148
+ or file. Or they can show a progress bar on the console. Or pack a
149
+ generated report to archive and send it by email to the business on
150
+ success or notify developers on failure.
125
151
 
126
152
 
127
153
  This observers should respond to `update` method. But if you inherit
@@ -134,7 +160,7 @@ Read below section Processor Thread to see how to use observers in runner.
134
160
 
135
161
  ### Processor Thread
136
162
  `Processor::Thread` is a Facade pattern. It simplifies access to all
137
- Processor classes and provide __stable__ interface.
163
+ Processor classes and provides __stable__ interface.
138
164
 
139
165
  Creating a new Thread:
140
166
  ``` ruby
@@ -149,20 +175,13 @@ Processor::Thread.new data_processor, observer1, observer2, ...
149
175
  Instance have a `run_as` method that accepts a block:
150
176
  ``` ruby
151
177
  thread = Processor::Thread.new @migration
152
- thread.run_as do |processor, *|
178
+ thread.run_as do |processor|
153
179
  processor.records.each do |record|
154
180
  processor.process record
155
181
  end
156
182
  end
157
183
  ```
158
184
 
159
- Block could accept next arguments: `processor`, `events`,
160
- `recursion_preventer` method. Last one could be called to prevent
161
- recurtion:
162
- ``` ruby
163
- recursion_preventer.call
164
- ```
165
-
166
185
  Instance have a `run_successive` method:
167
186
  ``` ruby
168
187
  data_processor = UserLocationMigration.new
@@ -180,7 +199,7 @@ thread.run_in_threads 10
180
199
  See `spec/processor/thread_spec.rb` and `spec/example_spec.rb` and
181
200
  `example` directory for other usage examples.
182
201
 
183
- It is recomended to wrap Processor::Thread by classes named like:
202
+ It is recommended to wrap Processor::Thread by classes named like:
184
203
  ``` ruby
185
204
  WeeklyReport
186
205
  TaxonomyMigration
@@ -191,11 +210,11 @@ The point is to hide configuration of observers and use (if you wish)
191
210
  your own API to run reports or migrations:
192
211
  ``` ruby
193
212
  weekly_report.create_and_deliver
194
- user_data_import.import_from_csv(file)
213
+ user_data_import.from_csv(file)
195
214
  etc.
196
215
  ```
197
216
 
198
- It is possible to use it raw, but please dont fear to add a wrapper
217
+ It is possible to use it raw, but please don't fear to add a wrapper
199
218
  class like `CsvUserImport` for this:
200
219
  ``` ruby
201
220
  csv_data_processor = Processor::Data::CsvProcessor.new file
@@ -214,6 +233,14 @@ More documentation could be found by running
214
233
  rspec
215
234
  ```
216
235
 
236
+ Find more examples under [example directory](https://github.com/AlexParamonov/processor/tree/master/example)
237
+
238
+ Contacts
239
+ -------------
240
+ Have questions or recommendations? Contact me via `alexander.n.paramonov@gmail.com`
241
+ Found a bug or have enhancement request? You are welcome at [Github bugtracker](https://github.com/AlexParamonov/processor/issues)
242
+
243
+
217
244
  Compatibility
218
245
  -------------
219
246
  tested with Ruby
@@ -222,7 +249,7 @@ tested with Ruby
222
249
  * rbx-19mode
223
250
  * ruby-head
224
251
 
225
- see [build history](http://travis-ci.org/#!/AlexParamonov/processor/builds)
252
+ See [build history](http://travis-ci.org/#!/AlexParamonov/processor/builds)
226
253
 
227
254
  Contributing
228
255
  -------------
data/example/migrator.rb CHANGED
@@ -43,6 +43,7 @@ module Processor
43
43
 
44
44
  stdout_logger_debug = Processor::Observer::Logger.new(logger1, messenger: messenger)
45
45
  stdout_logger_info = Processor::Observer::Logger.new(logger2)
46
+ # file_logger_info = Processor::Observer::Logger.new
46
47
  your_custom_observer1 = Observer::NullObserver.new
47
48
  your_custom_observer2 = Observer::NullObserver.new
48
49
 
@@ -50,6 +51,7 @@ module Processor
50
51
  migration,
51
52
  stdout_logger_debug,
52
53
  stdout_logger_info,
54
+ # file_logger_info,
53
55
  your_custom_observer1,
54
56
  your_custom_observer2,
55
57
  )
@@ -6,6 +6,10 @@ module Processor
6
6
  def records
7
7
  raise NotImplementedError
8
8
  end
9
+
10
+ def total_records
11
+ @total_records ||= records.count
12
+ end
9
13
  end
10
14
  end
11
15
  end
@@ -1,6 +1,12 @@
1
1
  module Processor
2
2
  module Data
3
3
  class NullProcessor
4
+ def start; end
5
+ def finish; end
6
+ def finalize; end
7
+ def error(exception); end
8
+ def record_error(record, exception); end
9
+
4
10
  def process(record)
5
11
  # do nothing
6
12
  end
@@ -10,16 +16,7 @@ module Processor
10
16
  end
11
17
 
12
18
  def total_records
13
- @total_records ||= records.count
14
- end
15
-
16
- def name
17
- # underscore a class name
18
- self.class.name.to_s.
19
- gsub(/::/, '_').
20
- gsub(/([A-Z]+)([A-Z][a-z])/,'\1_\2').
21
- gsub(/([a-z\d])([A-Z])/,'\1_\2').
22
- downcase
19
+ 0
23
20
  end
24
21
  end
25
22
  end
@@ -0,0 +1,3 @@
1
+ module Processor
2
+ RUNNING_ON_CI = !!ENV['CI']
3
+ end
@@ -0,0 +1,25 @@
1
+ module Processor
2
+ class EventProcessor
3
+ def initialize(processor, observers = [])
4
+ @observers = observers
5
+ @processor = processor
6
+ end
7
+
8
+ def register(event, *data)
9
+ observers.each do |observer|
10
+ observer.update event.to_sym, processor, *data
11
+ end
12
+ end
13
+
14
+ def method_missing(method, *args)
15
+ register "before_#{method}", *args
16
+ result = processor.public_send method, *args
17
+ register "after_#{method}", result, *args
18
+
19
+ result
20
+ end
21
+
22
+ private
23
+ attr_reader :observers, :processor
24
+ end
25
+ end
@@ -0,0 +1,56 @@
1
+ require 'logger'
2
+ require 'delegate'
3
+
4
+ module Processor
5
+ class LoggerMessages < SimpleDelegator
6
+ def initialize(logger)
7
+ log_device = fetch_log_device logger
8
+ messages = case log_device
9
+ when File::NULL then NullMessages.new
10
+ when String then FileMessages.new log_device
11
+ when File then FileMessages.new log_device.path
12
+ when IO then IoMessages.new
13
+ else NullMessages.new
14
+ end
15
+
16
+ super messages
17
+ end
18
+
19
+ class NullMessages
20
+ def initialize(filename = "")
21
+ @filename = filename
22
+ end
23
+ def initialized; "" end
24
+ def finished; "" end
25
+ private
26
+ attr_reader :filename
27
+ end
28
+
29
+ class IoMessages < NullMessages
30
+ def initialized
31
+ "Proggress will be streaming to provided IO object"
32
+ end
33
+ end
34
+
35
+ class FileMessages < NullMessages
36
+ def initialized
37
+ <<-MESSAGE
38
+ Proggress will be saved to the log file. Run
39
+ tail -f #{filename}
40
+ to see log in realtime
41
+ MESSAGE
42
+ end
43
+
44
+ def finished
45
+ "Log file saved to #{filename}"
46
+ end
47
+ end
48
+
49
+ private
50
+ def fetch_log_device logger
51
+ return unless logger.is_a? ::Logger
52
+ log_dev = logger.instance_variable_get "@logdev"
53
+ log_dev.filename or log_dev.dev
54
+ end
55
+ end
56
+ end
@@ -1,8 +1,10 @@
1
- require "logger"
1
+ require 'logger'
2
+ require 'delegate'
2
3
 
3
4
  module Processor
4
5
  class Messenger < SimpleDelegator
5
- def initialize(level = :info, file = STDOUT)
6
+ def initialize(level = :info, file = STDOUT, sender = nil)
7
+ @sender = sender
6
8
  if level == :null
7
9
  file = "/dev/null"
8
10
  level = :fatal
@@ -18,17 +20,28 @@ module Processor
18
20
  super logger
19
21
  end
20
22
 
23
+ %w[debug info warn error fatal unknown].each do |level|
24
+ define_method level do |message, &block|
25
+ return if message.nil? || message.empty?
26
+ add(Logger.const_get(level.upcase), nil, message, &block)
27
+ end
28
+ end
29
+
21
30
  def message(*args)
22
31
  self.info *args
23
32
  end
24
33
 
25
34
  private
35
+ attr_reader :sender
26
36
  def format_message(severity, datetime, progname, message)
27
37
  lines = message.split("\n").map do |line|
28
38
  "> %s" % line.gsub(/^\s+/, '')
29
39
  end.join("\n")
30
40
 
31
- "\n#{severity} message on #{datetime}:\n#{lines}\n\n"
41
+ message = "\n#{severity} message"
42
+ message << " from #{sender}" if sender
43
+ message << " on #{datetime}"
44
+ message << ":\n#{lines}\n"
32
45
  end
33
46
  end
34
47
  end
@@ -1,65 +1,78 @@
1
- require_relative 'null_observer'
2
1
  require 'logger'
2
+ require 'ostruct'
3
+ require_relative 'null_observer'
4
+ require 'processor/subroutine/name'
5
+ require 'processor/logger_messages'
3
6
 
4
7
  module Processor
5
8
  module Observer
6
9
  class Logger < NullObserver
7
10
  def initialize(logger = nil, options = {})
8
11
  @logger_source = logger
12
+ @messages = options.fetch :messages, nil
13
+ @messages = OpenStruct.new @messages if @messages.is_a? Hash
14
+
9
15
  super options
10
16
  end
11
17
 
12
- def processing_started(processor)
13
- initialize_logger(processor)
14
- logger.info "Processing of #{processor.name} started."
15
-
16
- message = <<-MESSAGE
17
- Proggress will be saved to the log file. Run
18
- tail -f #{log_file_name}
19
- to see log in realtime
20
- MESSAGE
21
-
22
- messenger.info message if use_log_file?
18
+ def after_start(result)
19
+ logger.info "Processing of #{processor_name} started."
20
+ messenger.info messages.initialized
23
21
  end
24
22
 
25
- def before_record_processing(record)
23
+ def before_process(record)
26
24
  logger.debug "Record #{id_for record} is going to be processed"
27
25
  end
28
26
 
29
- def after_record_processing(record, result)
27
+ def after_process(result, record)
30
28
  logger.info "Processed #{id_for record}: #{result}"
31
29
  end
32
30
 
33
- def processing_finished(processor)
34
- logger.info "Processing of #{processor.name} finished."
35
- messenger.info "Log file saved to #{log_file_name}" if use_log_file?
31
+ def after_finalize(result)
32
+ logger.info "Processing of #{processor_name} finished."
33
+ messenger.message messages.finished
36
34
  end
37
35
 
38
- def record_processing_error(record, exception)
36
+ def after_record_error(result, record, exception)
39
37
  logger.error "Error processing #{id_for record}: #{exception}"
40
38
  end
41
39
 
42
- def processing_error(processor, exception)
43
- logger.fatal "Processing #{processor.name} FAILED: #{exception.backtrace}"
40
+ def after_error(result, exception)
41
+ logger.fatal "Processing #{processor_name} FAILED: #{exception.backtrace}"
44
42
  end
45
43
 
46
- private
47
- attr_reader :logger, :log_file_name
48
-
49
- def initialize_logger(processor)
50
- @logger =
51
- if @logger_source.is_a? Proc
52
- @logger_source.call processor.name
44
+ def logger
45
+ @logger ||= begin
46
+ if @logger_source.is_a? Proc
47
+ @logger_source.call processor_name
53
48
  else
54
- @logger_source or ::Logger.new(create_log_filename(processor.name)).tap do |logger|
49
+ @logger_source or ::Logger.new(create_log_filename(processor_name)).tap do |logger|
55
50
  logger.level = ::Logger::INFO
56
51
  end
57
52
  end
58
- messenger.debug "Observer initialized with logger #{@logger}"
53
+ end
54
+ end
55
+
56
+ private
57
+
58
+ def messages
59
+ @messages ||= LoggerMessages.new logger
60
+ end
61
+
62
+ def processor_name
63
+ @processor_name ||=
64
+ begin
65
+ @processor = Subroutine::Name.new processor unless processor.respond_to? :name
66
+ processor.name
67
+ end
59
68
  end
60
69
 
61
70
  def create_log_filename(processor_name)
62
- FileUtils.mkdir log_directory unless File.directory? log_directory
71
+ unless File.directory? log_directory
72
+ FileUtils.mkdir log_directory
73
+ messenger.warn "Created new directory for logs: #{File.absolute_path log_directory}"
74
+ end
75
+
63
76
  @log_file_name = "#{log_directory}/#{processor_name}_on_#{current_time_string}.log"
64
77
  end
65
78
 
@@ -67,10 +80,6 @@ module Processor
67
80
  "log"
68
81
  end
69
82
 
70
- def use_log_file?
71
- not log_file_name.nil?
72
- end
73
-
74
83
  def current_time_string
75
84
  Time.now.gmtime.strftime "%Y-%m-%d_%H%M%S_UTC"
76
85
  end
@@ -4,12 +4,17 @@ require 'processor/messenger'
4
4
  module Processor
5
5
  module Observer
6
6
  class NullObserver
7
+ attr_reader :processor
8
+
7
9
  def initialize(options = {})
8
- @messenger = options.fetch :messenger, Processor::Messenger.new(:info)
10
+ @messenger = options.fetch :messenger, Processor::Messenger.new(:info, STDOUT, self.class.name)
11
+ @processor = options.fetch :processor, nil
9
12
  end
10
13
 
11
- def method_missing(*); end
12
- alias_method :update, :send
14
+ def update(method_name, processor = nil, *args)
15
+ @processor ||= processor
16
+ send method_name, *args if respond_to? method_name
17
+ end
13
18
 
14
19
  private
15
20
  attr_reader :messenger
@@ -6,23 +6,18 @@ module Processor
6
6
  @threads = []
7
7
  end
8
8
 
9
- def call(processor, events, recursion_preventer)
9
+ def call(processor)
10
10
  join_threads
11
11
 
12
12
  begin
13
13
  processor.records.each do |record|
14
- recursion_preventer.call
15
14
  if threads_created >= number_of_threads then join_threads end
16
15
 
17
16
  new_thread(processor, record) do |thread_data_processor, thread_record|
18
17
  begin
19
- events.register :before_record_processing, thread_record
20
-
21
- result = thread_data_processor.process(thread_record)
22
-
23
- events.register :after_record_processing, thread_record, result
18
+ thread_data_processor.process(thread_record)
24
19
  rescue StandardError => exception
25
- events.register :record_processing_error, thread_record, exception
20
+ thread_data_processor.record_error thread_record, exception
26
21
  end
27
22
  end
28
23
 
@@ -1,38 +1,23 @@
1
- require_relative "events_registrator"
2
-
3
1
  module Processor
4
2
  class Runner
5
- def initialize(processor, events_registrator)
3
+ def initialize(processor)
6
4
  @processor = processor
7
- @events = events_registrator
8
5
  end
9
6
 
10
7
  def run(process_runner)
11
- events.register :processing_started, processor
12
-
13
- process_runner.call processor, events, method(:recursion_preventer)
8
+ processor.start
9
+ process_runner.call processor
10
+ processor.finish
14
11
 
15
- events.register :processing_finished, processor
16
12
  rescue Exception => exception
17
- events.register :processing_error, processor, exception
13
+ processor.error exception
18
14
  raise exception
19
- end
20
15
 
21
- protected
22
- attr_writer :counter
23
- def counter
24
- @counter ||= 0
16
+ ensure
17
+ processor.finalize
25
18
  end
26
19
 
27
20
  private
28
- attr_reader :events, :processor
29
- def recursion_preventer
30
- self.counter += 1
31
- raise Exception, "Processing fall into recursion. Check logs." if self.counter > max_records_to_process
32
- end
33
-
34
- def max_records_to_process
35
- @max_records_to_process ||= (processor.total_records * 1.1).round + 10
36
- end
21
+ attr_reader :processor
37
22
  end
38
23
  end
@@ -0,0 +1,25 @@
1
+ require 'delegate'
2
+
3
+ module Processor
4
+ module Subroutine
5
+ class Counter < ::SimpleDelegator
6
+ def process(*)
7
+ super
8
+ record_processed
9
+ end
10
+
11
+ def remaining_records_count
12
+ [ total_records - processed_records_count, 0 ].max
13
+ end
14
+
15
+ def processed_records_count
16
+ @processed_records_count ||= 0
17
+ end
18
+
19
+ private
20
+ def record_processed
21
+ @processed_records_count = processed_records_count + 1
22
+ end
23
+ end
24
+ end
25
+ end