burner 1.0.0.pre.alpha → 1.0.0.pre.alpha.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 920f3b0f0027e21b30da75f86335d75d2d351f49c8b68662de7226ade43474c6
4
- data.tar.gz: b7f4189e37023f1d627ea5222df0f397a24a69c7ce3721b13ce714d2e274ee94
3
+ metadata.gz: 76ba9eda7f18a68f4c24c11d4724e2021c6f1d92ca4878fafb118e6c897c018c
4
+ data.tar.gz: 95e35512ece9b5f8acb912305d0609c86354977d3c8bd6e125a18db0584778fc
5
5
  SHA512:
6
- metadata.gz: d488edf32a95e5da64190c512f365d76e2414d16712cee9b56a400bd62e11ed7824a316a95923af15f0ecac3390e4be359d2e31ffc33fb13129309c4efce9403
7
- data.tar.gz: 8640cfdb2fa4241fb2a493c2ac2aed62e08e1826446907f4b16d043276fffee656c787d2079545fe03d52b8f45756240a523ad878fc89063884868c3f94bfecd
6
+ metadata.gz: 47b53c96680edfd6ca08e637d2ab6fd738684e9c2e7e8d08ca6d52a6361b8cf3cfa55af072cfeeb41a44838e76a4474aa229fc23b86f54558603c9af718f5fa0
7
+ data.tar.gz: 804da88a604544877d7b4f34bafcf1079f3282ad0c34da69466b83bfbd37fb30589f1cb38b02aafa4ccfd17f9e24aabf28385d06a8b80c3d2f552d4e71d3fd7d
data/README.md CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  [![Gem Version](https://badge.fury.io/rb/burner.svg)](https://badge.fury.io/rb/burner) [![Build Status](https://travis-ci.org/bluemarblepayroll/burner.svg?branch=master)](https://travis-ci.org/bluemarblepayroll/burner) [![Maintainability](https://api.codeclimate.com/v1/badges/dbc3757929b67504f6ca/maintainability)](https://codeclimate.com/github/bluemarblepayroll/burner/maintainability) [![Test Coverage](https://api.codeclimate.com/v1/badges/dbc3757929b67504f6ca/test_coverage)](https://codeclimate.com/github/bluemarblepayroll/burner/test_coverage) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
4
4
 
5
- TODO
5
+ This library serves as the skeleton for a processing engine. It allows you to organize your code into Jobs, then stitch those jobs together as steps.
6
6
 
7
7
  ## Installation
8
8
 
@@ -20,7 +20,297 @@ bundle add burner
20
20
 
21
21
  ## Examples
22
22
 
23
- TODO
23
+ The purpose of this library is to provide a framework for creating highly de-coupled functions (known as jobs), and then allow for the stitching of them back together in any arbitrary order (know as steps.) Although our example will be somewhat specific and contrived, the only limit to what the jobs and order of jobs are is up to your imagination.
24
+
25
+ ### JSON-to-YAML File Converter
26
+
27
+ All the jobs for this example are shipped with this library. In this example, we will write a pipeline that can read a JSON file and convert it to YAML. Pipelines are data-first so we can represent a pipeline using a hash:
28
+
29
+ ````ruby
30
+ pipeline = {
31
+ jobs: [
32
+ {
33
+ name: :read,
34
+ type: 'io/read',
35
+ path: '{input_file}'
36
+ },
37
+ {
38
+ name: :output_id,
39
+ type: :echo,
40
+ message: 'The job id is: {__id}'
41
+ },
42
+ {
43
+ name: :output_value,
44
+ type: :echo,
45
+ message: 'The current value is: {__value}'
46
+ },
47
+ {
48
+ name: :parse,
49
+ type: 'deserialize/json'
50
+ },
51
+ {
52
+ name: :convert,
53
+ type: 'serialize/yaml'
54
+ },
55
+ {
56
+ name: :write,
57
+ type: 'io/write',
58
+ path: '{output_file}'
59
+ }
60
+ ],
61
+ steps: %i[
62
+ read
63
+ output_id
64
+ output_value
65
+ parse
66
+ convert
67
+ output_value
68
+ write
69
+ ]
70
+ }
71
+
72
+ params = {
73
+ input_file: 'input.json',
74
+ output_file: 'output.yaml'
75
+ }
76
+ ````
77
+
78
+ Assuming we are running this script from a directory where an `input.json` file exists, we can then programatically process the pipeline:
79
+
80
+ ````ruby
81
+ Burner::Pipeline.make(pipeline).execute(params: params)
82
+ ````
83
+
84
+ We should now see a output.yaml file created.
85
+
86
+ Some notes:
87
+
88
+ * Some values are able to be string-interpolated using the provided params. This allows for the passing runtime configuration/data into pipelines/jobs.
89
+ * The job's ID can be accessed using the `__id` key.
90
+ * The current job's payload value can be accessed using the `__value` key.
91
+ * Jobs can be re-used (just like the output_id and output_value jobs).
92
+
93
+ ### Capturing Feedback / Output
94
+
95
+ By default, output will be emitted to `$stdout`. You can add or change listeners by passing in optional values into Pipeline#execute. For example, say we wanted to capture the output from our json-to-yaml example:
96
+
97
+ ````ruby
98
+ class StringOut
99
+ def initialize
100
+ @io = StringIO.new
101
+ end
102
+
103
+ def puts(msg)
104
+ tap { io.write("#{msg}\n") }
105
+ end
106
+
107
+ def read
108
+ io.rewind
109
+ io.read
110
+ end
111
+
112
+ private
113
+
114
+ attr_reader :io
115
+ end
116
+
117
+ string_out = StringOut.new
118
+ output = Burner::Output.new(outs: string_out)
119
+
120
+ Burner::Pipeline.make(pipeline).execute(output: output, params: params)
121
+
122
+ log = string_out.read
123
+ ````
124
+
125
+ The value of `log` should now look similar to:
126
+
127
+ ````bash
128
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] Pipeline started with 7 step(s)
129
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] Parameters:
130
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - input_file: input.json
131
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - output_file: output.yaml
132
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] --------------------------------------------------------------------------------
133
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] [1] Burner::Jobs::IO::Read::read
134
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Reading: spec/fixtures/input.json
135
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Completed in: 0.0 second(s)
136
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] [2] Burner::Jobs::Echo::output_id
137
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - The job id is:
138
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Completed in: 0.0 second(s)
139
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] [3] Burner::Jobs::Echo::output_value
140
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - The current value is:
141
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Completed in: 0.0 second(s)
142
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] [4] Burner::Jobs::Deserialize::Json::parse
143
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Completed in: 0.0 second(s)
144
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] [5] Burner::Jobs::Serialize::Yaml::convert
145
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Completed in: 0.0 second(s)
146
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] [6] Burner::Jobs::Echo::output_value
147
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - The current value is:
148
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Completed in: 0.0 second(s)
149
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] [7] Burner::Jobs::IO::Write::write
150
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Writing: output.yaml
151
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] - Completed in: 0.0 second(s)
152
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] --------------------------------------------------------------------------------
153
+ [8bdc394e-7047-4a1a-87ed-6c54ed690ed5 | 2020-10-14 13:49:59 UTC] Pipeline ended, took 0.001 second(s) to complete
154
+ ````
155
+
156
+ Notes:
157
+
158
+ * The Job ID is specified as the leading UUID in each line.
159
+ * `outs` can be provided an array of listeners, as long as each listener responds to `puts(msg)`.
160
+
161
+ ### Command Line Pipeline Processing
162
+
163
+ This library also ships with a built-in script `exe/burner` that illustrates using the `Burner::Cli` API. This class can take in an array of arguments (similar to a command-line) and execute a pipeline. The first argument is the path to a YAML file with the pipeline's configuration and each subsequent argument is a param in `key=value` form. Here is how the json-to-yaml example can utilize this interface:
164
+
165
+ #### Create YAML Pipeline Configuration File
166
+
167
+ Write the following json_to_yaml_pipeline.yaml file to disk:
168
+
169
+ ````yaml
170
+ jobs:
171
+ - name: read
172
+ type: io/read
173
+ path: '{input_file}'
174
+
175
+ - name: output_id
176
+ type: echo
177
+ message: 'The job id is: {__id}'
178
+
179
+ - name: output_value
180
+ type: echo
181
+ message: 'The current value is: {__value}'
182
+
183
+ - name: parse
184
+ type: deserialize/json
185
+
186
+ - name: convert
187
+ type: serialize/yaml
188
+
189
+ - name: write
190
+ type: io/write
191
+ path: '{output_file}'
192
+
193
+ steps:
194
+ - read
195
+ - output_id
196
+ - output_value
197
+ - parse
198
+ - convert
199
+ - output_value
200
+ - write
201
+ ````
202
+
203
+ #### Run Using Script
204
+
205
+ From the command-line, run:
206
+
207
+ ````bash
208
+ bundle exec burner json_to_yaml_pipeline.yaml input_file=input.json output_file=output.yaml
209
+ ````
210
+
211
+ The pipeline should be processed and output.yaml should be created.
212
+
213
+ #### Run Using Programmatic API
214
+
215
+ Instead of the script, you can invoke it using code:
216
+
217
+ ````ruby
218
+ args = %w[
219
+ json_to_yaml_pipeline.yaml
220
+ input_file=input.json
221
+ output_file=output.yaml
222
+ ]
223
+
224
+ Burner::Cli.new(args).invoke
225
+ ````
226
+
227
+ ### Core Job Library
228
+
229
+ This library only ships with very basic, rudimentary jobs that are meant to just serve as a baseline:
230
+
231
+ * **deserialize/json** []: Treat input as a string and de-serialize it to JSON.
232
+ * **deserialize/yaml** [safe]: Treat input as a string and de-serialize it to YAML. By default it will try and (safely de-serialize)[https://ruby-doc.org/stdlib-2.6.1/libdoc/psych/rdoc/Psych.html#method-c-safe_load] it (only using core classes). If you wish to de-serialize it to any class type, pass in `safe: false`
233
+ * **dummy** []: Do nothing
234
+ * **echo** [message]: Write a message to the output. The message parameter can be interpolated using params.
235
+ * **io/exist** [path, short_circuit]: Check to see if a file exists. The path parameter can be interpolated using params. If short_circuit was set to true (defaults to false) and the file does not exist then the pipeline will be short-circuited.
236
+ * **io/read** [binary, path]: Read in a local file. The path parameter can be interpolated using params. If the contents are binary, pass in `binary: true` to open it up in binary+read mode.
237
+ * **io/write** [binary, path]: Write to a local file. The path parameter can be interpolated using params. If the contents are binary, pass in `binary: true` to open it up in binary+write mode.
238
+ * **serialize/json** []: Convert value to JSON.
239
+ * **serialize/yaml** []: Convert value to YAML.
240
+ * **set** [value]: Set the value to any arbitrary value.
241
+ * **sleep** [seconds]: Sleep the thread for X number of seconds.
242
+
243
+
244
+ ### Adding & Registering Jobs
245
+
246
+ Where this library shines is when additional jobs are plugged in. Burner uses its `Burner::Jobs` class as its class-level registry built with (acts_as_hashable)[https://github.com/bluemarblepayroll/acts_as_hashable]'s acts_as_hashable_factory directive.
247
+
248
+ Let's say we would like to register a job to parse a CSV:
249
+
250
+ ````ruby
251
+ class ParseCsv < Burner::Job
252
+ def perform(output, payload, params)
253
+ payload.value = CSV.parse(payload.value, headers: true).map(&:to_h)
254
+
255
+ nil
256
+ end
257
+ end
258
+
259
+ Burner::Jobs.register('parse_csv', ParseCsv)
260
+ ````
261
+
262
+ `parse_csv` is now recognized as a valid job and we can use it:
263
+
264
+ ````ruby
265
+ pipeline = {
266
+ jobs: [
267
+ {
268
+ name: :read,
269
+ type: 'io/read',
270
+ path: '{input_file}'
271
+ },
272
+ {
273
+ name: :output_id,
274
+ type: :echo,
275
+ message: 'The job id is: {__id}'
276
+ },
277
+ {
278
+ name: :output_value,
279
+ type: :echo,
280
+ message: 'The current value is: {__value}'
281
+ },
282
+ {
283
+ name: :parse,
284
+ type: :parse_csv
285
+ },
286
+ {
287
+ name: :convert,
288
+ type: 'serialize/yaml'
289
+ },
290
+ {
291
+ name: :write,
292
+ type: 'io/write',
293
+ path: '{output_file}'
294
+ }
295
+ ],
296
+ steps: %i[
297
+ read
298
+ output_id
299
+ output_value
300
+ parse
301
+ convert
302
+ output_value
303
+ write
304
+ ]
305
+ }
306
+
307
+ params = {
308
+ input_file: File.join('spec', 'fixtures', 'cars.csv'),
309
+ output_file: File.join(TEMP_DIR, "#{SecureRandom.uuid}.yaml")
310
+ }
311
+
312
+ Burner::Pipeline.make(pipeline).execute(output: output, params: params)
313
+ ````
24
314
 
25
315
  ## Contributing
26
316
 
data/exe/burner CHANGED
@@ -18,4 +18,4 @@ if ARGV.empty?
18
18
  end
19
19
 
20
20
  # This should return exit code of 1 if it raises any hard errors.
21
- Burner::Cli.new(ARGV).execute
21
+ Burner::Cli.new(ARGV).invoke
@@ -22,7 +22,7 @@ module Burner
22
22
  @params = (config['params'] || {}).merge(cli_params)
23
23
  end
24
24
 
25
- def execute
25
+ def invoke
26
26
  pipeline.execute(params: params)
27
27
  end
28
28
 
@@ -12,6 +12,7 @@ require_relative 'jobs/deserialize/json'
12
12
  require_relative 'jobs/deserialize/yaml'
13
13
  require_relative 'jobs/dummy'
14
14
  require_relative 'jobs/echo'
15
+ require_relative 'jobs/io/exist'
15
16
  require_relative 'jobs/io/read'
16
17
  require_relative 'jobs/io/write'
17
18
  require_relative 'jobs/serialize/json'
@@ -30,6 +31,7 @@ module Burner
30
31
  register 'deserialize/yaml', Deserialize::Yaml
31
32
  register 'dummy', '', Dummy
32
33
  register 'echo', Echo
34
+ register 'io/exist', IO::Exist
33
35
  register 'io/read', IO::Read
34
36
  register 'io/write', IO::Write
35
37
  register 'serialize/json', Serialize::Json
@@ -12,17 +12,14 @@ module Burner
12
12
  module IO
13
13
  # Common configuration/code for all IO Job subclasses.
14
14
  class Base < Job
15
- attr_reader :binary, :path
15
+ attr_reader :path
16
16
 
17
- def initialize(name:, path:, binary: false)
17
+ def initialize(name:, path:)
18
18
  super(name: name)
19
19
 
20
20
  raise ArgumentError, 'path is required' if path.to_s.empty?
21
21
 
22
- @path = path.to_s
23
- @binary = binary || false
24
-
25
- freeze
22
+ @path = path.to_s
26
23
  end
27
24
 
28
25
  private
@@ -30,10 +27,6 @@ module Burner
30
27
  def compile_path(params)
31
28
  eval_string_template(path, params)
32
29
  end
33
-
34
- def mode
35
- binary ? 'wb' : 'w'
36
- end
37
30
  end
38
31
  end
39
32
  end
@@ -0,0 +1,43 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ require_relative 'base'
11
+
12
+ module Burner
13
+ class Jobs
14
+ module IO
15
+ # Check to see if a file exists. If short_circuit is set to true and the file
16
+ # does not exist then the job will return false and short circuit the pipeline.
17
+ class Exist < Base
18
+ attr_reader :short_circuit
19
+
20
+ def initialize(name:, path:, short_circuit: false)
21
+ super(name: name, path: path)
22
+
23
+ @short_circuit = short_circuit || false
24
+
25
+ freeze
26
+ end
27
+
28
+ def perform(output, _payload, params)
29
+ compiled_path = compile_path(params)
30
+
31
+ exists = File.exist?(compiled_path)
32
+ verb = exists ? 'does' : 'does not'
33
+
34
+ output.detail("The path: #{compiled_path} #{verb} exist")
35
+
36
+ # if anything but false is returned then the pipeline will not short circuit. So
37
+ # we need to make sure we explicitly return false.
38
+ short_circuit && !exists ? false : nil
39
+ end
40
+ end
41
+ end
42
+ end
43
+ end
@@ -14,6 +14,16 @@ module Burner
14
14
  module IO
15
15
  # Read value from disk.
16
16
  class Read < Base
17
+ attr_reader :binary
18
+
19
+ def initialize(name:, path:, binary: false)
20
+ super(name: name, path: path)
21
+
22
+ @binary = binary || false
23
+
24
+ freeze
25
+ end
26
+
17
27
  def perform(output, payload, params)
18
28
  compiled_path = compile_path(params)
19
29
 
@@ -14,6 +14,16 @@ module Burner
14
14
  module IO
15
15
  # Write value to disk.
16
16
  class Write < Base
17
+ attr_reader :binary
18
+
19
+ def initialize(name:, path:, binary: false)
20
+ super(name: name, path: path)
21
+
22
+ @binary = binary || false
23
+
24
+ freeze
25
+ end
26
+
17
27
  def perform(output, payload, params)
18
28
  compiled_path = compile_path(params)
19
29
 
@@ -21,7 +31,15 @@ module Burner
21
31
 
22
32
  output.detail("Writing: #{compiled_path}")
23
33
 
24
- File.open(compiled_path, mode) { |io| io.write(payload.value) }
34
+ time_in_seconds = Benchmark.measure do
35
+ File.open(compiled_path, mode) { |io| io.write(payload.value) }
36
+ end.real
37
+
38
+ payload.add_written_file(
39
+ logical_filename: compiled_path,
40
+ physical_filename: compiled_path,
41
+ time_in_seconds: time_in_seconds
42
+ )
25
43
 
26
44
  nil
27
45
  end
@@ -39,6 +57,10 @@ module Burner
39
57
 
40
58
  nil
41
59
  end
60
+
61
+ def mode
62
+ binary ? 'wb' : 'w'
63
+ end
42
64
  end
43
65
  end
44
66
  end
@@ -7,6 +7,8 @@
7
7
  # LICENSE file in the root directory of this source tree.
8
8
  #
9
9
 
10
+ require_relative 'written_file'
11
+
10
12
  module Burner
11
13
  # The input for all Job#perform methods. The main notion of this object is its "value"
12
14
  # attribute. This is dynamic and weak on purpose and is subject to whatever the Job#perform
@@ -16,11 +18,16 @@ module Burner
16
18
  class Payload
17
19
  attr_accessor :value
18
20
 
19
- attr_reader :context
21
+ attr_reader :context, :written_files
22
+
23
+ def initialize(context: {}, value: nil, written_files: [])
24
+ @context = context || {}
25
+ @value = value
26
+ @written_files = written_files || []
27
+ end
20
28
 
21
- def initialize(context: {}, value: nil)
22
- @context = context || {}
23
- @value = value
29
+ def add_written_file(written_file)
30
+ tap { written_files << WrittenFile.make(written_file) }
24
31
  end
25
32
  end
26
33
  end
@@ -42,7 +42,14 @@ module Burner
42
42
  output.ruler
43
43
 
44
44
  time_in_seconds = Benchmark.measure do
45
- steps.each { |step| step.perform(output, payload, params) }
45
+ steps.each do |step|
46
+ return_value = step.perform(output, payload, params)
47
+
48
+ if return_value.is_a?(FalseClass)
49
+ output.detail('Job returned false, ending pipeline.')
50
+ break
51
+ end
52
+ end
46
53
  end.real.round(3)
47
54
 
48
55
  output.ruler
@@ -29,15 +29,19 @@ module Burner
29
29
  end
30
30
 
31
31
  def perform(output, payload, params)
32
+ return_value = nil
33
+
32
34
  output.title("#{job.class.name}#{SEPARATOR}#{job.name}")
33
35
 
34
36
  time_in_seconds = Benchmark.measure do
35
- job.perform(output, payload, params)
37
+ job_params = (params || {}).merge(__id: output.id, __value: payload.value)
38
+
39
+ return_value = job.perform(output, payload, job_params)
36
40
  end.real.round(3)
37
41
 
38
42
  output.complete(time_in_seconds)
39
43
 
40
- nil
44
+ return_value
41
45
  end
42
46
  end
43
47
  end
@@ -8,5 +8,5 @@
8
8
  #
9
9
 
10
10
  module Burner
11
- VERSION = '1.0.0-alpha'
11
+ VERSION = '1.0.0-alpha.1'
12
12
  end
@@ -0,0 +1,28 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ # Describes a file that was generated by a Job. If a Job emits a file, it should also add the
12
+ # file details to the Payload#written_files array using the Payload#add_written_file method.
13
+ class WrittenFile
14
+ acts_as_hashable
15
+
16
+ attr_reader :logical_filename,
17
+ :physical_filename,
18
+ :time_in_seconds
19
+
20
+ def initialize(logical_filename:, physical_filename:, time_in_seconds:)
21
+ @logical_filename = logical_filename.to_s
22
+ @physical_filename = physical_filename.to_s
23
+ @time_in_seconds = time_in_seconds.to_f
24
+
25
+ freeze
26
+ end
27
+ end
28
+ end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: burner
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0.pre.alpha
4
+ version: 1.0.0.pre.alpha.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Matthew Ruggio
@@ -184,6 +184,7 @@ files:
184
184
  - lib/burner/jobs/dummy.rb
185
185
  - lib/burner/jobs/echo.rb
186
186
  - lib/burner/jobs/io/base.rb
187
+ - lib/burner/jobs/io/exist.rb
187
188
  - lib/burner/jobs/io/read.rb
188
189
  - lib/burner/jobs/io/write.rb
189
190
  - lib/burner/jobs/serialize/json.rb
@@ -196,6 +197,7 @@ files:
196
197
  - lib/burner/step.rb
197
198
  - lib/burner/string_template.rb
198
199
  - lib/burner/version.rb
200
+ - lib/burner/written_file.rb
199
201
  homepage: https://github.com/bluemarblepayroll/burner
200
202
  licenses:
201
203
  - MIT