burner 1.0.0.pre.alpha.8 → 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
Files changed (41) hide show
  1. checksums.yaml +4 -4
  2. data/CHANGELOG.md +4 -0
  3. data/README.md +9 -2
  4. data/burner.gemspec +1 -1
  5. data/lib/burner.rb +6 -0
  6. data/lib/burner/job.rb +12 -5
  7. data/lib/burner/job_with_register.rb +1 -1
  8. data/lib/burner/jobs.rb +8 -4
  9. data/lib/burner/library.rb +11 -4
  10. data/lib/burner/library/collection/arrays_to_objects.rb +9 -4
  11. data/lib/burner/library/collection/concatenate.rb +42 -0
  12. data/lib/burner/library/collection/graph.rb +7 -3
  13. data/lib/burner/library/collection/objects_to_arrays.rb +31 -29
  14. data/lib/burner/library/collection/shift.rb +3 -3
  15. data/lib/burner/library/collection/transform.rb +9 -3
  16. data/lib/burner/library/collection/unpivot.rb +7 -3
  17. data/lib/burner/library/collection/validate.rb +4 -3
  18. data/lib/burner/library/collection/values.rb +3 -3
  19. data/lib/burner/library/deserialize/csv.rb +2 -2
  20. data/lib/burner/library/deserialize/json.rb +2 -2
  21. data/lib/burner/library/deserialize/yaml.rb +3 -3
  22. data/lib/burner/library/echo.rb +1 -1
  23. data/lib/burner/library/io/base.rb +1 -1
  24. data/lib/burner/library/io/exist.rb +1 -1
  25. data/lib/burner/library/io/read.rb +3 -3
  26. data/lib/burner/library/io/write.rb +3 -3
  27. data/lib/burner/library/{dummy.rb → nothing.rb} +2 -2
  28. data/lib/burner/library/serialize/csv.rb +2 -2
  29. data/lib/burner/library/serialize/json.rb +2 -2
  30. data/lib/burner/library/serialize/yaml.rb +2 -2
  31. data/lib/burner/library/sleep.rb +1 -1
  32. data/lib/burner/library/value/copy.rb +39 -0
  33. data/lib/burner/library/value/static.rb +34 -0
  34. data/lib/burner/modeling/attribute.rb +3 -1
  35. data/lib/burner/modeling/validations/base.rb +1 -1
  36. data/lib/burner/modeling/validations/blank.rb +6 -2
  37. data/lib/burner/modeling/validations/present.rb +4 -4
  38. data/lib/burner/payload.rb +11 -5
  39. data/lib/burner/version.rb +1 -1
  40. metadata +10 -8
  41. data/lib/burner/library/set_value.rb +0 -32
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ebd1332e7b4a01aace974be43aaa77e02b3bcb13e36b2360b08192bde2b99705
4
- data.tar.gz: '09095dcd305279cf2e246cca6e36256a0f7943d12d335366aa6ff9dcc92c6198'
3
+ metadata.gz: d3d251463ade2de965d0dd1e0d635967571d3d1d4cab46a7d1271ab057d1c2e3
4
+ data.tar.gz: 2eac66d3f745888be07cf5d2d66f6e8429c61b0255ad0916f7c6f8c792194dcd
5
5
  SHA512:
6
- metadata.gz: 31e45b0c0cc815d05f3424a04e77a8e36ed96e912ec0fed2dfcf8e171ecbe7f0ade1841991cdb97751bbd10d103b161d173120abd0a68689281a258de524697c
7
- data.tar.gz: 751091b32afae9e386814cb16856c5b0d41393b51e4e98070579d6508484e61a27a21bddba4be08f7a8416a86b50de71eeac3d886c635c785572f1597cb9095c
6
+ metadata.gz: c57dabbc95d1b58b5f0fe15429f36682d0dec0771951bb01b6e8c6345ddd9e95b5805ad57f0e0166e8e4e87d0296885915c022895818e46c3389049207e0e719
7
+ data.tar.gz: f23ecf1374570ad2f5cc6dfb70c88c895e4d45e0c2eb7453878038dc507d1a808ca3894c50a320b4a259df09ad7f1368a4b8c7bd790624b27baac487df1b48cf
@@ -1,3 +1,7 @@
1
+ # 1.0.0 (November 5th, 2020)
2
+
3
+ Initial version publication.
4
+
1
5
  # 0.0.1
2
6
 
3
7
  Shell
data/README.md CHANGED
@@ -234,6 +234,7 @@ This library only ships with very basic, rudimentary jobs that are meant to just
234
234
  #### Collection
235
235
 
236
236
  * **b/collection/arrays_to_objects** [mappings, register]: Convert an array of arrays to an array of objects.
237
+ * **b/collection/concatenate** [from_registers, to_register]: Concatenate each from_register's value and place the newly concatenated array into the to_register. Note: this does not do any deep copying and should be assumed it is shallow copying all objects.
237
238
  * **b/collection/graph** [config, key, register]: Use [Hashematics](https://github.com/bluemarblepayroll/hashematics) to turn a flat array of objects into a deeply nested object tree.
238
239
  * **b/collection/objects_to_arrays** [mappings, register]: Convert an array of objects to an array of arrays.
239
240
  * **b/collection/shift** [amount, register]: Remove the first N number of elements from an array.
@@ -260,11 +261,15 @@ This library only ships with very basic, rudimentary jobs that are meant to just
260
261
  * **b/serialize/json** [register]: Convert value to JSON.
261
262
  * **b/serialize/yaml** [register]: Convert value to YAML.
262
263
 
264
+ #### Value
265
+
266
+ * **b/value/copy** [from_register, to_register]: Copy from_register's value into the to_register. Note: this does not do any deep copying and should be assumed it is shallow copying all objects.
267
+ * **b/value/static** [register, value]: Set the value to any arbitrary value.
268
+
263
269
  #### General
264
270
 
265
- * **b/dummy** []: Do nothing
266
271
  * **b/echo** [message]: Write a message to the output. The message parameter can be interpolated using `Payload#params`.
267
- * **b/set** [register, value]: Set the value to any arbitrary value.
272
+ * **b/nothing** []: Do nothing.
268
273
  * **b/sleep** [seconds]: Sleep the thread for X number of seconds.
269
274
 
270
275
  Notes:
@@ -273,6 +278,8 @@ Notes:
273
278
 
274
279
  ### Adding & Registering Jobs
275
280
 
281
+ Note: Jobs have to be registered with a type in the Burner::Jobs factory. All jobs that ship with this library are prefixed with `b/` in their type in order to provide a namespace for 'burner-specific' jobs vs. externally provided jobs.
282
+
276
283
  Where this library shines is when additional jobs are plugged in. Burner uses its `Burner::Jobs` class as its class-level registry built with [acts_as_hashable](https://github.com/bluemarblepayroll/acts_as_hashable)'s acts_as_hashable_factory directive.
277
284
 
278
285
  Let's say we would like to register a job to parse a CSV:
@@ -32,7 +32,7 @@ Gem::Specification.new do |s|
32
32
  s.add_dependency('hashematics', '~>1.1')
33
33
  s.add_dependency('hash_math', '~>1.2')
34
34
  s.add_dependency('objectable', '~>1.0')
35
- s.add_dependency('realize', '~>1.2')
35
+ s.add_dependency('realize', '~>1.3')
36
36
  s.add_dependency('stringento', '~>2.1')
37
37
 
38
38
  s.add_development_dependency('guard-rspec', '~>4.7')
@@ -29,3 +29,9 @@ require_relative 'burner/util'
29
29
 
30
30
  # Main Entrypoint(s)
31
31
  require_relative 'burner/cli'
32
+
33
+ # Top-level namespace
34
+ module Burner
35
+ # All jobs that need to reference the main register should use this constant.
36
+ DEFAULT_REGISTER = 'default'
37
+ end
@@ -31,10 +31,11 @@ module Burner
31
31
  # The #perform method takes in two arguments: output (an instance of Burner::Output)
32
32
  # and payload (an instance of Burner::Payload). Jobs can leverage output to emit
33
33
  # information to the pipeline's log(s). The payload is utilized to pass data from job to job,
34
- # with its most important attribute being #value. The value attribute is mutable
35
- # per the individual job's context (meaning of it is unknown without understanding a job's
36
- # input and output value of #value.). Therefore #value can mean anything and it is up to the
37
- # engineers to clearly document the assumptions of its use.
34
+ # with its most important attribute being #registers. The registers attribute is a mutable
35
+ # and accessible hash per the individual job's context
36
+ # (meaning of it is unknown without understanding a job's input and output value
37
+ # of #registers.). Therefore #register key values can mean anything
38
+ # and it is up to consumers to clearly document the assumptions of its use.
38
39
  #
39
40
  # Returning false will short-circuit the pipeline right after the job method exits.
40
41
  # Returning anything else besides false just means "continue".
@@ -47,9 +48,15 @@ module Burner
47
48
  protected
48
49
 
49
50
  def job_string_template(expression, output, payload)
50
- templatable_params = payload.params.merge(__id: output.id, __value: payload[''])
51
+ templatable_params = payload.params
52
+ .merge(__id: output.id)
53
+ .merge(templatable_register_values(payload))
51
54
 
52
55
  Util::StringTemplate.instance.evaluate(expression, templatable_params)
53
56
  end
57
+
58
+ def templatable_register_values(payload)
59
+ payload.registers.transform_keys { |key| "__#{key}_register" }
60
+ end
54
61
  end
55
62
  end
@@ -15,7 +15,7 @@ module Burner
15
15
  class JobWithRegister < Job
16
16
  attr_reader :register
17
17
 
18
- def initialize(name:, register: '')
18
+ def initialize(name:, register: DEFAULT_REGISTER)
19
19
  super(name: name)
20
20
 
21
21
  @register = register.to_s
@@ -16,20 +16,21 @@ module Burner
16
16
  class Jobs
17
17
  acts_as_hashable_factory
18
18
 
19
- # Dummy is the default as noted by the ''. This means if a type is omitted, nil, or blank
20
- # string then the dummy job will be used.
21
- register 'b/dummy', '', Library::Dummy
19
+ # Nothing is the default as noted by the ''. This means if a type is omitted, nil, or blank
20
+ # string then the nothing job will be used.
22
21
  register 'b/echo', Library::Echo
23
- register 'b/set_value', Library::SetValue
22
+ register 'b/nothing', '', Library::Nothing
24
23
  register 'b/sleep', Library::Sleep
25
24
 
26
25
  register 'b/collection/arrays_to_objects', Library::Collection::ArraysToObjects
26
+ register 'b/collection/concatenate', Library::Collection::Concatenate
27
27
  register 'b/collection/graph', Library::Collection::Graph
28
28
  register 'b/collection/objects_to_arrays', Library::Collection::ObjectsToArrays
29
29
  register 'b/collection/shift', Library::Collection::Shift
30
30
  register 'b/collection/transform', Library::Collection::Transform
31
31
  register 'b/collection/unpivot', Library::Collection::Unpivot
32
32
  register 'b/collection/values', Library::Collection::Values
33
+ register 'b/collection/validate', Library::Collection::Validate
33
34
 
34
35
  register 'b/deserialize/csv', Library::Deserialize::Csv
35
36
  register 'b/deserialize/json', Library::Deserialize::Json
@@ -42,5 +43,8 @@ module Burner
42
43
  register 'b/serialize/csv', Library::Serialize::Csv
43
44
  register 'b/serialize/json', Library::Serialize::Json
44
45
  register 'b/serialize/yaml', Library::Serialize::Yaml
46
+
47
+ register 'b/value/copy', Library::Value::Copy
48
+ register 'b/value/static', Library::Value::Static
45
49
  end
46
50
  end
@@ -9,7 +9,12 @@
9
9
 
10
10
  require_relative 'job_with_register'
11
11
 
12
+ require_relative 'library/echo'
13
+ require_relative 'library/nothing'
14
+ require_relative 'library/sleep'
15
+
12
16
  require_relative 'library/collection/arrays_to_objects'
17
+ require_relative 'library/collection/concatenate'
13
18
  require_relative 'library/collection/graph'
14
19
  require_relative 'library/collection/objects_to_arrays'
15
20
  require_relative 'library/collection/shift'
@@ -17,16 +22,18 @@ require_relative 'library/collection/transform'
17
22
  require_relative 'library/collection/unpivot'
18
23
  require_relative 'library/collection/validate'
19
24
  require_relative 'library/collection/values'
25
+
20
26
  require_relative 'library/deserialize/csv'
21
27
  require_relative 'library/deserialize/json'
22
28
  require_relative 'library/deserialize/yaml'
23
- require_relative 'library/dummy'
24
- require_relative 'library/echo'
29
+
25
30
  require_relative 'library/io/exist'
26
31
  require_relative 'library/io/read'
27
32
  require_relative 'library/io/write'
33
+
28
34
  require_relative 'library/serialize/csv'
29
35
  require_relative 'library/serialize/json'
30
36
  require_relative 'library/serialize/yaml'
31
- require_relative 'library/set_value'
32
- require_relative 'library/sleep'
37
+
38
+ require_relative 'library/value/copy'
39
+ require_relative 'library/value/static'
@@ -14,8 +14,8 @@ module Burner
14
14
  # Burner::Modeling::KeyIndexMapping instances or hashable configurations which specifies
15
15
  # the index-to-key mappings to use.
16
16
  #
17
- # Expected Payload#value input: array of arrays.
18
- # Payload#value output: An array of hashes.
17
+ # Expected Payload[register] input: array of arrays.
18
+ # Payload[register] output: An array of hashes.
19
19
  #
20
20
  # An example using a configuration-first pipeline:
21
21
  #
@@ -23,7 +23,7 @@ module Burner
23
23
  # jobs: [
24
24
  # {
25
25
  # name: 'set',
26
- # type: 'b/set_value',
26
+ # type: 'b/value/static',
27
27
  # value: [
28
28
  # [1, 'funky']
29
29
  # ]
@@ -47,10 +47,15 @@ module Burner
47
47
  # }
48
48
  #
49
49
  # Burner::Pipeline.make(config).execute
50
+ #
51
+ # Given the above example, the expected output would be:
52
+ # [
53
+ # { 'id' => 1, 'name' => 'funky' }
54
+ # ]
50
55
  class ArraysToObjects < JobWithRegister
51
56
  attr_reader :mappings
52
57
 
53
- def initialize(name:, mappings: [], register: '')
58
+ def initialize(name:, mappings: [], register: DEFAULT_REGISTER)
54
59
  super(name: name, register: register)
55
60
 
56
61
  @mappings = Modeling::KeyIndexMapping.array(mappings)
@@ -0,0 +1,42 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Take the list of from_registers and concatenate each of their values together.
14
+ # Each from_value will be coerced into an array if it not an array.
15
+ #
16
+ # Expected Payload[from_register] input: array of objects.
17
+ # Payload[to_register] output: An array of objects.
18
+ class Concatenate < Job
19
+ attr_reader :from_registers, :to_register
20
+
21
+ def initialize(name:, from_registers: [], to_register: DEFAULT_REGISTER)
22
+ super(name: name)
23
+
24
+ @from_registers = Array(from_registers)
25
+ @to_register = to_register.to_s
26
+
27
+ freeze
28
+ end
29
+
30
+ def perform(output, payload)
31
+ output.detail("Concatenating registers: '#{from_registers}' to: '#{to_register}'")
32
+
33
+ payload[to_register] = from_registers.each_with_object([]) do |from_register, memo|
34
+ from_register_value = array(payload[from_register])
35
+
36
+ memo.concat(from_register_value)
37
+ end
38
+ end
39
+ end
40
+ end
41
+ end
42
+ end
@@ -13,12 +13,16 @@ module Burner
13
13
  # Take an array of (denormalized) objects and create an object hierarchy from them.
14
14
  # Under the hood it uses Hashematics: https://github.com/bluemarblepayroll/hashematics.
15
15
  #
16
- # Expected Payload#value input: array of objects.
17
- # Payload#value output: An array of objects.
16
+ # Expected Payload[register] input: array of objects.
17
+ # Payload[register] output: An array of objects.
18
18
  class Graph < JobWithRegister
19
19
  attr_reader :key, :groups
20
20
 
21
- def initialize(name:, key:, config: Hashematics::Configuration.new, register: '')
21
+ def initialize(
22
+ name:, key:,
23
+ config: Hashematics::Configuration.new,
24
+ register: DEFAULT_REGISTER
25
+ )
22
26
  super(name: name, register: register)
23
27
 
24
28
  raise ArgumentError, 'key is required' if key.to_s.empty?
@@ -15,39 +15,41 @@ module Burner
15
15
  # Burner::Modeling::KeyIndexMapping instances or hashable configurations which specifies
16
16
  # the key-to-index mappings to use.
17
17
  #
18
- # Expected Payload#value input: array of hashes.
19
- # Payload#value output: An array of arrays.
18
+ # Expected Payload[register] input: array of hashes.
19
+ # Payload[register] output: An array of arrays.
20
20
  #
21
21
  # An example using a configuration-first pipeline:
22
22
  #
23
- # config = {
24
- # jobs: [
25
- # {
26
- # name: 'set',
27
- # type: 'b/set_value',
28
- # value: [
29
- # [1, 'funky']
30
- # ]
31
- # },
32
- # {
33
- # name: 'map',
34
- # type: 'b/collection/objects_to_arrays',
35
- # mappings: [
36
- # { index: 0, key: 'id' },
37
- # { index: 1, key: 'name' }
38
- # ]
39
- # },
40
- # {
41
- # name: 'output',
42
- # type: 'b/echo',
43
- # message: 'value is currently: {__value}'
44
- # },
23
+ # config = {
24
+ # jobs: [
25
+ # {
26
+ # name: 'set',
27
+ # type: 'b/value/static',
28
+ # value: [
29
+ # { 'id' => 1, 'name' => 'funky' }
30
+ # ],
31
+ # register: register
32
+ # },
33
+ # {
34
+ # name: 'map',
35
+ # type: 'b/collection/objects_to_arrays',
36
+ # mappings: [
37
+ # { index: 0, key: 'id' },
38
+ # { index: 1, key: 'name' }
39
+ # ],
40
+ # register: register
41
+ # },
42
+ # {
43
+ # name: 'output',
44
+ # type: 'b/echo',
45
+ # message: 'value is currently: {__value}'
46
+ # },
45
47
  #
46
- # ],
47
- # steps: %w[set map output]
48
- # }
48
+ # ],
49
+ # steps: %w[set map output]
50
+ # }
49
51
  #
50
- # Burner::Pipeline.make(config).execute
52
+ # Burner::Pipeline.make(config).execute
51
53
  class ObjectsToArrays < JobWithRegister
52
54
  attr_reader :mappings
53
55
 
@@ -56,7 +58,7 @@ module Burner
56
58
  # nested hashes then set separator to '.'. For more information, see the underlying
57
59
  # library that supports this dot-notation concept:
58
60
  # https://github.com/bluemarblepayroll/objectable
59
- def initialize(name:, mappings: [], register: '', separator: '')
61
+ def initialize(name:, mappings: [], register: DEFAULT_REGISTER, separator: '')
60
62
  super(name: name, register: register)
61
63
 
62
64
  @mappings = Modeling::KeyIndexMapping.array(mappings)
@@ -14,8 +14,8 @@ module Burner
14
14
  # attribute. The initial use case for this was to remove "header" rows from arrays,
15
15
  # like you would expect when parsing CSV files.
16
16
  #
17
- # Expected Payload#value input: nothing.
18
- # Payload#value output: An array with N beginning elements removed.
17
+ # Expected Payload[register] input: nothing.
18
+ # Payload[register] output: An array with N beginning elements removed.
19
19
  class Shift < JobWithRegister
20
20
  DEFAULT_AMOUNT = 0
21
21
 
@@ -23,7 +23,7 @@ module Burner
23
23
 
24
24
  attr_reader :amount
25
25
 
26
- def initialize(name:, amount: DEFAULT_AMOUNT, register: '')
26
+ def initialize(name:, amount: DEFAULT_AMOUNT, register: DEFAULT_REGISTER)
27
27
  super(name: name, register: register)
28
28
 
29
29
  @amount = amount.to_i
@@ -18,8 +18,8 @@ module Burner
18
18
  # For more information on the specific contract for attributes, see the
19
19
  # Burner::Modeling::Attribute class.
20
20
  #
21
- # Expected Payload#value input: array of objects.
22
- # Payload#value output: An array of objects.
21
+ # Expected Payload[register] input: array of objects.
22
+ # Payload[register] output: An array of objects.
23
23
  class Transform < JobWithRegister
24
24
  BLANK = ''
25
25
 
@@ -27,7 +27,13 @@ module Burner
27
27
  :exclusive,
28
28
  :resolver
29
29
 
30
- def initialize(name:, attributes: [], exclusive: false, register: '', separator: BLANK)
30
+ def initialize(
31
+ name:,
32
+ attributes: [],
33
+ exclusive: false,
34
+ register: DEFAULT_REGISTER,
35
+ separator: BLANK
36
+ )
31
37
  super(name: name, register: register)
32
38
 
33
39
  @resolver = Objectable.resolver(separator: separator)
@@ -14,12 +14,16 @@ module Burner
14
14
  # Under the hood it uses HashMath's Unpivot class:
15
15
  # https://github.com/bluemarblepayroll/hash_math
16
16
  #
17
- # Expected Payload#value input: array of objects.
18
- # Payload#value output: An array of objects.
17
+ # Expected Payload[register] input: array of objects.
18
+ # Payload[register] output: An array of objects.
19
19
  class Unpivot < JobWithRegister
20
20
  attr_reader :unpivot
21
21
 
22
- def initialize(name:, pivot_set: HashMath::Unpivot::PivotSet.new, register: '')
22
+ def initialize(
23
+ name:,
24
+ pivot_set: HashMath::Unpivot::PivotSet.new,
25
+ register: DEFAULT_REGISTER
26
+ )
23
27
  super(name: name, register: register)
24
28
 
25
29
  @unpivot = HashMath::Unpivot.new(pivot_set)
@@ -14,8 +14,9 @@ module Burner
14
14
  # of validations. The main register will include the valid objects and the invalid_register
15
15
  # will contain the invalid objects.
16
16
  #
17
- # Expected Payload#value input: array of objects.
18
- # Payload#value output: An array of objects.
17
+ # Expected Payload[register] input: array of objects.
18
+ # Payload[register] output: An array of objects that are valid.
19
+ # Payload[invalid_register] output: An array of objects that are invalid.
19
20
  class Validate < JobWithRegister
20
21
  DEFAULT_INVALID_REGISTER = 'invalid'
21
22
  DEFAULT_JOIN_CHAR = ', '
@@ -32,7 +33,7 @@ module Burner
32
33
  invalid_register: DEFAULT_INVALID_REGISTER,
33
34
  join_char: DEFAULT_JOIN_CHAR,
34
35
  message_key: DEFAULT_MESSAGE_KEY,
35
- register: '',
36
+ register: DEFAULT_REGISTER,
36
37
  separator: '',
37
38
  validations: []
38
39
  )
@@ -14,12 +14,12 @@ module Burner
14
14
  # If include_keys is true (it is false by default), then call #keys on the first
15
15
  # object and inject that as a "header" object.
16
16
  #
17
- # Expected Payload#value input: array of objects.
18
- # Payload#value output: An array of arrays.
17
+ # Expected Payload[register] input: array of objects.
18
+ # Payload[register] output: An array of arrays.
19
19
  class Values < JobWithRegister
20
20
  attr_reader :include_keys
21
21
 
22
- def initialize(name:, include_keys: false, register: '')
22
+ def initialize(name:, include_keys: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, register: register)
24
24
 
25
25
  @include_keys = include_keys || false
@@ -12,8 +12,8 @@ module Burner
12
12
  module Deserialize
13
13
  # Take a CSV string and de-serialize into object(s).
14
14
  #
15
- # Expected Payload#value input: nothing.
16
- # Payload#value output: an array of arrays. Each inner array represents one data row.
15
+ # Expected Payload[register] input: nothing.
16
+ # Payload[register] output: an array of arrays. Each inner array represents one data row.
17
17
  class Csv < JobWithRegister
18
18
  # This currently only supports returning an array of arrays, including the header row.
19
19
  # In the future this could be extended to offer more customizable options, such as
@@ -12,8 +12,8 @@ module Burner
12
12
  module Deserialize
13
13
  # Take a JSON string and deserialize into object(s).
14
14
  #
15
- # Expected Payload#value input: string of JSON data.
16
- # Payload#value output: anything, as specified by the JSON de-serializer.
15
+ # Expected Payload[register] input: string of JSON data.
16
+ # Payload[register] output: anything, as specified by the JSON de-serializer.
17
17
  class Json < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = JSON.parse(payload[register])
@@ -15,12 +15,12 @@ module Burner
15
15
  # YAML. If you wish to ease this restriction, for example if you have custom serialization
16
16
  # for custom classes, then you can pass in safe: false.
17
17
  #
18
- # Expected Payload#value input: string of YAML data.
19
- # Payload#value output: anything as specified by the YAML de-serializer.
18
+ # Expected Payload[register] input: string of YAML data.
19
+ # Payload[register]output: anything as specified by the YAML de-serializer.
20
20
  class Yaml < JobWithRegister
21
21
  attr_reader :safe
22
22
 
23
- def initialize(name:, register: '', safe: true)
23
+ def initialize(name:, register: DEFAULT_REGISTER, safe: true)
24
24
  super(name: name, register: register)
25
25
 
26
26
  @safe = safe
@@ -11,7 +11,7 @@ module Burner
11
11
  module Library
12
12
  # Output a simple message to the output.
13
13
  #
14
- # Note: this does not use Payload#value.
14
+ # Note: this does not use Payload#registers.
15
15
  class Echo < Job
16
16
  attr_reader :message
17
17
 
@@ -14,7 +14,7 @@ module Burner
14
14
  class Base < JobWithRegister
15
15
  attr_reader :path
16
16
 
17
- def initialize(name:, path:, register: '')
17
+ def initialize(name:, path:, register: DEFAULT_REGISTER)
18
18
  super(name: name, register: register)
19
19
 
20
20
  raise ArgumentError, 'path is required' if path.to_s.empty?
@@ -15,7 +15,7 @@ module Burner
15
15
  # Check to see if a file exists. If short_circuit is set to true and the file
16
16
  # does not exist then the job will return false and short circuit the pipeline.
17
17
  #
18
- # Note: this does not use Payload#value.
18
+ # Note: this does not use Payload#registers.
19
19
  class Exist < Job
20
20
  attr_reader :path, :short_circuit
21
21
 
@@ -14,12 +14,12 @@ module Burner
14
14
  module IO
15
15
  # Read value from disk.
16
16
  #
17
- # Expected Payload#value input: nothing.
18
- # Payload#value output: contents of the specified file.
17
+ # Expected Payload[register] input: nothing.
18
+ # Payload[register] output: contents of the specified file.
19
19
  class Read < Base
20
20
  attr_reader :binary
21
21
 
22
- def initialize(name:, path:, binary: false, register: '')
22
+ def initialize(name:, path:, binary: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, path: path, register: register)
24
24
 
25
25
  @binary = binary || false
@@ -14,12 +14,12 @@ module Burner
14
14
  module IO
15
15
  # Write value to disk.
16
16
  #
17
- # Expected Payload#value input: anything.
18
- # Payload#value output: whatever was passed in.
17
+ # Expected Payload[register] input: anything.
18
+ # Payload[register] output: whatever was passed in.
19
19
  class Write < Base
20
20
  attr_reader :binary
21
21
 
22
- def initialize(name:, path:, binary: false, register: '')
22
+ def initialize(name:, path:, binary: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, path: path, register: register)
24
24
 
25
25
  @binary = binary || false
@@ -11,8 +11,8 @@ module Burner
11
11
  module Library
12
12
  # Do nothing.
13
13
  #
14
- # Note: this does not use Payload#value.
15
- class Dummy < Job
14
+ # Note: this does not use Payload#registers.
15
+ class Nothing < Job
16
16
  def perform(_output, _payload); end
17
17
  end
18
18
  end
@@ -12,8 +12,8 @@ module Burner
12
12
  module Serialize
13
13
  # Take an array of arrays and create a CSV.
14
14
  #
15
- # Expected Payload#value input: array of arrays.
16
- # Payload#value output: a serialized CSV string.
15
+ # Expected Payload[register] input: array of arrays.
16
+ # Payload[register] output: a serialized CSV string.
17
17
  class Csv < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = CSV.generate(options) do |csv|
@@ -12,8 +12,8 @@ module Burner
12
12
  module Serialize
13
13
  # Treat value like a Ruby object and serialize it using JSON.
14
14
  #
15
- # Expected Payload#value input: anything.
16
- # Payload#value output: string representing the output of the JSON serializer.
15
+ # Expected Payload[register] input: anything.
16
+ # Payload[register] output: string representing the output of the JSON serializer.
17
17
  class Json < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = payload[register].to_json
@@ -12,8 +12,8 @@ module Burner
12
12
  module Serialize
13
13
  # Treat value like a Ruby object and serialize it using YAML.
14
14
  #
15
- # Expected Payload#value input: anything.
16
- # Payload#value output: string representing the output of the YAML serializer.
15
+ # Expected Payload[register] input: anything.
16
+ # Payload[register] output: string representing the output of the YAML serializer.
17
17
  class Yaml < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = payload[register].to_yaml
@@ -11,7 +11,7 @@ module Burner
11
11
  module Library
12
12
  # Arbitrarily put thread to sleep for X number of seconds
13
13
  #
14
- # Payload#value output: whatever value was specified in this job.
14
+ # Note: this does not use Payload#registers.
15
15
  class Sleep < Job
16
16
  attr_reader :seconds
17
17
 
@@ -0,0 +1,39 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Value
13
+ # Copy one value in a register to another. Note that this does *not* perform any type of
14
+ # deep copy, it simply points one register's value to another. If you decide to later mutate
15
+ # one register then you may mutate the other.
16
+ #
17
+ # Expected Payload[from_register] input: anything.
18
+ # Payload[to_register] output: whatever value was specified in the from_register.
19
+ class Copy < Job
20
+ attr_reader :from_register, :to_register
21
+
22
+ def initialize(name:, to_register: DEFAULT_REGISTER, from_register: DEFAULT_REGISTER)
23
+ super(name: name)
24
+
25
+ @from_register = from_register.to_s
26
+ @to_register = to_register.to_s
27
+
28
+ freeze
29
+ end
30
+
31
+ def perform(output, payload)
32
+ output.detail("Copying register: '#{from_register}' to: '#{to_register}'")
33
+
34
+ payload[to_register] = payload[from_register]
35
+ end
36
+ end
37
+ end
38
+ end
39
+ end
@@ -0,0 +1,34 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Value
13
+ # Arbitrarily set the value of a register.
14
+ #
15
+ # Expected Payload[register] input: anything.
16
+ # Payload[register] output: whatever value was specified in this job.
17
+ class Static < JobWithRegister
18
+ attr_reader :value
19
+
20
+ def initialize(name:, register: DEFAULT_REGISTER, value: nil)
21
+ super(name: name, register: register)
22
+
23
+ @value = value
24
+
25
+ freeze
26
+ end
27
+
28
+ def perform(_output, payload)
29
+ payload[register] = value
30
+ end
31
+ end
32
+ end
33
+ end
34
+ end
@@ -10,7 +10,9 @@
10
10
  module Burner
11
11
  module Modeling
12
12
  # Defines a top-level key and the associated transformers for deriving the final value
13
- # to set the key to.
13
+ # to set the key to. The transformers that can be passed in can be any Realize::Transformers
14
+ # subclasses. For more information, see the Realize library at:
15
+ # https://github.com/bluemarblepayroll/realize
14
16
  class Attribute
15
17
  acts_as_hashable
16
18
 
@@ -27,7 +27,7 @@ module Burner
27
27
  end
28
28
 
29
29
  def message
30
- @message.to_s.empty? ? "#{key}#{default_message}" : @message.to_s
30
+ @message.to_s.empty? ? "#{key} #{default_message}" : @message.to_s
31
31
  end
32
32
  end
33
33
  end
@@ -16,14 +16,18 @@ module Burner
16
16
  class Blank < Base
17
17
  acts_as_hashable
18
18
 
19
+ BLANK_RE = /\A[[:space:]]*\z/.freeze
20
+
19
21
  def valid?(object, resolver)
20
- resolver.get(object, key).to_s.empty?
22
+ value = resolver.get(object, key).to_s
23
+
24
+ value.empty? || BLANK_RE.match?(value)
21
25
  end
22
26
 
23
27
  private
24
28
 
25
29
  def default_message
26
- ' must be blank'
30
+ 'must be blank'
27
31
  end
28
32
  end
29
33
  end
@@ -7,23 +7,23 @@
7
7
  # LICENSE file in the root directory of this source tree.
8
8
  #
9
9
 
10
- require_relative 'base'
10
+ require_relative 'blank'
11
11
 
12
12
  module Burner
13
13
  module Modeling
14
14
  class Validations
15
15
  # Check if a value is present. If it is blank (null or empty) then it is invalid.
16
- class Present < Base
16
+ class Present < Blank
17
17
  acts_as_hashable
18
18
 
19
19
  def valid?(object_value, resolver)
20
- !resolver.get(object_value, key).to_s.empty?
20
+ !super(object_value, resolver)
21
21
  end
22
22
 
23
23
  private
24
24
 
25
25
  def default_message
26
- ' is required'
26
+ 'is required'
27
27
  end
28
28
  end
29
29
  end
@@ -8,16 +8,22 @@
8
8
  #
9
9
 
10
10
  module Burner
11
- # The input for all Job#perform methods. The main notion of this object is its "value"
12
- # attribute. This is dynamic and weak on purpose and is subject to whatever the Job#perform
13
- # methods decides it is. This definitely adds an order-of-magnitude complexity to this whole
14
- # library and lifecycle, but I am not sure there is any other way around it: trying to build
15
- # a generic, open-ended object pipeline to serve almost any use case.
11
+ # The input for all Job#perform methods. The main notion of this object is its 'registers'
12
+ # attribute. This registers attribute is a key-indifferent hash, accessible on Payload using
13
+ # the brackets setter and getter methods. This is dynamic and weak on purpose and is subject
14
+ # to whatever the Job#perform methods decides it is. This definitely adds an order-of-magnitude
15
+ # complexity to this whole library and lifecycle, but I am not sure there is any other way
16
+ # around it: trying to build a generic, open-ended processing pipeline to serve almost
17
+ # any use case.
16
18
  #
17
19
  # The side_effects attribute can also be utilized as a way for jobs to emit any data in a more
18
20
  # structured/additive manner. The initial use case for this was for Burner's core IO jobs to
19
21
  # report back the files it has written in a more structured data way (as opposed to simply
20
22
  # writing some information to the output.)
23
+ #
24
+ # The 'time' attribute is important in that it should for the replaying of pipelines and jobs.
25
+ # Instead of having job's utilizing Time.now, Date.today, etc... they should rather opt to
26
+ # use this value instead.
21
27
  class Payload
22
28
  attr_reader :params,
23
29
  :registers,
@@ -8,5 +8,5 @@
8
8
  #
9
9
 
10
10
  module Burner
11
- VERSION = '1.0.0-alpha.8'
11
+ VERSION = '1.0.0'
12
12
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: burner
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0.pre.alpha.8
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Matthew Ruggio
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2020-10-27 00:00:00.000000000 Z
11
+ date: 2020-11-05 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: acts_as_hashable
@@ -72,14 +72,14 @@ dependencies:
72
72
  requirements:
73
73
  - - "~>"
74
74
  - !ruby/object:Gem::Version
75
- version: '1.2'
75
+ version: '1.3'
76
76
  type: :runtime
77
77
  prerelease: false
78
78
  version_requirements: !ruby/object:Gem::Requirement
79
79
  requirements:
80
80
  - - "~>"
81
81
  - !ruby/object:Gem::Version
82
- version: '1.2'
82
+ version: '1.3'
83
83
  - !ruby/object:Gem::Dependency
84
84
  name: stringento
85
85
  requirement: !ruby/object:Gem::Requirement
@@ -225,6 +225,7 @@ files:
225
225
  - lib/burner/jobs.rb
226
226
  - lib/burner/library.rb
227
227
  - lib/burner/library/collection/arrays_to_objects.rb
228
+ - lib/burner/library/collection/concatenate.rb
228
229
  - lib/burner/library/collection/graph.rb
229
230
  - lib/burner/library/collection/objects_to_arrays.rb
230
231
  - lib/burner/library/collection/shift.rb
@@ -235,17 +236,18 @@ files:
235
236
  - lib/burner/library/deserialize/csv.rb
236
237
  - lib/burner/library/deserialize/json.rb
237
238
  - lib/burner/library/deserialize/yaml.rb
238
- - lib/burner/library/dummy.rb
239
239
  - lib/burner/library/echo.rb
240
240
  - lib/burner/library/io/base.rb
241
241
  - lib/burner/library/io/exist.rb
242
242
  - lib/burner/library/io/read.rb
243
243
  - lib/burner/library/io/write.rb
244
+ - lib/burner/library/nothing.rb
244
245
  - lib/burner/library/serialize/csv.rb
245
246
  - lib/burner/library/serialize/json.rb
246
247
  - lib/burner/library/serialize/yaml.rb
247
- - lib/burner/library/set_value.rb
248
248
  - lib/burner/library/sleep.rb
249
+ - lib/burner/library/value/copy.rb
250
+ - lib/burner/library/value/static.rb
249
251
  - lib/burner/modeling.rb
250
252
  - lib/burner/modeling/attribute.rb
251
253
  - lib/burner/modeling/attribute_renderer.rb
@@ -284,9 +286,9 @@ required_ruby_version: !ruby/object:Gem::Requirement
284
286
  version: '2.5'
285
287
  required_rubygems_version: !ruby/object:Gem::Requirement
286
288
  requirements:
287
- - - ">"
289
+ - - ">="
288
290
  - !ruby/object:Gem::Version
289
- version: 1.3.1
291
+ version: '0'
290
292
  requirements: []
291
293
  rubygems_version: 3.0.3
292
294
  signing_key:
@@ -1,32 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- #
4
- # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
- #
6
- # This source code is licensed under the MIT license found in the
7
- # LICENSE file in the root directory of this source tree.
8
- #
9
-
10
- module Burner
11
- module Library
12
- # Arbitrarily set value
13
- #
14
- # Expected Payload#value input: anything.
15
- # Payload#value output: whatever value was specified in this job.
16
- class SetValue < JobWithRegister
17
- attr_reader :value
18
-
19
- def initialize(name:, register: '', value: nil)
20
- super(name: name, register: register)
21
-
22
- @value = value
23
-
24
- freeze
25
- end
26
-
27
- def perform(_output, payload)
28
- payload[register] = value
29
- end
30
- end
31
- end
32
- end