burner 1.0.0.pre.alpha.9 → 1.1.0.pre.alpha

Sign up to get free protection for your applications and to get access to all the features.
Files changed (45) hide show
  1. checksums.yaml +4 -4
  2. data/CHANGELOG.md +11 -0
  3. data/README.md +11 -2
  4. data/burner.gemspec +1 -1
  5. data/lib/burner.rb +6 -0
  6. data/lib/burner/job.rb +12 -5
  7. data/lib/burner/job_with_register.rb +1 -1
  8. data/lib/burner/jobs.rb +10 -5
  9. data/lib/burner/library.rb +13 -4
  10. data/lib/burner/library/collection/arrays_to_objects.rb +9 -4
  11. data/lib/burner/library/collection/coalesce.rb +73 -0
  12. data/lib/burner/library/collection/concatenate.rb +42 -0
  13. data/lib/burner/library/collection/graph.rb +8 -3
  14. data/lib/burner/library/collection/group.rb +66 -0
  15. data/lib/burner/library/collection/objects_to_arrays.rb +31 -29
  16. data/lib/burner/library/collection/shift.rb +3 -3
  17. data/lib/burner/library/collection/transform.rb +9 -3
  18. data/lib/burner/library/collection/unpivot.rb +7 -3
  19. data/lib/burner/library/collection/validate.rb +4 -3
  20. data/lib/burner/library/collection/values.rb +3 -3
  21. data/lib/burner/library/deserialize/csv.rb +2 -2
  22. data/lib/burner/library/deserialize/json.rb +2 -2
  23. data/lib/burner/library/deserialize/yaml.rb +3 -3
  24. data/lib/burner/library/echo.rb +1 -1
  25. data/lib/burner/library/io/base.rb +1 -1
  26. data/lib/burner/library/io/exist.rb +1 -1
  27. data/lib/burner/library/io/read.rb +3 -3
  28. data/lib/burner/library/io/write.rb +3 -3
  29. data/lib/burner/library/{dummy.rb → nothing.rb} +2 -2
  30. data/lib/burner/library/serialize/csv.rb +2 -2
  31. data/lib/burner/library/serialize/json.rb +2 -2
  32. data/lib/burner/library/serialize/yaml.rb +2 -2
  33. data/lib/burner/library/sleep.rb +1 -1
  34. data/lib/burner/library/value/copy.rb +39 -0
  35. data/lib/burner/library/value/static.rb +34 -0
  36. data/lib/burner/modeling.rb +1 -0
  37. data/lib/burner/modeling/attribute.rb +3 -1
  38. data/lib/burner/modeling/key_mapping.rb +29 -0
  39. data/lib/burner/modeling/validations/base.rb +1 -1
  40. data/lib/burner/modeling/validations/blank.rb +6 -2
  41. data/lib/burner/modeling/validations/present.rb +4 -4
  42. data/lib/burner/payload.rb +11 -5
  43. data/lib/burner/version.rb +1 -1
  44. metadata +11 -6
  45. data/lib/burner/library/set_value.rb +0 -32
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a2af01edaf5ab67512cb8d3db4dbf3bba75a6272af78d78d43f3ceaec0ead386
4
- data.tar.gz: 812095ecf0b4240f61128f1968380f35e6748a07486f268da358ab56425de013
3
+ metadata.gz: 9dea13fe31b6e7bc97eafb25201508afc0a05d2d4e2eb86aeab00004695674f1
4
+ data.tar.gz: 6569debdcc9ad51a0e180ee6b0bac192029b95d59ad767ddae7c6711dba9830d
5
5
  SHA512:
6
- metadata.gz: 68fcfbf7d2b600889e805691b4e8289c82eb6366391c533333d8873b1521e41d342af00bbf68b01caad954fd8eb136016906b484727020af9ff5716bcfed0822
7
- data.tar.gz: 33dc2ca22f71d2e6f54858bfa90e9f5c6bbbff969409fb1d6a8c05b459c105c77ff7e354780a93daca8241edd31fbb03c424091c3557bfb4facf8b43b9fe6264
6
+ metadata.gz: 20cb03d16e9495edd35a9323f9507780b006f0be2ef61b0423e198ada1bdcc672e7a7fc471d606bc5cd47d3f727237922285bf71dd9648381b92bc5b90327174
7
+ data.tar.gz: 977cd155d3a717af7a4fc4876694f27b665844780fadf23556cdf1d7fb120a124741c341dbf1d4725fad8a0dbe12260f7475e36d9cd7e1415618929b73fb9794
@@ -1,3 +1,14 @@
1
+ # 1.1.0 (TBD)
2
+
3
+ Added Jobs:
4
+
5
+ * b/collection/coalesce
6
+ * b/collection/group
7
+
8
+ # 1.0.0 (November 5th, 2020)
9
+
10
+ Initial version publication.
11
+
1
12
  # 0.0.1
2
13
 
3
14
  Shell
data/README.md CHANGED
@@ -234,7 +234,10 @@ This library only ships with very basic, rudimentary jobs that are meant to just
234
234
  #### Collection
235
235
 
236
236
  * **b/collection/arrays_to_objects** [mappings, register]: Convert an array of arrays to an array of objects.
237
+ * **b/collection/coalesce** [register, grouped_register, key_mappings, keys, separator]: Merge two datasets together based on the key values of one dataset (array) with a grouped dataset (hash).
238
+ * **b/collection/concatenate** [from_registers, to_register]: Concatenate each from_register's value and place the newly concatenated array into the to_register. Note: this does not do any deep copying and should be assumed it is shallow copying all objects.
237
239
  * **b/collection/graph** [config, key, register]: Use [Hashematics](https://github.com/bluemarblepayroll/hashematics) to turn a flat array of objects into a deeply nested object tree.
240
+ * **b/collection/group** [keys, register, separator]: Take a register's value (an array of objects) and group the objects by the specified keys.
238
241
  * **b/collection/objects_to_arrays** [mappings, register]: Convert an array of objects to an array of arrays.
239
242
  * **b/collection/shift** [amount, register]: Remove the first N number of elements from an array.
240
243
  * **b/collection/transform** [attributes, exclusive, separator, register]: Iterate over all objects and transform each key per the attribute transformers specifications. If exclusive is set to false then the current object will be overridden/merged. Separator can also be set for key path support. This job uses [Realize](https://github.com/bluemarblepayroll/realize), which provides its own extendable value-transformation pipeline.
@@ -260,11 +263,15 @@ This library only ships with very basic, rudimentary jobs that are meant to just
260
263
  * **b/serialize/json** [register]: Convert value to JSON.
261
264
  * **b/serialize/yaml** [register]: Convert value to YAML.
262
265
 
266
+ #### Value
267
+
268
+ * **b/value/copy** [from_register, to_register]: Copy from_register's value into the to_register. Note: this does not do any deep copying and should be assumed it is shallow copying all objects.
269
+ * **b/value/static** [register, value]: Set the value to any arbitrary value.
270
+
263
271
  #### General
264
272
 
265
- * **b/dummy** []: Do nothing
266
273
  * **b/echo** [message]: Write a message to the output. The message parameter can be interpolated using `Payload#params`.
267
- * **b/set** [register, value]: Set the value to any arbitrary value.
274
+ * **b/nothing** []: Do nothing.
268
275
  * **b/sleep** [seconds]: Sleep the thread for X number of seconds.
269
276
 
270
277
  Notes:
@@ -273,6 +280,8 @@ Notes:
273
280
 
274
281
  ### Adding & Registering Jobs
275
282
 
283
+ Note: Jobs have to be registered with a type in the Burner::Jobs factory. All jobs that ship with this library are prefixed with `b/` in their type in order to provide a namespace for 'burner-specific' jobs vs. externally provided jobs.
284
+
276
285
  Where this library shines is when additional jobs are plugged in. Burner uses its `Burner::Jobs` class as its class-level registry built with [acts_as_hashable](https://github.com/bluemarblepayroll/acts_as_hashable)'s acts_as_hashable_factory directive.
277
286
 
278
287
  Let's say we would like to register a job to parse a CSV:
@@ -32,7 +32,7 @@ Gem::Specification.new do |s|
32
32
  s.add_dependency('hashematics', '~>1.1')
33
33
  s.add_dependency('hash_math', '~>1.2')
34
34
  s.add_dependency('objectable', '~>1.0')
35
- s.add_dependency('realize', '~>1.2')
35
+ s.add_dependency('realize', '~>1.3')
36
36
  s.add_dependency('stringento', '~>2.1')
37
37
 
38
38
  s.add_development_dependency('guard-rspec', '~>4.7')
@@ -29,3 +29,9 @@ require_relative 'burner/util'
29
29
 
30
30
  # Main Entrypoint(s)
31
31
  require_relative 'burner/cli'
32
+
33
+ # Top-level namespace
34
+ module Burner
35
+ # All jobs that need to reference the main register should use this constant.
36
+ DEFAULT_REGISTER = 'default'
37
+ end
@@ -31,10 +31,11 @@ module Burner
31
31
  # The #perform method takes in two arguments: output (an instance of Burner::Output)
32
32
  # and payload (an instance of Burner::Payload). Jobs can leverage output to emit
33
33
  # information to the pipeline's log(s). The payload is utilized to pass data from job to job,
34
- # with its most important attribute being #value. The value attribute is mutable
35
- # per the individual job's context (meaning of it is unknown without understanding a job's
36
- # input and output value of #value.). Therefore #value can mean anything and it is up to the
37
- # engineers to clearly document the assumptions of its use.
34
+ # with its most important attribute being #registers. The registers attribute is a mutable
35
+ # and accessible hash per the individual job's context
36
+ # (meaning of it is unknown without understanding a job's input and output value
37
+ # of #registers.). Therefore #register key values can mean anything
38
+ # and it is up to consumers to clearly document the assumptions of its use.
38
39
  #
39
40
  # Returning false will short-circuit the pipeline right after the job method exits.
40
41
  # Returning anything else besides false just means "continue".
@@ -47,9 +48,15 @@ module Burner
47
48
  protected
48
49
 
49
50
  def job_string_template(expression, output, payload)
50
- templatable_params = payload.params.merge(__id: output.id, __value: payload[''])
51
+ templatable_params = payload.params
52
+ .merge(__id: output.id)
53
+ .merge(templatable_register_values(payload))
51
54
 
52
55
  Util::StringTemplate.instance.evaluate(expression, templatable_params)
53
56
  end
57
+
58
+ def templatable_register_values(payload)
59
+ payload.registers.transform_keys { |key| "__#{key}_register" }
60
+ end
54
61
  end
55
62
  end
@@ -15,7 +15,7 @@ module Burner
15
15
  class JobWithRegister < Job
16
16
  attr_reader :register
17
17
 
18
- def initialize(name:, register: '')
18
+ def initialize(name:, register: DEFAULT_REGISTER)
19
19
  super(name: name)
20
20
 
21
21
  @register = register.to_s
@@ -16,21 +16,23 @@ module Burner
16
16
  class Jobs
17
17
  acts_as_hashable_factory
18
18
 
19
- # Dummy is the default as noted by the ''. This means if a type is omitted, nil, or blank
20
- # string then the dummy job will be used.
21
- register 'b/dummy', '', Library::Dummy
19
+ # Nothing is the default as noted by the ''. This means if a type is omitted, nil, or blank
20
+ # string then the nothing job will be used.
22
21
  register 'b/echo', Library::Echo
23
- register 'b/set_value', Library::SetValue
22
+ register 'b/nothing', '', Library::Nothing
24
23
  register 'b/sleep', Library::Sleep
25
24
 
26
25
  register 'b/collection/arrays_to_objects', Library::Collection::ArraysToObjects
26
+ register 'b/collection/coalesce', Library::Collection::Coalesce
27
+ register 'b/collection/concatenate', Library::Collection::Concatenate
27
28
  register 'b/collection/graph', Library::Collection::Graph
29
+ register 'b/collection/group', Library::Collection::Group
28
30
  register 'b/collection/objects_to_arrays', Library::Collection::ObjectsToArrays
29
31
  register 'b/collection/shift', Library::Collection::Shift
30
32
  register 'b/collection/transform', Library::Collection::Transform
31
33
  register 'b/collection/unpivot', Library::Collection::Unpivot
32
34
  register 'b/collection/values', Library::Collection::Values
33
- register 'b/collection/validate', Library::Collection::Values
35
+ register 'b/collection/validate', Library::Collection::Validate
34
36
 
35
37
  register 'b/deserialize/csv', Library::Deserialize::Csv
36
38
  register 'b/deserialize/json', Library::Deserialize::Json
@@ -43,5 +45,8 @@ module Burner
43
45
  register 'b/serialize/csv', Library::Serialize::Csv
44
46
  register 'b/serialize/json', Library::Serialize::Json
45
47
  register 'b/serialize/yaml', Library::Serialize::Yaml
48
+
49
+ register 'b/value/copy', Library::Value::Copy
50
+ register 'b/value/static', Library::Value::Static
46
51
  end
47
52
  end
@@ -9,24 +9,33 @@
9
9
 
10
10
  require_relative 'job_with_register'
11
11
 
12
+ require_relative 'library/echo'
13
+ require_relative 'library/nothing'
14
+ require_relative 'library/sleep'
15
+
12
16
  require_relative 'library/collection/arrays_to_objects'
17
+ require_relative 'library/collection/coalesce'
18
+ require_relative 'library/collection/concatenate'
13
19
  require_relative 'library/collection/graph'
20
+ require_relative 'library/collection/group'
14
21
  require_relative 'library/collection/objects_to_arrays'
15
22
  require_relative 'library/collection/shift'
16
23
  require_relative 'library/collection/transform'
17
24
  require_relative 'library/collection/unpivot'
18
25
  require_relative 'library/collection/validate'
19
26
  require_relative 'library/collection/values'
27
+
20
28
  require_relative 'library/deserialize/csv'
21
29
  require_relative 'library/deserialize/json'
22
30
  require_relative 'library/deserialize/yaml'
23
- require_relative 'library/dummy'
24
- require_relative 'library/echo'
31
+
25
32
  require_relative 'library/io/exist'
26
33
  require_relative 'library/io/read'
27
34
  require_relative 'library/io/write'
35
+
28
36
  require_relative 'library/serialize/csv'
29
37
  require_relative 'library/serialize/json'
30
38
  require_relative 'library/serialize/yaml'
31
- require_relative 'library/set_value'
32
- require_relative 'library/sleep'
39
+
40
+ require_relative 'library/value/copy'
41
+ require_relative 'library/value/static'
@@ -14,8 +14,8 @@ module Burner
14
14
  # Burner::Modeling::KeyIndexMapping instances or hashable configurations which specifies
15
15
  # the index-to-key mappings to use.
16
16
  #
17
- # Expected Payload#value input: array of arrays.
18
- # Payload#value output: An array of hashes.
17
+ # Expected Payload[register] input: array of arrays.
18
+ # Payload[register] output: An array of hashes.
19
19
  #
20
20
  # An example using a configuration-first pipeline:
21
21
  #
@@ -23,7 +23,7 @@ module Burner
23
23
  # jobs: [
24
24
  # {
25
25
  # name: 'set',
26
- # type: 'b/set_value',
26
+ # type: 'b/value/static',
27
27
  # value: [
28
28
  # [1, 'funky']
29
29
  # ]
@@ -47,10 +47,15 @@ module Burner
47
47
  # }
48
48
  #
49
49
  # Burner::Pipeline.make(config).execute
50
+ #
51
+ # Given the above example, the expected output would be:
52
+ # [
53
+ # { 'id' => 1, 'name' => 'funky' }
54
+ # ]
50
55
  class ArraysToObjects < JobWithRegister
51
56
  attr_reader :mappings
52
57
 
53
- def initialize(name:, mappings: [], register: '')
58
+ def initialize(name:, mappings: [], register: DEFAULT_REGISTER)
54
59
  super(name: name, register: register)
55
60
 
56
61
  @mappings = Modeling::KeyIndexMapping.array(mappings)
@@ -0,0 +1,73 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # This is generally used right after the Group job has been executed on a separate
14
+ # dataset in a separate register. This job can match up specified values in its dataset
15
+ # with lookup values in another. If it finds a match then it will (shallow) copy over
16
+ # the values into the respective dataset.
17
+ #
18
+ # Expected Payload[register] input: array of objects.
19
+ # Payload[register] output: array of objects.
20
+ class Coalesce < JobWithRegister
21
+ attr_reader :grouped_register, :key_mappings, :keys, :resolver
22
+
23
+ def initialize(
24
+ name:,
25
+ grouped_register:,
26
+ key_mappings: [],
27
+ keys: [],
28
+ register: DEFAULT_REGISTER,
29
+ separator: ''
30
+ )
31
+ super(name: name, register: register)
32
+
33
+ @grouped_register = grouped_register.to_s
34
+ @key_mappings = Modeling::KeyMapping.array(key_mappings)
35
+ @keys = Array(keys)
36
+ @resolver = Objectable.resolver(separator: separator.to_s)
37
+
38
+ raise ArgumentError, 'at least one key is required' if @keys.empty?
39
+
40
+ freeze
41
+ end
42
+
43
+ def perform(output, payload)
44
+ payload[register] = array(payload[register])
45
+ count = payload[register].length
46
+
47
+ output.detail("Coalescing based on key(s): #{keys} for #{count} records(s)")
48
+
49
+ payload[register].each do |record|
50
+ key = make_key(record)
51
+ lookup = find_lookup(payload, key)
52
+
53
+ key_mappings.each do |key_mapping|
54
+ value = resolver.get(lookup, key_mapping.from)
55
+
56
+ resolver.set(record, key_mapping.to, value)
57
+ end
58
+ end
59
+ end
60
+
61
+ private
62
+
63
+ def find_lookup(payload, key)
64
+ (payload[grouped_register] || {})[key] || {}
65
+ end
66
+
67
+ def make_key(record)
68
+ keys.map { |key| resolver.get(record, key) }
69
+ end
70
+ end
71
+ end
72
+ end
73
+ end
@@ -0,0 +1,42 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Take the list of from_registers and concatenate each of their values together.
14
+ # Each from_value will be coerced into an array if it not an array.
15
+ #
16
+ # Expected Payload[from_register] input: array of objects.
17
+ # Payload[to_register] output: An array of objects.
18
+ class Concatenate < Job
19
+ attr_reader :from_registers, :to_register
20
+
21
+ def initialize(name:, from_registers: [], to_register: DEFAULT_REGISTER)
22
+ super(name: name)
23
+
24
+ @from_registers = Array(from_registers)
25
+ @to_register = to_register.to_s
26
+
27
+ freeze
28
+ end
29
+
30
+ def perform(output, payload)
31
+ output.detail("Concatenating registers: '#{from_registers}' to: '#{to_register}'")
32
+
33
+ payload[to_register] = from_registers.each_with_object([]) do |from_register, memo|
34
+ from_register_value = array(payload[from_register])
35
+
36
+ memo.concat(from_register_value)
37
+ end
38
+ end
39
+ end
40
+ end
41
+ end
42
+ end
@@ -13,12 +13,17 @@ module Burner
13
13
  # Take an array of (denormalized) objects and create an object hierarchy from them.
14
14
  # Under the hood it uses Hashematics: https://github.com/bluemarblepayroll/hashematics.
15
15
  #
16
- # Expected Payload#value input: array of objects.
17
- # Payload#value output: An array of objects.
16
+ # Expected Payload[register] input: array of objects.
17
+ # Payload[register] output: An array of objects.
18
18
  class Graph < JobWithRegister
19
19
  attr_reader :key, :groups
20
20
 
21
- def initialize(name:, key:, config: Hashematics::Configuration.new, register: '')
21
+ def initialize(
22
+ name:,
23
+ key:,
24
+ config: Hashematics::Configuration.new,
25
+ register: DEFAULT_REGISTER
26
+ )
22
27
  super(name: name, register: register)
23
28
 
24
29
  raise ArgumentError, 'key is required' if key.to_s.empty?
@@ -0,0 +1,66 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Take a register's value (an array of objects) and group the objects by the specified keys.
14
+ # It essentially creates a hash from an array. This is useful for creating a O(1) lookup
15
+ # which can then be used in conjunction with the Coalesce Job for another array of data.
16
+ #
17
+ # An example of this specific job:
18
+ #
19
+ # input: [{ id: 1, code: 'a' }, { id: 2, code: 'b' }]
20
+ # keys: [:code]
21
+ # output: { 'a' => { id: 1, code: 'a' }, 'b' => { id: 2, code: 'b' } }
22
+ #
23
+ # Expected Payload[register] input: array of objects.
24
+ # Payload[register] output: hash.
25
+ class Group < JobWithRegister
26
+ attr_reader :keys, :resolver
27
+
28
+ def initialize(
29
+ name:,
30
+ keys: [],
31
+ register: DEFAULT_REGISTER,
32
+ separator: ''
33
+ )
34
+ super(name: name, register: register)
35
+
36
+ @keys = Array(keys)
37
+ @resolver = Objectable.resolver(separator: separator.to_s)
38
+
39
+ raise ArgumentError, 'at least one key is required' if @keys.empty?
40
+
41
+ freeze
42
+ end
43
+
44
+ def perform(output, payload)
45
+ payload[register] = array(payload[register])
46
+ count = payload[register].length
47
+
48
+ output.detail("Grouping based on key(s): #{keys} for #{count} records(s)")
49
+
50
+ grouped_records = payload[register].each_with_object({}) do |record, memo|
51
+ key = make_key(record)
52
+ memo[key] = record
53
+ end
54
+
55
+ payload[register] = grouped_records
56
+ end
57
+
58
+ private
59
+
60
+ def make_key(record)
61
+ keys.map { |key| resolver.get(record, key) }
62
+ end
63
+ end
64
+ end
65
+ end
66
+ end
@@ -15,39 +15,41 @@ module Burner
15
15
  # Burner::Modeling::KeyIndexMapping instances or hashable configurations which specifies
16
16
  # the key-to-index mappings to use.
17
17
  #
18
- # Expected Payload#value input: array of hashes.
19
- # Payload#value output: An array of arrays.
18
+ # Expected Payload[register] input: array of hashes.
19
+ # Payload[register] output: An array of arrays.
20
20
  #
21
21
  # An example using a configuration-first pipeline:
22
22
  #
23
- # config = {
24
- # jobs: [
25
- # {
26
- # name: 'set',
27
- # type: 'b/set_value',
28
- # value: [
29
- # [1, 'funky']
30
- # ]
31
- # },
32
- # {
33
- # name: 'map',
34
- # type: 'b/collection/objects_to_arrays',
35
- # mappings: [
36
- # { index: 0, key: 'id' },
37
- # { index: 1, key: 'name' }
38
- # ]
39
- # },
40
- # {
41
- # name: 'output',
42
- # type: 'b/echo',
43
- # message: 'value is currently: {__value}'
44
- # },
23
+ # config = {
24
+ # jobs: [
25
+ # {
26
+ # name: 'set',
27
+ # type: 'b/value/static',
28
+ # value: [
29
+ # { 'id' => 1, 'name' => 'funky' }
30
+ # ],
31
+ # register: register
32
+ # },
33
+ # {
34
+ # name: 'map',
35
+ # type: 'b/collection/objects_to_arrays',
36
+ # mappings: [
37
+ # { index: 0, key: 'id' },
38
+ # { index: 1, key: 'name' }
39
+ # ],
40
+ # register: register
41
+ # },
42
+ # {
43
+ # name: 'output',
44
+ # type: 'b/echo',
45
+ # message: 'value is currently: {__value}'
46
+ # },
45
47
  #
46
- # ],
47
- # steps: %w[set map output]
48
- # }
48
+ # ],
49
+ # steps: %w[set map output]
50
+ # }
49
51
  #
50
- # Burner::Pipeline.make(config).execute
52
+ # Burner::Pipeline.make(config).execute
51
53
  class ObjectsToArrays < JobWithRegister
52
54
  attr_reader :mappings
53
55
 
@@ -56,7 +58,7 @@ module Burner
56
58
  # nested hashes then set separator to '.'. For more information, see the underlying
57
59
  # library that supports this dot-notation concept:
58
60
  # https://github.com/bluemarblepayroll/objectable
59
- def initialize(name:, mappings: [], register: '', separator: '')
61
+ def initialize(name:, mappings: [], register: DEFAULT_REGISTER, separator: '')
60
62
  super(name: name, register: register)
61
63
 
62
64
  @mappings = Modeling::KeyIndexMapping.array(mappings)
@@ -14,8 +14,8 @@ module Burner
14
14
  # attribute. The initial use case for this was to remove "header" rows from arrays,
15
15
  # like you would expect when parsing CSV files.
16
16
  #
17
- # Expected Payload#value input: nothing.
18
- # Payload#value output: An array with N beginning elements removed.
17
+ # Expected Payload[register] input: nothing.
18
+ # Payload[register] output: An array with N beginning elements removed.
19
19
  class Shift < JobWithRegister
20
20
  DEFAULT_AMOUNT = 0
21
21
 
@@ -23,7 +23,7 @@ module Burner
23
23
 
24
24
  attr_reader :amount
25
25
 
26
- def initialize(name:, amount: DEFAULT_AMOUNT, register: '')
26
+ def initialize(name:, amount: DEFAULT_AMOUNT, register: DEFAULT_REGISTER)
27
27
  super(name: name, register: register)
28
28
 
29
29
  @amount = amount.to_i
@@ -18,8 +18,8 @@ module Burner
18
18
  # For more information on the specific contract for attributes, see the
19
19
  # Burner::Modeling::Attribute class.
20
20
  #
21
- # Expected Payload#value input: array of objects.
22
- # Payload#value output: An array of objects.
21
+ # Expected Payload[register] input: array of objects.
22
+ # Payload[register] output: An array of objects.
23
23
  class Transform < JobWithRegister
24
24
  BLANK = ''
25
25
 
@@ -27,7 +27,13 @@ module Burner
27
27
  :exclusive,
28
28
  :resolver
29
29
 
30
- def initialize(name:, attributes: [], exclusive: false, register: '', separator: BLANK)
30
+ def initialize(
31
+ name:,
32
+ attributes: [],
33
+ exclusive: false,
34
+ register: DEFAULT_REGISTER,
35
+ separator: BLANK
36
+ )
31
37
  super(name: name, register: register)
32
38
 
33
39
  @resolver = Objectable.resolver(separator: separator)
@@ -14,12 +14,16 @@ module Burner
14
14
  # Under the hood it uses HashMath's Unpivot class:
15
15
  # https://github.com/bluemarblepayroll/hash_math
16
16
  #
17
- # Expected Payload#value input: array of objects.
18
- # Payload#value output: An array of objects.
17
+ # Expected Payload[register] input: array of objects.
18
+ # Payload[register] output: An array of objects.
19
19
  class Unpivot < JobWithRegister
20
20
  attr_reader :unpivot
21
21
 
22
- def initialize(name:, pivot_set: HashMath::Unpivot::PivotSet.new, register: '')
22
+ def initialize(
23
+ name:,
24
+ pivot_set: HashMath::Unpivot::PivotSet.new,
25
+ register: DEFAULT_REGISTER
26
+ )
23
27
  super(name: name, register: register)
24
28
 
25
29
  @unpivot = HashMath::Unpivot.new(pivot_set)
@@ -14,8 +14,9 @@ module Burner
14
14
  # of validations. The main register will include the valid objects and the invalid_register
15
15
  # will contain the invalid objects.
16
16
  #
17
- # Expected Payload#value input: array of objects.
18
- # Payload#value output: An array of objects.
17
+ # Expected Payload[register] input: array of objects.
18
+ # Payload[register] output: An array of objects that are valid.
19
+ # Payload[invalid_register] output: An array of objects that are invalid.
19
20
  class Validate < JobWithRegister
20
21
  DEFAULT_INVALID_REGISTER = 'invalid'
21
22
  DEFAULT_JOIN_CHAR = ', '
@@ -32,7 +33,7 @@ module Burner
32
33
  invalid_register: DEFAULT_INVALID_REGISTER,
33
34
  join_char: DEFAULT_JOIN_CHAR,
34
35
  message_key: DEFAULT_MESSAGE_KEY,
35
- register: '',
36
+ register: DEFAULT_REGISTER,
36
37
  separator: '',
37
38
  validations: []
38
39
  )
@@ -14,12 +14,12 @@ module Burner
14
14
  # If include_keys is true (it is false by default), then call #keys on the first
15
15
  # object and inject that as a "header" object.
16
16
  #
17
- # Expected Payload#value input: array of objects.
18
- # Payload#value output: An array of arrays.
17
+ # Expected Payload[register] input: array of objects.
18
+ # Payload[register] output: An array of arrays.
19
19
  class Values < JobWithRegister
20
20
  attr_reader :include_keys
21
21
 
22
- def initialize(name:, include_keys: false, register: '')
22
+ def initialize(name:, include_keys: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, register: register)
24
24
 
25
25
  @include_keys = include_keys || false
@@ -12,8 +12,8 @@ module Burner
12
12
  module Deserialize
13
13
  # Take a CSV string and de-serialize into object(s).
14
14
  #
15
- # Expected Payload#value input: nothing.
16
- # Payload#value output: an array of arrays. Each inner array represents one data row.
15
+ # Expected Payload[register] input: nothing.
16
+ # Payload[register] output: an array of arrays. Each inner array represents one data row.
17
17
  class Csv < JobWithRegister
18
18
  # This currently only supports returning an array of arrays, including the header row.
19
19
  # In the future this could be extended to offer more customizable options, such as
@@ -12,8 +12,8 @@ module Burner
12
12
  module Deserialize
13
13
  # Take a JSON string and deserialize into object(s).
14
14
  #
15
- # Expected Payload#value input: string of JSON data.
16
- # Payload#value output: anything, as specified by the JSON de-serializer.
15
+ # Expected Payload[register] input: string of JSON data.
16
+ # Payload[register] output: anything, as specified by the JSON de-serializer.
17
17
  class Json < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = JSON.parse(payload[register])
@@ -15,12 +15,12 @@ module Burner
15
15
  # YAML. If you wish to ease this restriction, for example if you have custom serialization
16
16
  # for custom classes, then you can pass in safe: false.
17
17
  #
18
- # Expected Payload#value input: string of YAML data.
19
- # Payload#value output: anything as specified by the YAML de-serializer.
18
+ # Expected Payload[register] input: string of YAML data.
19
+ # Payload[register]output: anything as specified by the YAML de-serializer.
20
20
  class Yaml < JobWithRegister
21
21
  attr_reader :safe
22
22
 
23
- def initialize(name:, register: '', safe: true)
23
+ def initialize(name:, register: DEFAULT_REGISTER, safe: true)
24
24
  super(name: name, register: register)
25
25
 
26
26
  @safe = safe
@@ -11,7 +11,7 @@ module Burner
11
11
  module Library
12
12
  # Output a simple message to the output.
13
13
  #
14
- # Note: this does not use Payload#value.
14
+ # Note: this does not use Payload#registers.
15
15
  class Echo < Job
16
16
  attr_reader :message
17
17
 
@@ -14,7 +14,7 @@ module Burner
14
14
  class Base < JobWithRegister
15
15
  attr_reader :path
16
16
 
17
- def initialize(name:, path:, register: '')
17
+ def initialize(name:, path:, register: DEFAULT_REGISTER)
18
18
  super(name: name, register: register)
19
19
 
20
20
  raise ArgumentError, 'path is required' if path.to_s.empty?
@@ -15,7 +15,7 @@ module Burner
15
15
  # Check to see if a file exists. If short_circuit is set to true and the file
16
16
  # does not exist then the job will return false and short circuit the pipeline.
17
17
  #
18
- # Note: this does not use Payload#value.
18
+ # Note: this does not use Payload#registers.
19
19
  class Exist < Job
20
20
  attr_reader :path, :short_circuit
21
21
 
@@ -14,12 +14,12 @@ module Burner
14
14
  module IO
15
15
  # Read value from disk.
16
16
  #
17
- # Expected Payload#value input: nothing.
18
- # Payload#value output: contents of the specified file.
17
+ # Expected Payload[register] input: nothing.
18
+ # Payload[register] output: contents of the specified file.
19
19
  class Read < Base
20
20
  attr_reader :binary
21
21
 
22
- def initialize(name:, path:, binary: false, register: '')
22
+ def initialize(name:, path:, binary: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, path: path, register: register)
24
24
 
25
25
  @binary = binary || false
@@ -14,12 +14,12 @@ module Burner
14
14
  module IO
15
15
  # Write value to disk.
16
16
  #
17
- # Expected Payload#value input: anything.
18
- # Payload#value output: whatever was passed in.
17
+ # Expected Payload[register] input: anything.
18
+ # Payload[register] output: whatever was passed in.
19
19
  class Write < Base
20
20
  attr_reader :binary
21
21
 
22
- def initialize(name:, path:, binary: false, register: '')
22
+ def initialize(name:, path:, binary: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, path: path, register: register)
24
24
 
25
25
  @binary = binary || false
@@ -11,8 +11,8 @@ module Burner
11
11
  module Library
12
12
  # Do nothing.
13
13
  #
14
- # Note: this does not use Payload#value.
15
- class Dummy < Job
14
+ # Note: this does not use Payload#registers.
15
+ class Nothing < Job
16
16
  def perform(_output, _payload); end
17
17
  end
18
18
  end
@@ -12,8 +12,8 @@ module Burner
12
12
  module Serialize
13
13
  # Take an array of arrays and create a CSV.
14
14
  #
15
- # Expected Payload#value input: array of arrays.
16
- # Payload#value output: a serialized CSV string.
15
+ # Expected Payload[register] input: array of arrays.
16
+ # Payload[register] output: a serialized CSV string.
17
17
  class Csv < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = CSV.generate(options) do |csv|
@@ -12,8 +12,8 @@ module Burner
12
12
  module Serialize
13
13
  # Treat value like a Ruby object and serialize it using JSON.
14
14
  #
15
- # Expected Payload#value input: anything.
16
- # Payload#value output: string representing the output of the JSON serializer.
15
+ # Expected Payload[register] input: anything.
16
+ # Payload[register] output: string representing the output of the JSON serializer.
17
17
  class Json < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = payload[register].to_json
@@ -12,8 +12,8 @@ module Burner
12
12
  module Serialize
13
13
  # Treat value like a Ruby object and serialize it using YAML.
14
14
  #
15
- # Expected Payload#value input: anything.
16
- # Payload#value output: string representing the output of the YAML serializer.
15
+ # Expected Payload[register] input: anything.
16
+ # Payload[register] output: string representing the output of the YAML serializer.
17
17
  class Yaml < JobWithRegister
18
18
  def perform(_output, payload)
19
19
  payload[register] = payload[register].to_yaml
@@ -11,7 +11,7 @@ module Burner
11
11
  module Library
12
12
  # Arbitrarily put thread to sleep for X number of seconds
13
13
  #
14
- # Payload#value output: whatever value was specified in this job.
14
+ # Note: this does not use Payload#registers.
15
15
  class Sleep < Job
16
16
  attr_reader :seconds
17
17
 
@@ -0,0 +1,39 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Value
13
+ # Copy one value in a register to another. Note that this does *not* perform any type of
14
+ # deep copy, it simply points one register's value to another. If you decide to later mutate
15
+ # one register then you may mutate the other.
16
+ #
17
+ # Expected Payload[from_register] input: anything.
18
+ # Payload[to_register] output: whatever value was specified in the from_register.
19
+ class Copy < Job
20
+ attr_reader :from_register, :to_register
21
+
22
+ def initialize(name:, to_register: DEFAULT_REGISTER, from_register: DEFAULT_REGISTER)
23
+ super(name: name)
24
+
25
+ @from_register = from_register.to_s
26
+ @to_register = to_register.to_s
27
+
28
+ freeze
29
+ end
30
+
31
+ def perform(output, payload)
32
+ output.detail("Copying register: '#{from_register}' to: '#{to_register}'")
33
+
34
+ payload[to_register] = payload[from_register]
35
+ end
36
+ end
37
+ end
38
+ end
39
+ end
@@ -0,0 +1,34 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Value
13
+ # Arbitrarily set the value of a register.
14
+ #
15
+ # Expected Payload[register] input: anything.
16
+ # Payload[register] output: whatever value was specified in this job.
17
+ class Static < JobWithRegister
18
+ attr_reader :value
19
+
20
+ def initialize(name:, register: DEFAULT_REGISTER, value: nil)
21
+ super(name: name, register: register)
22
+
23
+ @value = value
24
+
25
+ freeze
26
+ end
27
+
28
+ def perform(_output, payload)
29
+ payload[register] = value
30
+ end
31
+ end
32
+ end
33
+ end
34
+ end
@@ -10,4 +10,5 @@
10
10
  require_relative 'modeling/attribute'
11
11
  require_relative 'modeling/attribute_renderer'
12
12
  require_relative 'modeling/key_index_mapping'
13
+ require_relative 'modeling/key_mapping'
13
14
  require_relative 'modeling/validations'
@@ -10,7 +10,9 @@
10
10
  module Burner
11
11
  module Modeling
12
12
  # Defines a top-level key and the associated transformers for deriving the final value
13
- # to set the key to.
13
+ # to set the key to. The transformers that can be passed in can be any Realize::Transformers
14
+ # subclasses. For more information, see the Realize library at:
15
+ # https://github.com/bluemarblepayroll/realize
14
16
  class Attribute
15
17
  acts_as_hashable
16
18
 
@@ -0,0 +1,29 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Modeling
12
+ # Generic mapping from a key to another key.
13
+ class KeyMapping
14
+ acts_as_hashable
15
+
16
+ attr_reader :from, :to
17
+
18
+ def initialize(from:, to:)
19
+ raise ArgumentError, 'from is required' if from.to_s.empty?
20
+ raise ArgumentError, 'to is required' if to.to_s.empty?
21
+
22
+ @from = from.to_s
23
+ @to = to.to_s
24
+
25
+ freeze
26
+ end
27
+ end
28
+ end
29
+ end
@@ -27,7 +27,7 @@ module Burner
27
27
  end
28
28
 
29
29
  def message
30
- @message.to_s.empty? ? "#{key}#{default_message}" : @message.to_s
30
+ @message.to_s.empty? ? "#{key} #{default_message}" : @message.to_s
31
31
  end
32
32
  end
33
33
  end
@@ -16,14 +16,18 @@ module Burner
16
16
  class Blank < Base
17
17
  acts_as_hashable
18
18
 
19
+ BLANK_RE = /\A[[:space:]]*\z/.freeze
20
+
19
21
  def valid?(object, resolver)
20
- resolver.get(object, key).to_s.empty?
22
+ value = resolver.get(object, key).to_s
23
+
24
+ value.empty? || BLANK_RE.match?(value)
21
25
  end
22
26
 
23
27
  private
24
28
 
25
29
  def default_message
26
- ' must be blank'
30
+ 'must be blank'
27
31
  end
28
32
  end
29
33
  end
@@ -7,23 +7,23 @@
7
7
  # LICENSE file in the root directory of this source tree.
8
8
  #
9
9
 
10
- require_relative 'base'
10
+ require_relative 'blank'
11
11
 
12
12
  module Burner
13
13
  module Modeling
14
14
  class Validations
15
15
  # Check if a value is present. If it is blank (null or empty) then it is invalid.
16
- class Present < Base
16
+ class Present < Blank
17
17
  acts_as_hashable
18
18
 
19
19
  def valid?(object_value, resolver)
20
- !resolver.get(object_value, key).to_s.empty?
20
+ !super(object_value, resolver)
21
21
  end
22
22
 
23
23
  private
24
24
 
25
25
  def default_message
26
- ' is required'
26
+ 'is required'
27
27
  end
28
28
  end
29
29
  end
@@ -8,16 +8,22 @@
8
8
  #
9
9
 
10
10
  module Burner
11
- # The input for all Job#perform methods. The main notion of this object is its "value"
12
- # attribute. This is dynamic and weak on purpose and is subject to whatever the Job#perform
13
- # methods decides it is. This definitely adds an order-of-magnitude complexity to this whole
14
- # library and lifecycle, but I am not sure there is any other way around it: trying to build
15
- # a generic, open-ended object pipeline to serve almost any use case.
11
+ # The input for all Job#perform methods. The main notion of this object is its 'registers'
12
+ # attribute. This registers attribute is a key-indifferent hash, accessible on Payload using
13
+ # the brackets setter and getter methods. This is dynamic and weak on purpose and is subject
14
+ # to whatever the Job#perform methods decides it is. This definitely adds an order-of-magnitude
15
+ # complexity to this whole library and lifecycle, but I am not sure there is any other way
16
+ # around it: trying to build a generic, open-ended processing pipeline to serve almost
17
+ # any use case.
16
18
  #
17
19
  # The side_effects attribute can also be utilized as a way for jobs to emit any data in a more
18
20
  # structured/additive manner. The initial use case for this was for Burner's core IO jobs to
19
21
  # report back the files it has written in a more structured data way (as opposed to simply
20
22
  # writing some information to the output.)
23
+ #
24
+ # The 'time' attribute is important in that it should for the replaying of pipelines and jobs.
25
+ # Instead of having job's utilizing Time.now, Date.today, etc... they should rather opt to
26
+ # use this value instead.
21
27
  class Payload
22
28
  attr_reader :params,
23
29
  :registers,
@@ -8,5 +8,5 @@
8
8
  #
9
9
 
10
10
  module Burner
11
- VERSION = '1.0.0-alpha.9'
11
+ VERSION = '1.1.0-alpha'
12
12
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: burner
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0.pre.alpha.9
4
+ version: 1.1.0.pre.alpha
5
5
  platform: ruby
6
6
  authors:
7
7
  - Matthew Ruggio
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2020-10-27 00:00:00.000000000 Z
11
+ date: 2020-11-15 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: acts_as_hashable
@@ -72,14 +72,14 @@ dependencies:
72
72
  requirements:
73
73
  - - "~>"
74
74
  - !ruby/object:Gem::Version
75
- version: '1.2'
75
+ version: '1.3'
76
76
  type: :runtime
77
77
  prerelease: false
78
78
  version_requirements: !ruby/object:Gem::Requirement
79
79
  requirements:
80
80
  - - "~>"
81
81
  - !ruby/object:Gem::Version
82
- version: '1.2'
82
+ version: '1.3'
83
83
  - !ruby/object:Gem::Dependency
84
84
  name: stringento
85
85
  requirement: !ruby/object:Gem::Requirement
@@ -225,7 +225,10 @@ files:
225
225
  - lib/burner/jobs.rb
226
226
  - lib/burner/library.rb
227
227
  - lib/burner/library/collection/arrays_to_objects.rb
228
+ - lib/burner/library/collection/coalesce.rb
229
+ - lib/burner/library/collection/concatenate.rb
228
230
  - lib/burner/library/collection/graph.rb
231
+ - lib/burner/library/collection/group.rb
229
232
  - lib/burner/library/collection/objects_to_arrays.rb
230
233
  - lib/burner/library/collection/shift.rb
231
234
  - lib/burner/library/collection/transform.rb
@@ -235,21 +238,23 @@ files:
235
238
  - lib/burner/library/deserialize/csv.rb
236
239
  - lib/burner/library/deserialize/json.rb
237
240
  - lib/burner/library/deserialize/yaml.rb
238
- - lib/burner/library/dummy.rb
239
241
  - lib/burner/library/echo.rb
240
242
  - lib/burner/library/io/base.rb
241
243
  - lib/burner/library/io/exist.rb
242
244
  - lib/burner/library/io/read.rb
243
245
  - lib/burner/library/io/write.rb
246
+ - lib/burner/library/nothing.rb
244
247
  - lib/burner/library/serialize/csv.rb
245
248
  - lib/burner/library/serialize/json.rb
246
249
  - lib/burner/library/serialize/yaml.rb
247
- - lib/burner/library/set_value.rb
248
250
  - lib/burner/library/sleep.rb
251
+ - lib/burner/library/value/copy.rb
252
+ - lib/burner/library/value/static.rb
249
253
  - lib/burner/modeling.rb
250
254
  - lib/burner/modeling/attribute.rb
251
255
  - lib/burner/modeling/attribute_renderer.rb
252
256
  - lib/burner/modeling/key_index_mapping.rb
257
+ - lib/burner/modeling/key_mapping.rb
253
258
  - lib/burner/modeling/validations.rb
254
259
  - lib/burner/modeling/validations/base.rb
255
260
  - lib/burner/modeling/validations/blank.rb
@@ -1,32 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- #
4
- # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
- #
6
- # This source code is licensed under the MIT license found in the
7
- # LICENSE file in the root directory of this source tree.
8
- #
9
-
10
- module Burner
11
- module Library
12
- # Arbitrarily set value
13
- #
14
- # Expected Payload#value input: anything.
15
- # Payload#value output: whatever value was specified in this job.
16
- class SetValue < JobWithRegister
17
- attr_reader :value
18
-
19
- def initialize(name:, register: '', value: nil)
20
- super(name: name, register: register)
21
-
22
- @value = value
23
-
24
- freeze
25
- end
26
-
27
- def perform(_output, payload)
28
- payload[register] = value
29
- end
30
- end
31
- end
32
- end