burner 1.0.0.pre.alpha.11 → 1.2.0

Sign up to get free protection for your applications and to get access to all the features.
Files changed (36) hide show
  1. checksums.yaml +4 -4
  2. data/CHANGELOG.md +21 -0
  3. data/README.md +17 -25
  4. data/burner.gemspec +1 -1
  5. data/lib/burner.rb +6 -0
  6. data/lib/burner/job_with_register.rb +1 -1
  7. data/lib/burner/jobs.rb +7 -1
  8. data/lib/burner/library.rb +7 -1
  9. data/lib/burner/library/collection/arrays_to_objects.rb +2 -2
  10. data/lib/burner/library/collection/coalesce.rb +73 -0
  11. data/lib/burner/library/collection/concatenate.rb +42 -0
  12. data/lib/burner/library/collection/graph.rb +6 -1
  13. data/lib/burner/library/collection/group.rb +68 -0
  14. data/lib/burner/library/collection/nested_aggregate.rb +67 -0
  15. data/lib/burner/library/collection/objects_to_arrays.rb +2 -2
  16. data/lib/burner/library/collection/shift.rb +1 -1
  17. data/lib/burner/library/collection/transform.rb +7 -1
  18. data/lib/burner/library/collection/unpivot.rb +5 -1
  19. data/lib/burner/library/collection/validate.rb +1 -1
  20. data/lib/burner/library/collection/values.rb +1 -1
  21. data/lib/burner/library/deserialize/yaml.rb +1 -1
  22. data/lib/burner/library/io/base.rb +1 -1
  23. data/lib/burner/library/io/read.rb +1 -1
  24. data/lib/burner/library/io/write.rb +1 -1
  25. data/lib/burner/library/value/copy.rb +39 -0
  26. data/lib/burner/library/value/static.rb +34 -0
  27. data/lib/burner/modeling.rb +1 -0
  28. data/lib/burner/modeling/attribute.rb +20 -2
  29. data/lib/burner/modeling/key_mapping.rb +29 -0
  30. data/lib/burner/modeling/validations/base.rb +1 -1
  31. data/lib/burner/modeling/validations/blank.rb +6 -2
  32. data/lib/burner/modeling/validations/present.rb +4 -4
  33. data/lib/burner/pipeline.rb +6 -3
  34. data/lib/burner/version.rb +1 -1
  35. metadata +13 -7
  36. data/lib/burner/library/set_value.rb +0 -32
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: adf7e5e1ea0c19d59b6edcbb1c5073e25f533bf32076df4ec7d9122edc852958
4
- data.tar.gz: 40009afdb93c5ee3971513971dd47a8f416acd04b33eb0c8e9720806cce4a515
3
+ metadata.gz: a2c3e9af2bbfbf80cd5c5884e05b1282f52871566103baf4ca1b599fc76af8ca
4
+ data.tar.gz: e7c2eb8a00086e1937a4d9e39383d38a60c5ebaa5efa1d67da4e6af9741261cd
5
5
  SHA512:
6
- metadata.gz: cf255e3021e975451354a3d9537b464ffc61843c59b780ceace9dc6be04e4e2499f5d874aa905608ed3c4b60c4989e6bb3c8cccd2116e8286ca045ea4202bea7
7
- data.tar.gz: 569a44eb9a5915d946b4de3038ead60094b170481583ea6974391eedaba209162df03883757b917b2e9c3ed3be6d8c2f80f64f43e6420d5fbc2d711eedc81a8c
6
+ metadata.gz: e885c7bf710f613323bbc3fc49162815632d7785aa6e753f3cd53bb04d2b53e74f92be549f75b6997d4dedce5c2dabe01fd032796a6d6e2fadfd6e84c633b168
7
+ data.tar.gz: e09d491b3fb7ef1b79932ac6094adedd74f806170a2a11aa1ff1eda838b6bf76cec1684dbfaba12d07a6e071062d432bcd5f3939f275830fc13fb71cabb39f8c
@@ -1,3 +1,24 @@
1
+ # 1.2.0 (November 25th, 2020)
2
+
3
+ #### Enhancements:
4
+
5
+ * All for a pipeline to be configured with null steps. When null, just execute all jobs in positional order.
6
+ * Allow Collection::Transform job attributes to implicitly start from a resolve transformer. `explicit: true` can be passed in as an option in case the desire is to begin from the record and not a specific value.
7
+
8
+ #### Added Jobs:
9
+
10
+ * b/collection/nested_aggregate
11
+ # 1.1.0 (November 16, 2020)
12
+
13
+ Added Jobs:
14
+
15
+ * b/collection/coalesce
16
+ * b/collection/group
17
+
18
+ # 1.0.0 (November 5th, 2020)
19
+
20
+ Initial version publication.
21
+
1
22
  # 0.0.1
2
23
 
3
24
  Shell
data/README.md CHANGED
@@ -91,38 +91,20 @@ Some notes:
91
91
  * The job's ID can be accessed using the `__id` key.
92
92
  * The current job's payload value can be accessed using the `__value` key.
93
93
  * Jobs can be re-used (just like the output_id and output_value jobs).
94
+ * If steps is nil then all jobs will execute in their declared order.
94
95
 
95
96
  ### Capturing Feedback / Output
96
97
 
97
98
  By default, output will be emitted to `$stdout`. You can add or change listeners by passing in optional values into Pipeline#execute. For example, say we wanted to capture the output from our json-to-yaml example:
98
99
 
99
100
  ````ruby
100
- class StringOut
101
- def initialize
102
- @io = StringIO.new
103
- end
104
-
105
- def puts(msg)
106
- tap { io.write("#{msg}\n") }
107
- end
108
-
109
- def read
110
- io.rewind
111
- io.read
112
- end
113
-
114
- private
115
-
116
- attr_reader :io
117
- end
118
-
119
- string_out = StringOut.new
120
- output = Burner::Output.new(outs: string_out)
121
- payload = Burner::Payload.new(params: params)
101
+ io = StringIO.new
102
+ output = Burner::Output.new(outs: io)
103
+ payload = Burner::Payload.new(params: params)
122
104
 
123
105
  Burner::Pipeline.make(pipeline).execute(output: output, payload: payload)
124
106
 
125
- log = string_out.read
107
+ log = io.string
126
108
  ````
127
109
 
128
110
  The value of `log` should now look similar to:
@@ -234,10 +216,14 @@ This library only ships with very basic, rudimentary jobs that are meant to just
234
216
  #### Collection
235
217
 
236
218
  * **b/collection/arrays_to_objects** [mappings, register]: Convert an array of arrays to an array of objects.
219
+ * **b/collection/coalesce** [register, grouped_register, key_mappings, keys, separator]: Merge two datasets together based on the key values of one dataset (array) with a grouped dataset (hash).
220
+ * **b/collection/concatenate** [from_registers, to_register]: Concatenate each from_register's value and place the newly concatenated array into the to_register. Note: this does not do any deep copying and should be assumed it is shallow copying all objects.
237
221
  * **b/collection/graph** [config, key, register]: Use [Hashematics](https://github.com/bluemarblepayroll/hashematics) to turn a flat array of objects into a deeply nested object tree.
222
+ * **b/collection/group** [keys, register, separator]: Take a register's value (an array of objects) and group the objects by the specified keys.
223
+ * **b/collection/nested_aggregate** [register, key_mappings, key, separator]: Traverse a set of objects, resolving key's value for each object, optionally copying down key_mappings to the child records, then merging all the inner records together.
238
224
  * **b/collection/objects_to_arrays** [mappings, register]: Convert an array of objects to an array of arrays.
239
225
  * **b/collection/shift** [amount, register]: Remove the first N number of elements from an array.
240
- * **b/collection/transform** [attributes, exclusive, separator, register]: Iterate over all objects and transform each key per the attribute transformers specifications. If exclusive is set to false then the current object will be overridden/merged. Separator can also be set for key path support. This job uses [Realize](https://github.com/bluemarblepayroll/realize), which provides its own extendable value-transformation pipeline.
226
+ * **b/collection/transform** [attributes, exclusive, separator, register]: Iterate over all objects and transform each key per the attribute transformers specifications. If exclusive is set to false then the current object will be overridden/merged. Separator can also be set for key path support. This job uses [Realize](https://github.com/bluemarblepayroll/realize), which provides its own extendable value-transformation pipeline. If an attribute is not set with `explicit: true` then it will automatically start from the key's value from the record. If `explicit: true` is started, then it will start from the record itself.
241
227
  * **b/collection/unpivot** [pivot_set, register]: Take an array of objects and unpivot specific sets of keys into rows. Under the hood it uses [HashMath's Unpivot class](https://github.com/bluemarblepayroll/hash_math#unpivot-hash-key-coalescence-and-row-extrapolation).
242
228
  * **b/collection/validate** [invalid_register, join_char, message_key, register, separator, validations]: Take an array of objects, run it through each declared validator, and split the objects into two registers. The valid objects will be split into the current register while the invalid ones will go into the invalid_register as declared. Optional arguments, join_char and message_key, help determine the compiled error messages. The separator option can be utilized to use dot-notation for validating keys. See each validation's options by viewing their classes within the `lib/modeling/validations` directory.
243
229
  * **b/collection/values** [include_keys, register]: Take an array of objects and call `#values` on each object. If include_keys is true (it is false by default), then call `#keys` on the first object and inject that as a "header" object.
@@ -260,11 +246,15 @@ This library only ships with very basic, rudimentary jobs that are meant to just
260
246
  * **b/serialize/json** [register]: Convert value to JSON.
261
247
  * **b/serialize/yaml** [register]: Convert value to YAML.
262
248
 
249
+ #### Value
250
+
251
+ * **b/value/copy** [from_register, to_register]: Copy from_register's value into the to_register. Note: this does not do any deep copying and should be assumed it is shallow copying all objects.
252
+ * **b/value/static** [register, value]: Set the value to any arbitrary value.
253
+
263
254
  #### General
264
255
 
265
256
  * **b/echo** [message]: Write a message to the output. The message parameter can be interpolated using `Payload#params`.
266
257
  * **b/nothing** []: Do nothing.
267
- * **b/set** [register, value]: Set the value to any arbitrary value.
268
258
  * **b/sleep** [seconds]: Sleep the thread for X number of seconds.
269
259
 
270
260
  Notes:
@@ -273,6 +263,8 @@ Notes:
273
263
 
274
264
  ### Adding & Registering Jobs
275
265
 
266
+ Note: Jobs have to be registered with a type in the Burner::Jobs factory. All jobs that ship with this library are prefixed with `b/` in their type in order to provide a namespace for 'burner-specific' jobs vs. externally provided jobs.
267
+
276
268
  Where this library shines is when additional jobs are plugged in. Burner uses its `Burner::Jobs` class as its class-level registry built with [acts_as_hashable](https://github.com/bluemarblepayroll/acts_as_hashable)'s acts_as_hashable_factory directive.
277
269
 
278
270
  Let's say we would like to register a job to parse a CSV:
@@ -32,7 +32,7 @@ Gem::Specification.new do |s|
32
32
  s.add_dependency('hashematics', '~>1.1')
33
33
  s.add_dependency('hash_math', '~>1.2')
34
34
  s.add_dependency('objectable', '~>1.0')
35
- s.add_dependency('realize', '~>1.2')
35
+ s.add_dependency('realize', '~>1.3')
36
36
  s.add_dependency('stringento', '~>2.1')
37
37
 
38
38
  s.add_development_dependency('guard-rspec', '~>4.7')
@@ -29,3 +29,9 @@ require_relative 'burner/util'
29
29
 
30
30
  # Main Entrypoint(s)
31
31
  require_relative 'burner/cli'
32
+
33
+ # Top-level namespace
34
+ module Burner
35
+ # All jobs that need to reference the main register should use this constant.
36
+ DEFAULT_REGISTER = 'default'
37
+ end
@@ -15,7 +15,7 @@ module Burner
15
15
  class JobWithRegister < Job
16
16
  attr_reader :register
17
17
 
18
- def initialize(name:, register: '')
18
+ def initialize(name:, register: DEFAULT_REGISTER)
19
19
  super(name: name)
20
20
 
21
21
  @register = register.to_s
@@ -20,11 +20,14 @@ module Burner
20
20
  # string then the nothing job will be used.
21
21
  register 'b/echo', Library::Echo
22
22
  register 'b/nothing', '', Library::Nothing
23
- register 'b/set_value', Library::SetValue
24
23
  register 'b/sleep', Library::Sleep
25
24
 
26
25
  register 'b/collection/arrays_to_objects', Library::Collection::ArraysToObjects
26
+ register 'b/collection/coalesce', Library::Collection::Coalesce
27
+ register 'b/collection/concatenate', Library::Collection::Concatenate
27
28
  register 'b/collection/graph', Library::Collection::Graph
29
+ register 'b/collection/group', Library::Collection::Group
30
+ register 'b/collection/nested_aggregate', Library::Collection::NestedAggregate
28
31
  register 'b/collection/objects_to_arrays', Library::Collection::ObjectsToArrays
29
32
  register 'b/collection/shift', Library::Collection::Shift
30
33
  register 'b/collection/transform', Library::Collection::Transform
@@ -43,5 +46,8 @@ module Burner
43
46
  register 'b/serialize/csv', Library::Serialize::Csv
44
47
  register 'b/serialize/json', Library::Serialize::Json
45
48
  register 'b/serialize/yaml', Library::Serialize::Yaml
49
+
50
+ register 'b/value/copy', Library::Value::Copy
51
+ register 'b/value/static', Library::Value::Static
46
52
  end
47
53
  end
@@ -11,11 +11,14 @@ require_relative 'job_with_register'
11
11
 
12
12
  require_relative 'library/echo'
13
13
  require_relative 'library/nothing'
14
- require_relative 'library/set_value'
15
14
  require_relative 'library/sleep'
16
15
 
17
16
  require_relative 'library/collection/arrays_to_objects'
17
+ require_relative 'library/collection/coalesce'
18
+ require_relative 'library/collection/concatenate'
18
19
  require_relative 'library/collection/graph'
20
+ require_relative 'library/collection/group'
21
+ require_relative 'library/collection/nested_aggregate'
19
22
  require_relative 'library/collection/objects_to_arrays'
20
23
  require_relative 'library/collection/shift'
21
24
  require_relative 'library/collection/transform'
@@ -34,3 +37,6 @@ require_relative 'library/io/write'
34
37
  require_relative 'library/serialize/csv'
35
38
  require_relative 'library/serialize/json'
36
39
  require_relative 'library/serialize/yaml'
40
+
41
+ require_relative 'library/value/copy'
42
+ require_relative 'library/value/static'
@@ -23,7 +23,7 @@ module Burner
23
23
  # jobs: [
24
24
  # {
25
25
  # name: 'set',
26
- # type: 'b/set_value',
26
+ # type: 'b/value/static',
27
27
  # value: [
28
28
  # [1, 'funky']
29
29
  # ]
@@ -55,7 +55,7 @@ module Burner
55
55
  class ArraysToObjects < JobWithRegister
56
56
  attr_reader :mappings
57
57
 
58
- def initialize(name:, mappings: [], register: '')
58
+ def initialize(name:, mappings: [], register: DEFAULT_REGISTER)
59
59
  super(name: name, register: register)
60
60
 
61
61
  @mappings = Modeling::KeyIndexMapping.array(mappings)
@@ -0,0 +1,73 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # This is generally used right after the Group job has been executed on a separate
14
+ # dataset in a separate register. This job can match up specified values in its dataset
15
+ # with lookup values in another. If it finds a match then it will (shallow) copy over
16
+ # the values into the respective dataset.
17
+ #
18
+ # Expected Payload[register] input: array of objects.
19
+ # Payload[register] output: array of objects.
20
+ class Coalesce < JobWithRegister
21
+ attr_reader :grouped_register, :key_mappings, :keys, :resolver
22
+
23
+ def initialize(
24
+ name:,
25
+ grouped_register:,
26
+ key_mappings: [],
27
+ keys: [],
28
+ register: DEFAULT_REGISTER,
29
+ separator: ''
30
+ )
31
+ super(name: name, register: register)
32
+
33
+ @grouped_register = grouped_register.to_s
34
+ @key_mappings = Modeling::KeyMapping.array(key_mappings)
35
+ @keys = Array(keys)
36
+ @resolver = Objectable.resolver(separator: separator.to_s)
37
+
38
+ raise ArgumentError, 'at least one key is required' if @keys.empty?
39
+
40
+ freeze
41
+ end
42
+
43
+ def perform(output, payload)
44
+ payload[register] = array(payload[register])
45
+ count = payload[register].length
46
+
47
+ output.detail("Coalescing based on key(s): #{keys} for #{count} records(s)")
48
+
49
+ payload[register].each do |record|
50
+ key = make_key(record)
51
+ lookup = find_lookup(payload, key)
52
+
53
+ key_mappings.each do |key_mapping|
54
+ value = resolver.get(lookup, key_mapping.from)
55
+
56
+ resolver.set(record, key_mapping.to, value)
57
+ end
58
+ end
59
+ end
60
+
61
+ private
62
+
63
+ def find_lookup(payload, key)
64
+ (payload[grouped_register] || {})[key] || {}
65
+ end
66
+
67
+ def make_key(record)
68
+ keys.map { |key| resolver.get(record, key) }
69
+ end
70
+ end
71
+ end
72
+ end
73
+ end
@@ -0,0 +1,42 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Take the list of from_registers and concatenate each of their values together.
14
+ # Each from_value will be coerced into an array if it not an array.
15
+ #
16
+ # Expected Payload[from_register] input: array of objects.
17
+ # Payload[to_register] output: An array of objects.
18
+ class Concatenate < Job
19
+ attr_reader :from_registers, :to_register
20
+
21
+ def initialize(name:, from_registers: [], to_register: DEFAULT_REGISTER)
22
+ super(name: name)
23
+
24
+ @from_registers = Array(from_registers)
25
+ @to_register = to_register.to_s
26
+
27
+ freeze
28
+ end
29
+
30
+ def perform(output, payload)
31
+ output.detail("Concatenating registers: '#{from_registers}' to: '#{to_register}'")
32
+
33
+ payload[to_register] = from_registers.each_with_object([]) do |from_register, memo|
34
+ from_register_value = array(payload[from_register])
35
+
36
+ memo.concat(from_register_value)
37
+ end
38
+ end
39
+ end
40
+ end
41
+ end
42
+ end
@@ -18,7 +18,12 @@ module Burner
18
18
  class Graph < JobWithRegister
19
19
  attr_reader :key, :groups
20
20
 
21
- def initialize(name:, key:, config: Hashematics::Configuration.new, register: '')
21
+ def initialize(
22
+ name:,
23
+ key:,
24
+ config: Hashematics::Configuration.new,
25
+ register: DEFAULT_REGISTER
26
+ )
22
27
  super(name: name, register: register)
23
28
 
24
29
  raise ArgumentError, 'key is required' if key.to_s.empty?
@@ -0,0 +1,68 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Take a register's value (an array of objects) and group the objects by the specified keys.
14
+ # It essentially creates a hash from an array. This is useful for creating a O(1) lookup
15
+ # which can then be used in conjunction with the Coalesce Job for another array of data.
16
+ # It is worth noting that the resulting hashes values are singular objects and not an array
17
+ # like Ruby's Enumerable#group_by method.
18
+ #
19
+ # An example of this specific job:
20
+ #
21
+ # input: [{ id: 1, code: 'a' }, { id: 2, code: 'b' }]
22
+ # keys: [:code]
23
+ # output: { ['a'] => { id: 1, code: 'a' }, ['b'] => { id: 2, code: 'b' } }
24
+ #
25
+ # Expected Payload[register] input: array of objects.
26
+ # Payload[register] output: hash.
27
+ class Group < JobWithRegister
28
+ attr_reader :keys, :resolver
29
+
30
+ def initialize(
31
+ name:,
32
+ keys: [],
33
+ register: DEFAULT_REGISTER,
34
+ separator: ''
35
+ )
36
+ super(name: name, register: register)
37
+
38
+ @keys = Array(keys)
39
+ @resolver = Objectable.resolver(separator: separator.to_s)
40
+
41
+ raise ArgumentError, 'at least one key is required' if @keys.empty?
42
+
43
+ freeze
44
+ end
45
+
46
+ def perform(output, payload)
47
+ payload[register] = array(payload[register])
48
+ count = payload[register].length
49
+
50
+ output.detail("Grouping based on key(s): #{keys} for #{count} records(s)")
51
+
52
+ grouped_records = payload[register].each_with_object({}) do |record, memo|
53
+ key = make_key(record)
54
+ memo[key] = record
55
+ end
56
+
57
+ payload[register] = grouped_records
58
+ end
59
+
60
+ private
61
+
62
+ def make_key(record)
63
+ keys.map { |key| resolver.get(record, key) }
64
+ end
65
+ end
66
+ end
67
+ end
68
+ end
@@ -0,0 +1,67 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Iterate over a collection of objects, calling key on each object, then aggregating the
14
+ # returns of key together into one array. This new derived array will be set as the value
15
+ # for the payload's register. Leverage the key_mappings option to optionally copy down
16
+ # keys and values from outer to inner records. This job is particularly useful
17
+ # if you have nested arrays but wish to deal with each level/depth in the aggregate.
18
+ #
19
+ # Expected Payload[register] input: array of objects.
20
+ # Payload[register] output: array of objects.
21
+ class NestedAggregate < JobWithRegister
22
+ attr_reader :key, :key_mappings, :resolver
23
+
24
+ def initialize(name:, key:, key_mappings: [], register: DEFAULT_REGISTER, separator: '')
25
+ super(name: name, register: register)
26
+
27
+ raise ArgumentError, 'key is required' if key.to_s.empty?
28
+
29
+ @key = key.to_s
30
+ @key_mappings = Modeling::KeyMapping.array(key_mappings)
31
+ @resolver = Objectable.resolver(separator: separator.to_s)
32
+
33
+ freeze
34
+ end
35
+
36
+ def perform(output, payload)
37
+ records = array(payload[register])
38
+ count = records.length
39
+
40
+ output.detail("Aggregating on key: #{key} for #{count} records(s)")
41
+
42
+ # Outer loop on parent records
43
+ payload[register] = records.each_with_object([]) do |record, memo|
44
+ inner_records = resolver.get(record, key)
45
+
46
+ # Inner loop on child records
47
+ array(inner_records).each do |inner_record|
48
+ memo << copy_key_mappings(record, inner_record)
49
+ end
50
+ end
51
+ end
52
+
53
+ private
54
+
55
+ def copy_key_mappings(source_record, destination_record)
56
+ key_mappings.each do |key_mapping|
57
+ value = resolver.get(source_record, key_mapping.from)
58
+
59
+ resolver.set(destination_record, key_mapping.to, value)
60
+ end
61
+
62
+ destination_record
63
+ end
64
+ end
65
+ end
66
+ end
67
+ end
@@ -24,7 +24,7 @@ module Burner
24
24
  # jobs: [
25
25
  # {
26
26
  # name: 'set',
27
- # type: 'b/set_value',
27
+ # type: 'b/value/static',
28
28
  # value: [
29
29
  # { 'id' => 1, 'name' => 'funky' }
30
30
  # ],
@@ -58,7 +58,7 @@ module Burner
58
58
  # nested hashes then set separator to '.'. For more information, see the underlying
59
59
  # library that supports this dot-notation concept:
60
60
  # https://github.com/bluemarblepayroll/objectable
61
- def initialize(name:, mappings: [], register: '', separator: '')
61
+ def initialize(name:, mappings: [], register: DEFAULT_REGISTER, separator: '')
62
62
  super(name: name, register: register)
63
63
 
64
64
  @mappings = Modeling::KeyIndexMapping.array(mappings)
@@ -23,7 +23,7 @@ module Burner
23
23
 
24
24
  attr_reader :amount
25
25
 
26
- def initialize(name:, amount: DEFAULT_AMOUNT, register: '')
26
+ def initialize(name:, amount: DEFAULT_AMOUNT, register: DEFAULT_REGISTER)
27
27
  super(name: name, register: register)
28
28
 
29
29
  @amount = amount.to_i
@@ -27,7 +27,13 @@ module Burner
27
27
  :exclusive,
28
28
  :resolver
29
29
 
30
- def initialize(name:, attributes: [], exclusive: false, register: '', separator: BLANK)
30
+ def initialize(
31
+ name:,
32
+ attributes: [],
33
+ exclusive: false,
34
+ register: DEFAULT_REGISTER,
35
+ separator: BLANK
36
+ )
31
37
  super(name: name, register: register)
32
38
 
33
39
  @resolver = Objectable.resolver(separator: separator)
@@ -19,7 +19,11 @@ module Burner
19
19
  class Unpivot < JobWithRegister
20
20
  attr_reader :unpivot
21
21
 
22
- def initialize(name:, pivot_set: HashMath::Unpivot::PivotSet.new, register: '')
22
+ def initialize(
23
+ name:,
24
+ pivot_set: HashMath::Unpivot::PivotSet.new,
25
+ register: DEFAULT_REGISTER
26
+ )
23
27
  super(name: name, register: register)
24
28
 
25
29
  @unpivot = HashMath::Unpivot.new(pivot_set)
@@ -33,7 +33,7 @@ module Burner
33
33
  invalid_register: DEFAULT_INVALID_REGISTER,
34
34
  join_char: DEFAULT_JOIN_CHAR,
35
35
  message_key: DEFAULT_MESSAGE_KEY,
36
- register: '',
36
+ register: DEFAULT_REGISTER,
37
37
  separator: '',
38
38
  validations: []
39
39
  )
@@ -19,7 +19,7 @@ module Burner
19
19
  class Values < JobWithRegister
20
20
  attr_reader :include_keys
21
21
 
22
- def initialize(name:, include_keys: false, register: '')
22
+ def initialize(name:, include_keys: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, register: register)
24
24
 
25
25
  @include_keys = include_keys || false
@@ -20,7 +20,7 @@ module Burner
20
20
  class Yaml < JobWithRegister
21
21
  attr_reader :safe
22
22
 
23
- def initialize(name:, register: '', safe: true)
23
+ def initialize(name:, register: DEFAULT_REGISTER, safe: true)
24
24
  super(name: name, register: register)
25
25
 
26
26
  @safe = safe
@@ -14,7 +14,7 @@ module Burner
14
14
  class Base < JobWithRegister
15
15
  attr_reader :path
16
16
 
17
- def initialize(name:, path:, register: '')
17
+ def initialize(name:, path:, register: DEFAULT_REGISTER)
18
18
  super(name: name, register: register)
19
19
 
20
20
  raise ArgumentError, 'path is required' if path.to_s.empty?
@@ -19,7 +19,7 @@ module Burner
19
19
  class Read < Base
20
20
  attr_reader :binary
21
21
 
22
- def initialize(name:, path:, binary: false, register: '')
22
+ def initialize(name:, path:, binary: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, path: path, register: register)
24
24
 
25
25
  @binary = binary || false
@@ -19,7 +19,7 @@ module Burner
19
19
  class Write < Base
20
20
  attr_reader :binary
21
21
 
22
- def initialize(name:, path:, binary: false, register: '')
22
+ def initialize(name:, path:, binary: false, register: DEFAULT_REGISTER)
23
23
  super(name: name, path: path, register: register)
24
24
 
25
25
  @binary = binary || false
@@ -0,0 +1,39 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Value
13
+ # Copy one value in a register to another. Note that this does *not* perform any type of
14
+ # deep copy, it simply points one register's value to another. If you decide to later mutate
15
+ # one register then you may mutate the other.
16
+ #
17
+ # Expected Payload[from_register] input: anything.
18
+ # Payload[to_register] output: whatever value was specified in the from_register.
19
+ class Copy < Job
20
+ attr_reader :from_register, :to_register
21
+
22
+ def initialize(name:, to_register: DEFAULT_REGISTER, from_register: DEFAULT_REGISTER)
23
+ super(name: name)
24
+
25
+ @from_register = from_register.to_s
26
+ @to_register = to_register.to_s
27
+
28
+ freeze
29
+ end
30
+
31
+ def perform(output, payload)
32
+ output.detail("Copying register: '#{from_register}' to: '#{to_register}'")
33
+
34
+ payload[to_register] = payload[from_register]
35
+ end
36
+ end
37
+ end
38
+ end
39
+ end
@@ -0,0 +1,34 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Value
13
+ # Arbitrarily set the value of a register.
14
+ #
15
+ # Expected Payload[register] input: anything.
16
+ # Payload[register] output: whatever value was specified in this job.
17
+ class Static < JobWithRegister
18
+ attr_reader :value
19
+
20
+ def initialize(name:, register: DEFAULT_REGISTER, value: nil)
21
+ super(name: name, register: register)
22
+
23
+ @value = value
24
+
25
+ freeze
26
+ end
27
+
28
+ def perform(_output, payload)
29
+ payload[register] = value
30
+ end
31
+ end
32
+ end
33
+ end
34
+ end
@@ -10,4 +10,5 @@
10
10
  require_relative 'modeling/attribute'
11
11
  require_relative 'modeling/attribute_renderer'
12
12
  require_relative 'modeling/key_index_mapping'
13
+ require_relative 'modeling/key_mapping'
13
14
  require_relative 'modeling/validations'
@@ -13,19 +13,37 @@ module Burner
13
13
  # to set the key to. The transformers that can be passed in can be any Realize::Transformers
14
14
  # subclasses. For more information, see the Realize library at:
15
15
  # https://github.com/bluemarblepayroll/realize
16
+ #
17
+ # Note that if explicit: true is set then no transformers will be automatically injected.
18
+ # If explicit is not true (default) then it will have a resolve job automatically injected
19
+ # in the beginning of the pipeline. This is the observed default behavior, with the
20
+ # exception having to be initially cross-mapped using a custom resolve transformation.
16
21
  class Attribute
17
22
  acts_as_hashable
18
23
 
24
+ RESOLVE_TYPE = 'r/value/resolve'
25
+
19
26
  attr_reader :key, :transformers
20
27
 
21
- def initialize(key:, transformers: [])
28
+ def initialize(key:, explicit: false, transformers: [])
22
29
  raise ArgumentError, 'key is required' if key.to_s.empty?
23
30
 
24
31
  @key = key.to_s
25
- @transformers = Realize::Transformers.array(transformers)
32
+ @transformers = base_transformers(explicit) + Realize::Transformers.array(transformers)
26
33
 
27
34
  freeze
28
35
  end
36
+
37
+ private
38
+
39
+ # When explicit, this will return an empty array.
40
+ # When not explicit, this will return an array with a basic transformer that simply
41
+ # gets the key's value. This establishes a good majority base case.
42
+ def base_transformers(explicit)
43
+ return [] if explicit
44
+
45
+ [Realize::Transformers.make(type: RESOLVE_TYPE, key: key)]
46
+ end
29
47
  end
30
48
  end
31
49
  end
@@ -0,0 +1,29 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Modeling
12
+ # Generic mapping from a key to another key. The argument 'to' is optional
13
+ # and if it is blank then the 'from' value will be used for the 'to' as well.
14
+ class KeyMapping
15
+ acts_as_hashable
16
+
17
+ attr_reader :from, :to
18
+
19
+ def initialize(from:, to: '')
20
+ raise ArgumentError, 'from is required' if from.to_s.empty?
21
+
22
+ @from = from.to_s
23
+ @to = to.to_s.empty? ? @from : to.to_s
24
+
25
+ freeze
26
+ end
27
+ end
28
+ end
29
+ end
@@ -27,7 +27,7 @@ module Burner
27
27
  end
28
28
 
29
29
  def message
30
- @message.to_s.empty? ? "#{key}#{default_message}" : @message.to_s
30
+ @message.to_s.empty? ? "#{key} #{default_message}" : @message.to_s
31
31
  end
32
32
  end
33
33
  end
@@ -16,14 +16,18 @@ module Burner
16
16
  class Blank < Base
17
17
  acts_as_hashable
18
18
 
19
+ BLANK_RE = /\A[[:space:]]*\z/.freeze
20
+
19
21
  def valid?(object, resolver)
20
- resolver.get(object, key).to_s.empty?
22
+ value = resolver.get(object, key).to_s
23
+
24
+ value.empty? || BLANK_RE.match?(value)
21
25
  end
22
26
 
23
27
  private
24
28
 
25
29
  def default_message
26
- ' must be blank'
30
+ 'must be blank'
27
31
  end
28
32
  end
29
33
  end
@@ -7,23 +7,23 @@
7
7
  # LICENSE file in the root directory of this source tree.
8
8
  #
9
9
 
10
- require_relative 'base'
10
+ require_relative 'blank'
11
11
 
12
12
  module Burner
13
13
  module Modeling
14
14
  class Validations
15
15
  # Check if a value is present. If it is blank (null or empty) then it is invalid.
16
- class Present < Base
16
+ class Present < Blank
17
17
  acts_as_hashable
18
18
 
19
19
  def valid?(object_value, resolver)
20
- !resolver.get(object_value, key).to_s.empty?
20
+ !super(object_value, resolver)
21
21
  end
22
22
 
23
23
  private
24
24
 
25
25
  def default_message
26
- ' is required'
26
+ 'is required'
27
27
  end
28
28
  end
29
29
  end
@@ -14,7 +14,8 @@ require_relative 'step'
14
14
 
15
15
  module Burner
16
16
  # The root package. A Pipeline contains the job configurations along with the steps. The steps
17
- # referens jobs and tell you the order of the jobs to run.
17
+ # reference jobs and tell you the order of the jobs to run. If steps is nil then all jobs
18
+ # will execute in their declared order.
18
19
  class Pipeline
19
20
  acts_as_hashable
20
21
 
@@ -23,14 +24,16 @@ module Burner
23
24
 
24
25
  attr_reader :steps
25
26
 
26
- def initialize(jobs: [], steps: [])
27
+ def initialize(jobs: [], steps: nil)
27
28
  jobs = Jobs.array(jobs)
28
29
 
29
30
  assert_unique_job_names(jobs)
30
31
 
31
32
  jobs_by_name = jobs.map { |job| [job.name, job] }.to_h
32
33
 
33
- @steps = Array(steps).map do |step_name|
34
+ step_names = steps ? Array(steps) : jobs_by_name.keys
35
+
36
+ @steps = step_names.map do |step_name|
34
37
  job = jobs_by_name[step_name.to_s]
35
38
 
36
39
  raise JobNotFoundError, "#{step_name} was not declared as a job" unless job
@@ -8,5 +8,5 @@
8
8
  #
9
9
 
10
10
  module Burner
11
- VERSION = '1.0.0-alpha.11'
11
+ VERSION = '1.2.0'
12
12
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: burner
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0.pre.alpha.11
4
+ version: 1.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Matthew Ruggio
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2020-10-27 00:00:00.000000000 Z
11
+ date: 2020-11-25 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: acts_as_hashable
@@ -72,14 +72,14 @@ dependencies:
72
72
  requirements:
73
73
  - - "~>"
74
74
  - !ruby/object:Gem::Version
75
- version: '1.2'
75
+ version: '1.3'
76
76
  type: :runtime
77
77
  prerelease: false
78
78
  version_requirements: !ruby/object:Gem::Requirement
79
79
  requirements:
80
80
  - - "~>"
81
81
  - !ruby/object:Gem::Version
82
- version: '1.2'
82
+ version: '1.3'
83
83
  - !ruby/object:Gem::Dependency
84
84
  name: stringento
85
85
  requirement: !ruby/object:Gem::Requirement
@@ -225,7 +225,11 @@ files:
225
225
  - lib/burner/jobs.rb
226
226
  - lib/burner/library.rb
227
227
  - lib/burner/library/collection/arrays_to_objects.rb
228
+ - lib/burner/library/collection/coalesce.rb
229
+ - lib/burner/library/collection/concatenate.rb
228
230
  - lib/burner/library/collection/graph.rb
231
+ - lib/burner/library/collection/group.rb
232
+ - lib/burner/library/collection/nested_aggregate.rb
229
233
  - lib/burner/library/collection/objects_to_arrays.rb
230
234
  - lib/burner/library/collection/shift.rb
231
235
  - lib/burner/library/collection/transform.rb
@@ -244,12 +248,14 @@ files:
244
248
  - lib/burner/library/serialize/csv.rb
245
249
  - lib/burner/library/serialize/json.rb
246
250
  - lib/burner/library/serialize/yaml.rb
247
- - lib/burner/library/set_value.rb
248
251
  - lib/burner/library/sleep.rb
252
+ - lib/burner/library/value/copy.rb
253
+ - lib/burner/library/value/static.rb
249
254
  - lib/burner/modeling.rb
250
255
  - lib/burner/modeling/attribute.rb
251
256
  - lib/burner/modeling/attribute_renderer.rb
252
257
  - lib/burner/modeling/key_index_mapping.rb
258
+ - lib/burner/modeling/key_mapping.rb
253
259
  - lib/burner/modeling/validations.rb
254
260
  - lib/burner/modeling/validations/base.rb
255
261
  - lib/burner/modeling/validations/blank.rb
@@ -284,9 +290,9 @@ required_ruby_version: !ruby/object:Gem::Requirement
284
290
  version: '2.5'
285
291
  required_rubygems_version: !ruby/object:Gem::Requirement
286
292
  requirements:
287
- - - ">"
293
+ - - ">="
288
294
  - !ruby/object:Gem::Version
289
- version: 1.3.1
295
+ version: '0'
290
296
  requirements: []
291
297
  rubygems_version: 3.0.3
292
298
  signing_key:
@@ -1,32 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- #
4
- # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
- #
6
- # This source code is licensed under the MIT license found in the
7
- # LICENSE file in the root directory of this source tree.
8
- #
9
-
10
- module Burner
11
- module Library
12
- # Arbitrarily set value
13
- #
14
- # Expected Payload[register] input: anything.
15
- # Payload[register] output: whatever value was specified in this job.
16
- class SetValue < JobWithRegister
17
- attr_reader :value
18
-
19
- def initialize(name:, register: '', value: nil)
20
- super(name: name, register: register)
21
-
22
- @value = value
23
-
24
- freeze
25
- end
26
-
27
- def perform(_output, payload)
28
- payload[register] = value
29
- end
30
- end
31
- end
32
- end