burner 1.7.0 → 1.11.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 59c9f86d7e38af602d55db6f13473ef3943e7bb860bc531a49025c1cc42a4686
4
- data.tar.gz: 732a8127ec38ea3e4160331fa05046d9fec8e4057a48ecefb309c12821f67254
3
+ metadata.gz: f687d9818e7a090c0999f144096eef52c5ba7093a069a8d2ef784f5abdb859d7
4
+ data.tar.gz: cae3bb23c2671ab69263fe7359c93a47777c72fc387ab93a03006c38a2bc83cd
5
5
  SHA512:
6
- metadata.gz: 96c17f4c33e6aa21d9f6b5b075cee2d0fd456bef4b7b02c9401b9439925e2ae8b91c44ebae1e8bc1185d22215516b1f6d27ed6362760e14d8ddc069cceb8e22a
7
- data.tar.gz: 71ddb54a2f7c286adf6c457aeffbb897989fe2d0d3a1c95cfbb0e86bb69db4dcae5bc06190485e43a6655ba664d02d2fce90977d9d391597261ccf4944986aeb
6
+ metadata.gz: 3d1740b55ec51e3459092d99165f75197dcad75d2dcce28ae285ed9dc1e1fcbb2a85002f8abeb868e0bc388424e8e85056bab1b41a54046023d06fbd836f0598
7
+ data.tar.gz: bffbcf8567d03252bc04fb8e207674595b271635da2ef4cec963cd6ac329e4e58194167f0b474dc522785cbaef10a239b504e12825c56fb9d55042f4733b1840
data/.tool-versions ADDED
@@ -0,0 +1 @@
1
+ ruby 2.6.6
data/CHANGELOG.md CHANGED
@@ -1,3 +1,34 @@
1
+ # 1.11.0 (May 17th, 2021)
2
+
3
+ Added Jobs:
4
+
5
+ * b/collection/flat_file_parse
6
+
7
+ # 1.10.0 (May 17th, 2021)
8
+
9
+ Added Jobs:
10
+
11
+ * b/collection/only_keys
12
+ # 1.9.0 (April 13th, 2021)
13
+
14
+ Added Jobs:
15
+
16
+ * b/collection/pivot
17
+ # 1.8.0 (March 31st, 2021)
18
+
19
+ Added Jobs:
20
+
21
+ * b/param/from_register
22
+ * b/param/to_register
23
+
24
+ Other:
25
+
26
+ * Payload#param was added to access a param key's value.
27
+ * Payload#update_param was added to update a param key's value.
28
+
29
+ Internal Notes:
30
+
31
+ Payload#register and Payload#params data stores have been internally consolidated while still maintaining the same public API surface area.
1
32
 
2
33
  # 1.7.0 (January 22nd, 2021)
3
34
 
data/README.md CHANGED
@@ -266,11 +266,15 @@ This library only ships with very basic, rudimentary jobs that are meant to just
266
266
  * **b/collection/arrays_to_objects** [mappings, register]: Convert an array of arrays to an array of objects.
267
267
  * **b/collection/coalesce** [grouped_register, insensitive, key_mappings, keys, register, separator]: Merge two datasets together based on the key values of one dataset (array) with a grouped dataset (hash). If insensitive (false by default) is true then each key's value will be converted/coerced to a lowercase string.
268
268
  * **b/collection/concatenate** [from_registers, to_register]: Concatenate each from_register's value and place the newly concatenated array into the to_register. Note: this does not do any deep copying and should be assumed it is shallow copying all objects.
269
+ * **b/collection/flat_file_parse** [keys_register, register, separator]: Map an array of arrays to an array of hashes. These keys can be realized at run-time as they are pulled from the first entry in the array. The `keys_register` will also be set to the keys used for mapping.
269
270
  * **b/collection/graph** [config, key, register]: Use [Hashematics](https://github.com/bluemarblepayroll/hashematics) to turn a flat array of objects into a deeply nested object tree.
270
271
  * **b/collection/group** [insensitive, keys, register, separator]: Take a register's value (an array of objects) and group the objects by the specified keys. If insensitive (false by default) is true then each key's value will be converted/coerced to a lowercase string.
271
272
  * **b/collection/nested_aggregate** [register, key_mappings, key, separator]: Traverse a set of objects, resolving key's value for each object, optionally copying down key_mappings to the child records, then merging all the inner records together.
272
273
  * **b/collection/number** [key, register, separator, start_at]: This job can iterate over a set of records and sequence them (set the specified key to a sequential index value.)
273
274
  * **b/collection/objects_to_arrays** [mappings, register]: Convert an array of objects to an array of arrays.
275
+ * **b/collection/only_keys** [keys_register, register, separator]: Limit an array of objects' keys to a specified set of keys. These keys can be realized at run-time as they are pulled from another register (`keys_register`) thus making it dynamic.
276
+ * **b/collection/pivot** [unique_keys, insensitive, other_keys, pivot_key, pivot_value_key, register, separator]:
277
+ Take an array of objects and pivot a key into multiple keys. It essentially takes all the values for a key and creates N number of keys (one per value.) Under the hood it uses HashMath's [Record](https://github.com/bluemarblepayroll/hash_math#record-the-hash-prototype) and [Table](https://github.com/bluemarblepayroll/hash_math#table-the-double-hash-hash-of-hashes) classes.
274
278
  * **b/collection/shift** [amount, register]: Remove the first N number of elements from an array.
275
279
  * **b/collection/transform** [attributes, exclusive, separator, register]: Iterate over all objects and transform each key per the attribute transformers specifications. If exclusive is set to false then the current object will be overridden/merged. Separator can also be set for key path support. This job uses [Realize](https://github.com/bluemarblepayroll/realize), which provides its own extendable value-transformation pipeline. If an attribute is not set with `explicit: true` then it will automatically start from the key's value from the record. If `explicit: true` is started, then it will start from the record itself.
276
280
  * **b/collection/unpivot** [pivot_set, register]: Take an array of objects and unpivot specific sets of keys into rows. Under the hood it uses [HashMath's Unpivot class](https://github.com/bluemarblepayroll/hash_math#unpivot-hash-key-coalescence-and-row-extrapolation).
@@ -297,6 +301,11 @@ By default all jobs will use the `Burner::Disks::Local` disk for its persistence
297
301
  * **b/io/row_reader** [data_key, disk, ignore_blank_path, ignore_file_not_found, path_key, register, separator]: Iterates over an array of objects, extracts a filepath from a key in each object, and attempts to load the file's content for each record. The file's content will be stored at the specified data_key. By default missing paths or files will be treated as hard errors. If you wish to ignore these then pass in true for ignore_blank_path and/or ignore_file_not_found.
298
302
  * **b/io/write** [binary, disk, path, register, supress_side_effect]: Write to a local file. The path parameter can be interpolated using `Payload#params`. If the contents are binary, pass in `binary: true` to open it up in binary+write mode. By default, written files are also logged as WrittenFile instances to the Payload#side_effects array. You can pass in supress_side_effect: true to disable this behavior.
299
303
 
304
+ #### Parameters
305
+
306
+ * **b/param/from_register** [param_key, register]: Copy the value of a register to a param key.
307
+ * **b/param/to_register** [param_key, register]: Copy the value of a param key to a register.
308
+
300
309
  #### Serialization
301
310
 
302
311
  * **b/serialize/csv** [byte_order_mark, register]: Take an array of arrays and create a CSV. You can optionally pre-pend a byte order mark, see Burner::Modeling::ByteOrderMark for acceptable options.
@@ -0,0 +1,46 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ # Defines a key value pair data store per our library. It is basically a composite
12
+ # object around a hash with indifferent key typing.
13
+ class Data
14
+ extend Forwardable
15
+
16
+ def_delegators :internal_hash, :transform_keys
17
+
18
+ def initialize(hash = {})
19
+ @internal_hash = {}
20
+
21
+ (hash || {}).each { |k, v| self[k] = v }
22
+ end
23
+
24
+ def []=(key, value)
25
+ internal_hash[key.to_s] = value
26
+ end
27
+
28
+ def [](key)
29
+ internal_hash[key.to_s]
30
+ end
31
+
32
+ def to_h
33
+ internal_hash
34
+ end
35
+
36
+ def ==(other)
37
+ other.instance_of?(self.class) &&
38
+ to_h == other.to_h
39
+ end
40
+ alias eql? ==
41
+
42
+ private
43
+
44
+ attr_reader :internal_hash
45
+ end
46
+ end
data/lib/burner/job.rb CHANGED
@@ -46,15 +46,9 @@ module Burner
46
46
  protected
47
47
 
48
48
  def job_string_template(expression, output, payload)
49
- templatable_params = payload.params
50
- .merge(__id: output.id)
51
- .merge(templatable_register_values(payload))
49
+ templatable_params = payload.params_and_registers_hash.merge(__id: output.id)
52
50
 
53
51
  Util::StringTemplate.instance.evaluate(expression, templatable_params)
54
52
  end
55
-
56
- def templatable_register_values(payload)
57
- payload.registers.transform_keys { |key| "__#{key}_register" }
58
- end
59
53
  end
60
54
  end
@@ -0,0 +1,33 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ require_relative 'job_with_register'
11
+
12
+ module Burner
13
+ # Add on a register attribute to the configuration for a job. This indicates that a job
14
+ # either accesses and/or mutates the payload's registers.
15
+ class JobWithDynamicKeys < JobWithRegister
16
+ attr_reader :keys_register,
17
+ :resolver
18
+
19
+ def initialize(
20
+ keys_register:,
21
+ name: '',
22
+ register: DEFAULT_REGISTER,
23
+ separator: BLANK
24
+ )
25
+ super(name: name, register: register)
26
+
27
+ @keys_register = keys_register.to_s
28
+ @resolver = Objectable.resolver(separator: separator)
29
+
30
+ freeze
31
+ end
32
+ end
33
+ end
@@ -13,6 +13,8 @@ module Burner
13
13
  # Add on a register attribute to the configuration for a job. This indicates that a job
14
14
  # either accesses and/or mutates the payload's registers.
15
15
  class JobWithRegister < Job
16
+ BLANK = ''
17
+
16
18
  attr_reader :register
17
19
 
18
20
  def initialize(name: '', register: DEFAULT_REGISTER)
data/lib/burner/jobs.rb CHANGED
@@ -25,11 +25,14 @@ module Burner
25
25
  register 'b/collection/arrays_to_objects', Library::Collection::ArraysToObjects
26
26
  register 'b/collection/coalesce', Library::Collection::Coalesce
27
27
  register 'b/collection/concatenate', Library::Collection::Concatenate
28
- register 'b/collection/graph', Library::Collection::Graph
29
28
  register 'b/collection/group', Library::Collection::Group
29
+ register 'b/collection/flat_file_parse', Library::Collection::FlatFileParse
30
+ register 'b/collection/graph', Library::Collection::Graph
30
31
  register 'b/collection/nested_aggregate', Library::Collection::NestedAggregate
31
32
  register 'b/collection/number', Library::Collection::Number
32
33
  register 'b/collection/objects_to_arrays', Library::Collection::ObjectsToArrays
34
+ register 'b/collection/only_keys', Library::Collection::OnlyKeys
35
+ register 'b/collection/pivot', Library::Collection::Pivot
33
36
  register 'b/collection/shift', Library::Collection::Shift
34
37
  register 'b/collection/transform', Library::Collection::Transform
35
38
  register 'b/collection/unpivot', Library::Collection::Unpivot
@@ -48,6 +51,9 @@ module Burner
48
51
  register 'b/io/row_reader', Library::IO::RowReader
49
52
  register 'b/io/write', Library::IO::Write
50
53
 
54
+ register 'b/param/from_register', Library::Param::FromRegister
55
+ register 'b/param/to_register', Library::Param::ToRegister
56
+
51
57
  register 'b/serialize/csv', Library::Serialize::Csv
52
58
  register 'b/serialize/json', Library::Serialize::Json
53
59
  register 'b/serialize/yaml', Library::Serialize::Yaml
@@ -7,7 +7,7 @@
7
7
  # LICENSE file in the root directory of this source tree.
8
8
  #
9
9
 
10
- require_relative 'job_with_register'
10
+ require_relative 'job_with_dynamic_keys'
11
11
 
12
12
  require_relative 'library/echo'
13
13
  require_relative 'library/nothing'
@@ -16,11 +16,14 @@ require_relative 'library/sleep'
16
16
  require_relative 'library/collection/arrays_to_objects'
17
17
  require_relative 'library/collection/coalesce'
18
18
  require_relative 'library/collection/concatenate'
19
+ require_relative 'library/collection/flat_file_parse'
19
20
  require_relative 'library/collection/graph'
20
21
  require_relative 'library/collection/group'
21
22
  require_relative 'library/collection/nested_aggregate'
22
23
  require_relative 'library/collection/number'
23
24
  require_relative 'library/collection/objects_to_arrays'
25
+ require_relative 'library/collection/only_keys'
26
+ require_relative 'library/collection/pivot'
24
27
  require_relative 'library/collection/shift'
25
28
  require_relative 'library/collection/transform'
26
29
  require_relative 'library/collection/unpivot'
@@ -39,6 +42,9 @@ require_relative 'library/io/read'
39
42
  require_relative 'library/io/row_reader'
40
43
  require_relative 'library/io/write'
41
44
 
45
+ require_relative 'library/param/from_register'
46
+ require_relative 'library/param/to_register'
47
+
42
48
  require_relative 'library/serialize/csv'
43
49
  require_relative 'library/serialize/json'
44
50
  require_relative 'library/serialize/yaml'
@@ -0,0 +1,57 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Convert an array of arrays to an array of objects. The difference between this
14
+ # job and ArraysToObjects is that this one does not take in mappings and instead
15
+ # will use the first entry as the array of keys to positionally map to.
16
+ #
17
+ # For example, if a register had this:
18
+ #
19
+ # [['id', 'first', 'last'], [1, 'frank', 'rizzo']]
20
+ #
21
+ # Then executing this job would result in this:
22
+ #
23
+ # [{ 'id' => 1, 'first' => frank, 'last' => rizzo }]
24
+ #
25
+ # As a side-effect, the keys_register would result in this:
26
+ #
27
+ # ['id', 'first', 'last']
28
+ #
29
+ # Expected Payload[register] input: array of arrays.
30
+ # Payload[register] output: An array of hashes.
31
+ class FlatFileParse < JobWithDynamicKeys
32
+ def perform(output, payload)
33
+ objects = array(payload[register])
34
+ keys = array(objects.shift)
35
+ count = objects.length
36
+
37
+ output.detail("Mapping #{count} array(s) to key(s): #{keys.join(', ')}")
38
+
39
+ payload[register] = objects.map { |object| transform(object, keys) }
40
+ payload[keys_register] = keys
41
+ end
42
+
43
+ private
44
+
45
+ def transform(object, keys)
46
+ object.each_with_object({}).with_index do |(value, memo), index|
47
+ next if index >= keys.length
48
+
49
+ key = keys[index]
50
+
51
+ resolver.set(memo, key, value)
52
+ end
53
+ end
54
+ end
55
+ end
56
+ end
57
+ end
@@ -0,0 +1,42 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # This job knows how to take an array of objects and limit it to a specific set of keys.
14
+ # The keys are pulled from another register which helps make it dynamic (you can load
15
+ # up this other register with a dynamic list of keys at run-time.)
16
+ #
17
+ # Expected Payload[register] input: array of objects.
18
+ # Payload[register] output: An array of objects.
19
+ class OnlyKeys < JobWithDynamicKeys
20
+ def perform(output, payload)
21
+ objects = array(payload[register])
22
+ count = objects.length
23
+ keys = array(payload[keys_register])
24
+
25
+ output.detail("Dynamically limiting #{count} object(s) with key(s): #{keys.join(', ')}")
26
+
27
+ payload[register] = objects.map { |object| transform(object, keys) }
28
+ end
29
+
30
+ private
31
+
32
+ def transform(object, keys)
33
+ keys.each_with_object({}) do |key, memo|
34
+ value = resolver.get(object, key)
35
+
36
+ resolver.set(memo, key, value)
37
+ end
38
+ end
39
+ end
40
+ end
41
+ end
42
+ end
@@ -0,0 +1,150 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Collection
13
+ # Take an array of objects and pivot a key into multiple keys. It essentially takes all
14
+ # the values for a key and creates N number of keys (one per value.)
15
+ # Under the hood it uses HashMath's Record and Table classes:
16
+ # https://github.com/bluemarblepayroll/hash_math
17
+ #
18
+ # An example of a normalized dataset that could be pivoted:
19
+ #
20
+ # records = [
21
+ # { patient_id: 1, key: :first_name, value: 'bozo' },
22
+ # { patient_id: 1, key: :last_name, value: 'clown' },
23
+ # { patient_id: 2, key: :first_name, value: 'frank' },
24
+ # { patient_id: 2, key: :last_name, value: 'rizzo' },
25
+ # ]
26
+ #
27
+ # Using the following job configuration:
28
+ #
29
+ # config = {
30
+ # unique_key: :patient_id
31
+ # }
32
+ #
33
+ # Once ran through this job, it would set the register to:
34
+ #
35
+ # records = [
36
+ # { patient_id: 1, first_name: 'bozo', last_name: 'clown' },
37
+ # { patient_id: 2, first_name: 'frank', last_name: 'rizzo' },
38
+ # ]
39
+ #
40
+ # Expected Payload[register] input: array of objects.
41
+ # Payload[register] output: An array of objects.
42
+ class Pivot < JobWithRegister
43
+ DEFAULT_PIVOT_KEY = :key
44
+ DEFAULT_PIVOT_VALUE_KEY = :value
45
+
46
+ attr_reader :insensitive,
47
+ :other_keys,
48
+ :non_pivoted_keys,
49
+ :pivot_key,
50
+ :pivot_value_key,
51
+ :resolver,
52
+ :unique_keys
53
+
54
+ def initialize(
55
+ unique_keys:,
56
+ insensitive: false,
57
+ name: '',
58
+ other_keys: [],
59
+ pivot_key: DEFAULT_PIVOT_KEY,
60
+ pivot_value_key: DEFAULT_PIVOT_KEY_VALUE,
61
+ register: DEFAULT_REGISTER,
62
+ separator: ''
63
+ )
64
+ super(name: name, register: register)
65
+
66
+ @insensitive = insensitive || false
67
+ @pivot_key = pivot_key.to_s
68
+ @pivot_value_key = pivot_value_key.to_s
69
+ @resolver = Objectable.resolver(separator: separator)
70
+ @unique_keys = Array(unique_keys)
71
+ @other_keys = Array(other_keys)
72
+ @non_pivoted_keys = @unique_keys + @other_keys
73
+
74
+ freeze
75
+ end
76
+
77
+ def perform(output, payload)
78
+ objects = array(payload[register])
79
+ table = make_table(objects)
80
+
81
+ output.detail("Pivoting #{objects.length} object(s)")
82
+ output.detail("By key: #{pivot_key} and value: #{pivot_value_key}")
83
+
84
+ objects.each { |object| object_to_table(object, table) }
85
+
86
+ pivoted_objects = table.to_a.map(&:fields)
87
+
88
+ output.detail("Resulting dataset has #{pivoted_objects.length} object(s)")
89
+
90
+ payload[register] = pivoted_objects
91
+ end
92
+
93
+ private
94
+
95
+ def resolve_key(object)
96
+ key_to_use = resolver.get(object, pivot_key)
97
+
98
+ make_key(key_to_use)
99
+ end
100
+
101
+ def make_key(value)
102
+ insensitive ? value.to_s.downcase : value
103
+ end
104
+
105
+ def make_row_id(object)
106
+ unique_keys.map { |k| make_key(resolver.get(object, k)) }
107
+ end
108
+
109
+ def make_key_map(objects)
110
+ objects.each_with_object({}) do |object, key_map|
111
+ key = resolver.get(object, pivot_key)
112
+ unique_key = make_key(key)
113
+
114
+ key_map[unique_key] ||= Set.new
115
+
116
+ key_map[unique_key] << key
117
+ end
118
+ end
119
+
120
+ def make_record(objects)
121
+ key_map = make_key_map(objects)
122
+ keys = non_pivoted_keys + key_map.values.map(&:first)
123
+
124
+ HashMath::Record.new(keys)
125
+ end
126
+
127
+ def make_table(objects)
128
+ HashMath::Table.new(make_record(objects))
129
+ end
130
+
131
+ def object_to_table(object, table)
132
+ row_id = make_row_id(object)
133
+
134
+ non_pivoted_keys.each do |key|
135
+ value = resolver.get(object, key)
136
+
137
+ table.add(row_id, key, value)
138
+ end
139
+
140
+ key_to_use = resolve_key(object)
141
+ value_to_use = resolver.get(object, pivot_value_key)
142
+
143
+ table.add(row_id, key_to_use, value_to_use)
144
+
145
+ self
146
+ end
147
+ end
148
+ end
149
+ end
150
+ end
@@ -21,8 +21,6 @@ module Burner
21
21
  # Expected Payload[register] input: array of objects.
22
22
  # Payload[register] output: An array of objects.
23
23
  class Transform < JobWithRegister
24
- BLANK = ''
25
-
26
24
  attr_reader :attribute_renderers,
27
25
  :exclusive,
28
26
  :resolver
@@ -0,0 +1,27 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ module Burner
11
+ module Library
12
+ module Param
13
+ # Common logic shared across Param job subclasses.
14
+ class Base < JobWithRegister
15
+ attr_reader :param_key
16
+
17
+ def initialize(name: BLANK, param_key: BLANK, register: DEFAULT_REGISTER)
18
+ super(name: name, register: register)
19
+
20
+ @param_key = param_key.to_s
21
+
22
+ freeze
23
+ end
24
+ end
25
+ end
26
+ end
27
+ end
@@ -0,0 +1,30 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ require_relative 'base'
11
+
12
+ module Burner
13
+ module Library
14
+ module Param
15
+ # Copy a register's value into a param key. Generally speaking you should only be
16
+ # mutating registers, that way the params stay true to the passed in params for the
17
+ # pipeline. But this job is available in case a param needs to be updated.
18
+ #
19
+ # Expected Payload[register] input: anything.
20
+ # Payload.params(param_key) output: whatever value was specified in the register.
21
+ class FromRegister < Base
22
+ def perform(output, payload)
23
+ output.detail("Pushing value from register: #{register} to param: #{param_key}")
24
+
25
+ payload.update_param(param_key, payload[register])
26
+ end
27
+ end
28
+ end
29
+ end
30
+ end
@@ -0,0 +1,28 @@
1
+ # frozen_string_literal: true
2
+
3
+ #
4
+ # Copyright (c) 2020-present, Blue Marble Payroll, LLC
5
+ #
6
+ # This source code is licensed under the MIT license found in the
7
+ # LICENSE file in the root directory of this source tree.
8
+ #
9
+
10
+ require_relative 'base'
11
+
12
+ module Burner
13
+ module Library
14
+ module Param
15
+ # Copy a param key's value into a register.
16
+ #
17
+ # Expected Payload.param(param_key) input: anything.
18
+ # Payload[register] output: whatever value was specified as the param_key's value.
19
+ class ToRegister < Base
20
+ def perform(output, payload)
21
+ output.detail("Pushing value to register: #{register} from param: #{param_key}")
22
+
23
+ payload[register] = payload.param(param_key)
24
+ end
25
+ end
26
+ end
27
+ end
28
+ end
@@ -7,6 +7,8 @@
7
7
  # LICENSE file in the root directory of this source tree.
8
8
  #
9
9
 
10
+ require_relative 'data'
11
+
10
12
  module Burner
11
13
  # The input for all Job#perform methods. The main notion of this object is its 'registers'
12
14
  # attribute. This registers attribute is a key-indifferent hash, accessible on Payload using
@@ -24,10 +26,15 @@ module Burner
24
26
  # The 'time' attribute is important in that it should for the replaying of pipelines and jobs.
25
27
  # Instead of having job's utilizing Time.now, Date.today, etc... they should rather opt to
26
28
  # use this value instead.
29
+ #
30
+ # The notion of params are somewhat conflated with registers here. Both are hashes and both
31
+ # serve as data stores. The difference is registers are really meant to be the shared-data
32
+ # repository across jobs, while params are, more or less, the input into the _pipeline_.
33
+ # It is debatable if mutation of the params should be allowed but the design decision was made
34
+ # early on to allow for params to be mutable albeit with registers being the preferred mutable
35
+ # store.
27
36
  class Payload
28
- attr_reader :params,
29
- :registers,
30
- :side_effects,
37
+ attr_reader :side_effects,
31
38
  :time
32
39
 
33
40
  def initialize(
@@ -36,12 +43,32 @@ module Burner
36
43
  side_effects: [],
37
44
  time: Time.now.utc
38
45
  )
39
- @params = params || {}
40
- @registers = {}
46
+ @params = Data.new(params)
47
+ @registers = Data.new(registers)
41
48
  @side_effects = side_effects || []
42
49
  @time = time || Time.now.utc
50
+ end
43
51
 
44
- add_registers(registers)
52
+ # Backwards compatibility. This allows for control over the underlying data structure
53
+ # while still maintaining the same public API as before.
54
+ def params
55
+ @params.to_h
56
+ end
57
+
58
+ # Backwards compatibility. This allows for control over the underlying data structure
59
+ # while still maintaining the same public API as before.
60
+ def registers
61
+ @registers.to_h
62
+ end
63
+
64
+ # Law of Demeter: While params is an accessible hash, it is preferred that the
65
+ # public class methods are used.
66
+ def param(key)
67
+ @params[key]
68
+ end
69
+
70
+ def update_param(key, value)
71
+ tap { @params[key] = value }
45
72
  end
46
73
 
47
74
  # Add a side effect of a job. This helps to keep track of things jobs do outside of its
@@ -52,12 +79,12 @@ module Burner
52
79
 
53
80
  # Set a register's value.
54
81
  def []=(key, value)
55
- set(key, value)
82
+ @registers[key] = value
56
83
  end
57
84
 
58
85
  # Retrieve a register's value.
59
86
  def [](key)
60
- registers[key.to_s]
87
+ @registers[key]
61
88
  end
62
89
 
63
90
  # Set halt_pipeline to true. This will indicate to the pipeline to stop all
@@ -71,14 +98,11 @@ module Burner
71
98
  @halt_pipeline || false
72
99
  end
73
100
 
74
- private
75
-
76
- def set(key, value)
77
- registers[key.to_s] = value
78
- end
101
+ def params_and_registers_hash
102
+ registers_hash = @registers.transform_keys { |key| "__#{key}_register" }
103
+ params_hash = @params.to_h
79
104
 
80
- def add_registers(registers)
81
- (registers || {}).each { |k, v| set(k, v) }
105
+ params_hash.merge(registers_hash)
82
106
  end
83
107
  end
84
108
  end
@@ -55,7 +55,7 @@ module Burner
55
55
  private
56
56
 
57
57
  def output_params(params, output)
58
- if params.keys.any?
58
+ if params.any?
59
59
  output.write('Parameters:')
60
60
  else
61
61
  output.write('No parameters passed in.')
@@ -8,5 +8,5 @@
8
8
  #
9
9
 
10
10
  module Burner
11
- VERSION = '1.7.0'
11
+ VERSION = '1.11.0'
12
12
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: burner
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.7.0
4
+ version: 1.11.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Matthew Ruggio
8
- autorequire:
8
+ autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2021-01-22 00:00:00.000000000 Z
11
+ date: 2021-05-18 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: acts_as_hashable
@@ -220,6 +220,7 @@ files:
220
220
  - ".gitignore"
221
221
  - ".rubocop.yml"
222
222
  - ".ruby-version"
223
+ - ".tool-versions"
223
224
  - ".travis.yml"
224
225
  - CHANGELOG.md
225
226
  - CODE_OF_CONDUCT.md
@@ -234,21 +235,26 @@ files:
234
235
  - exe/burner
235
236
  - lib/burner.rb
236
237
  - lib/burner/cli.rb
238
+ - lib/burner/data.rb
237
239
  - lib/burner/disks.rb
238
240
  - lib/burner/disks/local.rb
239
241
  - lib/burner/job.rb
240
242
  - lib/burner/job_set.rb
243
+ - lib/burner/job_with_dynamic_keys.rb
241
244
  - lib/burner/job_with_register.rb
242
245
  - lib/burner/jobs.rb
243
246
  - lib/burner/library.rb
244
247
  - lib/burner/library/collection/arrays_to_objects.rb
245
248
  - lib/burner/library/collection/coalesce.rb
246
249
  - lib/burner/library/collection/concatenate.rb
250
+ - lib/burner/library/collection/flat_file_parse.rb
247
251
  - lib/burner/library/collection/graph.rb
248
252
  - lib/burner/library/collection/group.rb
249
253
  - lib/burner/library/collection/nested_aggregate.rb
250
254
  - lib/burner/library/collection/number.rb
251
255
  - lib/burner/library/collection/objects_to_arrays.rb
256
+ - lib/burner/library/collection/only_keys.rb
257
+ - lib/burner/library/collection/pivot.rb
252
258
  - lib/burner/library/collection/shift.rb
253
259
  - lib/burner/library/collection/transform.rb
254
260
  - lib/burner/library/collection/unpivot.rb
@@ -266,6 +272,9 @@ files:
266
272
  - lib/burner/library/io/row_reader.rb
267
273
  - lib/burner/library/io/write.rb
268
274
  - lib/burner/library/nothing.rb
275
+ - lib/burner/library/param/base.rb
276
+ - lib/burner/library/param/from_register.rb
277
+ - lib/burner/library/param/to_register.rb
269
278
  - lib/burner/library/serialize/csv.rb
270
279
  - lib/burner/library/serialize/json.rb
271
280
  - lib/burner/library/serialize/yaml.rb
@@ -304,7 +313,7 @@ metadata:
304
313
  documentation_uri: https://www.rubydoc.info/gems/burner
305
314
  homepage_uri: https://github.com/bluemarblepayroll/burner
306
315
  source_code_uri: https://github.com/bluemarblepayroll/burner
307
- post_install_message:
316
+ post_install_message:
308
317
  rdoc_options: []
309
318
  require_paths:
310
319
  - lib
@@ -320,7 +329,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
320
329
  version: '0'
321
330
  requirements: []
322
331
  rubygems_version: 3.0.3
323
- signing_key:
332
+ signing_key:
324
333
  specification_version: 4
325
334
  summary: Declarative and extendable processing pipeline
326
335
  test_files: []