burner 1.7.0.pre.alpha → 1.10.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/.tool-versions +1 -0
- data/CHANGELOG.md +30 -1
- data/README.md +8 -0
- data/lib/burner/data.rb +46 -0
- data/lib/burner/job.rb +1 -7
- data/lib/burner/jobs.rb +5 -0
- data/lib/burner/library.rb +5 -0
- data/lib/burner/library/collection/only_keys.rb +61 -0
- data/lib/burner/library/collection/pivot.rb +150 -0
- data/lib/burner/library/param/base.rb +29 -0
- data/lib/burner/library/param/from_register.rb +30 -0
- data/lib/burner/library/param/to_register.rb +28 -0
- data/lib/burner/payload.rb +39 -15
- data/lib/burner/pipeline.rb +1 -1
- data/lib/burner/version.rb +1 -1
- metadata +14 -7
checksums.yaml
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
---
|
|
2
2
|
SHA256:
|
|
3
|
-
metadata.gz:
|
|
4
|
-
data.tar.gz:
|
|
3
|
+
metadata.gz: b775833cacea64e30e78943c176e3317e3f72510cb3653066a3ce16c70dc1f3d
|
|
4
|
+
data.tar.gz: 51da8ebaac4ea928c441898e9c7b46052efa9e5c95f454e6933e8a036093709b
|
|
5
5
|
SHA512:
|
|
6
|
-
metadata.gz:
|
|
7
|
-
data.tar.gz:
|
|
6
|
+
metadata.gz: 9276607a43419b630a7804ce5c83b53844dc0a42abddfd218e4a3f9d62aaaf608c24d821a980f3c618d058de0fbb2fb3abb6b8d1195f165d5f773121727ca1b1
|
|
7
|
+
data.tar.gz: 3c6ff8a7428940aaff2bb1a46f1bb7dfb4ac077e700ffbae7b006de8e0458c0d55ff719b846fd4daad6f5aa62f8b826c6abd96e185ba7d6dc6f4799b60d59291
|
data/.tool-versions
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
ruby 2.6.6
|
data/CHANGELOG.md
CHANGED
|
@@ -1,5 +1,30 @@
|
|
|
1
|
+
# 1.10.0 (May 17th, 2021)
|
|
1
2
|
|
|
2
|
-
|
|
3
|
+
Added Jobs:
|
|
4
|
+
|
|
5
|
+
* b/collection/only_keys
|
|
6
|
+
# 1.9.0 (April 13th, 2021)
|
|
7
|
+
|
|
8
|
+
Added Jobs:
|
|
9
|
+
|
|
10
|
+
* b/collection/pivot
|
|
11
|
+
# 1.8.0 (March 31st, 2021)
|
|
12
|
+
|
|
13
|
+
Added Jobs:
|
|
14
|
+
|
|
15
|
+
* b/param/from_register
|
|
16
|
+
* b/param/to_register
|
|
17
|
+
|
|
18
|
+
Other:
|
|
19
|
+
|
|
20
|
+
* Payload#param was added to access a param key's value.
|
|
21
|
+
* Payload#update_param was added to update a param key's value.
|
|
22
|
+
|
|
23
|
+
Internal Notes:
|
|
24
|
+
|
|
25
|
+
Payload#register and Payload#params data stores have been internally consolidated while still maintaining the same public API surface area.
|
|
26
|
+
|
|
27
|
+
# 1.7.0 (January 22nd, 2021)
|
|
3
28
|
|
|
4
29
|
Added Jobs:
|
|
5
30
|
|
|
@@ -11,6 +36,10 @@ Enhanced Jobs:
|
|
|
11
36
|
|
|
12
37
|
* b/collection/coalesce and b/collection/group now support the notion of case and type-insensitivity (insensitive option).
|
|
13
38
|
|
|
39
|
+
Changes:
|
|
40
|
+
|
|
41
|
+
* Job names derived from Burner::Job are now optional. Pipelines themselves now can handle jobs without names.
|
|
42
|
+
|
|
14
43
|
# 1.6.0 (December 22nd, 2020)
|
|
15
44
|
|
|
16
45
|
Additions:
|
data/README.md
CHANGED
|
@@ -271,6 +271,9 @@ This library only ships with very basic, rudimentary jobs that are meant to just
|
|
|
271
271
|
* **b/collection/nested_aggregate** [register, key_mappings, key, separator]: Traverse a set of objects, resolving key's value for each object, optionally copying down key_mappings to the child records, then merging all the inner records together.
|
|
272
272
|
* **b/collection/number** [key, register, separator, start_at]: This job can iterate over a set of records and sequence them (set the specified key to a sequential index value.)
|
|
273
273
|
* **b/collection/objects_to_arrays** [mappings, register]: Convert an array of objects to an array of arrays.
|
|
274
|
+
* **b/collection/only_keys** [keys_register, register, separator]: Limit an array of objects' keys to a specified set of keys. These keys can be realized at run-time as they are pulled from another register (keys_register) thus making it dynamic.
|
|
275
|
+
* **b/collection/pivot** [unique_keys, insensitive, other_keys, pivot_key, pivot_value_key, register, separator]:
|
|
276
|
+
Take an array of objects and pivot a key into multiple keys. It essentially takes all the values for a key and creates N number of keys (one per value.) Under the hood it uses HashMath's [Record](https://github.com/bluemarblepayroll/hash_math#record-the-hash-prototype) and [Table](https://github.com/bluemarblepayroll/hash_math#table-the-double-hash-hash-of-hashes) classes.
|
|
274
277
|
* **b/collection/shift** [amount, register]: Remove the first N number of elements from an array.
|
|
275
278
|
* **b/collection/transform** [attributes, exclusive, separator, register]: Iterate over all objects and transform each key per the attribute transformers specifications. If exclusive is set to false then the current object will be overridden/merged. Separator can also be set for key path support. This job uses [Realize](https://github.com/bluemarblepayroll/realize), which provides its own extendable value-transformation pipeline. If an attribute is not set with `explicit: true` then it will automatically start from the key's value from the record. If `explicit: true` is started, then it will start from the record itself.
|
|
276
279
|
* **b/collection/unpivot** [pivot_set, register]: Take an array of objects and unpivot specific sets of keys into rows. Under the hood it uses [HashMath's Unpivot class](https://github.com/bluemarblepayroll/hash_math#unpivot-hash-key-coalescence-and-row-extrapolation).
|
|
@@ -297,6 +300,11 @@ By default all jobs will use the `Burner::Disks::Local` disk for its persistence
|
|
|
297
300
|
* **b/io/row_reader** [data_key, disk, ignore_blank_path, ignore_file_not_found, path_key, register, separator]: Iterates over an array of objects, extracts a filepath from a key in each object, and attempts to load the file's content for each record. The file's content will be stored at the specified data_key. By default missing paths or files will be treated as hard errors. If you wish to ignore these then pass in true for ignore_blank_path and/or ignore_file_not_found.
|
|
298
301
|
* **b/io/write** [binary, disk, path, register, supress_side_effect]: Write to a local file. The path parameter can be interpolated using `Payload#params`. If the contents are binary, pass in `binary: true` to open it up in binary+write mode. By default, written files are also logged as WrittenFile instances to the Payload#side_effects array. You can pass in supress_side_effect: true to disable this behavior.
|
|
299
302
|
|
|
303
|
+
#### Parameters
|
|
304
|
+
|
|
305
|
+
* **b/param/from_register** [param_key, register]: Copy the value of a register to a param key.
|
|
306
|
+
* **b/param/to_register** [param_key, register]: Copy the value of a param key to a register.
|
|
307
|
+
|
|
300
308
|
#### Serialization
|
|
301
309
|
|
|
302
310
|
* **b/serialize/csv** [byte_order_mark, register]: Take an array of arrays and create a CSV. You can optionally pre-pend a byte order mark, see Burner::Modeling::ByteOrderMark for acceptable options.
|
data/lib/burner/data.rb
ADDED
|
@@ -0,0 +1,46 @@
|
|
|
1
|
+
# frozen_string_literal: true
|
|
2
|
+
|
|
3
|
+
#
|
|
4
|
+
# Copyright (c) 2020-present, Blue Marble Payroll, LLC
|
|
5
|
+
#
|
|
6
|
+
# This source code is licensed under the MIT license found in the
|
|
7
|
+
# LICENSE file in the root directory of this source tree.
|
|
8
|
+
#
|
|
9
|
+
|
|
10
|
+
module Burner
|
|
11
|
+
# Defines a key value pair data store per our library. It is basically a composite
|
|
12
|
+
# object around a hash with indifferent key typing.
|
|
13
|
+
class Data
|
|
14
|
+
extend Forwardable
|
|
15
|
+
|
|
16
|
+
def_delegators :internal_hash, :transform_keys
|
|
17
|
+
|
|
18
|
+
def initialize(hash = {})
|
|
19
|
+
@internal_hash = {}
|
|
20
|
+
|
|
21
|
+
(hash || {}).each { |k, v| self[k] = v }
|
|
22
|
+
end
|
|
23
|
+
|
|
24
|
+
def []=(key, value)
|
|
25
|
+
internal_hash[key.to_s] = value
|
|
26
|
+
end
|
|
27
|
+
|
|
28
|
+
def [](key)
|
|
29
|
+
internal_hash[key.to_s]
|
|
30
|
+
end
|
|
31
|
+
|
|
32
|
+
def to_h
|
|
33
|
+
internal_hash
|
|
34
|
+
end
|
|
35
|
+
|
|
36
|
+
def ==(other)
|
|
37
|
+
other.instance_of?(self.class) &&
|
|
38
|
+
to_h == other.to_h
|
|
39
|
+
end
|
|
40
|
+
alias eql? ==
|
|
41
|
+
|
|
42
|
+
private
|
|
43
|
+
|
|
44
|
+
attr_reader :internal_hash
|
|
45
|
+
end
|
|
46
|
+
end
|
data/lib/burner/job.rb
CHANGED
|
@@ -46,15 +46,9 @@ module Burner
|
|
|
46
46
|
protected
|
|
47
47
|
|
|
48
48
|
def job_string_template(expression, output, payload)
|
|
49
|
-
templatable_params = payload.
|
|
50
|
-
.merge(__id: output.id)
|
|
51
|
-
.merge(templatable_register_values(payload))
|
|
49
|
+
templatable_params = payload.params_and_registers_hash.merge(__id: output.id)
|
|
52
50
|
|
|
53
51
|
Util::StringTemplate.instance.evaluate(expression, templatable_params)
|
|
54
52
|
end
|
|
55
|
-
|
|
56
|
-
def templatable_register_values(payload)
|
|
57
|
-
payload.registers.transform_keys { |key| "__#{key}_register" }
|
|
58
|
-
end
|
|
59
53
|
end
|
|
60
54
|
end
|
data/lib/burner/jobs.rb
CHANGED
|
@@ -30,6 +30,8 @@ module Burner
|
|
|
30
30
|
register 'b/collection/nested_aggregate', Library::Collection::NestedAggregate
|
|
31
31
|
register 'b/collection/number', Library::Collection::Number
|
|
32
32
|
register 'b/collection/objects_to_arrays', Library::Collection::ObjectsToArrays
|
|
33
|
+
register 'b/collection/only_keys', Library::Collection::OnlyKeys
|
|
34
|
+
register 'b/collection/pivot', Library::Collection::Pivot
|
|
33
35
|
register 'b/collection/shift', Library::Collection::Shift
|
|
34
36
|
register 'b/collection/transform', Library::Collection::Transform
|
|
35
37
|
register 'b/collection/unpivot', Library::Collection::Unpivot
|
|
@@ -48,6 +50,9 @@ module Burner
|
|
|
48
50
|
register 'b/io/row_reader', Library::IO::RowReader
|
|
49
51
|
register 'b/io/write', Library::IO::Write
|
|
50
52
|
|
|
53
|
+
register 'b/param/from_register', Library::Param::FromRegister
|
|
54
|
+
register 'b/param/to_register', Library::Param::ToRegister
|
|
55
|
+
|
|
51
56
|
register 'b/serialize/csv', Library::Serialize::Csv
|
|
52
57
|
register 'b/serialize/json', Library::Serialize::Json
|
|
53
58
|
register 'b/serialize/yaml', Library::Serialize::Yaml
|
data/lib/burner/library.rb
CHANGED
|
@@ -21,6 +21,8 @@ require_relative 'library/collection/group'
|
|
|
21
21
|
require_relative 'library/collection/nested_aggregate'
|
|
22
22
|
require_relative 'library/collection/number'
|
|
23
23
|
require_relative 'library/collection/objects_to_arrays'
|
|
24
|
+
require_relative 'library/collection/only_keys'
|
|
25
|
+
require_relative 'library/collection/pivot'
|
|
24
26
|
require_relative 'library/collection/shift'
|
|
25
27
|
require_relative 'library/collection/transform'
|
|
26
28
|
require_relative 'library/collection/unpivot'
|
|
@@ -39,6 +41,9 @@ require_relative 'library/io/read'
|
|
|
39
41
|
require_relative 'library/io/row_reader'
|
|
40
42
|
require_relative 'library/io/write'
|
|
41
43
|
|
|
44
|
+
require_relative 'library/param/from_register'
|
|
45
|
+
require_relative 'library/param/to_register'
|
|
46
|
+
|
|
42
47
|
require_relative 'library/serialize/csv'
|
|
43
48
|
require_relative 'library/serialize/json'
|
|
44
49
|
require_relative 'library/serialize/yaml'
|
|
@@ -0,0 +1,61 @@
|
|
|
1
|
+
# frozen_string_literal: true
|
|
2
|
+
|
|
3
|
+
#
|
|
4
|
+
# Copyright (c) 2020-present, Blue Marble Payroll, LLC
|
|
5
|
+
#
|
|
6
|
+
# This source code is licensed under the MIT license found in the
|
|
7
|
+
# LICENSE file in the root directory of this source tree.
|
|
8
|
+
#
|
|
9
|
+
|
|
10
|
+
module Burner
|
|
11
|
+
module Library
|
|
12
|
+
module Collection
|
|
13
|
+
# This job knows how to take an array of objects and limit it to a specific set of keys.
|
|
14
|
+
# The keys are pulled from another register which helps make it dynamic (you can load
|
|
15
|
+
# up this other register with a dynamic list of keys at run-time.)
|
|
16
|
+
#
|
|
17
|
+
# Expected Payload[register] input: array of objects.
|
|
18
|
+
# Payload[register] output: An array of objects.
|
|
19
|
+
class OnlyKeys < JobWithRegister
|
|
20
|
+
BLANK = ''
|
|
21
|
+
|
|
22
|
+
attr_reader :keys_register,
|
|
23
|
+
:resolver
|
|
24
|
+
|
|
25
|
+
def initialize(
|
|
26
|
+
keys_register:,
|
|
27
|
+
name: '',
|
|
28
|
+
register: DEFAULT_REGISTER,
|
|
29
|
+
separator: BLANK
|
|
30
|
+
)
|
|
31
|
+
super(name: name, register: register)
|
|
32
|
+
|
|
33
|
+
@keys_register = keys_register.to_s
|
|
34
|
+
@resolver = Objectable.resolver(separator: separator)
|
|
35
|
+
|
|
36
|
+
freeze
|
|
37
|
+
end
|
|
38
|
+
|
|
39
|
+
def perform(output, payload)
|
|
40
|
+
objects = array(payload[register])
|
|
41
|
+
count = objects.length
|
|
42
|
+
keys = array(payload[keys_register])
|
|
43
|
+
|
|
44
|
+
output.detail("Dynamically limiting #{count} object(s) with key(s): #{keys.join(', ')}")
|
|
45
|
+
|
|
46
|
+
payload[register] = objects.map { |object| transform(object, keys) }
|
|
47
|
+
end
|
|
48
|
+
|
|
49
|
+
private
|
|
50
|
+
|
|
51
|
+
def transform(object, keys)
|
|
52
|
+
keys.each_with_object({}) do |key, memo|
|
|
53
|
+
value = resolver.get(object, key)
|
|
54
|
+
|
|
55
|
+
resolver.set(memo, key, value)
|
|
56
|
+
end
|
|
57
|
+
end
|
|
58
|
+
end
|
|
59
|
+
end
|
|
60
|
+
end
|
|
61
|
+
end
|
|
@@ -0,0 +1,150 @@
|
|
|
1
|
+
# frozen_string_literal: true
|
|
2
|
+
|
|
3
|
+
#
|
|
4
|
+
# Copyright (c) 2020-present, Blue Marble Payroll, LLC
|
|
5
|
+
#
|
|
6
|
+
# This source code is licensed under the MIT license found in the
|
|
7
|
+
# LICENSE file in the root directory of this source tree.
|
|
8
|
+
#
|
|
9
|
+
|
|
10
|
+
module Burner
|
|
11
|
+
module Library
|
|
12
|
+
module Collection
|
|
13
|
+
# Take an array of objects and pivot a key into multiple keys. It essentially takes all
|
|
14
|
+
# the values for a key and creates N number of keys (one per value.)
|
|
15
|
+
# Under the hood it uses HashMath's Record and Table classes:
|
|
16
|
+
# https://github.com/bluemarblepayroll/hash_math
|
|
17
|
+
#
|
|
18
|
+
# An example of a normalized dataset that could be pivoted:
|
|
19
|
+
#
|
|
20
|
+
# records = [
|
|
21
|
+
# { patient_id: 1, key: :first_name, value: 'bozo' },
|
|
22
|
+
# { patient_id: 1, key: :last_name, value: 'clown' },
|
|
23
|
+
# { patient_id: 2, key: :first_name, value: 'frank' },
|
|
24
|
+
# { patient_id: 2, key: :last_name, value: 'rizzo' },
|
|
25
|
+
# ]
|
|
26
|
+
#
|
|
27
|
+
# Using the following job configuration:
|
|
28
|
+
#
|
|
29
|
+
# config = {
|
|
30
|
+
# unique_key: :patient_id
|
|
31
|
+
# }
|
|
32
|
+
#
|
|
33
|
+
# Once ran through this job, it would set the register to:
|
|
34
|
+
#
|
|
35
|
+
# records = [
|
|
36
|
+
# { patient_id: 1, first_name: 'bozo', last_name: 'clown' },
|
|
37
|
+
# { patient_id: 2, first_name: 'frank', last_name: 'rizzo' },
|
|
38
|
+
# ]
|
|
39
|
+
#
|
|
40
|
+
# Expected Payload[register] input: array of objects.
|
|
41
|
+
# Payload[register] output: An array of objects.
|
|
42
|
+
class Pivot < JobWithRegister
|
|
43
|
+
DEFAULT_PIVOT_KEY = :key
|
|
44
|
+
DEFAULT_PIVOT_VALUE_KEY = :value
|
|
45
|
+
|
|
46
|
+
attr_reader :insensitive,
|
|
47
|
+
:other_keys,
|
|
48
|
+
:non_pivoted_keys,
|
|
49
|
+
:pivot_key,
|
|
50
|
+
:pivot_value_key,
|
|
51
|
+
:resolver,
|
|
52
|
+
:unique_keys
|
|
53
|
+
|
|
54
|
+
def initialize(
|
|
55
|
+
unique_keys:,
|
|
56
|
+
insensitive: false,
|
|
57
|
+
name: '',
|
|
58
|
+
other_keys: [],
|
|
59
|
+
pivot_key: DEFAULT_PIVOT_KEY,
|
|
60
|
+
pivot_value_key: DEFAULT_PIVOT_KEY_VALUE,
|
|
61
|
+
register: DEFAULT_REGISTER,
|
|
62
|
+
separator: ''
|
|
63
|
+
)
|
|
64
|
+
super(name: name, register: register)
|
|
65
|
+
|
|
66
|
+
@insensitive = insensitive || false
|
|
67
|
+
@pivot_key = pivot_key.to_s
|
|
68
|
+
@pivot_value_key = pivot_value_key.to_s
|
|
69
|
+
@resolver = Objectable.resolver(separator: separator)
|
|
70
|
+
@unique_keys = Array(unique_keys)
|
|
71
|
+
@other_keys = Array(other_keys)
|
|
72
|
+
@non_pivoted_keys = @unique_keys + @other_keys
|
|
73
|
+
|
|
74
|
+
freeze
|
|
75
|
+
end
|
|
76
|
+
|
|
77
|
+
def perform(output, payload)
|
|
78
|
+
objects = array(payload[register])
|
|
79
|
+
table = make_table(objects)
|
|
80
|
+
|
|
81
|
+
output.detail("Pivoting #{objects.length} object(s)")
|
|
82
|
+
output.detail("By key: #{pivot_key} and value: #{pivot_value_key}")
|
|
83
|
+
|
|
84
|
+
objects.each { |object| object_to_table(object, table) }
|
|
85
|
+
|
|
86
|
+
pivoted_objects = table.to_a.map(&:fields)
|
|
87
|
+
|
|
88
|
+
output.detail("Resulting dataset has #{pivoted_objects.length} object(s)")
|
|
89
|
+
|
|
90
|
+
payload[register] = pivoted_objects
|
|
91
|
+
end
|
|
92
|
+
|
|
93
|
+
private
|
|
94
|
+
|
|
95
|
+
def resolve_key(object)
|
|
96
|
+
key_to_use = resolver.get(object, pivot_key)
|
|
97
|
+
|
|
98
|
+
make_key(key_to_use)
|
|
99
|
+
end
|
|
100
|
+
|
|
101
|
+
def make_key(value)
|
|
102
|
+
insensitive ? value.to_s.downcase : value
|
|
103
|
+
end
|
|
104
|
+
|
|
105
|
+
def make_row_id(object)
|
|
106
|
+
unique_keys.map { |k| make_key(resolver.get(object, k)) }
|
|
107
|
+
end
|
|
108
|
+
|
|
109
|
+
def make_key_map(objects)
|
|
110
|
+
objects.each_with_object({}) do |object, key_map|
|
|
111
|
+
key = resolver.get(object, pivot_key)
|
|
112
|
+
unique_key = make_key(key)
|
|
113
|
+
|
|
114
|
+
key_map[unique_key] ||= Set.new
|
|
115
|
+
|
|
116
|
+
key_map[unique_key] << key
|
|
117
|
+
end
|
|
118
|
+
end
|
|
119
|
+
|
|
120
|
+
def make_record(objects)
|
|
121
|
+
key_map = make_key_map(objects)
|
|
122
|
+
keys = non_pivoted_keys + key_map.values.map(&:first)
|
|
123
|
+
|
|
124
|
+
HashMath::Record.new(keys)
|
|
125
|
+
end
|
|
126
|
+
|
|
127
|
+
def make_table(objects)
|
|
128
|
+
HashMath::Table.new(make_record(objects))
|
|
129
|
+
end
|
|
130
|
+
|
|
131
|
+
def object_to_table(object, table)
|
|
132
|
+
row_id = make_row_id(object)
|
|
133
|
+
|
|
134
|
+
non_pivoted_keys.each do |key|
|
|
135
|
+
value = resolver.get(object, key)
|
|
136
|
+
|
|
137
|
+
table.add(row_id, key, value)
|
|
138
|
+
end
|
|
139
|
+
|
|
140
|
+
key_to_use = resolve_key(object)
|
|
141
|
+
value_to_use = resolver.get(object, pivot_value_key)
|
|
142
|
+
|
|
143
|
+
table.add(row_id, key_to_use, value_to_use)
|
|
144
|
+
|
|
145
|
+
self
|
|
146
|
+
end
|
|
147
|
+
end
|
|
148
|
+
end
|
|
149
|
+
end
|
|
150
|
+
end
|
|
@@ -0,0 +1,29 @@
|
|
|
1
|
+
# frozen_string_literal: true
|
|
2
|
+
|
|
3
|
+
#
|
|
4
|
+
# Copyright (c) 2020-present, Blue Marble Payroll, LLC
|
|
5
|
+
#
|
|
6
|
+
# This source code is licensed under the MIT license found in the
|
|
7
|
+
# LICENSE file in the root directory of this source tree.
|
|
8
|
+
#
|
|
9
|
+
|
|
10
|
+
module Burner
|
|
11
|
+
module Library
|
|
12
|
+
module Param
|
|
13
|
+
# Common logic shared across Param job subclasses.
|
|
14
|
+
class Base < JobWithRegister
|
|
15
|
+
BLANK = ''
|
|
16
|
+
|
|
17
|
+
attr_reader :param_key
|
|
18
|
+
|
|
19
|
+
def initialize(name: BLANK, param_key: BLANK, register: DEFAULT_REGISTER)
|
|
20
|
+
super(name: name, register: register)
|
|
21
|
+
|
|
22
|
+
@param_key = param_key.to_s
|
|
23
|
+
|
|
24
|
+
freeze
|
|
25
|
+
end
|
|
26
|
+
end
|
|
27
|
+
end
|
|
28
|
+
end
|
|
29
|
+
end
|
|
@@ -0,0 +1,30 @@
|
|
|
1
|
+
# frozen_string_literal: true
|
|
2
|
+
|
|
3
|
+
#
|
|
4
|
+
# Copyright (c) 2020-present, Blue Marble Payroll, LLC
|
|
5
|
+
#
|
|
6
|
+
# This source code is licensed under the MIT license found in the
|
|
7
|
+
# LICENSE file in the root directory of this source tree.
|
|
8
|
+
#
|
|
9
|
+
|
|
10
|
+
require_relative 'base'
|
|
11
|
+
|
|
12
|
+
module Burner
|
|
13
|
+
module Library
|
|
14
|
+
module Param
|
|
15
|
+
# Copy a register's value into a param key. Generally speaking you should only be
|
|
16
|
+
# mutating registers, that way the params stay true to the passed in params for the
|
|
17
|
+
# pipeline. But this job is available in case a param needs to be updated.
|
|
18
|
+
#
|
|
19
|
+
# Expected Payload[register] input: anything.
|
|
20
|
+
# Payload.params(param_key) output: whatever value was specified in the register.
|
|
21
|
+
class FromRegister < Base
|
|
22
|
+
def perform(output, payload)
|
|
23
|
+
output.detail("Pushing value from register: #{register} to param: #{param_key}")
|
|
24
|
+
|
|
25
|
+
payload.update_param(param_key, payload[register])
|
|
26
|
+
end
|
|
27
|
+
end
|
|
28
|
+
end
|
|
29
|
+
end
|
|
30
|
+
end
|
|
@@ -0,0 +1,28 @@
|
|
|
1
|
+
# frozen_string_literal: true
|
|
2
|
+
|
|
3
|
+
#
|
|
4
|
+
# Copyright (c) 2020-present, Blue Marble Payroll, LLC
|
|
5
|
+
#
|
|
6
|
+
# This source code is licensed under the MIT license found in the
|
|
7
|
+
# LICENSE file in the root directory of this source tree.
|
|
8
|
+
#
|
|
9
|
+
|
|
10
|
+
require_relative 'base'
|
|
11
|
+
|
|
12
|
+
module Burner
|
|
13
|
+
module Library
|
|
14
|
+
module Param
|
|
15
|
+
# Copy a param key's value into a register.
|
|
16
|
+
#
|
|
17
|
+
# Expected Payload.param(param_key) input: anything.
|
|
18
|
+
# Payload[register] output: whatever value was specified as the param_key's value.
|
|
19
|
+
class ToRegister < Base
|
|
20
|
+
def perform(output, payload)
|
|
21
|
+
output.detail("Pushing value to register: #{register} from param: #{param_key}")
|
|
22
|
+
|
|
23
|
+
payload[register] = payload.param(param_key)
|
|
24
|
+
end
|
|
25
|
+
end
|
|
26
|
+
end
|
|
27
|
+
end
|
|
28
|
+
end
|
data/lib/burner/payload.rb
CHANGED
|
@@ -7,6 +7,8 @@
|
|
|
7
7
|
# LICENSE file in the root directory of this source tree.
|
|
8
8
|
#
|
|
9
9
|
|
|
10
|
+
require_relative 'data'
|
|
11
|
+
|
|
10
12
|
module Burner
|
|
11
13
|
# The input for all Job#perform methods. The main notion of this object is its 'registers'
|
|
12
14
|
# attribute. This registers attribute is a key-indifferent hash, accessible on Payload using
|
|
@@ -24,10 +26,15 @@ module Burner
|
|
|
24
26
|
# The 'time' attribute is important in that it should for the replaying of pipelines and jobs.
|
|
25
27
|
# Instead of having job's utilizing Time.now, Date.today, etc... they should rather opt to
|
|
26
28
|
# use this value instead.
|
|
29
|
+
#
|
|
30
|
+
# The notion of params are somewhat conflated with registers here. Both are hashes and both
|
|
31
|
+
# serve as data stores. The difference is registers are really meant to be the shared-data
|
|
32
|
+
# repository across jobs, while params are, more or less, the input into the _pipeline_.
|
|
33
|
+
# It is debatable if mutation of the params should be allowed but the design decision was made
|
|
34
|
+
# early on to allow for params to be mutable albeit with registers being the preferred mutable
|
|
35
|
+
# store.
|
|
27
36
|
class Payload
|
|
28
|
-
attr_reader :
|
|
29
|
-
:registers,
|
|
30
|
-
:side_effects,
|
|
37
|
+
attr_reader :side_effects,
|
|
31
38
|
:time
|
|
32
39
|
|
|
33
40
|
def initialize(
|
|
@@ -36,12 +43,32 @@ module Burner
|
|
|
36
43
|
side_effects: [],
|
|
37
44
|
time: Time.now.utc
|
|
38
45
|
)
|
|
39
|
-
@params = params
|
|
40
|
-
@registers =
|
|
46
|
+
@params = Data.new(params)
|
|
47
|
+
@registers = Data.new(registers)
|
|
41
48
|
@side_effects = side_effects || []
|
|
42
49
|
@time = time || Time.now.utc
|
|
50
|
+
end
|
|
43
51
|
|
|
44
|
-
|
|
52
|
+
# Backwards compatibility. This allows for control over the underlying data structure
|
|
53
|
+
# while still maintaining the same public API as before.
|
|
54
|
+
def params
|
|
55
|
+
@params.to_h
|
|
56
|
+
end
|
|
57
|
+
|
|
58
|
+
# Backwards compatibility. This allows for control over the underlying data structure
|
|
59
|
+
# while still maintaining the same public API as before.
|
|
60
|
+
def registers
|
|
61
|
+
@registers.to_h
|
|
62
|
+
end
|
|
63
|
+
|
|
64
|
+
# Law of Demeter: While params is an accessible hash, it is preferred that the
|
|
65
|
+
# public class methods are used.
|
|
66
|
+
def param(key)
|
|
67
|
+
@params[key]
|
|
68
|
+
end
|
|
69
|
+
|
|
70
|
+
def update_param(key, value)
|
|
71
|
+
tap { @params[key] = value }
|
|
45
72
|
end
|
|
46
73
|
|
|
47
74
|
# Add a side effect of a job. This helps to keep track of things jobs do outside of its
|
|
@@ -52,12 +79,12 @@ module Burner
|
|
|
52
79
|
|
|
53
80
|
# Set a register's value.
|
|
54
81
|
def []=(key, value)
|
|
55
|
-
|
|
82
|
+
@registers[key] = value
|
|
56
83
|
end
|
|
57
84
|
|
|
58
85
|
# Retrieve a register's value.
|
|
59
86
|
def [](key)
|
|
60
|
-
registers[key
|
|
87
|
+
@registers[key]
|
|
61
88
|
end
|
|
62
89
|
|
|
63
90
|
# Set halt_pipeline to true. This will indicate to the pipeline to stop all
|
|
@@ -71,14 +98,11 @@ module Burner
|
|
|
71
98
|
@halt_pipeline || false
|
|
72
99
|
end
|
|
73
100
|
|
|
74
|
-
|
|
75
|
-
|
|
76
|
-
|
|
77
|
-
registers[key.to_s] = value
|
|
78
|
-
end
|
|
101
|
+
def params_and_registers_hash
|
|
102
|
+
registers_hash = @registers.transform_keys { |key| "__#{key}_register" }
|
|
103
|
+
params_hash = @params.to_h
|
|
79
104
|
|
|
80
|
-
|
|
81
|
-
(registers || {}).each { |k, v| set(k, v) }
|
|
105
|
+
params_hash.merge(registers_hash)
|
|
82
106
|
end
|
|
83
107
|
end
|
|
84
108
|
end
|
data/lib/burner/pipeline.rb
CHANGED
data/lib/burner/version.rb
CHANGED
metadata
CHANGED
|
@@ -1,14 +1,14 @@
|
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
|
2
2
|
name: burner
|
|
3
3
|
version: !ruby/object:Gem::Version
|
|
4
|
-
version: 1.
|
|
4
|
+
version: 1.10.0
|
|
5
5
|
platform: ruby
|
|
6
6
|
authors:
|
|
7
7
|
- Matthew Ruggio
|
|
8
|
-
autorequire:
|
|
8
|
+
autorequire:
|
|
9
9
|
bindir: exe
|
|
10
10
|
cert_chain: []
|
|
11
|
-
date: 2021-
|
|
11
|
+
date: 2021-05-17 00:00:00.000000000 Z
|
|
12
12
|
dependencies:
|
|
13
13
|
- !ruby/object:Gem::Dependency
|
|
14
14
|
name: acts_as_hashable
|
|
@@ -220,6 +220,7 @@ files:
|
|
|
220
220
|
- ".gitignore"
|
|
221
221
|
- ".rubocop.yml"
|
|
222
222
|
- ".ruby-version"
|
|
223
|
+
- ".tool-versions"
|
|
223
224
|
- ".travis.yml"
|
|
224
225
|
- CHANGELOG.md
|
|
225
226
|
- CODE_OF_CONDUCT.md
|
|
@@ -234,6 +235,7 @@ files:
|
|
|
234
235
|
- exe/burner
|
|
235
236
|
- lib/burner.rb
|
|
236
237
|
- lib/burner/cli.rb
|
|
238
|
+
- lib/burner/data.rb
|
|
237
239
|
- lib/burner/disks.rb
|
|
238
240
|
- lib/burner/disks/local.rb
|
|
239
241
|
- lib/burner/job.rb
|
|
@@ -249,6 +251,8 @@ files:
|
|
|
249
251
|
- lib/burner/library/collection/nested_aggregate.rb
|
|
250
252
|
- lib/burner/library/collection/number.rb
|
|
251
253
|
- lib/burner/library/collection/objects_to_arrays.rb
|
|
254
|
+
- lib/burner/library/collection/only_keys.rb
|
|
255
|
+
- lib/burner/library/collection/pivot.rb
|
|
252
256
|
- lib/burner/library/collection/shift.rb
|
|
253
257
|
- lib/burner/library/collection/transform.rb
|
|
254
258
|
- lib/burner/library/collection/unpivot.rb
|
|
@@ -266,6 +270,9 @@ files:
|
|
|
266
270
|
- lib/burner/library/io/row_reader.rb
|
|
267
271
|
- lib/burner/library/io/write.rb
|
|
268
272
|
- lib/burner/library/nothing.rb
|
|
273
|
+
- lib/burner/library/param/base.rb
|
|
274
|
+
- lib/burner/library/param/from_register.rb
|
|
275
|
+
- lib/burner/library/param/to_register.rb
|
|
269
276
|
- lib/burner/library/serialize/csv.rb
|
|
270
277
|
- lib/burner/library/serialize/json.rb
|
|
271
278
|
- lib/burner/library/serialize/yaml.rb
|
|
@@ -304,7 +311,7 @@ metadata:
|
|
|
304
311
|
documentation_uri: https://www.rubydoc.info/gems/burner
|
|
305
312
|
homepage_uri: https://github.com/bluemarblepayroll/burner
|
|
306
313
|
source_code_uri: https://github.com/bluemarblepayroll/burner
|
|
307
|
-
post_install_message:
|
|
314
|
+
post_install_message:
|
|
308
315
|
rdoc_options: []
|
|
309
316
|
require_paths:
|
|
310
317
|
- lib
|
|
@@ -315,12 +322,12 @@ required_ruby_version: !ruby/object:Gem::Requirement
|
|
|
315
322
|
version: '2.5'
|
|
316
323
|
required_rubygems_version: !ruby/object:Gem::Requirement
|
|
317
324
|
requirements:
|
|
318
|
-
- - "
|
|
325
|
+
- - ">="
|
|
319
326
|
- !ruby/object:Gem::Version
|
|
320
|
-
version:
|
|
327
|
+
version: '0'
|
|
321
328
|
requirements: []
|
|
322
329
|
rubygems_version: 3.0.3
|
|
323
|
-
signing_key:
|
|
330
|
+
signing_key:
|
|
324
331
|
specification_version: 4
|
|
325
332
|
summary: Declarative and extendable processing pipeline
|
|
326
333
|
test_files: []
|