datacaster 4.2.1 → 6.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 351d202f9afcb51c391c23e852f4d9d289a60df1a4f890473f532aac47ca668c
4
- data.tar.gz: '031040811b8f5a2c319c5c48295c44d347f76dcb52fbfaeadd37f586f0d06582'
3
+ metadata.gz: 0072ba53600fb794462c13dba4fe31c9134aac67e5a7687f0e91e17483cf10b1
4
+ data.tar.gz: b35c2f8b667d884bcf641be2d79747c65a68b97a7da20abc718776b3fe532af6
5
5
  SHA512:
6
- metadata.gz: c53e86295d1f47bc80d0d5e1805b5076e85f8ded528f14039ebcbd88a810bd4e8e4897b16a41b30422c8b18556049a54256ce60fa5fecc04c176cda3de363890
7
- data.tar.gz: 67111b7dfa26fcfc844ca33336f884738a961c9629e01486ad4cef5423406f33ec030d92746e32c66739c624a0d0b0c6b9a89c8609dc40d0509cdf85298767ec
6
+ metadata.gz: 16903213c77a60b38d8a034ac88b39deaddf926474fcc9eeb575c7633d4742a3aab51c1a6e56b094bcbbc6dff5ae5ed5911a9c4e17bd12cc14df51f1f230e790
7
+ data.tar.gz: 5f3da501cb763c39f330cc8406bbd9a4e6db76db255612884c92fbb89a5fca0f8d8f81f72295941bc078b969bf5a3f1e2cfd1552eccda9cff30930c244398961
@@ -6,7 +6,7 @@ jobs:
6
6
  runs-on: ubuntu-latest
7
7
  strategy:
8
8
  matrix:
9
- ruby-version: ['3.1', '3.2', '3.3']
9
+ ruby-version: ['3.2', '3.3', '3.4']
10
10
  steps:
11
11
  - uses: actions/checkout@v3
12
12
  - name: Set up Ruby
data/README.md CHANGED
@@ -332,7 +332,7 @@ Notice that OR operator, if left-hand validation fails, passes the original valu
332
332
 
333
333
  #### *IF... THEN... ELSE operator*
334
334
 
335
- Let's support we want to run different validations depending on some value, e.g.:
335
+ Let's suppose we want to run different validations depending on some value, e.g.:
336
336
 
337
337
  * if 'salary' is more than 100_000, check for the additional key, 'passport'
338
338
  * otherwise, ensure 'passport' key is absent
@@ -373,7 +373,7 @@ Formally, with `a.then(b).else(c)`:
373
373
 
374
374
  Note: this construct is *not* an equivalent of `a & b | c`.
375
375
 
376
- With `a.then(b).else(c)` if `a` and `b` fails, then `b`'s error is returned. With `a & b | c`, instead, `c`'s result would be returned.
376
+ With `a.then(b).else(c)` if `a` passes and `b` fails, then `b`'s error is returned. With `a & b | c`, instead, `c`'s result would be returned.
377
377
 
378
378
  #### *SWITCH... ON... ELSE operator*
379
379
 
@@ -1128,13 +1128,13 @@ To define compound data type, array of 'something', use `array_schema(something)
1128
1128
  salaries = Datacaster.schema { array_of(integer) }
1129
1129
 
1130
1130
  salaries.([1000, 2000, 3000]) # Datacaster::ValidResult([1000, 2000, 3000])
1131
+ salaries.([]) # Datacaster::ValidResult([])
1131
1132
 
1132
1133
  salaries.(["one thousand"]) # Datacaster::ErrorResult({0=>["is not an integer"]})
1133
1134
  salaries.(:not_an_array) # Datacaster::ErrorResult(["should be an array"])
1134
- salaries.([]) # Datacaster::ErrorResult(["should not be empty"])
1135
1135
  ```
1136
1136
 
1137
- To allow empty array use the following construct: `compare([]) | array_of(...)`.
1137
+ To disallow empty array use the following construct: `array_of(..., allow_empty: false)`.
1138
1138
 
1139
1139
  If you want to define an array of hashes, [shortcut definition](#shortcut-nested-definitions) could be used: instead of `array_of(hash_schema({...}))` use `array_of({...})`:
1140
1140
 
@@ -1371,11 +1371,11 @@ Had we used `schema` everywhere, `CommonFieldsValidator` would return failure fo
1371
1371
 
1372
1372
  As a rule of thumb, use `partial_schema` in any "intermediary" validators (extracted for the sake of clarity of code and reusability) and use `schema` in any "end" validators (ones which receive full record as input and use intermediary validators behind the scenes).
1373
1373
 
1374
- Lastly, if you want to just delete extra unvalidated keys without returning a error, use `choosy_schema`.
1374
+ Lastly, if you want to just delete extra unvalidated keys without returning an error, use `choosy_schema`.
1375
1375
 
1376
1376
  #### AND with error aggregation (`*`)
1377
1377
 
1378
- Often it is useful to run validator which are "further down the conveyor" (i.e. placed at the right-hand side of AND operator `&`) even if current (i.e. left-hand side) validator has failed.
1378
+ Often it is useful to run validators which are "further down the conveyor" (i.e. placed at the right-hand side of AND operator `&`) even if current (i.e. left-hand side) validator has failed.
1379
1379
 
1380
1380
  Let's say we have extracted some "common validations" and have some concrete validators, which utilize these reusable common validations (more or less repeating the motif of the previous example, shortening non-essential for this section parts for clarity):
1381
1381
 
@@ -1793,7 +1793,7 @@ All keyword arguments of `#i18n_key`, `#i18n_scope` and designed for that sole p
1793
1793
 
1794
1794
  It is possible to add i18n variables at the runtime (e.g. inside `check { ... }` block) by calling `i18n_vars!(variable: 'value')` or `i18n_var!(:variable, 'value')`.
1795
1795
 
1796
- Outer calls of `#i18n_key` (`#i18n_scope`, `#i18n_vars`) have presedence before the inner if variable names collide. However, runtime calls of `#i18n_vars!` and `#i18n_var!` overwrite compile-time variables from the next nearest key, scope or vars on collision.
1796
+ Outer calls of `#i18n_key` (`#i18n_scope`, `#i18n_vars`) have precedence before the inner if variable names collide. However, runtime calls of `#i18n_vars!` and `#i18n_var!` overwrite compile-time variables from the next nearest key, scope or vars on collision.
1797
1797
 
1798
1798
  ## Registering custom 'predefined' types
1799
1799
 
@@ -27,6 +27,7 @@ en:
27
27
  relate: "%{left} should be %{op} %{right}"
28
28
  responds_to: "does not respond to %{reference}"
29
29
  string: is not a string
30
+ boolean: is not a boolean
30
31
  to_boolean: does not look like a boolean
31
32
  to_float: does not look like a float
32
33
  to_integer: does not look like an integer
@@ -14,6 +14,33 @@ module Datacaster
14
14
  )
15
15
  end
16
16
 
17
+ def to_json_schema
18
+ result =
19
+ @casters.reduce(JsonSchemaResult.new) do |result, caster|
20
+ result.apply(caster.to_json_schema, caster.to_json_schema_attributes)
21
+ end
22
+
23
+ mapping =
24
+ @casters.reduce({}) do |result, caster|
25
+ result.merge(caster.to_json_schema_attributes[:remaped])
26
+ end
27
+
28
+ result.remap(mapping)
29
+ end
30
+
31
+ def to_json_schema_attributes
32
+ super.merge(
33
+ required:
34
+ @casters.any? { |caster| caster.to_json_schema_attributes[:required] },
35
+ picked:
36
+ @casters.flat_map { |caster| caster.to_json_schema_attributes[:picked] },
37
+ remaped:
38
+ @casters.reduce({}) do |result, caster|
39
+ result.merge(caster.to_json_schema_attributes[:remaped])
40
+ end
41
+ )
42
+ end
43
+
17
44
  def inspect
18
45
  "#<Datacaster::AndNode casters: #{@casters.inspect}>"
19
46
  end
@@ -22,6 +22,20 @@ module Datacaster
22
22
  end
23
23
  end
24
24
 
25
+
26
+ def to_json_schema
27
+ [@left, @right].reduce(JsonSchemaResult.new) do |result, caster|
28
+ result.apply(caster.to_json_schema)
29
+ end
30
+ end
31
+
32
+ def to_json_schema_attributes
33
+ super.merge(
34
+ required:
35
+ [@left, @right].any? { |caster| caster.to_json_schema_attributes[:required] }
36
+ )
37
+ end
38
+
25
39
  def inspect
26
40
  "#<Datacaster::AndWithErrorAggregationNode L: #{@left.inspect} R: #{@right.inspect}>"
27
41
  end
@@ -1,7 +1,8 @@
1
1
  module Datacaster
2
2
  class ArraySchema < Base
3
- def initialize(element_caster, error_keys = {})
3
+ def initialize(element_caster, error_keys = {}, allow_empty: true)
4
4
  @element_caster = element_caster
5
+ @allow_empty = allow_empty
5
6
 
6
7
  @not_array_error_keys = ['.array', 'datacaster.errors.array']
7
8
  @not_array_error_keys.unshift(error_keys[:array]) if error_keys[:array]
@@ -12,7 +13,7 @@ module Datacaster
12
13
 
13
14
  def cast(array, runtime:)
14
15
  return Datacaster.ErrorResult(I18nValues::Key.new(@not_array_error_keys, value: array)) if !array.respond_to?(:map) || !array.respond_to?(:zip)
15
- return Datacaster.ErrorResult(I18nValues::Key.new(@empty_error_keys, value: array)) if array.empty?
16
+ return Datacaster.ErrorResult(I18nValues::Key.new(@empty_error_keys, value: array)) if array.empty? && !@allow_empty
16
17
 
17
18
  runtime.will_check!
18
19
 
@@ -30,6 +31,13 @@ module Datacaster
30
31
  end
31
32
  end
32
33
 
34
+ def to_json_schema
35
+ JsonSchemaResult.new({
36
+ 'type' => 'array',
37
+ 'items' => @element_caster.to_json_schema
38
+ })
39
+ end
40
+
33
41
  def inspect
34
42
  "#<Datacaster::ArraySchema [#{@element_caster.inspect}]>"
35
43
  end
@@ -10,6 +10,14 @@ module Datacaster
10
10
  transform_result(result)
11
11
  end
12
12
 
13
+ def to_json_schema
14
+ @base.to_json_schema
15
+ end
16
+
17
+ def to_json_schema_attributes
18
+ @base.to_json_schema_attributes
19
+ end
20
+
13
21
  def inspect
14
22
  "#<#{self.class.name} base: #{@base.inspect}>"
15
23
  end
@@ -72,6 +72,28 @@ module Datacaster
72
72
  end
73
73
  end
74
74
 
75
+ def to_json_schema
76
+ result = @fields.values.reduce(JsonSchemaResult.new) do |result, caster|
77
+ result.apply(caster.to_json_schema)
78
+ end
79
+
80
+ result.without_focus
81
+ end
82
+
83
+ def to_json_schema_attributes
84
+ super.merge(
85
+ remaped: @fields.flat_map do |key, caster|
86
+ picked = caster.to_json_schema_attributes[:picked]
87
+
88
+ if picked.any?
89
+ picked.map { |picked| { picked.to_s => key.to_s } }
90
+ else
91
+ { nil => key.to_s }
92
+ end
93
+ end.reduce(&:merge)
94
+ )
95
+ end
96
+
75
97
  def inspect
76
98
  field_descriptions =
77
99
  @fields.map do |k, v|
@@ -42,6 +42,16 @@ module Datacaster
42
42
  end
43
43
  end
44
44
 
45
+ def to_json_schema
46
+ not_hidden_fields = @fields.reject { |_k ,v| v.to_json_schema_attributes[:hidden] }
47
+
48
+ JsonSchemaResult.new({
49
+ "type" => "object",
50
+ "properties" => not_hidden_fields.map { |k, v| [k.to_s, v.to_json_schema] }.to_h,
51
+ "required" => not_hidden_fields.select { |_k ,v| v.to_json_schema_attributes[:required] }.keys.map(&:to_s),
52
+ })
53
+ end
54
+
45
55
  def inspect
46
56
  field_descriptions =
47
57
  @fields.map do |k, v|
@@ -0,0 +1,28 @@
1
+ module Datacaster
2
+ class JsonSchemaAttributes < Base
3
+ def initialize(base, schema_attributes = {}, &block)
4
+ @base = base
5
+ @schema_attributes = schema_attributes
6
+ @block = block
7
+ end
8
+
9
+ def cast(object, runtime:)
10
+ @base.cast(object, runtime: runtime)
11
+ end
12
+
13
+ def to_json_schema_attributes
14
+ result = @base.to_json_schema_attributes
15
+ result = result.merge(@schema_attributes)
16
+ result = @block.(result) if @block
17
+ result
18
+ end
19
+
20
+ def to_json_schema
21
+ @base.to_json_schema
22
+ end
23
+
24
+ def inspect
25
+ "#<#{self.class.name} base: #{@base.inspect}>"
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,28 @@
1
+ module Datacaster
2
+ class JsonSchemaNode < Base
3
+ def initialize(base, schema_attributes = {}, &block)
4
+ @base = base
5
+ @schema_attributes = schema_attributes.transform_keys(&:to_s)
6
+ @block = block
7
+ end
8
+
9
+ def cast(object, runtime:)
10
+ @base.cast(object, runtime: runtime)
11
+ end
12
+
13
+ def to_json_schema
14
+ result = @base.to_json_schema
15
+ result = result.apply(@schema_attributes)
16
+ result = @block.(result) if @block
17
+ result
18
+ end
19
+
20
+ def to_json_schema_attributes
21
+ @base.to_json_schema_attributes
22
+ end
23
+
24
+ def inspect
25
+ "#<#{self.class.name} base: #{@base.inspect}>"
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,239 @@
1
+ module Datacaster
2
+ class JsonSchemaResult < Hash
3
+ def initialize(from = {}, focus = nil)
4
+ merge!(from)
5
+
6
+ if from.is_a?(self.class)
7
+ @focus = from.focus
8
+ else
9
+ @focus = []
10
+ end
11
+
12
+ if focus == false || @focus == false
13
+ @focus = false
14
+ return
15
+ end
16
+
17
+ @focus << focus if focus
18
+ @target = self
19
+ @focus.each { |k| @target = @target['properties'][k] }
20
+ end
21
+
22
+ def with_focus_key(key)
23
+ result = apply(
24
+ "type" => "object",
25
+ "properties" => key ? { key => {} } : {}
26
+ )
27
+ self.class.new(result, key)
28
+ end
29
+
30
+ def without_focus
31
+ self.class.new(self).reset_focus
32
+ end
33
+
34
+ def remap(mapping)
35
+ return self if mapping.empty?
36
+
37
+ if self['oneOf'] || self['anyOf']
38
+ type = self.keys.first
39
+
40
+ self[type] = self[type].map { |props| object_remap(props, mapping) }
41
+ else
42
+ object_remap(self, mapping)
43
+ end
44
+
45
+ self
46
+ end
47
+
48
+ def object_remap(value, mapping)
49
+ return value unless value['type'] == 'object'
50
+
51
+ mapping.each do |from, to|
52
+ from_props = value['properties'][from] || {}
53
+ to_props = value['properties'][to] || {}
54
+
55
+ one_to_one_remap = mapping.values.count { _1 == to } == 1
56
+
57
+ properties_from = value['properties'].delete(from)
58
+ properties_to = value['properties'].delete(to)
59
+
60
+ if from && (properties_to || properties_from)
61
+ value['properties'][from] =
62
+ if one_to_one_remap
63
+ Datacaster::Utils.deep_merge(to_props, from_props)
64
+ else
65
+ self.class.new(properties_from || {})
66
+ end
67
+ end
68
+
69
+ required_from = value['required']&.delete(from)
70
+ required_to = value['required']&.delete(to)
71
+
72
+ if from && one_to_one_remap && (required_from || required_to)
73
+ value['required'] << from
74
+ end
75
+ end
76
+
77
+ value
78
+ end
79
+
80
+ def apply(other, schema_attributes = {})
81
+ return self if other.nil? || other.empty?
82
+ return JsonSchemaResult.new(other) if empty?
83
+
84
+ if @focus && !@focus.empty?
85
+ return with_updated_target(JsonSchemaResult.new(@target).apply(other))
86
+ end
87
+
88
+ # validations after pick(a, b) & transform
89
+ self_type = self['type']
90
+ other_type = other['type']
91
+
92
+ if (self_type == 'object' || self_type == 'array') && (other_type != 'object' && other_type != 'array')
93
+ return JsonSchemaResult.new(self)
94
+ end
95
+
96
+ result = self.class.new({})
97
+
98
+ if self['required'] || other['required']
99
+ result['required'] = (
100
+ (self['required'] || []).to_set | (other['required'] || []).to_set
101
+ ).to_a
102
+ end
103
+
104
+ nested =
105
+ if self['properties'] && (other['items'] || self['items']) ||
106
+ self['items'] && (self['properties'] || other['properties']) ||
107
+ other['items'] && other['properties']
108
+ raise RuntimeError, "can't merge json schemas due to wrong items/properties combination " \
109
+ "for #{self.inspect} and #{other.inspect}", caller
110
+ elsif self['properties'] || other['properties']
111
+ 'properties'
112
+ elsif self['items'] || other['items']
113
+ 'items'
114
+ else
115
+ nil
116
+ end
117
+
118
+ if nested
119
+ result[nested] = {}
120
+
121
+ keys = (self[nested] || {}).keys + (other[nested] || {}).keys
122
+ keys = keys.to_set
123
+
124
+ keys.each do |k|
125
+ one_k = self[nested] && self[nested][k] || {}
126
+ two_k = other[nested] && other[nested][k] || {}
127
+
128
+ if !one_k.is_a?(Hash) || !two_k.is_a?(Hash)
129
+ if one_k.empty? && !two_k.is_a?(Hash)
130
+ result[nested][k] = two_k
131
+ elsif two_k.empty? && !one_k.is_a?(Hash)
132
+ result[nested][k] = one_k
133
+ elsif one_k == two_k
134
+ result[nested][k] = one_k
135
+ else
136
+ raise RuntimeError, "can't merge json schemas due to wrong items/properties combination " \
137
+ "for #{self.inspect} and #{other.inspect}", caller
138
+ end
139
+ elsif one_k.is_a?(Hash) && two_k.is_a?(Hash)
140
+ result[nested][k] = self.class.new(one_k).apply(two_k)
141
+ else
142
+ raise RuntimeError, "can't merge json schemas due to wrong items/properties combination " \
143
+ "for #{self.inspect} and #{other.inspect}", caller
144
+ end
145
+
146
+ end
147
+ end
148
+
149
+ if self['description'] || other['description']
150
+ result['description'] = other['description'] || self['description']
151
+ end
152
+
153
+ (self.keys + other.keys - %w(required properties items description)).to_set.each do |k|
154
+ # used to merge switch schemas
155
+ # TODO: подумать как сделать в обратную сторону
156
+ # FULL_DETAILS_SCHEMA = Datacaster.partial_schema do
157
+ # LoanTransferMethods::InitiatorTransferDetailsStruct.schema & switch(
158
+ # :kind,
159
+ # product: hash_schema(
160
+ # currency: string,
161
+ # us_only: boolean,
162
+ # name: string,
163
+ # values: array_of(integer),
164
+ # official_provider_name: string,
165
+ # )
166
+ # ).else(pass)
167
+ # end
168
+
169
+ if schema_attributes[:extendable]
170
+ case k
171
+ in 'oneOf'
172
+ self_one_of = self[k]
173
+ other_one_of = other[k]
174
+
175
+ result_objects = other_one_of.map do |other_obj|
176
+ other_obj_properties = other_obj['properties'].to_a
177
+
178
+ max_same = -1
179
+
180
+ # basicly is guessing here, but must be ok in most cases
181
+ merge_candidate = self_one_of.max_by do |self_obj|
182
+ next -1 if self_obj.empty?
183
+
184
+ self_obj_properties = self_obj['properties'].to_a
185
+
186
+ max_same = (self_obj_properties & other_obj_properties).size
187
+
188
+ max_same
189
+ end
190
+
191
+ next other_obj if max_same < 1
192
+
193
+ Datacaster::Utils.deep_merge(other_obj, merge_candidate)
194
+ end
195
+
196
+ next result[k] = result_objects
197
+ else
198
+ raise RuntimeError, "can't merge json schemas due to conflicting field #{k} for " \
199
+ "#{inspect} and #{other.inspect}", caller
200
+ end
201
+ else
202
+ if self[k] && other[k] && self[k] != other[k]
203
+ raise RuntimeError, "can't merge json schemas due to conflicting field #{k} for " \
204
+ "#{inspect} and #{other.inspect}", caller
205
+ end
206
+ end
207
+
208
+ result[k] = other[k] || self[k]
209
+ end
210
+
211
+ result
212
+ end
213
+
214
+ protected
215
+
216
+ def focus
217
+ @focus
218
+ end
219
+
220
+ def reset_focus
221
+ @focus = []
222
+ @target = self
223
+ self
224
+ end
225
+
226
+ private
227
+
228
+ def with_updated_target(target)
229
+ result = self.class.new(self)
230
+ nested =
231
+ @focus[0..-2].reduce(result) do |result, k|
232
+ result['properties'][k] = result['properties'][k].dup
233
+ result['properties'][k]
234
+ end
235
+ nested['properties'][@focus[-1]] = target
236
+ result
237
+ end
238
+ end
239
+ end
@@ -81,5 +81,27 @@ module Datacaster
81
81
  def inspect
82
82
  "#<Datacaster::Base>"
83
83
  end
84
+
85
+ def json_schema(schema_attributes = {}, &block)
86
+ JsonSchemaNode.new(self, schema_attributes, &block)
87
+ end
88
+
89
+ def json_schema_attributes(schema_attributes = {}, &block)
90
+ JsonSchemaAttributes.new(self, schema_attributes, &block)
91
+ end
92
+
93
+ def to_json_schema_attributes
94
+ {
95
+ required: true,
96
+ extendable: false,
97
+ remaped: {},
98
+ picked: [],
99
+ hidden: false
100
+ }
101
+ end
102
+
103
+ def to_json_schema
104
+ JsonSchemaResult.new
105
+ end
84
106
  end
85
107
  end
@@ -13,6 +13,20 @@ module Datacaster
13
13
  @right.with_runtime(runtime).(object)
14
14
  end
15
15
 
16
+ def to_json_schema
17
+ JsonSchemaResult.new({
18
+ "anyOf" => [@left, @right].map(&:to_json_schema)
19
+ })
20
+ end
21
+
22
+ def to_json_schema_attributes
23
+ super.merge(
24
+ required:
25
+ @left.to_json_schema_attributes[:required] &&
26
+ @right.to_json_schema_attributes[:required]
27
+ )
28
+ end
29
+
16
30
  def inspect
17
31
  "#<Datacaster::OrNode L: #{@left.inspect} R: #{@right.inspect}>"
18
32
  end
@@ -13,7 +13,13 @@ module Datacaster
13
13
  end
14
14
 
15
15
  def compare(value, error_key = nil)
16
- Comparator.new(value, error_key)
16
+ comparator = Comparator.new(value, error_key)
17
+
18
+ if value.nil?
19
+ comparator.json_schema(type: 'null')
20
+ else
21
+ comparator.json_schema(enum: [value])
22
+ end
17
23
  end
18
24
 
19
25
  def run(&block)
@@ -34,8 +40,8 @@ module Datacaster
34
40
  Trier.new(catched_exception, error_key, &block)
35
41
  end
36
42
 
37
- def array_schema(element_caster, error_keys = {})
38
- ArraySchema.new(DefinitionDSL.expand(element_caster), error_keys)
43
+ def array_schema(element_caster, error_keys = {}, allow_empty: true)
44
+ ArraySchema.new(DefinitionDSL.expand(element_caster), error_keys, allow_empty:)
39
45
  end
40
46
  alias_method :array_of, :array_schema
41
47
 
@@ -98,7 +104,7 @@ module Datacaster
98
104
  I18nValues::Key.new(error_keys, value: x)
99
105
  )
100
106
  end
101
- end
107
+ end.json_schema_attributes(required: false)
102
108
  end
103
109
 
104
110
  def any(error_key = nil)
@@ -135,7 +141,7 @@ module Datacaster
135
141
  else
136
142
  x
137
143
  end
138
- end
144
+ end.json_schema_attributes(required: false)
139
145
  end
140
146
 
141
147
  def merge_message_keys(*keys)
@@ -149,8 +155,11 @@ module Datacaster
149
155
  end
150
156
 
151
157
  def optional(base, on: nil)
152
- return absent | base if on == nil
153
- cast do |x|
158
+ if on == nil
159
+ return (absent | base).json_schema { base.to_json_schema }.json_schema_attributes(required: false)
160
+ end
161
+
162
+ caster = cast do |x|
154
163
  if x == Datacaster.absent ||
155
164
  (!on.nil? && x.respond_to?(on) && x.public_send(on))
156
165
  Datacaster.ValidResult(Datacaster.absent)
@@ -158,10 +167,15 @@ module Datacaster
158
167
  base.(x)
159
168
  end
160
169
  end
170
+
171
+ caster
172
+ .json_schema(base.to_json_schema)
173
+ .json_schema_attributes(required: false)
161
174
  end
162
175
 
163
176
  def pass
164
177
  cast { |v| Datacaster::ValidResult(v) }
178
+ .json_schema_attributes(required: false)
165
179
  end
166
180
 
167
181
  def pass_if(base)
@@ -189,6 +203,19 @@ module Datacaster
189
203
  end
190
204
  end
191
205
 
206
+ json_schema = -> (previous) do
207
+ previous = previous.apply({
208
+ 'type' => 'object',
209
+ 'properties' => keys.map { |k, v| [k.to_s, JsonSchemaResult.new] }.to_h
210
+ })
211
+
212
+ if keys.length == 1
213
+ previous.with_focus_key(keys[0].to_s)
214
+ else
215
+ previous.with_focus_key(false)
216
+ end
217
+ end
218
+
192
219
  must_be(Enumerable) & cast { |input|
193
220
  result =
194
221
  keys.map do |key|
@@ -200,7 +227,7 @@ module Datacaster
200
227
  end
201
228
  result = keys.length == 1 ? result.first : result
202
229
  Datacaster::ValidResult(result)
203
- }
230
+ }.json_schema(&json_schema).json_schema_attributes(picked: keys)
204
231
  end
205
232
 
206
233
  def relate(left, op, right, error_key: nil)
@@ -277,7 +304,8 @@ module Datacaster
277
304
  end
278
305
 
279
306
  def transform_to_value(value)
280
- transform { Datacaster::Utils.deep_freeze(value) }
307
+ value = Datacaster::Utils.deep_freeze(value)
308
+ transform { value }
281
309
  end
282
310
 
283
311
  def with(keys, caster)
@@ -299,7 +327,9 @@ module Datacaster
299
327
  def numeric(error_key = nil)
300
328
  error_keys = ['.numeric', 'datacaster.errors.numeric']
301
329
  error_keys.unshift(error_key) if error_key
302
- check { |x| x.is_a?(Numeric) }.i18n_key(*error_keys)
330
+ check { |x| x.is_a?(Numeric) }.
331
+ i18n_key(*error_keys).
332
+ json_schema(oneOf: [{ 'type' => 'string' }, { 'type' => 'number' }])
303
333
  end
304
334
 
305
335
  def decimal(digits = 8, error_key = nil)
@@ -311,32 +341,39 @@ module Datacaster
311
341
  Float(x)
312
342
 
313
343
  BigDecimal(x, digits)
314
- end.i18n_key(*error_keys)
344
+ end.i18n_key(*error_keys).json_schema(type: 'string')
315
345
  end
316
346
 
317
347
  def array(error_key = nil)
318
348
  error_keys = ['.array', 'datacaster.errors.array']
319
349
  error_keys.unshift(error_key) if error_key
320
- check { |x| x.is_a?(Array) }.i18n_key(*error_keys)
350
+
351
+ check { |x| x.is_a?(Array) }.
352
+ i18n_key(*error_keys).
353
+ json_schema(type: 'array')
321
354
  end
322
355
 
323
356
  def float(error_key = nil)
324
357
  error_keys = ['.float', 'datacaster.errors.float']
325
358
  error_keys.unshift(error_key) if error_key
326
- check { |x| x.is_a?(Float) }.i18n_key(*error_keys)
359
+ check { |x| x.is_a?(Float) }.i18n_key(*error_keys).
360
+ json_schema(type: 'number', format: 'float')
327
361
  end
328
362
 
329
363
  def pattern(regexp, error_key = nil)
330
364
  error_keys = ['.pattern', 'datacaster.errors.pattern']
331
365
  error_keys.unshift(error_key) if error_key
332
- string(error_key) & check { |x| x.match?(regexp) }.i18n_key(*error_keys, reference: regexp.inspect)
366
+ string(error_key) & check { |x| x.match?(regexp) }.i18n_key(*error_keys, reference: regexp.inspect).
367
+ json_schema(pattern: regexp.inspect)
333
368
  end
334
369
 
335
370
  # 'hash' would be a bad method name, because it would override built in Object#hash
336
371
  def hash_value(error_key = nil)
337
372
  error_keys = ['.hash_value', 'datacaster.errors.hash_value']
338
373
  error_keys.unshift(error_key) if error_key
339
- check { |x| x.is_a?(Hash) }.i18n_key(*error_keys)
374
+ check { |x| x.is_a?(Hash) }.
375
+ i18n_key(*error_keys).
376
+ json_schema(type: 'object', additionalProperties: true)
340
377
  end
341
378
 
342
379
  def hash_with_symbolized_keys(error_key = nil)
@@ -346,19 +383,23 @@ module Datacaster
346
383
  def included_in(values, error_key: nil)
347
384
  error_keys = ['.included_in', 'datacaster.errors.included_in']
348
385
  error_keys.unshift(error_key) if error_key
349
- check { |x| values.include?(x) }.i18n_key(*error_keys, reference: values.map(&:to_s).join(', '))
386
+ check { |x| values.include?(x) }.
387
+ i18n_key(*error_keys, reference: values.map(&:to_s).join(', ')).
388
+ json_schema(enum: values)
350
389
  end
351
390
 
352
391
  def integer(error_key = nil)
353
392
  error_keys = ['.integer', 'datacaster.errors.integer']
354
393
  error_keys.unshift(error_key) if error_key
355
- check { |x| x.is_a?(Integer) }.i18n_key(*error_keys)
394
+ check { |x| x.is_a?(Integer) }.i18n_key(*error_keys).
395
+ json_schema(type: 'integer')
356
396
  end
357
397
 
358
398
  def integer32(error_key = nil)
359
399
  error_keys = ['.integer32', 'datacaster.errors.integer32']
360
400
  error_keys.unshift(error_key) if error_key
361
- integer(error_key) & check { |x| x.abs <= 2_147_483_647 }.i18n_key(*error_keys)
401
+ integer(error_key) & check { |x| x.abs <= 2_147_483_647 }.i18n_key(*error_keys).
402
+ json_schema(format: 'int32')
362
403
  end
363
404
 
364
405
  def maximum(max, error_key = nil, inclusive: true)
@@ -400,7 +441,8 @@ module Datacaster
400
441
  def string(error_key = nil)
401
442
  error_keys = ['.string', 'datacaster.errors.string']
402
443
  error_keys.unshift(error_key) if error_key
403
- check { |x| x.is_a?(String) }.i18n_key(*error_keys)
444
+ check { |x| x.is_a?(String) }.i18n_key(*error_keys).
445
+ json_schema(type: 'string')
404
446
  end
405
447
 
406
448
  def non_empty_string(error_key = nil)
@@ -412,7 +454,7 @@ module Datacaster
412
454
  def uuid(error_key = nil)
413
455
  error_keys = ['.uuid', 'datacaster.errors.uuid']
414
456
  error_keys.unshift(error_key) if error_key
415
- string(error_key) & pattern(/\A\h{8}-\h{4}-\h{4}-\h{4}-\h{12}\z/).i18n_key(*error_keys)
457
+ pattern(/\A\h{8}-\h{4}-\h{4}-\h{4}-\h{12}\z/, error_key).i18n_key(*error_keys)
416
458
  end
417
459
 
418
460
  # Form request types
@@ -423,7 +465,8 @@ module Datacaster
423
465
 
424
466
  string(error_key) &
425
467
  try(catched_exception: [ArgumentError, TypeError]) { |x| DateTime.iso8601(x) }.
426
- i18n_key(*error_keys)
468
+ i18n_key(*error_keys).
469
+ json_schema(type: 'string', format: 'date-time')
427
470
  end
428
471
 
429
472
  def to_boolean(error_key = nil)
@@ -438,7 +481,23 @@ module Datacaster
438
481
  else
439
482
  Datacaster.ErrorResult(I18nValues::Key.new(error_keys, value: x))
440
483
  end
441
- end
484
+ end.json_schema(oneOf: [
485
+ { 'type' => 'string', 'enum' => ['true', 'false', '1', '0'] },
486
+ { 'type' => 'boolean' },
487
+ ])
488
+ end
489
+
490
+ def boolean(error_key = nil)
491
+ error_keys = ['.boolean', 'datacaster.errors.boolean']
492
+ error_keys.unshift(error_key) if error_key
493
+
494
+ cast do |x|
495
+ if [false, true].include?(x)
496
+ Datacaster.ValidResult(x)
497
+ else
498
+ Datacaster.ErrorResult(I18nValues::Key.new(error_keys, value: x))
499
+ end
500
+ end.json_schema(type:'boolean')
442
501
  end
443
502
 
444
503
  def to_float(error_key = nil)
@@ -447,7 +506,7 @@ module Datacaster
447
506
 
448
507
  Trier.new([ArgumentError, TypeError]) do |x|
449
508
  Float(x)
450
- end.i18n_key(*error_keys)
509
+ end.i18n_key(*error_keys).json_schema(type: 'number', format: 'float')
451
510
  end
452
511
 
453
512
  def to_integer(error_key = nil)
@@ -456,7 +515,7 @@ module Datacaster
456
515
 
457
516
  Trier.new([ArgumentError, TypeError]) do |x|
458
517
  Integer(x)
459
- end.i18n_key(*error_keys)
518
+ end.i18n_key(*error_keys).json_schema(oneOf: [{ 'type' => 'string' }, { 'type' => 'number' }])
460
519
  end
461
520
 
462
521
  def optional_param(base)
@@ -27,10 +27,12 @@ module Datacaster
27
27
  caster_or_value
28
28
  when String, Symbol
29
29
  if strict
30
- Datacaster::Predefined.compare(caster_or_value)
30
+ Datacaster::Predefined.compare(caster_or_value).json_schema { {"type" => "string", "enum" => [caster_or_value.to_s]} }
31
31
  else
32
- Datacaster::Predefined.compare(caster_or_value.to_s) |
33
- Datacaster::Predefined.compare(caster_or_value.to_sym)
32
+ (
33
+ Datacaster::Predefined.compare(caster_or_value.to_s) |
34
+ Datacaster::Predefined.compare(caster_or_value.to_sym)
35
+ ).json_schema { {"type" => "string", "enum" => [caster_or_value.to_s]} }
34
36
  end
35
37
  else
36
38
  Datacaster::Predefined.compare(caster_or_value)
@@ -43,6 +45,7 @@ module Datacaster
43
45
 
44
46
  def else(else_caster)
45
47
  raise ArgumentError, "Datacaster: double else clause is not permitted", caller if @else
48
+ else_caster = DefinitionDSL.expand(else_caster)
46
49
  self.class.new(@base, on_casters: @ons, else_caster: else_caster, pick_key: @pick_key)
47
50
  end
48
51
 
@@ -75,6 +78,34 @@ module Datacaster
75
78
  )
76
79
  end
77
80
 
81
+ def to_json_schema
82
+ if @ons.empty?
83
+ raise RuntimeError, "switch caster requires at least one 'on' statement: switch(...).on(condition, cast)", caller
84
+ end
85
+
86
+ base = @base.to_json_schema
87
+
88
+ schema_result = @ons.map { |on|
89
+ base.apply(on[0].to_json_schema).without_focus.apply(on[1].to_json_schema)
90
+ }
91
+
92
+ if @else
93
+ schema_result << @else.to_json_schema
94
+ end
95
+
96
+ JsonSchemaResult.new( "oneOf" => schema_result )
97
+ end
98
+
99
+ def to_json_schema_attributes
100
+ super.merge(
101
+ extendable: true,
102
+ remaped:
103
+ [@base, @else, *@ons.map(&:last)].compact.reduce({}) do |result, caster|
104
+ result.merge(caster.to_json_schema_attributes[:remaped])
105
+ end
106
+ )
107
+ end
108
+
78
109
  def inspect
79
110
  "#<Datacaster::SwitchNode base: #{@base.inspect} on: #{@ons.inspect} else: #{@else.inspect} pick_key: #{@pick_key.inspect}>"
80
111
  end
@@ -26,6 +26,29 @@ module Datacaster
26
26
  end
27
27
  end
28
28
 
29
+ def to_json_schema
30
+ unless @else
31
+ raise ArgumentError.new('Datacaster: use "a & b" instead of "a.then(b)" when there is no else-clause')
32
+ end
33
+
34
+ left = @left.to_json_schema
35
+
36
+ JsonSchemaResult.new(
37
+ "oneOf" => [
38
+ (@left & @then).to_json_schema,
39
+ JsonSchemaResult.new("not" => left).apply(@else.to_json_schema)
40
+ ]
41
+ )
42
+ end
43
+
44
+ def to_json_schema_attributes
45
+ super.merge(
46
+ required:
47
+ @left.to_json_schema_attributes[:required] &&
48
+ @else.to_json_schema_attributes[:required]
49
+ )
50
+ end
51
+
29
52
  def inspect
30
53
  "#<Datacaster::ThenNode Then: #{@then.inspect} Else: #{@else.inspect}>"
31
54
  end
@@ -14,6 +14,10 @@ module Datacaster
14
14
  Datacaster::ValidResult(result)
15
15
  end
16
16
 
17
+ def to_json_schema_attributes
18
+ super.merge(required: false)
19
+ end
20
+
17
21
  def inspect
18
22
  "#<Datacaster::Transformer>"
19
23
  end
@@ -1,7 +1,14 @@
1
+ require 'set'
2
+
1
3
  module Datacaster
2
4
  module Utils
3
5
  extend self
4
6
 
7
+ def deep_merge(first, second)
8
+ merger = proc { |_, v1, v2| Hash === v1 && Hash === v2 ? v1.merge(v2, &merger) : Array === v1 && Array === v2 ? v1 | v2 : [:undefined, nil, :nil].include?(v2) ? v1 : v2 }
9
+ first.merge(second.to_h, &merger)
10
+ end
11
+
5
12
  def deep_freeze(value, copy: true)
6
13
  Ractor.make_shareable(value, copy:)
7
14
  end
@@ -1,3 +1,3 @@
1
1
  module Datacaster
2
- VERSION = "4.2.1"
2
+ VERSION = "6.0.1"
3
3
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: datacaster
3
3
  version: !ruby/object:Gem::Version
4
- version: 4.2.1
4
+ version: 6.0.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Eugene Zolotarev
8
- autorequire:
8
+ autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2025-04-02 00:00:00.000000000 Z
11
+ date: 2025-10-14 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: activemodel
@@ -106,7 +106,7 @@ dependencies:
106
106
  - - "<"
107
107
  - !ruby/object:Gem::Version
108
108
  version: '3'
109
- description:
109
+ description:
110
110
  email:
111
111
  - eugzol@gmail.com
112
112
  executables: []
@@ -152,6 +152,9 @@ files:
152
152
  - lib/datacaster/i18n_values/base.rb
153
153
  - lib/datacaster/i18n_values/key.rb
154
154
  - lib/datacaster/i18n_values/scope.rb
155
+ - lib/datacaster/json_schema_attributes.rb
156
+ - lib/datacaster/json_schema_node.rb
157
+ - lib/datacaster/json_schema_result.rb
155
158
  - lib/datacaster/message_keys_merger.rb
156
159
  - lib/datacaster/mixin.rb
157
160
  - lib/datacaster/or_node.rb
@@ -178,7 +181,7 @@ licenses:
178
181
  - MIT
179
182
  metadata:
180
183
  source_code_uri: https://github.com/EugZol/datacaster
181
- post_install_message:
184
+ post_install_message:
182
185
  rdoc_options: []
183
186
  require_paths:
184
187
  - lib
@@ -193,8 +196,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
193
196
  - !ruby/object:Gem::Version
194
197
  version: '0'
195
198
  requirements: []
196
- rubygems_version: 3.5.20
197
- signing_key:
199
+ rubygems_version: 3.1.6
200
+ signing_key:
198
201
  specification_version: 4
199
202
  summary: Run-time type checker and transformer for Ruby
200
203
  test_files: []