sparkql 1.2.1 → 1.2.2

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,15 +1,15 @@
1
1
  ---
2
2
  !binary "U0hBMQ==":
3
3
  metadata.gz: !binary |-
4
- MGYwODJjZmY5MTNkOWFjYjRlZTBkOGZjNjhmNGI1ZjU5MzhjNGM1OA==
4
+ OGNlZDVlNWY0ZDgwNDY0YTY1ZWI0ZGY5MzQ1ZmY0NTIxMGY4ZTI2OQ==
5
5
  data.tar.gz: !binary |-
6
- YTViYjY5OWVjYzJkYzYxMmQzZTQwZjg1MWFmN2MzNDM4ZmI3MjlmNw==
6
+ ZjhiZmI1Mjc4ZDM0YmE5M2Q4YzU5NTMwMmRhODY3MDY5YWE3ODQ5YQ==
7
7
  SHA512:
8
8
  metadata.gz: !binary |-
9
- ZDc5MjYxMmZiOTY5ZTIxNDg4YmFkYTUwNDRiNGE4YTMwNzgxYTBmNGMyNDUx
10
- ZWViZmNkNTI3ODk1ZWRkOWVkZDAyMDk2YWE5MDMxZWQ2MzNlNzI0OTE5ZDEy
11
- ZjUyZGQ1ZTIxNzVmODM0OGVmNGQzOWU1MzEzMGZhZWFlMWIzNDU=
9
+ Yzk5NzlkOTMxY2VkZTUyN2JmMmFkYWU0NmM4MWY4ODUyNzQ0NzZkYzM5ZTU5
10
+ YTg5NjJhNmFlMDVlNzkxNWM2MDgyYTQwZjI5ZTc1OGZjODI1MzIxMTE2ZGQ5
11
+ ZjBmYmQ2ZjdiNzNlODIwOWUzMjI5ODEzMThiMTcyZTJkYWUwOTk=
12
12
  data.tar.gz: !binary |-
13
- OWFmNWE1ZTYwZjY5ZTQxMmZhOTFiMjQ5YjM0YjAwMGUxNjE5YTQ2Y2Q5ODRi
14
- YmQ1NTg5OGQ3M2UzOTJiOTNjNWQzMDhjNWQyYzJlM2I1NmMyMWVlMDU4N2Fl
15
- MTQxODY1ODc3ZmQzOGJmY2FhMGQ3MTMwYjEzMGJjZmJkOWJiMDk=
13
+ YmUyZjU5ZWI3NWNhOGVhYmU5MDQxZDNhZjhhZWEwNzJmMzNmZjkzNTUxODQ5
14
+ YzIwODQwMjgzMThlODc4ZWJiYjdkMzM0NDU4ZjBiZjlhMGMwODQ3YzdkNDRh
15
+ MTRiYzZkMmVmMTM0NGNhZWY1ZWJkZDcwYjEzZDIyMDU0ZjY4YmM=
data/CHANGELOG.md CHANGED
@@ -1,3 +1,7 @@
1
+ v1.2.2, 2018-11-28
2
+ -------------------
3
+ * [IMPROVEMENT] Support Arithmetic: Add, Sub, Mul, Div, Mod
4
+
1
5
  v1.2.1, 2018-10-09
2
6
  -------------------
3
7
  * [BUGFIX] Check deepest function when type checking field arguments.
data/GRAMMAR.md CHANGED
@@ -1,9 +1,9 @@
1
1
  ## SparkQL BNF Grammar
2
2
  This document explains the rules for the Spark API filter language syntax and
3
- is a living document generated from the reference implementation at
3
+ is a living document generated from the reference implementation at
4
4
  https://github.com/sparkapi/sparkql.
5
5
  ### Precedence Rules
6
- Unless otherwise specified, SparkQL follows SQL precendence conventions for
6
+ Unless otherwise specified, SparkQL follows SQL precendence conventions for
7
7
  operators and conjunctions.
8
8
  Unary minus is always tied to value, such as for negative numbers.
9
9
 
@@ -11,6 +11,8 @@ Unary minus is always tied to value, such as for negative numbers.
11
11
  ```
12
12
  prechigh
13
13
  nonassoc UMINUS
14
+ left MUL DIV MOD
15
+ left ADD SUB
14
16
  preclow
15
17
  ```
16
18
 
@@ -39,28 +41,31 @@ One or more expressions
39
41
  ```
40
42
 
41
43
  #### Expression
42
- The core of the filtering system, the expression requires a field, a condition
43
- and criteria for comparing the value of the field to the value(s) of the
44
- condition. The result of evaluating the expression on a resource is a true of
45
- false for matching the criteria.
44
+ The core of the filtering system, the expression requires a field, a condition
45
+ and criteria for comparing the value of the field to the value(s) of the
46
+ condition. The result of evaluating the expression on a resource is a true of
47
+ false for matching the criteria. We are separating functions and arithmetic
48
+ based on if we are acting on the field side or the literal side. This is to
49
+ allow literal folding on the literal side and to prevent unnecessary checks
50
+ to see if a field is in the expression.
46
51
 
47
52
 
48
53
  ```
49
54
  expression
50
- : field OPERATOR condition
51
- | field RANGE_OPERATOR range
55
+ : field_expression OPERATOR condition
56
+ | field_expression RANGE_OPERATOR range
52
57
  | group
53
58
  ;
54
59
  ```
55
60
 
56
61
  #### Unary Conjunction
57
- Some conjunctions don't need to expression at all times (e.g. 'NOT').
62
+ Some conjunctions don't need to expression at all times (e.g. 'NOT').
58
63
 
59
64
 
60
65
  ```
61
66
  unary_conjunction
62
67
  : UNARY_CONJUNCTION expression
63
- ;
68
+ ;
64
69
  ```
65
70
 
66
71
  #### Conjunction
@@ -80,42 +85,50 @@ One or more expressions encased in parenthesis. There are limitations on nesting
80
85
 
81
86
  ```
82
87
  group
83
- : LPAREN expressions RPAREN
84
- ;
85
- ```
86
-
87
- #### Field
88
- Keyword for searching on, these fields should be discovered using the metadata
89
- rules. In general, Keywords that cannot be found will be dropped from the
90
- filter.
91
-
92
-
93
- ```
94
- field
95
- : STANDARD_FIELD
96
- | CUSTOM_FIELD
97
- | function
98
- ;
88
+ : LPAREN expressions RPAREN
89
+ ;
90
+ field_expression
91
+ : field_arithmetic_expression
92
+ ;
93
+ field_arithmetic_expression
94
+ : field_arithmetic_expression ADD field_arithmetic_expression
95
+ | field_arithmetic_expression SUB field_arithmetic_expression
96
+ | field_arithmetic_expression MUL field_arithmetic_expression
97
+ | field_arithmetic_expression DIV field_arithmetic_expression
98
+ | field_arithmetic_expression MOD field_arithmetic_expression
99
+ | literals
100
+ | field_function_expression
101
+ ;
102
+ field_function_expression
103
+ : field
104
+ | function
105
+ ;
99
106
  ```
100
107
 
101
108
  #### Condition
102
- The determinant of the filter, this is typically a value or set of values of
103
- a type that the field supports (review the field meta data for support).
109
+ The determinant of the filter, this is typically a value or set of values of
110
+ a type that the field supports (review the field meta data for support).
104
111
  Functions are also supported on some field types, and provide more flexibility
105
112
  on filtering values
106
113
 
107
114
 
108
115
  ```
109
116
  condition
110
- : literal
111
- | literal_function
117
+ : arithmetic_condition
112
118
  | literal_list
119
+ | literal
113
120
  ;
121
+ arithmetic_condition
122
+ : condition ADD condition
123
+ | condition SUB condition
124
+ | condition MUL condition
125
+ | condition DIV condition
126
+ | condition MOD condition
114
127
  ```
115
128
 
116
129
  #### Function
117
- Functions may replace static values for conditions with supported field
118
- types. Functions may have parameters that match types supported by
130
+ Functions may replace static values for conditions with supported field
131
+ types. Functions may have parameters that match types supported by
119
132
  fields.
120
133
 
121
134
 
@@ -143,9 +156,9 @@ Functions may optionally have a comma delimited list of parameters.
143
156
  | function_args COMMA function_arg
144
157
  ;
145
158
  function_arg
146
- : literal
159
+ : field_function_expression
160
+ | literal
147
161
  | literals
148
- | field
149
162
  ;
150
163
  literal_function_args
151
164
  : literal_function_arg
@@ -154,7 +167,6 @@ Functions may optionally have a comma delimited list of parameters.
154
167
  literal_function_arg
155
168
  : literal
156
169
  | literals
157
- | literal_function
158
170
  ;
159
171
  ```
160
172
 
@@ -172,12 +184,12 @@ A comma delimited list of functions and values.
172
184
  ```
173
185
 
174
186
  #### Range List
175
- A comma delimited list of values that support ranges for the Between operator
187
+ A comma delimited list of values that support ranges for the Between operator
176
188
  (see rangeable).
177
189
 
178
190
 
179
191
  ```
180
- range
192
+ range
181
193
  : rangeable COMMA rangeable
182
194
  ;
183
195
  ```
@@ -211,7 +223,7 @@ Literals only support a single value in a condition
211
223
  ```
212
224
 
213
225
  #### Range List
214
- Functions, and literals that can be used in a range
226
+ Functions, and literals that can be used in a range
215
227
 
216
228
 
217
229
  ```
@@ -225,3 +237,16 @@ Functions, and literals that can be used in a range
225
237
  ;
226
238
  ```
227
239
 
240
+ #### Field
241
+ Keyword for searching on, these fields should be discovered using the metadata
242
+ rules. In general, Keywords that cannot be found will be dropped from the
243
+ filter.
244
+
245
+
246
+ ```
247
+ field
248
+ : STANDARD_FIELD
249
+ | CUSTOM_FIELD
250
+ ;
251
+ ```
252
+
data/VERSION CHANGED
@@ -1 +1 @@
1
- 1.2.1
1
+ 1.2.2
data/lib/sparkql/lexer.rb CHANGED
@@ -62,7 +62,7 @@ class Sparkql::Lexer < StringScanner
62
62
 
63
63
  token.freeze
64
64
  end
65
-
65
+
66
66
  def check_reserved_words(value)
67
67
  u_value = value.capitalize
68
68
  if OPERATORS.include?(u_value)
@@ -73,6 +73,16 @@ class Sparkql::Lexer < StringScanner
73
73
  [:CONJUNCTION,u_value]
74
74
  elsif UNARY_CONJUNCTIONS.include?(u_value)
75
75
  [:UNARY_CONJUNCTION,u_value]
76
+ elsif ADD == u_value
77
+ [:ADD, u_value]
78
+ elsif SUB == u_value
79
+ [:SUB, u_value]
80
+ elsif MUL == u_value
81
+ [:MUL, u_value]
82
+ elsif DIV == u_value
83
+ [:DIV, u_value]
84
+ elsif MOD == u_value
85
+ [:MOD, u_value]
76
86
  else
77
87
  [:UNKNOWN, "ERROR: '#{self.string}'"]
78
88
  end