sparkql 0.3.24 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,15 +1,15 @@
1
1
  ---
2
2
  !binary "U0hBMQ==":
3
3
  metadata.gz: !binary |-
4
- NDA4ZDRjM2ExNzI2ZmMyMTViODM4NDNlZTkzOTFkNGU3NDRiZmY5MA==
4
+ MDMyYjhlMDJkY2ZkZjJmN2VlYzVmNTdkODNjNTRjZjFkMjEwZDJiMA==
5
5
  data.tar.gz: !binary |-
6
- ODA1NDYzMTM3OTlhYzQyMDY0Y2FhZjhhMTE5OGJjOWQ0Nzg1Mzg0NQ==
6
+ N2YwNzMyMTJjMGFjZmQ5MDM0OWYyZDYyM2Y2MmQwMWEyMWZhZTg0MA==
7
7
  SHA512:
8
8
  metadata.gz: !binary |-
9
- ZTFkYzJlNWRlY2EwZjdhOGZhZjdjY2JmMTQ5ODI0N2QxYWUxOGIwZjYyZWQ4
10
- MDUzNmJlYTU1OGNjZGFjNWI5MjgyYzAyYmQ2ZjVkNGU1OWFiOWNhNTAzZTIy
11
- MjJjNDNjNjAwMDI1ZDkzN2I2NjM4ZmRhZTJmOGE1YTk1NmU3YTI=
9
+ MWQxYWQwZmZjZTRlNGMwYzBkNGJkYWRjZTQxZTRhMTA1ZTZmYzVkNDQ0OTI0
10
+ OGY5ZmRlMjRlNTUzMGIwOWJlNGY4ODcxZjEyMDg3YTRkMDVhMzNkNDUwYTYy
11
+ NTFmYThlMGRkMzE2YjcwNzkxZDgwOGQ3MDBkMDMwNjU4ZGYwZTM=
12
12
  data.tar.gz: !binary |-
13
- MTljODQwOGI1NGY4YWI5NWExM2M5MzZiNmZjMmY1YjJhZDdmNjcyOTg4YjJh
14
- MTQ3OGJkZTdmNjgzMDFlOGZiYThlMjI0OGI0NTQxNmM4NWE3NzFkODQxNGMx
15
- ZTk2ODY4ZWRlMWJiY2FjZTY4ZDY0NDZjOTM0ZDU4NThiNmRlYTQ=
13
+ YjM1YWE1MGJlNTlhNWZiZmY5YjZkN2ZhMGIyYTUxMDQ5NzY3N2U4N2RmMTFj
14
+ YmFmODY2ZTEwNzA2NTFlMzkxNzJhM2RiYTM2Mjc1YTJhMzhjZTFmNWYxMDUy
15
+ YzZlMmQ0N2RkMWQ2ZGE2M2Y5MjZjMDBjYWRkNzc4NzM2NWVlNzA=
data/CHANGELOG.md CHANGED
@@ -1,3 +1,11 @@
1
+ v1.0.0, 2016-02-11 ([changes](https://github.com/sparkapi/sparkql/compare/v0.3.20...v1.0.0))
2
+ -------------------
3
+ * [IMPROVEMENT] function support for fields (delayed resolution). Backing systems must
4
+ implement necessary function behaviour.
5
+ * Drop support for ruby 1.8.7. Georuby dropped support several years back and
6
+ this drop allows us to pick up newer update allows us to stay in sync with
7
+ that gems development
8
+
1
9
  v0.3.24, 2016-01-05 ([changes](https://github.com/sparkapi/sparkql/compare/v0.3.23...v0.3.24))
2
10
  -------------------
3
11
 
data/GRAMMAR.md CHANGED
@@ -94,6 +94,7 @@ filter.
94
94
  field
95
95
  : STANDARD_FIELD
96
96
  | CUSTOM_FIELD
97
+ | function
97
98
  ;
98
99
  ```
99
100
 
@@ -140,6 +141,7 @@ Functions may optionally have a comma delimited list of parameters.
140
141
  function_arg
141
142
  : literal
142
143
  | literals
144
+ | field
143
145
  ;
144
146
  ```
145
147
 
@@ -187,6 +189,7 @@ Literals only support a single value in a condition
187
189
  literal
188
190
  : DATE
189
191
  | DATETIME
192
+ | TIME
190
193
  | BOOLEAN
191
194
  | NULL
192
195
  ;
@@ -202,6 +205,7 @@ Functions, and literals that can be used in a range
202
205
  | DECIMAL
203
206
  | DATE
204
207
  | DATETIME
208
+ | TIME
205
209
  | function
206
210
  ;
207
211
  ```
data/Gemfile CHANGED
@@ -1,4 +1,3 @@
1
- source "http://gems.flexmls.com/"
2
1
  source "http://rubygems.org"
3
2
 
4
3
  # Specify your gem's dependencies in sparkapi_parser.gemspec
data/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  SparkQL query language parser
2
2
  =====================
3
- This gem contains the syntax parser for processing spark api filter queries into manageable
3
+ This gem contains the syntax parser for processing spark api filter queries into manageable
4
4
  expressions. To get an overview of the language syntax-wise, refer to the following files:
5
5
 
6
6
  * lib/sparkql/parser.y # BNF Grammar
@@ -12,14 +12,16 @@ Installation
12
12
  Add the gem to your gemfile:
13
13
 
14
14
  Gemfile
15
- gem 'sparkql', '~> 0.0.1'
15
+ gem 'sparkql', '~> 0.0.1'
16
16
 
17
17
  When completed, run 'bundle install'.
18
18
 
19
19
 
20
20
  Usage
21
21
  -------------
22
- See test/unit/parser_test.rb for generic parsing examples. In most cases an extended parser is
22
+ Ruby 1.9 or greater is required.
23
+
24
+ See test/unit/parser_test.rb for generic parsing examples. In most cases an extended parser is
23
25
  needed to do anything of significance, such as the postgres and db2 search implementations in the
24
26
  API.
25
27
 
@@ -40,12 +42,12 @@ The return value will be an array with one expression element containing the que
40
42
 
41
43
  Development
42
44
  -------------
43
- The parser is based on racc, a yacc like LR parser that is a part of the ruby runtime. The grammar
44
- is located at lib/sparkql/parser.y and is compiled as part of the test process. Refer to the
45
- Rakefile for details. When modifying the grammar, please checkin BOTH the parser.y and parser.rb
45
+ The parser is based on racc, a yacc like LR parser that is a part of the ruby runtime. The grammar
46
+ is located at lib/sparkql/parser.y and is compiled as part of the test process. Refer to the
47
+ Rakefile for details. When modifying the grammar, please checkin BOTH the parser.y and parser.rb
46
48
  files.
47
49
 
48
- Debugging grammar issues can be done by hand using the "racc" command. For example, a dump of the
50
+ Debugging grammar issues can be done by hand using the "racc" command. For example, a dump of the
49
51
  parser states (and conflicts) can be generated via
50
52
 
51
53
  racc -o lib/sparkql/parser.rb lib/sparkql/parser.y -v # see lib/sparkql/parser.output
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.3.24
1
+ 1.0.0
@@ -10,8 +10,8 @@ require 'sparkql/geo'
10
10
  # SUPPORTED_FUNCTIONS which will run validation on the function syntax prior to execution.
11
11
  class Sparkql::FunctionResolver
12
12
  SECONDS_IN_DAY = 60 * 60 * 24
13
- STRFTIME_FORMAT = '%Y-%m-%d'
14
-
13
+ STRFTIME_DATE_FORMAT = '%Y-%m-%d'
14
+ STRFTIME_TIME_FORMAT = '%H:%M:%S.%N'
15
15
  VALID_REGEX_FLAGS = ["", "i"]
16
16
  SUPPORTED_FUNCTIONS = {
17
17
  :polygon => {
@@ -53,6 +53,16 @@ class Sparkql::FunctionResolver
53
53
  :now => {
54
54
  :args => [],
55
55
  :return_type => :datetime
56
+ },
57
+ :date => {
58
+ :args => [[:field,:datetime]],
59
+ :resolve_for_type => true,
60
+ :return_type => :date
61
+ },
62
+ :time => {
63
+ :args => [[:field,:datetime]],
64
+ :resolve_for_type => true,
65
+ :return_type => :time
56
66
  }
57
67
  }
58
68
 
@@ -89,7 +99,7 @@ class Sparkql::FunctionResolver
89
99
 
90
100
  count = 0
91
101
  @args.each do |arg|
92
- unless arg[:type] == total_args[count]
102
+ unless Array(total_args[count]).include?(arg[:type])
93
103
  @errors << Sparkql::ParserError.new(:token => @name,
94
104
  :message => "Function call '#{@name}' has an invalid argument at #{arg[:value]}",
95
105
  :status => :fatal )
@@ -99,7 +109,7 @@ class Sparkql::FunctionResolver
99
109
  end
100
110
 
101
111
  def return_type
102
- supported[@name.to_sym][:return_type]
112
+ support[@name.to_sym][:return_type]
103
113
  end
104
114
 
105
115
  def errors
@@ -126,8 +136,12 @@ class Sparkql::FunctionResolver
126
136
  fill_in_optional_args.each do |default|
127
137
  real_vals << default
128
138
  end
129
-
130
- v = self.send(name, *real_vals)
139
+ method = name
140
+ if support[name][:resolve_for_type]
141
+ method_type = @args.first[:type]
142
+ method = "#{method}_#{method_type}"
143
+ end
144
+ v = self.send(method, *real_vals)
131
145
 
132
146
  unless v.nil?
133
147
  v[:function_name] = @name
@@ -172,7 +186,7 @@ class Sparkql::FunctionResolver
172
186
  d = Date.today + num
173
187
  {
174
188
  :type => :date,
175
- :value => d.strftime(STRFTIME_FORMAT)
189
+ :value => d.strftime(STRFTIME_DATE_FORMAT)
176
190
  }
177
191
  end
178
192
 
@@ -183,12 +197,42 @@ class Sparkql::FunctionResolver
183
197
  :value => Time.now.iso8601
184
198
  }
185
199
  end
200
+
201
+ def date_field(arg)
202
+ {
203
+ :type => :function,
204
+ :value => "date",
205
+ :args => [arg]
206
+ }
207
+ end
208
+
209
+ def time_field(arg)
210
+ {
211
+ :type => :function,
212
+ :value => "time",
213
+ :args => [arg]
214
+ }
215
+ end
216
+
217
+ def date_datetime(dt)
218
+ {
219
+ :type => :date,
220
+ :value => dt.strftime(STRFTIME_DATE_FORMAT)
221
+ }
222
+ end
223
+
224
+ def time_datetime(dt)
225
+ {
226
+ :type => :time,
227
+ :value => dt.strftime(STRFTIME_TIME_FORMAT)
228
+ }
229
+ end
186
230
 
187
231
  def months num_months
188
232
  d = DateTime.now >> num_months
189
233
  {
190
234
  :type => :date,
191
- :value => d.strftime(STRFTIME_FORMAT)
235
+ :value => d.strftime(STRFTIME_DATE_FORMAT)
192
236
  }
193
237
  end
194
238
 
@@ -196,7 +240,7 @@ class Sparkql::FunctionResolver
196
240
  d = DateTime.now >> (num_years * 12)
197
241
  {
198
242
  :type => :date,
199
- :value => d.strftime(STRFTIME_FORMAT)
243
+ :value => d.strftime(STRFTIME_DATE_FORMAT)
200
244
  }
201
245
  end
202
246
 
data/lib/sparkql/lexer.rb CHANGED
@@ -39,6 +39,8 @@ class Sparkql::Lexer < StringScanner
39
39
  literal :DATETIME, @current_token_value
40
40
  when @current_token_value = scan(DATE)
41
41
  literal :DATE, @current_token_value
42
+ when @current_token_value = scan(TIME)
43
+ literal :TIME, @current_token_value
42
44
  when @current_token_value = scan(DECIMAL)
43
45
  literal :DECIMAL, @current_token_value
44
46
  when @current_token_value = scan(INTEGER)
@@ -1,6 +1,6 @@
1
1
  #
2
2
  # DO NOT MODIFY!!!!
3
- # This file is automatically generated by Racc 1.4.8
3
+ # This file is automatically generated by Racc 1.4.12
4
4
  # from Racc grammer file "".
5
5
  #
6
6
 
@@ -16,108 +16,120 @@ module Sparkql
16
16
  ##### State transition tables begin ###
17
17
 
18
18
  racc_action_table = [
19
- 49, 14, 13, 45, 43, 29, 30, 31, 32, 33,
20
- 34, 35, 27, 46, 29, 30, 31, 32, 33, 34,
21
- 35, 27, 19, 38, 39, 12, 40, 41, 29, 30,
22
- 31, 32, 33, 34, 35, 27, -26, 38, 39, 55,
23
- 40, 41, 8, 56, 9, 44, 10, 11, 8, nil,
24
- 9, nil, 10, 11, 27, nil, 29, 30, 31, 9,
25
- nil, 10, 11, 9, nil, 10, 11, 9, nil, 10,
26
- 11, 14, 13, 15, 16 ]
19
+ 48, 10, 11, 14, 54, 31, 32, 33, 34, 35,
20
+ 36, 37, 38, 10, 11, 14, 55, 31, 32, 33,
21
+ 34, 35, 36, 37, 38, 14, -28, 31, 32, 33,
22
+ 34, 35, 36, 37, 38, 14, 23, 41, 42, 22,
23
+ 43, 44, 45, 14, 15, 41, 42, nil, 43, 44,
24
+ 45, 8, nil, 9, nil, 10, 11, 14, 8, nil,
25
+ 9, nil, 10, 11, 14, 9, nil, 10, 11, 14,
26
+ 9, nil, 10, 11, 14, 9, 56, 10, 11, 14,
27
+ 57, 14, nil, 31, 32, 33, 17, 16, nil, 47,
28
+ 18, 19, 17, 16 ]
27
29
 
28
30
  racc_action_check = [
29
- 45, 18, 18, 26, 18, 45, 45, 45, 45, 45,
30
- 45, 45, 15, 37, 15, 15, 15, 15, 15, 15,
31
- 15, 46, 12, 46, 46, 1, 46, 46, 56, 56,
32
- 56, 56, 56, 56, 56, 16, 24, 16, 16, 50,
33
- 16, 16, 9, 50, 9, 25, 9, 9, 0, nil,
34
- 0, nil, 0, 0, 44, nil, 44, 44, 44, 13,
35
- nil, 13, 13, 8, nil, 8, 8, 14, nil, 14,
36
- 14, 2, 2, 6, 6 ]
31
+ 22, 22, 22, 22, 29, 22, 22, 22, 22, 22,
32
+ 22, 22, 22, 57, 57, 57, 40, 57, 57, 57,
33
+ 57, 57, 57, 57, 57, 18, 28, 18, 18, 18,
34
+ 18, 18, 18, 18, 18, 19, 15, 19, 19, 13,
35
+ 19, 19, 19, 55, 1, 55, 55, nil, 55, 55,
36
+ 55, 9, nil, 9, nil, 9, 9, 9, 0, nil,
37
+ 0, nil, 0, 0, 0, 8, nil, 8, 8, 8,
38
+ 16, nil, 16, 16, 16, 17, 49, 17, 17, 17,
39
+ 49, 54, nil, 54, 54, 54, 21, 21, nil, 21,
40
+ 6, 6, 2, 2 ]
37
41
 
38
42
  racc_action_pointer = [
39
- 43, 25, 66, nil, nil, nil, 70, nil, 56, 37,
40
- nil, nil, 22, 52, 60, 1, 24, nil, -4, nil,
41
- nil, nil, nil, nil, 24, 33, -4, nil, nil, nil,
42
- nil, nil, nil, nil, nil, nil, nil, 1, nil, nil,
43
- nil, nil, nil, nil, 43, -8, 10, nil, nil, nil,
44
- 31, nil, nil, nil, nil, nil, 15, nil ]
43
+ 53, 44, 87, nil, nil, nil, 87, nil, 58, 46,
44
+ nil, nil, nil, 32, nil, 36, 63, 68, 14, 24,
45
+ nil, 81, -8, nil, nil, nil, nil, nil, 14, -8,
46
+ nil, nil, nil, nil, nil, nil, nil, nil, nil, nil,
47
+ 4, nil, nil, nil, nil, nil, nil, nil, nil, 68,
48
+ nil, nil, nil, nil, 70, 32, nil, 4, nil, nil,
49
+ nil, nil ]
45
50
 
46
51
  racc_action_default = [
47
- -2, -42, -1, -3, -4, -5, -42, -8, -42, -42,
48
- -13, -14, -42, -42, -42, -42, -42, -9, -42, 58,
49
- -10, -11, -6, -15, -16, -17, -42, -20, -25, -30,
50
- -31, -32, -33, -34, -35, -36, -7, -42, -37, -38,
51
- -39, -40, -41, -12, -42, -42, -42, -27, -28, -18,
52
- -42, -21, -23, -24, -29, -19, -42, -22 ]
52
+ -2, -46, -1, -3, -4, -5, -46, -8, -46, -46,
53
+ -13, -14, -15, -46, -21, -46, -46, -46, -46, -46,
54
+ -9, -46, -46, 62, -10, -11, -6, -16, -17, -18,
55
+ -27, -32, -33, -34, -35, -36, -37, -38, -39, -7,
56
+ -46, -40, -41, -42, -43, -44, -45, -12, -19, -46,
57
+ -22, -24, -25, -26, -46, -46, -20, -46, -29, -30,
58
+ -31, -23 ]
53
59
 
54
60
  racc_goto_table = [
55
- 37, 51, 28, 24, 2, 17, 25, 22, 50, 23,
56
- 20, 21, 57, 18, 36, 1, nil, nil, nil, nil,
61
+ 28, 46, 30, 40, 2, 53, 50, 20, 29, 27,
62
+ 49, 39, 26, 21, 1, 24, 25, nil, nil, nil,
57
63
  nil, nil, nil, nil, nil, nil, nil, nil, nil, nil,
58
- 54, 47, 48 ]
64
+ nil, nil, nil, nil, nil, nil, 59, 46, 58, 60,
65
+ 53, 61 ]
59
66
 
60
67
  racc_goto_check = [
61
- 17, 15, 16, 11, 2, 3, 12, 7, 14, 10,
62
- 3, 3, 15, 2, 8, 1, nil, nil, nil, nil,
68
+ 10, 10, 16, 17, 2, 6, 15, 3, 12, 11,
69
+ 14, 8, 7, 2, 1, 3, 3, nil, nil, nil,
63
70
  nil, nil, nil, nil, nil, nil, nil, nil, nil, nil,
64
- 17, 16, 11 ]
71
+ nil, nil, nil, nil, nil, nil, 10, 10, 16, 17,
72
+ 6, 15 ]
65
73
 
66
74
  racc_goto_pointer = [
67
- nil, 15, 4, -3, nil, nil, nil, -8, -2, nil,
68
- -6, -12, -9, nil, -37, -44, -13, -16 ]
75
+ nil, 14, 4, -1, nil, nil, -17, -6, -8, nil,
76
+ -18, -9, -10, nil, -12, -16, -16, -16 ]
69
77
 
70
78
  racc_goto_default = [
71
79
  nil, nil, nil, 3, 4, 5, 6, nil, nil, 7,
72
- 52, 42, nil, 26, nil, nil, 53, nil ]
80
+ 12, 51, nil, 13, nil, nil, 52, nil ]
73
81
 
74
82
  racc_reduce_table = [
75
83
  0, 0, :racc_error,
76
- 1, 21, :_reduce_none,
77
- 0, 21, :_reduce_2,
78
- 1, 22, :_reduce_none,
79
- 1, 22, :_reduce_none,
80
84
  1, 22, :_reduce_none,
81
- 3, 23, :_reduce_6,
82
- 3, 23, :_reduce_7,
85
+ 0, 22, :_reduce_2,
83
86
  1, 23, :_reduce_none,
84
- 2, 25, :_reduce_9,
85
- 3, 24, :_reduce_10,
86
- 3, 24, :_reduce_11,
87
- 3, 29, :_reduce_12,
88
- 1, 26, :_reduce_none,
89
- 1, 26, :_reduce_none,
87
+ 1, 23, :_reduce_none,
88
+ 1, 23, :_reduce_none,
89
+ 3, 24, :_reduce_6,
90
+ 3, 24, :_reduce_7,
91
+ 1, 24, :_reduce_none,
92
+ 2, 26, :_reduce_9,
93
+ 3, 25, :_reduce_10,
94
+ 3, 25, :_reduce_11,
95
+ 3, 30, :_reduce_12,
90
96
  1, 27, :_reduce_none,
91
97
  1, 27, :_reduce_none,
92
- 1, 27, :_reduce_17,
93
- 3, 31, :_reduce_18,
94
- 4, 31, :_reduce_19,
95
- 1, 33, :_reduce_none,
98
+ 1, 27, :_reduce_none,
99
+ 1, 28, :_reduce_none,
100
+ 1, 28, :_reduce_none,
101
+ 1, 28, :_reduce_18,
102
+ 3, 31, :_reduce_19,
103
+ 4, 31, :_reduce_20,
96
104
  1, 34, :_reduce_none,
97
- 3, 34, :_reduce_22,
98
- 1, 35, :_reduce_none,
99
105
  1, 35, :_reduce_none,
100
- 1, 32, :_reduce_none,
101
- 1, 32, :_reduce_none,
102
- 3, 32, :_reduce_27,
103
- 3, 32, :_reduce_28,
104
- 3, 28, :_reduce_29,
105
- 1, 36, :_reduce_none,
106
+ 3, 35, :_reduce_23,
106
107
  1, 36, :_reduce_none,
107
108
  1, 36, :_reduce_none,
108
- 1, 30, :_reduce_none,
109
- 1, 30, :_reduce_none,
110
- 1, 30, :_reduce_none,
111
- 1, 30, :_reduce_none,
112
- 1, 37, :_reduce_none,
109
+ 1, 36, :_reduce_26,
110
+ 1, 33, :_reduce_none,
111
+ 1, 33, :_reduce_none,
112
+ 3, 33, :_reduce_29,
113
+ 3, 33, :_reduce_30,
114
+ 3, 29, :_reduce_31,
113
115
  1, 37, :_reduce_none,
114
116
  1, 37, :_reduce_none,
115
117
  1, 37, :_reduce_none,
116
- 1, 37, :_reduce_none ]
118
+ 1, 32, :_reduce_none,
119
+ 1, 32, :_reduce_none,
120
+ 1, 32, :_reduce_none,
121
+ 1, 32, :_reduce_none,
122
+ 1, 32, :_reduce_none,
123
+ 1, 38, :_reduce_none,
124
+ 1, 38, :_reduce_none,
125
+ 1, 38, :_reduce_none,
126
+ 1, 38, :_reduce_none,
127
+ 1, 38, :_reduce_none,
128
+ 1, 38, :_reduce_none ]
117
129
 
118
- racc_reduce_n = 42
130
+ racc_reduce_n = 46
119
131
 
120
- racc_shift_n = 58
132
+ racc_shift_n = 62
121
133
 
122
134
  racc_token_table = {
123
135
  false => 0,
@@ -138,10 +150,11 @@ racc_token_table = {
138
150
  :CHARACTER => 15,
139
151
  :DATE => 16,
140
152
  :DATETIME => 17,
141
- :BOOLEAN => 18,
142
- :NULL => 19 }
153
+ :TIME => 18,
154
+ :BOOLEAN => 19,
155
+ :NULL => 20 }
143
156
 
144
- racc_nt_base = 20
157
+ racc_nt_base = 21
145
158
 
146
159
  racc_use_result_var = true
147
160
 
@@ -180,6 +193,7 @@ Racc_token_to_s_table = [
180
193
  "CHARACTER",
181
194
  "DATE",
182
195
  "DATETIME",
196
+ "TIME",
183
197
  "BOOLEAN",
184
198
  "NULL",
185
199
  "$start",
@@ -192,8 +206,8 @@ Racc_token_to_s_table = [
192
206
  "condition",
193
207
  "range",
194
208
  "group",
195
- "literal",
196
209
  "function",
210
+ "literal",
197
211
  "literal_list",
198
212
  "function_name",
199
213
  "function_args",
@@ -260,57 +274,60 @@ end
260
274
 
261
275
  # reduce 16 omitted
262
276
 
263
- def _reduce_17(val, _values, result)
277
+ # reduce 17 omitted
278
+
279
+ def _reduce_18(val, _values, result)
264
280
  result = tokenize_list(val[0])
265
281
  result
266
282
  end
267
283
 
268
- def _reduce_18(val, _values, result)
284
+ def _reduce_19(val, _values, result)
269
285
  result = tokenize_function(val[0], [])
270
286
  result
271
287
  end
272
288
 
273
- def _reduce_19(val, _values, result)
289
+ def _reduce_20(val, _values, result)
274
290
  result = tokenize_function(val[0], val[2])
275
291
  result
276
292
  end
277
293
 
278
- # reduce 20 omitted
279
-
280
294
  # reduce 21 omitted
281
295
 
282
- def _reduce_22(val, _values, result)
296
+ # reduce 22 omitted
297
+
298
+ def _reduce_23(val, _values, result)
283
299
  result = tokenize_function_args(val[0], val[2])
284
300
  result
285
301
  end
286
302
 
287
- # reduce 23 omitted
288
-
289
303
  # reduce 24 omitted
290
304
 
291
305
  # reduce 25 omitted
292
306
 
293
- # reduce 26 omitted
307
+ def _reduce_26(val, _values, result)
308
+ result = tokenize_field_arg(val[0])
309
+ result
310
+ end
311
+
312
+ # reduce 27 omitted
313
+
314
+ # reduce 28 omitted
294
315
 
295
- def _reduce_27(val, _values, result)
316
+ def _reduce_29(val, _values, result)
296
317
  result = tokenize_multiple(val[0], val[2])
297
318
  result
298
319
  end
299
320
 
300
- def _reduce_28(val, _values, result)
321
+ def _reduce_30(val, _values, result)
301
322
  result = tokenize_multiple(val[0], val[2])
302
323
  result
303
324
  end
304
325
 
305
- def _reduce_29(val, _values, result)
326
+ def _reduce_31(val, _values, result)
306
327
  result = tokenize_multiple(val[0], val[2])
307
328
  result
308
329
  end
309
330
 
310
- # reduce 30 omitted
311
-
312
- # reduce 31 omitted
313
-
314
331
  # reduce 32 omitted
315
332
 
316
333
  # reduce 33 omitted
@@ -331,6 +348,14 @@ end
331
348
 
332
349
  # reduce 41 omitted
333
350
 
351
+ # reduce 42 omitted
352
+
353
+ # reduce 43 omitted
354
+
355
+ # reduce 44 omitted
356
+
357
+ # reduce 45 omitted
358
+
334
359
  def _reduce_none(val, _values, result)
335
360
  val[0]
336
361
  end
data/lib/sparkql/parser.y CHANGED
@@ -91,6 +91,7 @@ rule
91
91
  field
92
92
  : STANDARD_FIELD
93
93
  | CUSTOM_FIELD
94
+ | function
94
95
  ;
95
96
 
96
97
  ##### Condition
@@ -130,6 +131,7 @@ rule
130
131
  function_arg
131
132
  : literal
132
133
  | literals
134
+ | field { result = tokenize_field_arg(val[0]) }
133
135
  ;
134
136
 
135
137
  ##### Literal List
@@ -165,6 +167,7 @@ rule
165
167
  literal
166
168
  : DATE
167
169
  | DATETIME
170
+ | TIME
168
171
  | BOOLEAN
169
172
  | NULL
170
173
  ;
@@ -177,6 +180,7 @@ rule
177
180
  | DECIMAL
178
181
  | DATE
179
182
  | DATETIME
183
+ | TIME
180
184
  | function
181
185
  ;
182
186
 
@@ -18,6 +18,11 @@ module Sparkql::ParserCompatibility
18
18
  :regex => /^[0-9]{4}\-[0-9]{2}\-[0-9]{2}$/,
19
19
  :operators => Sparkql::Token::OPERATORS + [Sparkql::Token::RANGE_OPERATOR]
20
20
  },
21
+ {
22
+ :type => :time,
23
+ :regex => /^[0-9]{2}\:[0-9]{2}(\:[0-9]{2})?(\.[0-9]{6)$/,
24
+ :operators => Sparkql::Token::OPERATORS + [Sparkql::Token::RANGE_OPERATOR]
25
+ },
21
26
  {
22
27
  :type => :character,
23
28
  :regex => /^'([^'\\]*(\\.[^'\\]*)*)'$/, # Strings must be single quoted. Any inside single quotes must be escaped.
@@ -50,6 +55,11 @@ module Sparkql::ParserCompatibility
50
55
  :type => :null,
51
56
  :regex => /^NULL|Null|null$/,
52
57
  :operators => Sparkql::Token::EQUALITY_OPERATORS
58
+ },
59
+ {
60
+ :type => :function,
61
+ # This type is not parseable, so no regex
62
+ :operators => Sparkql::Token::OPERATORS + [Sparkql::Token::RANGE_OPERATOR]
53
63
  }
54
64
  ]
55
65
 
@@ -130,6 +140,8 @@ module Sparkql::ParserCompatibility
130
140
  return date_escape(expression[:value])
131
141
  when :datetime
132
142
  return datetime_escape(expression[:value])
143
+ when :time
144
+ return time_escape(expression[:value])
133
145
  when :boolean
134
146
  return boolean_escape(expression[:value])
135
147
  when :null
@@ -159,6 +171,10 @@ module Sparkql::ParserCompatibility
159
171
  def datetime_escape(string)
160
172
  DateTime.parse(string)
161
173
  end
174
+
175
+ def time_escape(string)
176
+ DateTime.parse(string)
177
+ end
162
178
 
163
179
  def boolean_escape(string)
164
180
  "true" == string
@@ -204,7 +220,8 @@ module Sparkql::ParserCompatibility
204
220
 
205
221
  # Checks the type of an expression with what is expected.
206
222
  def check_type!(expression, expected, supports_nulls = true)
207
- if expected == expression[:type] || (supports_nulls && expression[:type] == :null)
223
+ if expected == expression[:type] || check_function_type?(expression, expected) ||
224
+ (supports_nulls && expression[:type] == :null)
208
225
  return true
209
226
  elsif expected == :datetime && expression[:type] == :date
210
227
  expression[:type] = :datetime
@@ -233,6 +250,17 @@ module Sparkql::ParserCompatibility
233
250
  :message => "expected #{expected} but found #{expression[:type]}",
234
251
  :status => :fatal )
235
252
  end
253
+
254
+ # If a function is being applied to a field, we check that the return type of
255
+ # the function matches what is expected, and that the function supports the
256
+ # field type as the first argument.
257
+ def check_function_type?(expression, expected)
258
+ return false unless expression[:field_function_type] == expression[:type]
259
+ # Lookup the function arguments
260
+ function = Sparkql::FunctionResolver::SUPPORTED_FUNCTIONS[expression[:field_function].to_sym]
261
+ return false if function.nil?
262
+ Array(function[:args].first).include?(expected)
263
+ end
236
264
 
237
265
  # Builds the correct operator based on the type and the value.
238
266
  # default should be the operator provided in the actual filter string
@@ -24,10 +24,24 @@ module Sparkql::ParserTools
24
24
 
25
25
  def tokenize_expression(field, op, val)
26
26
  operator = get_operator(val,op) unless val.nil?
27
+ field_args = {}
28
+ # Function support for fields is stapled in here. The function information
29
+ # is remapped to the expression
30
+ if field.is_a?(Hash) && field[:type] == :function
31
+ function = Sparkql::FunctionResolver::SUPPORTED_FUNCTIONS[field[:value].to_sym]
32
+ if !function.nil?
33
+ field_args[:field_function] = field[:value]
34
+ field_args[:field_function_type] = function[:return_type]
35
+ else
36
+ tokenizer_error(:token => field[:value],
37
+ :message => "Unsupported function type", :status => :fatal )
38
+ end
39
+ field = field[:args].first
40
+ end
27
41
  custom_field = field.start_with?('"')
28
42
  block_group = (@lexer.level == 0) ? 0 : @lexer.block_group_identifier
29
43
  expression = {:field => field, :operator => operator, :conjunction => 'And',
30
- :level => @lexer.level, :block_group => block_group, :custom_field => custom_field}
44
+ :level => @lexer.level, :block_group => block_group, :custom_field => custom_field}.merge!(field_args)
31
45
  expression = val.merge(expression) unless val.nil?
32
46
  validate_level_depth expression
33
47
  if operator.nil?
@@ -98,6 +112,13 @@ module Sparkql::ParserTools
98
112
  array
99
113
  end
100
114
 
115
+ def tokenize_field_arg(field)
116
+ {
117
+ :type => :field,
118
+ :value => field,
119
+ }
120
+ end
121
+
101
122
  def tokenize_function(name, f_args)
102
123
  @lexer.leveldown
103
124
  @lexer.block_group_identifier -= 1
data/lib/sparkql/token.rb CHANGED
@@ -10,7 +10,8 @@ module Sparkql::Token
10
10
  DECIMAL = /^\-?[0-9]+\.[0-9]+/
11
11
  CHARACTER = /^'([^'\\]*(\\.[^'\\]*)*)'/
12
12
  DATE = /^[0-9]{4}\-[0-9]{2}\-[0-9]{2}/
13
- DATETIME = /^[0-9]{4}\-[0-9]{2}\-[0-9]{2}T[0-9]{2}\:[0-9]{2}(\:[0-9]{2})?(\.[0-9]{1,50})?(((\+|-)[0-9]{2}\:?[0-9]{2})|Z)?/
13
+ TIME = /^[0-9]{2}\:[0-9]{2}((\:[0-9]{2})(\.[0-9]{1,50})?)?/
14
+ DATETIME = /^[0-9]{4}\-[0-9]{2}\-[0-9]{2}T[0-9]{2}\:[0-9]{2}((\:[0-9]{2})(\.[0-9]{1,50})?)?(((\+|-)[0-9]{2}\:?[0-9]{2})|Z)?/
14
15
  BOOLEAN = /^true|false/
15
16
  NULL = /NULL|null|Null/
16
17
  # Reserved words
data/sparkql.gemspec CHANGED
@@ -20,14 +20,11 @@ Gem::Specification.new do |s|
20
20
  s.executables = `git ls-files -- bin/*`.split("\n").map{ |f| File.basename(f) }
21
21
  s.require_paths = ["lib"]
22
22
 
23
- # georuby 2.1.x adds ruby 1.9-only syntax, so that's
24
- # a no-go for us at the moment
25
- s.add_dependency 'georuby', '~> 2.0.0'
26
- s.add_development_dependency 'racc', '1.4.8'
23
+ s.add_dependency 'georuby', '~> 2.0'
24
+ s.add_development_dependency 'racc', '~> 1.4.8'
27
25
  s.add_development_dependency 'rake', '~> 0.9.2'
28
26
  s.add_development_dependency 'test-unit', '~> 2.1.0'
29
27
  s.add_development_dependency 'ci_reporter', '~> 1.6'
30
28
  s.add_development_dependency 'mocha', '~> 0.12.0'
31
- s.add_development_dependency 'rcov', '~> 0.9.9'
32
29
 
33
30
  end
@@ -1,8 +1,10 @@
1
1
  require 'test_helper'
2
2
  require 'sparkql/geo'
3
3
 
4
- class ParserTest < Test::Unit::TestCase
4
+ class FunctionResolverTest < Test::Unit::TestCase
5
5
  include Sparkql
6
+
7
+ EXAMPLE_DATE = DateTime.parse("2013-07-26T10:22:15.422804")
6
8
 
7
9
  test "function parameters and name preserved" do
8
10
  f = FunctionResolver.new('radius', [{:type => :character,
@@ -147,10 +149,59 @@ class ParserTest < Test::Unit::TestCase
147
149
  assert_nil f.call
148
150
  end
149
151
 
152
+ test "return_type" do
153
+ f = FunctionResolver.new('radius', [{:type => :character,
154
+ :value => "35.12 -68.33, 35.13 -68.34"},{:type => :decimal,
155
+ :value => 1.0}])
156
+ assert_equal :shape, f.return_type
157
+ end
158
+
150
159
  test "invalid function" do
151
160
  f = FunctionResolver.new('then', [])
152
161
  f.validate
153
162
  assert f.errors?, "'then' is not a function"
154
163
  end
164
+
165
+ test "time(datetime)" do
166
+ f = FunctionResolver.new('time', [{:type => :datetime, :value => EXAMPLE_DATE}])
167
+ f.validate
168
+ assert !f.errors?, "Errors #{f.errors.inspect}"
169
+ value = f.call
170
+ assert_equal :time, value[:type]
171
+ assert_equal '10:22:15.422804000', value[:value]
172
+ end
173
+
174
+ test "date(datetime)" do
175
+ f = FunctionResolver.new('date', [{:type => :datetime, :value => EXAMPLE_DATE}])
176
+ f.validate
177
+ assert !f.errors?, "Errors #{f.errors.inspect}"
178
+ value = f.call
179
+ assert_equal :date, value[:type]
180
+ assert_equal '2013-07-26', value[:value]
181
+ end
182
+
183
+ ###
184
+ # Delayed functions. These functions don't get run immediately and require
185
+ # resolution by the backing system
186
+ ###
155
187
 
188
+ test "time(field)" do
189
+ f = FunctionResolver.new('time', [{:type => :field, :value => "OriginalEntryTimestamp"}])
190
+ f.validate
191
+ assert !f.errors?, "Errors #{f.errors.inspect}"
192
+ value = f.call
193
+ assert_equal :function, value[:type]
194
+ assert_equal 'time', value[:value]
195
+ assert_equal "OriginalEntryTimestamp", value[:args].first
196
+ end
197
+
198
+ test "date(field)" do
199
+ f = FunctionResolver.new('date', [{:type => :field, :value => "OriginalEntryTimestamp"}])
200
+ f.validate
201
+ assert !f.errors?, "Errors #{f.errors.inspect}"
202
+ value = f.call
203
+ assert_equal :function, value[:type]
204
+ assert_equal 'date', value[:value]
205
+ assert_equal "OriginalEntryTimestamp", value[:args].first
206
+ end
156
207
  end
@@ -77,6 +77,22 @@ class LexerTest < Test::Unit::TestCase
77
77
  end
78
78
  end
79
79
 
80
+ def test_dates_matches
81
+ ['2013-07-26', '1999-01-01'].each do |op|
82
+ @lexer = Lexer.new(op)
83
+ token = @lexer.shift
84
+ assert_equal :DATE, token.first, op
85
+ end
86
+ end
87
+
88
+ def test_times_matches
89
+ ['10:22:15.422804', '10:22:15', '10:22'].each do |op|
90
+ @lexer = Lexer.new(op)
91
+ token = @lexer.shift
92
+ assert_equal :TIME, token.first, op
93
+ end
94
+ end
95
+
80
96
  def test_utc_offsets
81
97
  ['2013-07-26T10:22:15.422804-0300', '2013-07-26T10:22:15+0400'].each do |op|
82
98
  @lexer = Lexer.new(op)
@@ -498,8 +498,8 @@ class ParserCompatabilityTest < Test::Unit::TestCase
498
498
  expression = parser.tokenize( "DateField Eq now()").first
499
499
  assert !parser.errors?
500
500
  assert parser.send(:check_type!, expression, :date)
501
- assert_equal t.strftime(Sparkql::FunctionResolver::STRFTIME_FORMAT),
502
- parser.escape_value(expression).strftime(Sparkql::FunctionResolver::STRFTIME_FORMAT)
501
+ assert_equal t.strftime(Sparkql::FunctionResolver::STRFTIME_DATE_FORMAT),
502
+ parser.escape_value(expression).strftime(Sparkql::FunctionResolver::STRFTIME_DATE_FORMAT)
503
503
  end
504
504
 
505
505
  test "datetime->date type coercion array" do
@@ -509,9 +509,9 @@ class ParserCompatabilityTest < Test::Unit::TestCase
509
509
  assert !parser.errors?
510
510
  assert parser.send(:check_type!, expression, :date)
511
511
  yesterday = today - 3600 * 24
512
- assert_equal [ yesterday.strftime(Sparkql::FunctionResolver::STRFTIME_FORMAT),
513
- today.strftime(Sparkql::FunctionResolver::STRFTIME_FORMAT)],
514
- parser.escape_value(expression).map { |i| i.strftime(Sparkql::FunctionResolver::STRFTIME_FORMAT)}
512
+ assert_equal [ yesterday.strftime(Sparkql::FunctionResolver::STRFTIME_DATE_FORMAT),
513
+ today.strftime(Sparkql::FunctionResolver::STRFTIME_DATE_FORMAT)],
514
+ parser.escape_value(expression).map { |i| i.strftime(Sparkql::FunctionResolver::STRFTIME_DATE_FORMAT)}
515
515
  end
516
516
 
517
517
 
@@ -218,6 +218,36 @@ class ParserTest < Test::Unit::TestCase
218
218
  assert_equal '2014,days(-7)', expressions.first[:condition]
219
219
  end
220
220
 
221
+ def test_function_date
222
+ filter = "OnMarketDate Eq date(OriginalEntryTimestamp)"
223
+ @parser = Parser.new
224
+ expressions = @parser.parse(filter)
225
+ assert !@parser.errors?, "errors #{@parser.errors.inspect}"
226
+ assert_equal 'date(OriginalEntryTimestamp)', expressions.first[:condition]
227
+ assert_equal 'date', expressions.first[:value]
228
+ assert_equal :function, expressions.first[:type]
229
+ # Run using a static value, we just resolve the type
230
+ filter = "OnMarketDate Eq date(2013-07-26T10:22:15.111-0100)"
231
+ @parser = Parser.new
232
+ expressions = @parser.parse(filter)
233
+ assert !@parser.errors?, "errors #{@parser.errors.inspect}"
234
+ assert_equal 'date(2013-07-26T10:22:15.111-0100)', expressions.first[:condition]
235
+ assert_equal '2013-07-26', expressions.first[:value]
236
+ assert_equal :date, expressions.first[:type]
237
+ # And the grand finale: run on both sides
238
+ filter = "date(OriginalEntryTimestamp) Eq date(2013-07-26T10:22:15.111-0100)"
239
+ @parser = Parser.new
240
+ expression = @parser.parse(filter).first
241
+ assert !@parser.errors?, "errors #{@parser.errors.inspect}"
242
+ assert_equal 'date(2013-07-26T10:22:15.111-0100)', expression[:condition]
243
+ assert_equal '2013-07-26', expression[:value]
244
+ assert_equal :date, expression[:type]
245
+ # annnd the field function stuff
246
+ assert_equal "OriginalEntryTimestamp", expression[:field]
247
+ assert_equal :date, expression[:field_function_type]
248
+ assert_equal "date", expression[:field_function]
249
+ end
250
+
221
251
  test "regex function parses without second param" do
222
252
  filter = "ParcelNumber Eq regex('^[0-9]{3}-[0-9]{2}-[0-9]{3}$')"
223
253
  @parser = Parser.new
@@ -243,6 +273,10 @@ class ParserTest < Test::Unit::TestCase
243
273
 
244
274
  test "allow timezone offsets" do
245
275
  values = [
276
+ "2013-07-26",
277
+ "10:22",
278
+ "10:22:15.1111",
279
+ "10:22:15",
246
280
  "2013-07-26T10:22",
247
281
  "2013-07-26T10:22Z",
248
282
  "2013-07-26T10:22+01:00",
@@ -258,6 +292,7 @@ class ParserTest < Test::Unit::TestCase
258
292
  filter = "DatetimeField Eq #{value}"
259
293
  @parser = Parser.new
260
294
  expressions = @parser.parse(filter)
295
+ assert !@parser.errors?, "errors #{@parser.errors.inspect}"
261
296
  assert_not_nil expressions, "#{value} failed"
262
297
  assert_equal expressions.first[:value], value, "#{value} failed"
263
298
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sparkql
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.3.24
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Wade McEwen
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2016-01-06 00:00:00.000000000 Z
11
+ date: 2016-02-12 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: georuby
@@ -16,26 +16,26 @@ dependencies:
16
16
  requirements:
17
17
  - - ~>
18
18
  - !ruby/object:Gem::Version
19
- version: 2.0.0
19
+ version: '2.0'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - ~>
25
25
  - !ruby/object:Gem::Version
26
- version: 2.0.0
26
+ version: '2.0'
27
27
  - !ruby/object:Gem::Dependency
28
28
  name: racc
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
- - - '='
31
+ - - ~>
32
32
  - !ruby/object:Gem::Version
33
33
  version: 1.4.8
34
34
  type: :development
35
35
  prerelease: false
36
36
  version_requirements: !ruby/object:Gem::Requirement
37
37
  requirements:
38
- - - '='
38
+ - - ~>
39
39
  - !ruby/object:Gem::Version
40
40
  version: 1.4.8
41
41
  - !ruby/object:Gem::Dependency
@@ -94,20 +94,6 @@ dependencies:
94
94
  - - ~>
95
95
  - !ruby/object:Gem::Version
96
96
  version: 0.12.0
97
- - !ruby/object:Gem::Dependency
98
- name: rcov
99
- requirement: !ruby/object:Gem::Requirement
100
- requirements:
101
- - - ~>
102
- - !ruby/object:Gem::Version
103
- version: 0.9.9
104
- type: :development
105
- prerelease: false
106
- version_requirements: !ruby/object:Gem::Requirement
107
- requirements:
108
- - - ~>
109
- - !ruby/object:Gem::Version
110
- version: 0.9.9
111
97
  description: Specification and base implementation of the Spark API parsing system.
112
98
  email:
113
99
  - wade@fbsdata.com