sparkql 0.1.8 → 0.3.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,15 @@
1
+ ---
2
+ !binary "U0hBMQ==":
3
+ metadata.gz: !binary |-
4
+ OTgxODlmNjI2NzVkYTBmNjE1Njg3MmYwZmU0M2UyZjE4MDQ5Yzg5MA==
5
+ data.tar.gz: !binary |-
6
+ MDRlZTM2ODFjNWIxNDBiZTUyOGFhY2FlMjQ5OTVjZDJmMGUxZWRjZA==
7
+ SHA512:
8
+ metadata.gz: !binary |-
9
+ ZTFjYTAxZGRlYmFiODY2ZTFmYTgwN2MxYWNlMWJiMWFjMWYzYmIzOGUwZmYy
10
+ NmY0M2YwMmE3ZTNkMmNiMmQzMDRhOTExNWIyZDI3MmEwMTQzNDU2MjRkMmZm
11
+ MzIyNWI5OWNjNmUzZjI0MTQ5ZmM3YWIyY2Q3MDVlMzQwMGIwOTQ=
12
+ data.tar.gz: !binary |-
13
+ ZjYzMjgyZjc0NzI2YzE0MmUzOWUwOGI4NjVkZWRlODNkYTM5ZjU1ZWIxYTVi
14
+ ZWYyOWM3ODFjMWY1ZDVlMDg0ZDIwMzRiYjc2NDJkMmQ4MTdiOWVmMmYzYWFl
15
+ YmY0MWMwNmVjODc1N2E4ODAyMDBjMzU5OTdkMGQ3NjY4MzgxN2Y=
data/.gitignore CHANGED
@@ -3,3 +3,5 @@
3
3
  Gemfile.lock
4
4
  lib/sparkql/*.output
5
5
  pkg/*
6
+ *.swp
7
+ test/reports
data/.ruby-version ADDED
@@ -0,0 +1 @@
1
+ 1.9.3
data/CHANGELOG.md ADDED
@@ -0,0 +1,11 @@
1
+
2
+ v0.3.2, 2015-04-14 ([changes](https://github.com/sparkapi/sparkql/compare/v0.3.18...v0.3.2))
3
+ -------------------
4
+
5
+ * [BUGFIX] Allow seconds for ISO-8601
6
+
7
+ v0.3.18, 2015-04-10 ([changes](https://github.com/sparkapi/sparkql/compare/v0.3.17...v0.3.18))
8
+ -------------------
9
+
10
+ * [BUGFIX] Better support for ISO-8601
11
+
data/GRAMMAR.md ADDED
@@ -0,0 +1,208 @@
1
+ ## SparkQL BNF Grammar
2
+ This document explains the rules for the Spark API filter language syntax and
3
+ is a living document generated from the reference implementation at
4
+ https://github.com/sparkapi/sparkql.
5
+ ### Precedence Rules
6
+ Unless otherwise specified, SparkQL follows SQL precendence conventions for
7
+ operators and conjunctions.
8
+ Unary minus is always tied to value, such as for negative numbers.
9
+
10
+
11
+ ```
12
+ prechigh
13
+ nonassoc UMINUS
14
+ preclow
15
+ ```
16
+
17
+ ### Grammar Rules
18
+ A filter (target) is a composition of filter basic filter expressions.
19
+
20
+
21
+ ```
22
+ rule
23
+ target
24
+ : expressions
25
+ | /* none */
26
+ ;
27
+ ```
28
+
29
+ #### Expressions
30
+ One or more expressions
31
+
32
+
33
+ ```
34
+ expressions
35
+ : expression
36
+ | conjunction
37
+ | unary_conjunction
38
+ ;
39
+ ```
40
+
41
+ #### Expression
42
+ The core of the filtering system, the expression requires a field, a condition
43
+ and criteria for comparing the value of the field to the value(s) of the
44
+ condition. The result of evaluating the expression on a resource is a true of
45
+ false for matching the criteria.
46
+
47
+
48
+ ```
49
+ expression
50
+ : field OPERATOR condition
51
+ | field RANGE_OPERATOR range
52
+ | group
53
+ ;
54
+ ```
55
+
56
+ #### Unary Conjunction
57
+ Some conjunctions don't need to expression at all times (e.g. 'NOT').
58
+
59
+
60
+ ```
61
+ unary_conjunction
62
+ : UNARY_CONJUNCTION expression
63
+ ;
64
+ ```
65
+
66
+ #### Conjunction
67
+ Two expressions joined together using a supported conjunction
68
+
69
+
70
+ ```
71
+ conjunction
72
+ : expressions CONJUNCTION expression
73
+ | expressions UNARY_CONJUNCTION expression
74
+ ;
75
+ ```
76
+
77
+ #### Group
78
+ One or more expressions encased in parenthesis. There are limitations on nesting depth at the time of this writing.
79
+
80
+
81
+ ```
82
+ group
83
+ : LPAREN expressions RPAREN
84
+ ;
85
+ ```
86
+
87
+ #### Field
88
+ Keyword for searching on, these fields should be discovered using the metadata
89
+ rules. In general, Keywords that cannot be found will be dropped from the
90
+ filter.
91
+
92
+
93
+ ```
94
+ field
95
+ : STANDARD_FIELD
96
+ | CUSTOM_FIELD
97
+ ;
98
+ ```
99
+
100
+ #### Condition
101
+ The determinant of the filter, this is typically a value or set of values of
102
+ a type that the field supports (review the field meta data for support).
103
+ Functions are also supported on some field types, and provide more flexibility
104
+ on filtering values
105
+
106
+
107
+ ```
108
+ condition
109
+ : literal
110
+ | function
111
+ | literal_list
112
+ ;
113
+ ```
114
+
115
+ #### Function
116
+ Functions may replace static values for conditions with supported field
117
+ types. Functions may have parameters that match types supported by
118
+ fields.
119
+
120
+
121
+ ```
122
+ function
123
+ : function_name LPAREN RPAREN
124
+ | function_name LPAREN function_args RPAREN
125
+ ;
126
+ function_name
127
+ : KEYWORD
128
+ ;
129
+ ```
130
+
131
+ #### Function Arguments
132
+ Functions may optionally have a comma delimited list of parameters.
133
+
134
+
135
+ ```
136
+ function_args
137
+ : function_arg
138
+ | function_args COMMA function_arg
139
+ ;
140
+ function_arg
141
+ : literal
142
+ | literals
143
+ ;
144
+ ```
145
+
146
+ #### Literal List
147
+ A comma delimited list of functions and values.
148
+
149
+
150
+ ```
151
+ literal_list
152
+ : literals
153
+ | function
154
+ | literal_list COMMA literals
155
+ | literal_list COMMA function
156
+ ;
157
+ ```
158
+
159
+ #### Range List
160
+ A comma delimited list of values that support ranges for the Between operator
161
+ (see rangeable).
162
+
163
+
164
+ ```
165
+ range
166
+ : rangeable COMMA rangeable
167
+ ;
168
+ ```
169
+
170
+ #### Literals
171
+ Literals that support multiple values in a list for a condition
172
+
173
+
174
+ ```
175
+ literals
176
+ : INTEGER
177
+ | DECIMAL
178
+ | CHARACTER
179
+ ;
180
+ ```
181
+
182
+ #### Literal
183
+ Literals only support a single value in a condition
184
+
185
+
186
+ ```
187
+ literal
188
+ : DATE
189
+ | DATETIME
190
+ | BOOLEAN
191
+ | NULL
192
+ ;
193
+ ```
194
+
195
+ #### Range List
196
+ Functions, and literals that can be used in a range
197
+
198
+
199
+ ```
200
+ rangeable
201
+ : INTEGER
202
+ | DECIMAL
203
+ | DATE
204
+ | DATETIME
205
+ | function
206
+ ;
207
+ ```
208
+
data/Gemfile CHANGED
@@ -1,4 +1,4 @@
1
- source "http://gems.dev.fbsdata.com/public"
1
+ source "http://gems.flexmls.com/"
2
2
  source "http://rubygems.org"
3
3
 
4
4
  # Specify your gem's dependencies in sparkapi_parser.gemspec
data/README.md CHANGED
@@ -25,7 +25,7 @@ API.
25
25
 
26
26
  Here is a basic example:
27
27
 
28
- expressions = Parser.new.parse("Hello Eq 'World')
28
+ expressions = Parser.new.parse("Hello Eq 'World'")
29
29
 
30
30
  The return value will be an array with one expression element containing the query information:
31
31
 
@@ -50,5 +50,5 @@ parser states (and conflicts) can be generated via
50
50
 
51
51
  racc -o lib/sparkql/parser.rb lib/sparkql/parser.y -v # see lib/sparkql/parser.output
52
52
 
53
- The rails/journey project was an inspiration for this gem. Look it up on github for reference.
53
+ The [rails/journey](https://github.com/rails/journey) project was an inspiration for this gem. Look it up on github for reference.
54
54
 
data/Rakefile CHANGED
@@ -1,17 +1,30 @@
1
1
  require "rubygems"
2
2
  require 'rubygems/user_interaction'
3
- require 'flexmls_gems/tasks'
4
- require 'flexmls_gems/tasks/test_unit'
5
- require 'flexmls_gems/tasks/rdoc'
3
+ require 'rake/testtask'
4
+ require 'ci/reporter/rake/test_unit'
5
+ require 'bundler/gem_tasks'
6
+
7
+ Rake::TestTask.new(:test) do |test|
8
+ test.libs << 'lib' << 'test'
9
+ test.pattern = 'test/**/*_test.rb'
10
+ test.verbose = true
11
+ end
6
12
 
7
13
  rule '.rb' => '.y' do |t|
8
14
  sh "racc -l -o #{t.name} #{t.source}"
9
15
  end
10
16
 
11
17
  desc "Compile the racc parser from the grammar"
12
- task :compile => "lib/sparkql/parser.rb"
18
+ task :compile => ["lib/sparkql/parser.rb", "grammar"]
19
+
20
+ desc "Generate grammar Documenation"
21
+ task :grammar do
22
+ puts "Generating grammar documentation..."
23
+ sh "ruby script/markdownify.rb > GRAMMAR.md"
24
+ end
13
25
 
14
26
  Rake::Task[:test].prerequisites.unshift "lib/sparkql/parser.rb"
27
+ Rake::Task[:test].prerequisites.unshift "grammar"
15
28
 
16
29
  desc 'Default: run unit tests.'
17
30
  task :default => :test
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.1.8
1
+ 0.3.2
@@ -3,8 +3,8 @@ module Sparkql
3
3
  class ErrorsProcessor
4
4
  attr_accessor :errors
5
5
 
6
- def initialize( errors )
7
- @errors = errors || []
6
+ def initialize( errors = [] )
7
+ @errors = Array(errors)
8
8
  end
9
9
 
10
10
  # true if the error stack contains at least one error
@@ -39,24 +39,26 @@ end
39
39
 
40
40
  class ParserError
41
41
  attr_accessor :token, :expression, :message, :status, :recovered_as
42
+ attr_writer :syntax, :constraint
42
43
 
43
- def initialize(error_hash=nil)
44
- error_hash = {} if error_hash.nil?
44
+ def initialize(error_hash={})
45
45
  @token = error_hash[:token]
46
46
  @expression = error_hash[:expression]
47
47
  @message = error_hash[:message]
48
48
  @status = error_hash[:status]
49
49
  @recovered_as = error_hash[:recovered_as]
50
+ @recovered_as = error_hash[:recovered_as]
50
51
  self.syntax= error_hash[:syntax] == false ? false : true
52
+ self.constraint= error_hash[:constraint] == true
51
53
  end
52
54
 
53
- def syntax=(syntax_error)
54
- @syntax = syntax_error
55
- end
56
-
57
55
  def syntax?
58
56
  @syntax
59
57
  end
58
+
59
+ def constraint?
60
+ @constraint
61
+ end
60
62
 
61
63
  def to_s
62
64
  str = case @status
@@ -1,20 +1,23 @@
1
1
  # Custom fields need to add a table join to the customfieldsearch table when AND'd together,
2
- # but not when they are OR'd. This class maintains the state for all custom field expressions
2
+ # but not when they are OR'd or nested. This class maintains the state for all custom field expressions
3
3
  # lets the parser know when to do either.
4
4
  class Sparkql::ExpressionState
5
5
 
6
6
  def initialize
7
- @expressions = []
7
+ @expressions = {0=>[]}
8
8
  @last_conjunction = "And" # always start with a join
9
+ @block_group = 0
9
10
  end
10
11
 
11
12
  def push(expression)
12
- @expressions << expression
13
+ @block_group = expression[:block_group]
14
+ @expressions[@block_group] ||= []
15
+ @expressions[@block_group] << expression
13
16
  @last_conjunction = expression[:conjunction]
14
17
  end
15
18
 
16
19
  def needs_join?
17
- return @expressions.size == 1 || "And" == @last_conjunction
20
+ return @expressions[@block_group].size == 1 || ["Not", "And"].include?(@last_conjunction)
18
21
  end
19
22
 
20
23
  end
@@ -1,4 +1,6 @@
1
1
  require 'time'
2
+ require 'geo_ruby'
3
+ require 'sparkql/geo'
2
4
 
3
5
  # Binding class to all supported function calls in the parser. Current support requires that the
4
6
  # resolution of function calls to happen on the fly at parsing time at which point a value and
@@ -8,12 +10,37 @@ require 'time'
8
10
  # SUPPORTED_FUNCTIONS which will run validation on the function syntax prior to execution.
9
11
  class Sparkql::FunctionResolver
10
12
  SECONDS_IN_DAY = 60 * 60 * 24
13
+ STRFTIME_FORMAT = '%Y-%m-%d'
11
14
 
12
15
  SUPPORTED_FUNCTIONS = {
16
+ :polygon => {
17
+ :args => [:character],
18
+ :return_type => :shape
19
+ },
20
+ :rectangle => {
21
+ :args => [:character],
22
+ :return_type => :shape
23
+ },
24
+ :radius => {
25
+ :args => [:character, :decimal],
26
+ :return_type => :shape
27
+ },
28
+ :linestring => {
29
+ :args => [:character],
30
+ :return_type => :shape
31
+ },
13
32
  :days => {
14
33
  :args => [:integer],
15
34
  :return_type => :datetime
16
35
  },
36
+ :months => {
37
+ :args => [:integer],
38
+ :return_type => :datetime
39
+ },
40
+ :years => {
41
+ :args => [:integer],
42
+ :return_type => :datetime
43
+ },
17
44
  :now => {
18
45
  :args => [],
19
46
  :return_type => :datetime
@@ -78,7 +105,14 @@ class Sparkql::FunctionResolver
78
105
  # Execute the function
79
106
  def call()
80
107
  real_vals = @args.map { |i| i[:value]}
81
- self.send(@name.to_sym, *real_vals)
108
+ v = self.send(@name.to_sym, *real_vals)
109
+
110
+ unless v.nil?
111
+ v[:function_name] = @name
112
+ v[:function_parameters] = real_vals
113
+ end
114
+
115
+ v
82
116
  end
83
117
 
84
118
  protected
@@ -92,7 +126,7 @@ class Sparkql::FunctionResolver
92
126
  d = Date.today + num
93
127
  {
94
128
  :type => :date,
95
- :value => d.to_s
129
+ :value => d.strftime(STRFTIME_FORMAT)
96
130
  }
97
131
  end
98
132
 
@@ -103,4 +137,139 @@ class Sparkql::FunctionResolver
103
137
  :value => Time.now.iso8601
104
138
  }
105
139
  end
106
- end
140
+
141
+ def months num_months
142
+ d = DateTime.now >> num_months
143
+ {
144
+ :type => :date,
145
+ :value => d.strftime(STRFTIME_FORMAT)
146
+ }
147
+ end
148
+
149
+ def years num_years
150
+ d = DateTime.now >> (num_years * 12)
151
+ {
152
+ :type => :date,
153
+ :value => d.strftime(STRFTIME_FORMAT)
154
+ }
155
+ end
156
+
157
+ # TODO Donuts: to extend, we'd just replace (coords) param with (linear_ring1,linear_ring2, ...)
158
+ def polygon(coords)
159
+ new_coords = parse_coordinates(coords)
160
+ unless new_coords.size > 2
161
+ @errors << Sparkql::ParserError.new(:token => coords,
162
+ :message => "Function call 'polygon' requires at least three coordinates",
163
+ :status => :fatal )
164
+ return
165
+ end
166
+
167
+ # auto close the polygon if it's open
168
+ unless new_coords.first == new_coords.last
169
+ new_coords << new_coords.first.clone
170
+ end
171
+
172
+ shape = GeoRuby::SimpleFeatures::Polygon.from_coordinates([new_coords])
173
+ {
174
+ :type => :shape,
175
+ :value => shape
176
+ }
177
+ end
178
+
179
+ def linestring(coords)
180
+ new_coords = parse_coordinates(coords)
181
+ unless new_coords.size > 1
182
+ @errors << Sparkql::ParserError.new(:token => coords,
183
+ :message => "Function call 'linestring' requires at least two coordinates",
184
+ :status => :fatal )
185
+ return
186
+ end
187
+
188
+ shape = GeoRuby::SimpleFeatures::LineString.from_coordinates(new_coords)
189
+ {
190
+ :type => :shape,
191
+ :value => shape
192
+ }
193
+ end
194
+
195
+ def rectangle(coords)
196
+ bounding_box = parse_coordinates(coords)
197
+ unless bounding_box.size == 2
198
+ @errors << Sparkql::ParserError.new(:token => coords,
199
+ :message => "Function call 'rectangle' requires two coordinates for the bounding box",
200
+ :status => :fatal )
201
+ return
202
+ end
203
+ poly_coords = [
204
+ bounding_box.first,
205
+ [bounding_box.last.first, bounding_box.first.last],
206
+ bounding_box.last,
207
+ [bounding_box.first.first, bounding_box.last.last],
208
+ bounding_box.first.clone,
209
+ ]
210
+ shape = GeoRuby::SimpleFeatures::Polygon.from_coordinates([poly_coords])
211
+ {
212
+ :type => :shape,
213
+ :value => shape
214
+ }
215
+ end
216
+
217
+ def radius(coords, length)
218
+
219
+ unless length > 0
220
+ @errors << Sparkql::ParserError.new(:token => length,
221
+ :message => "Function call 'radius' length must be positive",
222
+ :status => :fatal )
223
+ return
224
+ end
225
+
226
+ # The radius() function is overloaded to allow an identifier
227
+ # to be specified over lat/lon. This identifier should specify a
228
+ # record that, in turn, references a lat/lon. Naturally, this won't be
229
+ # validated here.
230
+ shape_error = false
231
+ shape = if is_coords?(coords)
232
+ new_coords = parse_coordinates(coords)
233
+ if new_coords.size != 1
234
+ shape_error = true
235
+ else
236
+ GeoRuby::SimpleFeatures::Circle.from_coordinates(new_coords.first, length);
237
+ end
238
+ elsif Sparkql::Geo::RecordRadius.valid_record_id?(coords)
239
+ Sparkql::Geo::RecordRadius.new(coords, length)
240
+ else
241
+ shape_error = true
242
+ end
243
+
244
+ if shape_error
245
+ @errors << Sparkql::ParserError.new(:token => coords,
246
+ :message => "Function call 'radius' requires one coordinate for the center",
247
+ :status => :fatal )
248
+ return
249
+ end
250
+
251
+ {
252
+ :type => :shape,
253
+ :value => shape
254
+ }
255
+ end
256
+
257
+ private
258
+
259
+ def is_coords?(coord_string)
260
+ coord_string.split(" ").size > 1
261
+ end
262
+
263
+ def parse_coordinates coord_string
264
+ terms = coord_string.strip.split(',')
265
+ coords = terms.map do |term|
266
+ term.strip.split(/\s+/).reverse.map { |i| i.to_f }
267
+ end
268
+ coords
269
+ rescue => e
270
+ @errors << Sparkql::ParserError.new(:token => coord_string,
271
+ :message => "Unable to parse coordinate string.",
272
+ :status => :fatal )
273
+ end
274
+
275
+ end
@@ -0,0 +1,18 @@
1
+ module Sparkql
2
+ module Geo
3
+ class RecordRadius
4
+ RECORD_ID_REGEX = /\A[0-9]{26}\z/
5
+
6
+ attr_accessor :record_id, :radius
7
+
8
+ def self.valid_record_id?(record_id)
9
+ record_id =~ RECORD_ID_REGEX
10
+ end
11
+
12
+ def initialize(record_id, radius)
13
+ self.record_id = record_id
14
+ self.radius = radius
15
+ end
16
+ end
17
+ end
18
+ end
@@ -0,0 +1 @@
1
+ require 'sparkql/geo/record_circle'
data/lib/sparkql/lexer.rb CHANGED
@@ -1,5 +1,9 @@
1
1
  class Sparkql::Lexer < StringScanner
2
2
  include Sparkql::Token
3
+
4
+ attr_accessor :level, :block_group_identifier
5
+
6
+ attr_reader :last_field
3
7
 
4
8
  def initialize(str)
5
9
  str.freeze
@@ -21,7 +25,7 @@ class Sparkql::Lexer < StringScanner
21
25
  levelup
22
26
  [:LPAREN, value]
23
27
  when value = scan(RPAREN)
24
- # leveldown do this after parsing group
28
+ # leveldown: do this after parsing group
25
29
  [:RPAREN, value]
26
30
  when value = scan(/\,/)
27
31
  [:COMMA,value]
@@ -58,8 +62,12 @@ class Sparkql::Lexer < StringScanner
58
62
  u_value = value.capitalize
59
63
  if OPERATORS.include?(u_value)
60
64
  [:OPERATOR,u_value]
65
+ elsif RANGE_OPERATOR == u_value
66
+ [:RANGE_OPERATOR,u_value]
61
67
  elsif CONJUNCTIONS.include?(u_value)
62
68
  [:CONJUNCTION,u_value]
69
+ elsif UNARY_CONJUNCTIONS.include?(u_value)
70
+ [:UNARY_CONJUNCTION,u_value]
63
71
  else
64
72
  [:UNKNOWN, "ERROR: '#{self.string}'"]
65
73
  end
@@ -82,14 +90,6 @@ class Sparkql::Lexer < StringScanner
82
90
  result
83
91
  end
84
92
 
85
- def level
86
- @level
87
- end
88
-
89
- def block_group_identifier
90
- @block_group_identifier
91
- end
92
-
93
93
  def levelup
94
94
  @level += 1
95
95
  @block_group_identifier += 1
@@ -107,8 +107,4 @@ class Sparkql::Lexer < StringScanner
107
107
  [symbol, node]
108
108
  end
109
109
 
110
- def last_field
111
- @last_field
112
- end
113
-
114
- end
110
+ end