salesforce_bulk_query 0.0.6 → 0.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 4a5ed2f842b4032e2c90680ff6467feba485ba62
4
- data.tar.gz: 3a15cdf85bc0707570b2e35f4db44139680f850e
3
+ metadata.gz: 39893db1495deba35f872109eae3fde80d55b506
4
+ data.tar.gz: 64d1e6f6d2b016d5257004c8e83527372c51d688
5
5
  SHA512:
6
- metadata.gz: 3f1e236abefe0706f1ccfa80fc44c42feff3f5fec1ef5b6af7d3c61d9de30c6062d21ad629e785ac67c131705bd26517114426d71de37c4d21a8dd6f8cf547a3
7
- data.tar.gz: b431e5601bf9a9f940d79fc089a6dc00380e73a37eb17589b6bcf6f024278a3742a7056fe2553d40383db98e59d4ab0e55695442e4fb058e0daf0c553837324a
6
+ metadata.gz: 8dbda4621053cb70ad98f309feaf6842e0dfa8d3041db06b5cff531f151acd2dc9f41cc09ed60a3844c64d292bf49a9660c00d3c229edc63233ac5f37bcbc5a6
7
+ data.tar.gz: cf03cdc3d3b34bf587cb26359648af0343ee7a5b0956a789e4e0491d3a1c2ff905d42e52503c120806f385a1814ca969a78370a472319033cf3573b03048d87f
data/.gitignore CHANGED
@@ -1,3 +1,4 @@
1
- test_salesforce_credentials.json
1
+ env_setup.sh
2
2
  Gemfile.lock
3
- *.gem
3
+ *.gem
4
+ hh*
data/.travis.yml ADDED
@@ -0,0 +1,26 @@
1
+ language: ruby
2
+ sudo: false
3
+ os:
4
+ - linux
5
+ - osx
6
+ branches:
7
+ except:
8
+ - gh-pages
9
+ - next_doc_release
10
+ env:
11
+ global:
12
+ - LOGGING=true
13
+ - secure: EVku6JcQcp69DksaZUo8Sluk2Mk45CHjs/7w2xDG6n58Px3zxG3i0DSxm5SYBaDvcIQCv+LLt/LyjTRs5t51GbQrFLZGxGLMHDkJ/eS4K2q27P8NGqM/Yj4WtxB7BYQNLQf00eCo9HeTyjY7AsAqZG8wn6W9DUF5Gtu7AUO0aiQ=
14
+ - secure: GwO5mPCUhaeC1fHuX86T3INRrrpqb2raWS+M83Ko4fnwKUryGiMOVYtzu5NSiTfN2YRHd3gvAsisempqCHndTPvuWcJhSCIGEZdIYgdxXAz4Meb78t0w/5rY7O9BOJ5+Zin098SmB/uiPvW/HDSQw+T+bBHO3ETw+h8VvFDmHM4=
15
+ - secure: TbguXDUV3wZhZ9ZHGYP2X0zo8gJejeuawmHQ6xey5gZzYrAH92r6wgddQ7KM2Vc+UCw0Gpnuy2AKM6EhEGbRShc5u6LmwAJIqZ6iG48ALOxha6JOoICuSbahRmnNSVOP1utzAXjOB20YO8lao5PGNfqyacFnr3VhqG22Kl5EJhc=
16
+ - secure: a2wb07pS1Cl6aiKN0tSVjW4Zsc0K0yQ3mI+nrR3KQpF1vaMu43gKYRGLz3NDWcaPrIgKfFyeKjucxd8o52dLn5Mi8OW7hpPWy3d9Y9umx0PpTYHl16tOw2PLpWedqGTM/eZn5Y8UtzrsAdVJKIcCXE7SA28qGmaR5KKuaJjC+3w=
17
+ - secure: JvH3Nr6N9GbSWDNc8vE0cwDZPQcUgwXB2MjtkPZ/fnfrR/x/cNmCyc2lzZqfExmDNn17YbR2fPP8zKBczEDvY8q03v42LfnpDSCR2F73yJzQe9dxHwACroAT8VgAz3o23ejgry31SzZz+IoUvZ2fzuoZs+uAt1yr48I2V03PqfM=
18
+ rvm:
19
+ - 1.9.3
20
+ - jruby-19mode
21
+ - 2.1
22
+ - 2.2
23
+ before_install:
24
+ - gem update --system
25
+ - gem update bundler
26
+ script: rake ci
data/README.md CHANGED
@@ -4,6 +4,15 @@ A library for downloading data from Salesforce Bulk API. We only focus on queryi
4
4
 
5
5
  Derived from [Salesforce Bulk API](https://github.com/yatish27/salesforce_bulk_api)
6
6
 
7
+ ## Status
8
+
9
+ [![Gem Version](https://badge.fury.io/rb/salesforce_bulk_query.png)](http://badge.fury.io/rb/salesforce_bulk_query)
10
+ [![Downloads](http://img.shields.io/gem/dt/salesforce_bulk_query.svg)](http://rubygems.org/gems/salesforce_bulk_query)
11
+ [![Dependency Status](https://gemnasium.com/cvengros/salesforce_bulk_query.png)](https://gemnasium.com/cvengros/salesforce_bulk_query)
12
+ [![Code Climate](https://codeclimate.com/github/cvengros/salesforce_bulk_query.png)](https://codeclimate.com/github/cvengros/salesforce_bulk_query)
13
+ [![Build Status](https://travis-ci.org/cvengros/salesforce_bulk_query.png)](https://travis-ci.org/cvengros/salesforce_bulk_query)
14
+ [![Coverage Status](https://coveralls.io/repos/cvengros/salesforce_bulk_query/badge.png)](https://coveralls.io/r/cvengros/salesforce_bulk_query)
15
+
7
16
  ## Basic Usage
8
17
  To install, run:
9
18
 
@@ -54,14 +63,15 @@ puts "All the downloaded stuff is in csvs: #{result[:filenames]}"
54
63
  # if you want to just start the query asynchronously, use
55
64
  query = start_query("Task", "SELECT Id, Name FROM Task")
56
65
 
57
- # get a cofee
66
+ # get a coffee
58
67
  sleep(1234)
59
68
 
60
- # check the status
61
- status = query.check_status
62
- if status[:finished]
63
- result = query.get_results
64
- puts "All the downloaded stuff is in csvs: #{result[:filenames]}"
69
+ # get what's available and check the status
70
+ results = query.get_available_results
71
+ if results[:succeeded]
72
+ puts "All the downloaded stuff is in csvs: #{results[:filenames]}"
73
+ else
74
+ puts "This is going to take a while, get another coffee"
65
75
  end
66
76
  ```
67
77
 
@@ -82,28 +92,69 @@ There are a few optional settings you can pass to the `Api` methods:
82
92
  * `filename_prefix`: prefix applied to csv files
83
93
  * `directory_path`: custom direcotory path for CSVs, if omitted, a new temp directory is created
84
94
  * `check_interval`: how often the results should be checked in secs.
85
- * `time_limit`: maximum time the query can take. If this time limit is exceeded, available results are downloaded and the list of subqueries that didn't finished is returned. In seconds. The limti should be understood as limit for waiting. When the limit is reached the function downloads data that is ready which can take some additonal time.
95
+ * `time_limit`: maximum time the query can take. If this time limit is exceeded, available results are downloaded and the list of subqueries that didn't finished is returned. In seconds. The limti should be understood as limit for waiting. When the limit is reached the function downloads data that is ready which can take some additonal time. If no limit is given the query runs until it finishes
86
96
  * `created_from`, `created_to`: limits for the CreatedDate field. Note that queries can't contain any WHERE statements as we're doing some manipulations to create subqueries and we don't want things to get too difficult. So this is the way to limit the query yourself. The format is like `"1999-01-01T00:00:00.000Z"`
87
97
  * `single_batch`: If true, the queries are not divided into subqueries as described above. Instead one batch job is created with the given query. This is faster for small amount of data, but will fail with a timeout if you have a lot of data.
88
98
 
89
99
  See specs for exact usage.
90
100
 
91
101
  ## Logging
92
- require 'logger'
93
- require 'restforce'
94
102
 
95
- # create the restforce client
96
- restforce = Restforce.new(...)
103
+ ```ruby
104
+ require 'logger'
105
+ require 'restforce'
106
+
107
+ # create the restforce client
108
+ restforce = Restforce.new(...)
97
109
 
98
- # instantiate a logger and pass it to the Api constructor
99
- logger = Logger.new(STDOUT)
100
- bulk_api = SalesforceBulkQuery::Api.new(restforce, :logger => logger)
110
+ # instantiate a logger and pass it to the Api constructor
111
+ logger = Logger.new(STDOUT)
112
+ bulk_api = SalesforceBulkQuery::Api.new(restforce, :logger => logger)
101
113
 
102
- # switch off logging in Restforce so you don't get every message twice
103
- Restforce.log = false
114
+ # switch off logging in Restforce so you don't get every message twice
115
+ Restforce.log = false
116
+ ```
104
117
 
105
118
  If you're using Restforce as a client (which you probably are) and you want to do logging, Salesforce Bulk Query will use a custom logging middleware for Restforce. This is because the original logging middleware puts all API responses to log, which is not something you would like to do for a few gigabytes CSVs. When you use the :logger parameter it's recommended you switch off the default logging in Restforce, otherwise you'll get all messages twice.
106
119
 
120
+ ## Notes
121
+
122
+ Query (user given) -> Job (Salesforce construct that encapsulates 15 batches) -> Batch (1 SOQL with CreatedDate constraints)
123
+
124
+ At the beginning the query is divided into 15 subqueries and put into a single job. When one of the subqueries fails, a new job with 15 subqueries is created, the range of the failed query is divided into 15 sub-subqueries.
125
+
126
+ ## Running tests locally
127
+ Travis CI is set up for this repository to make sure all the tests are passing with each commit.
128
+
129
+ To run the tests locally:
130
+
131
+ * Copy the env_setup-example.sh file
132
+ ```
133
+ cp env_setup-example.sh env_setup.sh
134
+ ```
135
+ * Setup all the params in env_setup. USERNAME, PASSWORD and TOKEN are your salesforce account credentials. You can get those by [registering for a free developer account](https://developer.salesforce.com/signup). You might need to [reset your security token](https://help.salesforce.com/apex/HTViewHelpDoc?id=user_security_token.htm) to put it to TOKEN variable. CLIENT_ID and CLIENT_SECRET belong to your Salesforce connected app. You can create one by following the steps outlined in [the tutorial](https://help.salesforce.com/apex/HTViewHelpDoc?id=connected_app_create.htm). Make sure you check the 'api' permission.
136
+ * Run the env_setup
137
+ ```
138
+ . env_setup.sh
139
+ ```
140
+ * Run the tests
141
+ ```
142
+ bundle exec rspec
143
+ ```
144
+
145
+ Note that env_setup.sh is ignored from git in .gitignore so that you don't commit your credentials by accident.
146
+
147
+ ## Contributing
148
+
149
+ 1. Fork it ( https://github.com/[my-github-username]/salesforce_bulk_query/fork )
150
+ 2. Create your feature branch (`git checkout -b my-new-feature`)
151
+ 3. Run the tests (see above), fix if they fail.
152
+ 3. Commit your changes (`git commit -am 'Add some feature'`)
153
+ 4. Push to the branch (`git push origin my-new-feature`)
154
+ 5. Create a new Pull Request
155
+
156
+ Make sure you run all the tests and they pass. If you create a new feature, write a test for it.
157
+
107
158
  ## Copyright
108
159
 
109
160
  Copyright (c) 2014 Yatish Mehta & GoodData Corporation. See [LICENSE](LICENSE) for details.
data/Rakefile ADDED
@@ -0,0 +1,20 @@
1
+ require 'coveralls/rake/task'
2
+ require 'rake/testtask'
3
+ require 'rspec/core/rake_task'
4
+
5
+ Coveralls::RakeTask.new
6
+
7
+ task default: %w[ci]
8
+
9
+ desc 'Run continuous integration test'
10
+ task :ci do
11
+ Rake::Task['test:unit'].invoke
12
+ Rake::Task['coveralls:push'].invoke
13
+ end
14
+
15
+ namespace :test do
16
+ desc "Run unit tests"
17
+ RSpec::Core::RakeTask.new(:unit) do |t|
18
+ t.pattern = 'spec/**/*.rb'
19
+ end
20
+ end
@@ -0,0 +1,13 @@
1
+ #!/bin/sh
2
+
3
+ # mandatory
4
+ export CLIENT_ID=""
5
+ export CLIENT_SECRET=""
6
+
7
+ export USERNAME=""
8
+ export PASSWORD=""
9
+ export TOKEN=""
10
+
11
+ # optional
12
+ export LOGGING='true' # unset to switch off logging
13
+ export ENTITY='Task'
@@ -1,5 +1,8 @@
1
1
  require 'tmpdir'
2
2
 
3
+ require 'salesforce_bulk_query/utils'
4
+
5
+
3
6
  module SalesforceBulkQuery
4
7
  # Represents a Salesforce api batch. Batch contains a single subquery.
5
8
  # Many batches are contained in a Job.
@@ -11,10 +14,12 @@ module SalesforceBulkQuery
11
14
  @connection = options[:connection]
12
15
  @start = options[:start]
13
16
  @stop = options[:stop]
17
+ @logger = options[:logger]
14
18
  @@directory_path ||= Dir.mktmpdir
19
+ @filename = nil
15
20
  end
16
21
 
17
- attr_reader :soql, :start, :stop
22
+ attr_reader :soql, :start, :stop, :filename, :fail_message, :batch_id, :csv_record_count
18
23
 
19
24
  # Do the api request
20
25
  def create
@@ -25,15 +30,48 @@ module SalesforceBulkQuery
25
30
  @batch_id = response_parsed['id'][0]
26
31
  end
27
32
 
33
+ # check status of the batch
34
+ # if it fails, don't throw an error now, let the job above collect all fails and raise it at once
28
35
  def check_status
29
- # request to get the result id
30
- path = "job/#{@job_id}/batch/#{@batch_id}/result"
36
+ succeeded = nil
37
+ failed = nil
38
+
39
+ # get the status of the batch
40
+ # https://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_batches_get_info.htm
41
+ status_path = "job/#{@job_id}/batch/#{@batch_id}"
42
+ status_response = @connection.get_xml(status_path)
43
+
44
+ # interpret the status
45
+ @status = status_response['state'][0]
46
+
47
+ # https://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_batches_interpret_status.htm
48
+ case @status
49
+ when 'Failed'
50
+ failed = true
51
+ @fail_message = status_response['stateMessage']
52
+ when 'InProgress', 'Queued'
53
+ succeeded = false
54
+ when 'Completed'
55
+ succeeded = true
56
+ failed = false
57
+ else
58
+ fail "Something weird happened, #{@batch_id} has status #{@status}."
59
+ end
60
+
61
+ if succeeded
62
+ # request to get the result id
63
+ # https://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_batches_get_results.htm
64
+ path = "job/#{@job_id}/batch/#{@batch_id}/result"
65
+
66
+ response_parsed = @connection.get_xml(path)
31
67
 
32
- response_parsed = @connection.get_xml(path)
68
+ @result_id = response_parsed["result"] ? response_parsed["result"][0] : nil
69
+ end
33
70
 
34
- @result_id = response_parsed["result"] ? response_parsed["result"][0] : nil
35
71
  return {
36
- :finished => ! @result_id.nil?,
72
+ :failed => failed,
73
+ :fail_message => @fail_message,
74
+ :succeeded => succeeded,
37
75
  :result_id => @result_id
38
76
  }
39
77
  end
@@ -42,7 +80,14 @@ module SalesforceBulkQuery
42
80
  return "#{@sobject}_#{@batch_id}_#{@start}-#{@stop}.csv"
43
81
  end
44
82
 
45
- def get_result(directory_path=nil)
83
+ def get_result(options={})
84
+ # if it was already downloaded, no one should ask about it
85
+ if @filename
86
+ raise "This batch was already downloaded once: #{@filename}, #{@batch_id}"
87
+ end
88
+
89
+ directory_path = options[:directory_path]
90
+ skip_verification = options[:skip_verification]
46
91
 
47
92
  # request to get the actual results
48
93
  path = "job/#{@job_id}/batch/#{@batch_id}/result/#{@result_id}"
@@ -54,10 +99,39 @@ module SalesforceBulkQuery
54
99
  directory_path ||= @@directory_path
55
100
 
56
101
  # write it to a file
57
- filename = File.join(directory_path, get_filename)
58
- @connection.get_to_file(path, filename)
102
+ @filename = File.join(directory_path, get_filename)
103
+ @connection.get_to_file(path, @filename)
104
+
105
+ # Verify the number of downloaded records is roughly the same as
106
+ # count on the soql api
107
+ # maybe also verify
108
+ unless skip_verification
109
+ @verfication = verification
110
+ end
59
111
 
60
- return filename
112
+ return {
113
+ :filename => @filename,
114
+ :verfication => @verfication
115
+ }
116
+ end
117
+
118
+ def verification
119
+ api_count = @connection.query_count(@sobject, @start, @stop)
120
+ # if we weren't able to get the count, fail.
121
+ if api_count.nil?
122
+ return false
123
+ end
124
+
125
+ # count the records in the csv
126
+ @csv_record_count = Utils.line_count(@filename)
127
+
128
+ if @logger && @csv_record_count % 100 == 0
129
+ @logger.warn "The line count for batch #{@soql} is highly suspicius: #{@csv_record_count}"
130
+ end
131
+ if @logger && @csv_record_count != api_count
132
+ @logger.warn "The counts for batch #{@soql} don't match. Record count in downloaded csv #{@csv_record_count}, record count on api count(): #{api_count}"
133
+ end
134
+ return @csv_record_count >= api_count
61
135
  end
62
136
 
63
137
  def to_log
@@ -71,6 +145,5 @@ module SalesforceBulkQuery
71
145
  :directory_path => @@directory_path
72
146
  }
73
147
  end
74
-
75
148
  end
76
149
  end
@@ -7,7 +7,7 @@ module SalesforceBulkQuery
7
7
  # shared in all classes that do some requests
8
8
  class Connection
9
9
  def initialize(client, api_version, logger=nil, filename_prefix=nil)
10
- @client=client
10
+ @client = client
11
11
  @logger = logger
12
12
  @filename_prefix = filename_prefix
13
13
 
@@ -112,6 +112,18 @@ module SalesforceBulkQuery
112
112
  end
113
113
  end
114
114
 
115
+ def query_count(sobject, from, to)
116
+ # do it with retries, if it doesn't succeed, return nil, don't fail.
117
+ begin
118
+ with_retries do
119
+ q = @client.query("SELECT COUNT() FROM #{sobject} WHERE CreatedDate >= #{from} AND CreatedDate < #{to}")
120
+ return q.size
121
+ end
122
+ rescue TimeoutError => e
123
+ return nil
124
+ end
125
+ end
126
+
115
127
  def to_log
116
128
  return {
117
129
  :client => "Restforce asi",
@@ -7,16 +7,24 @@ module SalesforceBulkQuery
7
7
  class Job
8
8
  @@operation = 'query'
9
9
  @@xml_header = '<?xml version="1.0" encoding="utf-8" ?>'
10
- JOB_TIME_LIMIT = 10 * 60
10
+ JOB_TIME_LIMIT = 15 * 60
11
11
  BATCH_COUNT = 15
12
12
 
13
13
 
14
- def initialize(sobject, connection, logger=nil)
14
+ def initialize(sobject, connection, options={})
15
15
  @sobject = sobject
16
16
  @connection = connection
17
- @logger = logger
17
+ @logger = options[:logger]
18
+ @job_time_limit = options[:job_time_limit] || JOB_TIME_LIMIT
19
+
20
+ # all batches (static)
18
21
  @batches = []
22
+
23
+ # unfinished batches as of last get_available_results call
19
24
  @unfinished_batches = []
25
+
26
+ # filenames fort the already downloaded and verified batches
27
+ @filenames = []
20
28
  end
21
29
 
22
30
  attr_reader :job_id
@@ -79,12 +87,14 @@ module SalesforceBulkQuery
79
87
  :job_id => @job_id,
80
88
  :connection => @connection,
81
89
  :start => options[:start],
82
- :stop => options[:stop]
90
+ :stop => options[:stop],
91
+ :logger => @logger
83
92
  )
84
93
  batch.create
85
94
 
86
95
  # add the batch to the list
87
96
  @batches.push(batch)
97
+ @unfinished_batches.push(batch)
88
98
  end
89
99
 
90
100
  def close_job
@@ -95,60 +105,81 @@ module SalesforceBulkQuery
95
105
  path = "job/#{@job_id}"
96
106
 
97
107
  response_parsed = @connection.post_xml(path, xml)
98
- @job_closed = Time.now
108
+ @job_closed_time = Time.now
99
109
  end
100
110
 
101
111
  def check_status
102
112
  path = "job/#{@job_id}"
103
113
  response_parsed = @connection.get_xml(path)
104
- @completed = Integer(response_parsed["numberBatchesCompleted"][0])
105
- @finished = @completed == Integer(response_parsed["numberBatchesTotal"][0])
114
+ @completed_count = Integer(response_parsed["numberBatchesCompleted"][0])
115
+ @succeeded = @completed_count == Integer(response_parsed["numberBatchesTotal"][0])
116
+
106
117
  return {
107
- :finished => @finished,
108
- :some_failed => Integer(response_parsed["numberRecordsFailed"][0]) > 0,
118
+ :succeeded => @succeeded,
119
+ :some_records_failed => Integer(response_parsed["numberRecordsFailed"][0]) > 0,
120
+ :some_batches_failed => Integer(response_parsed["numberBatchesFailed"][0]) > 0,
109
121
  :response => response_parsed
110
122
  }
111
123
  end
112
124
 
125
+ def over_limit?
126
+ (Time.now - @job_closed_time) > @job_time_limit
127
+ end
128
+
113
129
  # downloads whatever is available, returns as unfinished whatever is not
114
- def get_results(options={})
115
- filenames = []
130
+
131
+ def get_available_results(options={})
132
+ downloaded_filenames = []
116
133
  unfinished_batches = []
134
+ verification_fail_batches = []
135
+ failed_batches = []
117
136
 
118
137
  # get result for each batch in the job
119
- @batches.each do |batch|
138
+ @unfinished_batches.each do |batch|
120
139
  batch_status = batch.check_status
121
140
 
122
141
  # if the result is ready
123
- if batch_status[:finished]
142
+ if batch_status[:succeeded]
143
+ # each finished batch should go here only once
124
144
 
125
145
  # download the result
126
- filename = batch.get_result(options[:directory_path])
127
- filenames.push(filename)
146
+ result = batch.get_result(options)
147
+
148
+ # if the verification failed, put it to failed
149
+ # will never ask about this one again.
150
+ if result[:verification] == false
151
+ verification_fail_batches << batch
152
+ else
153
+ # if verification ok and finished put it to filenames
154
+ downloaded_filenames << result[:filename]
155
+ end
156
+ elsif batch_status[:failed]
157
+ # put it to failed and raise error at the end
158
+ failed_batches << batch
128
159
  else
129
160
  # otherwise put it to unfinished
130
- unfinished_batches.push(batch)
161
+ unfinished_batches << batch
131
162
  end
132
163
  end
164
+
165
+ unless failed_batches.empty?
166
+ details = failed_batches.map{ |b| "#{b.batch_id}: #{b.fail_message}"}.join("\n")
167
+ fail ArgumentError, "#{failed_batches.length} batches failed. Details: #{details}"
168
+ end
169
+
170
+ # cache the unfinished_batches till the next run
133
171
  @unfinished_batches = unfinished_batches
134
172
 
173
+ # cumulate filenames
174
+ @filenames += downloaded_filenames
175
+
135
176
  return {
136
- :filenames => filenames,
137
- :unfinished_batches => unfinished_batches
177
+ :filenames => @filenames,
178
+ :unfinished_batches => unfinished_batches,
179
+ :verification_fail_batches => verification_fail_batches
138
180
  }
139
181
  end
140
182
 
141
- def get_available_results(options={})
142
- # if we didn't reach limit yet, do nothing
143
- # if all done, do nothing
144
- # if none of the batches finished, same thing
145
- if (Time.now - @job_closed < JOB_TIME_LIMIT) || @finished || @completed == 0
146
- return nil
147
- end
148
-
149
- return get_results(options)
150
- end
151
-
152
183
  def to_log
153
184
  return {
154
185
  :sobject => @sobject,
@@ -19,23 +19,33 @@ module SalesforceBulkQuery
19
19
  @created_from = options[:created_from]
20
20
  @created_to = options[:created_to]
21
21
  @single_batch = options[:single_batch]
22
+
23
+ # jobs currently running
22
24
  @jobs_in_progress = []
25
+
26
+ # successfully finished jobs with no batches to split
23
27
  @jobs_done = []
28
+
29
+ # finished or timeouted jobs with some batches split into other jobs
30
+ @jobs_restarted = []
31
+
24
32
  @finished_batch_filenames = []
25
33
  @restarted_subqueries = []
26
34
  end
27
35
 
36
+ attr_reader :jobs_in_progress, :jobs_restarted, :jobs_done
37
+
28
38
  DEFAULT_MIN_CREATED = "1999-01-01T00:00:00.000Z"
29
39
 
30
40
  # Creates the first job, divides the query to subqueries, puts all the subqueries as batches to the job
31
- def start
41
+ def start(options={})
32
42
  # order by and where not allowed
33
43
  if (!@single_batch) && (@soql =~ /WHERE/i || @soql =~ /ORDER BY/i)
34
44
  raise "You can't have WHERE or ORDER BY in your soql. If you want to download just specific date range use created_from / created_to"
35
45
  end
36
46
 
37
47
  # create the first job
38
- job = SalesforceBulkQuery::Job.new(@sobject, @connection, @logger)
48
+ job = SalesforceBulkQuery::Job.new(@sobject, @connection, {:logger => @logger}.merge(options))
39
49
  job.create_job
40
50
 
41
51
  # get the date when it should start
@@ -55,7 +65,7 @@ module SalesforceBulkQuery
55
65
 
56
66
  # generate intervals
57
67
  start = DateTime.parse(min_created)
58
- stop = @created_to ? DateTime.parse(@created_to) : DateTime.now - Rational(OFFSET_FROM_NOW, 1440)
68
+ stop = @created_to ? DateTime.parse(@created_to) : DateTime.now - Rational(options[:offset_from_now] || OFFSET_FROM_NOW, 1440)
59
69
  job.generate_batches(@soql, start, stop, @single_batch)
60
70
 
61
71
  job.close_job
@@ -63,89 +73,83 @@ module SalesforceBulkQuery
63
73
  @jobs_in_progress.push(job)
64
74
  end
65
75
 
76
+ # Get results for all finished jobs. If there are some unfinished batches, skip them and return them as unfinished.
77
+ #
78
+ # @param options[:directory_path]
79
+ def get_available_results(options={})
66
80
 
67
- # Check statuses of all jobs
68
- def check_status
69
81
  all_done = true
70
- job_statuses = []
71
- # check all jobs statuses and put them in an array
72
- @jobs_in_progress.each do |job|
73
- job_status = job.check_status
74
- all_done &&= job_status[:finished]
75
- job_statuses.push(job_status)
76
- end
77
-
78
- return {
79
- :finished => all_done,
80
- :job_statuses => job_statuses,
81
- :jobs_done => @jobs_done
82
- }
83
- end
84
-
85
- # Get results for all jobs
86
- # @param options[:directory_path]
87
- def get_results(options={})
88
- all_job_results = []
89
- job_result_filenames = []
90
82
  unfinished_subqueries = []
91
- # check each job and put it there
92
- @jobs_in_progress.each do |job|
93
- job_results = job.get_results(options)
94
- all_job_results.push(job_results)
95
- job_result_filenames += job_results[:filenames]
96
- unfinished_subqueries.push(job_results[:unfinished_batches].map {|b| b.soql})
97
- # if it's done add it to done
98
- if job_results[:unfinished_batches].empty?
99
- @jobs_done.push(job)
100
- end
101
- end
102
- return {
103
- :filenames => job_result_filenames + @finished_batch_filenames,
104
- :unfinished_subqueries => unfinished_subqueries,
105
- :restarted_subqueries => @restarted_subqueries,
106
- :results => all_job_results,
107
- :done_jobs => @jobs_done
108
- }
109
- end
110
-
111
- # Restart unfinished batches in all jobs in progress, creating new jobs
112
- # downloads results for finished batches
113
- def get_result_or_restart(options={})
114
- new_jobs = []
115
- job_ids_to_remove = []
83
+ jobs_in_progress = []
84
+ jobs_restarted = []
116
85
  jobs_done = []
117
86
 
87
+ # check all jobs statuses and split what should be split
118
88
  @jobs_in_progress.each do |job|
119
- # get available stuff, if not the right time yet, go on
120
- available_results = job.get_available_results(options)
121
- if available_results.nil?
122
- next
123
- end
124
89
 
125
- unfinished_batches = available_results[:unfinished_batches]
126
-
127
- # store the filenames and resturted stuff
128
- @finished_batch_filenames += available_results[:filenames]
129
- @restarted_subqueries += unfinished_batches.map {|b| b.soql}
90
+ # check job status
91
+ job_status = job.check_status
92
+ job_over_limit = job.over_limit?
93
+ job_done = job_status[:succeeded] || job_over_limit
94
+
95
+ # download what's available
96
+ job_results = job.get_available_results(options)
97
+ unfinished_batches = job_results[:unfinished_batches]
98
+ unfinished_subqueries += unfinished_batches.map {|b| b.soql}
99
+
100
+ # split to subqueries what needs to be split
101
+ to_split = job_results[:verification_fail_batches]
102
+ to_split += unfinished_batches if job_over_limit
103
+
104
+ # delete files associated with batches that failed verification
105
+ job_results[:verification_fail_batches].each do |b|
106
+ @logger.info "Deleting #{b.filename}, verification failed."
107
+ File.delete(b.filename)
108
+ end
130
109
 
131
- unfinished_batches.each do |batch|
110
+ to_split.each do |batch|
132
111
  # for each unfinished batch create a new job and add it to new jobs
133
- @logger.info "The following subquery didn't end in time: #{batch.soql}. Dividing into multiple and running again" if @logger
134
- new_job = SalesforceBulkQuery::Job.new(@sobject, @connection)
112
+ @logger.info "The following subquery didn't end in time / failed verification: #{batch.soql}. Dividing into multiple and running again" if @logger
113
+ new_job = SalesforceBulkQuery::Job.new(@sobject, @connection, {:logger => @logger}.merge(options))
135
114
  new_job.create_job
136
115
  new_job.generate_batches(@soql, batch.start, batch.stop)
137
116
  new_job.close_job
138
- new_jobs.push(new_job)
117
+ jobs_in_progress.push(new_job)
118
+ end
119
+
120
+ # what to do with the current job
121
+ # finish, some stuff restarted
122
+ if job_done
123
+ if to_split.empty?
124
+ # done, nothing left
125
+ jobs_done.push(job)
126
+ else
127
+ # done, some batches needed to be restarted
128
+ jobs_restarted.push(job)
129
+ end
130
+
131
+ # store the filenames and restarted stuff
132
+ @finished_batch_filenames += job_results[:filenames]
133
+ @restarted_subqueries += to_split.map {|b| b.soql}
134
+ else
135
+ # still in progress
136
+ jobs_in_progress.push(job)
139
137
  end
140
- # the current job to be removed from jobs in progress
141
- job_ids_to_remove.push(job.job_id)
142
- jobs_done.push(job)
138
+
139
+ # we're done if this job is done and it didn't generate any new jobs
140
+ all_done &&= (job_done && to_split.empty?)
143
141
  end
142
+
144
143
  # remove the finished jobs from progress and add there the new ones
145
- @jobs_in_progress.select! {|j| ! job_ids_to_remove.include?(j.job_id)}
144
+ @jobs_in_progress = jobs_in_progress
146
145
  @jobs_done += jobs_done
147
146
 
148
- @jobs_in_progress += new_jobs
147
+ return {
148
+ :succeeded => all_done,
149
+ :filenames => @finished_batch_filenames,
150
+ :unfinished_subqueries => unfinished_subqueries,
151
+ :jobs_done => @jobs_done.map { |j| j.job_id }
152
+ }
149
153
  end
150
154
  end
151
155
  end
@@ -0,0 +1,16 @@
1
+ require 'csv'
2
+
3
+ module SalesforceBulkQuery
4
+ class Utils
5
+ # record count if they want to
6
+ def self.line_count(f)
7
+ i = 0
8
+ CSV.foreach(f, :headers => true) {|_| i += 1}
9
+ i
10
+ end
11
+
12
+ def self.header(f)
13
+ File.open(f, &:readline).split(',').map{ |c| c.strip.delete('"') }
14
+ end
15
+ end
16
+ end
@@ -1,3 +1,3 @@
1
1
  module SalesforceBulkQuery
2
- VERSION = '0.0.6'
2
+ VERSION = '0.1.0'
3
3
  end
@@ -3,6 +3,8 @@ require 'csv'
3
3
  require 'salesforce_bulk_query/connection'
4
4
  require 'salesforce_bulk_query/query'
5
5
  require 'salesforce_bulk_query/logger'
6
+ require 'salesforce_bulk_query/utils'
7
+
6
8
 
7
9
  # Module where all the stuff is happening
8
10
  module SalesforceBulkQuery
@@ -36,8 +38,7 @@ module SalesforceBulkQuery
36
38
  return url
37
39
  end
38
40
 
39
- CHECK_INTERVAL = 10
40
- QUERY_TIME_LIMIT = 60 * 60 * 2 # two hours
41
+ CHECK_INTERVAL = 30
41
42
 
42
43
  # Query the Salesforce API. It's a blocking method - waits until the query is resolved
43
44
  # can take quite some time
@@ -46,7 +47,7 @@ module SalesforceBulkQuery
46
47
  # @return hash with :filenames and other useful stuff
47
48
  def query(sobject, soql, options={})
48
49
  check_interval = options[:check_interval] || CHECK_INTERVAL
49
- time_limit = options[:time_limit] || QUERY_TIME_LIMIT
50
+ time_limit = options[:time_limit] # in seconds
50
51
 
51
52
  start_time = Time.now
52
53
 
@@ -55,34 +56,27 @@ module SalesforceBulkQuery
55
56
  results = nil
56
57
 
57
58
  loop do
58
- # check the status
59
- status = query.check_status
59
+ # get available results and check the status
60
+ results = query.get_available_results(options)
60
61
 
61
62
  # if finished get the result and we're done
62
- if status[:finished]
63
+ if results[:succeeded]
63
64
 
64
- # get the results and we're done
65
- results = query.get_results(:directory_path => options[:directory_path])
66
- @logger.info "Query finished. Results: #{results_to_string(results)}" if @logger
65
+ # we're done
66
+ @logger.info "Query succeeded. Results: #{results}" if @logger
67
67
  break
68
68
  end
69
69
 
70
70
  # if we've run out of time limit, go away
71
- if Time.now - start_time > time_limit
71
+ if time_limit && (Time.now - start_time > time_limit)
72
72
  @logger.warn "Ran out of time limit, downloading what's available and terminating" if @logger
73
73
 
74
- # download what's available
75
- results = query.get_results(
76
- :directory_path => options[:directory_path],
77
- )
78
-
79
- @logger.info "Downloaded the following files: #{results[:filenames]} The following didn't finish in time: #{results[:unfinished_subqueries]}. Results: #{results_to_string(results)}" if @logger
74
+ @logger.info "Downloaded the following files: #{results[:filenames]} The following didn't finish in time: #{results[:unfinished_subqueries]}." if @logger
80
75
  break
81
76
  end
82
77
 
83
- # restart whatever needs to be restarted and sleep
84
- query.get_result_or_restart(:directory_path => options[:directory_path])
85
78
  @logger.info "Sleeping #{check_interval}" if @logger
79
+ @logger.info "Downloaded files: #{results[:filenames].length} Jobs in progress: #{query.jobs_in_progress.length}"
86
80
  sleep(check_interval)
87
81
  end
88
82
 
@@ -90,7 +84,7 @@ module SalesforceBulkQuery
90
84
  if @logger && ! results[:filenames].empty?
91
85
 
92
86
  @logger.info "Download finished. Downloaded files in #{File.dirname(results[:filenames][0])}. Filename size [line count]:"
93
- @logger.info "\n" + results[:filenames].sort.map{|f| "#{File.basename(f)} #{File.size(f)} #{line_count(f) if options[:count_lines]}"}.join("\n")
87
+ @logger.info "\n" + results[:filenames].sort.map{|f| "#{File.basename(f)} #{File.size(f)} #{Utils.line_count(f) if options[:count_lines]}"}.join("\n")
94
88
  end
95
89
  return results
96
90
  end
@@ -101,31 +95,8 @@ module SalesforceBulkQuery
101
95
  def start_query(sobject, soql, options={})
102
96
  # create the query, start it and return it
103
97
  query = SalesforceBulkQuery::Query.new(sobject, soql, @connection, {:logger => @logger}.merge(options))
104
- query.start
98
+ query.start(options)
105
99
  return query
106
100
  end
107
-
108
- private
109
-
110
- # record count if they want to
111
- def line_count(f)
112
- i = 0
113
- CSV.foreach(f, :headers => true) {|_| i+=1}
114
- i
115
- end
116
-
117
- # create a hash with just the fields we want to show in logs
118
- def results_to_string(results)
119
- return results.merge({
120
- :results => results[:results].map do |r|
121
- r.merge({
122
- :unfinished_batches => r[:unfinished_batches].map do |b|
123
- b.to_log
124
- end
125
- })
126
- end,
127
- :done_jobs => results[:done_jobs].map {|j| j.to_log}
128
- })
129
- end
130
101
  end
131
102
  end
@@ -22,6 +22,11 @@ Gem::Specification.new do |s|
22
22
  s.add_development_dependency 'restforce', '~>1.4'
23
23
  s.add_development_dependency 'rspec', '~>2.14'
24
24
  s.add_development_dependency 'pry', '~>0.9'
25
+ s.add_development_dependency 'pry-stack_explorer', '~>0.4' if RUBY_PLATFORM != 'java'
26
+ s.add_development_dependency 'rake', '~> 10.3'
27
+ s.add_development_dependency 'coveralls', '~> 0.7', '>= 0.7.0'
28
+
29
+
25
30
 
26
31
 
27
32
  s.files = `git ls-files`.split($/)
@@ -1,32 +1,17 @@
1
1
  require 'spec_helper'
2
2
  require 'multi_json'
3
- require 'restforce'
4
3
  require 'csv'
5
4
  require 'tmpdir'
6
5
  require 'logger'
6
+ require 'set'
7
7
 
8
- LOGGING = false
9
-
8
+ # test co nejak nafakuje tu situaci v twc
10
9
  describe SalesforceBulkQuery do
11
-
12
10
  before :all do
13
- auth = MultiJson.load(File.read('test_salesforce_credentials.json'), :symbolize_keys => true)
14
-
15
- @client = Restforce.new(
16
- :username => auth[:username],
17
- :password => auth[:password],
18
- :security_token => auth[:token],
19
- :client_id => auth[:client_id],
20
- :client_secret => auth[:client_secret],
21
- :api_version => '30.0'
22
- )
23
- @api = SalesforceBulkQuery::Api.new(@client,
24
- :api_version => '30.0',
25
- :logger => LOGGING ? Logger.new(STDOUT): nil
26
- )
27
-
28
- # switch off the normal logging
29
- Restforce.log = false
11
+ @client = SpecHelper.create_default_restforce
12
+ @api = SpecHelper.create_default_api(@client)
13
+ @entity = ENV['ENTITY'] || 'Opportunity'
14
+ @field_list = (ENV['FIELD_LIST'] || "Id,CreatedDate").split(',')
30
15
  end
31
16
 
32
17
  describe "instance_url" do
@@ -38,21 +23,29 @@ describe SalesforceBulkQuery do
38
23
  end
39
24
 
40
25
  describe "query" do
26
+ context "if you give it an invalid SOQL" do
27
+ it "fails with argument error" do
28
+ expect{@api.query(@entity, "SELECT Id, SomethingInvalid FROM #{@entity}")}.to raise_error(ArgumentError)
29
+ end
30
+ end
41
31
  context "when you give it no options" do
42
32
  it "downloads the data to a few files", :constraint => 'slow' do
43
- result = @api.query("Opportunity", "SELECT Id, Name FROM Opportunity")
44
- result[:filenames].should have_at_least(2).items
45
- result[:results].should_not be_empty
46
- result[:done_jobs].should_not be_empty
33
+ result = @api.query(@entity, "SELECT #{@field_list.join(', ')} FROM #{@entity}", :count_lines => true)
34
+ filenames = result[:filenames]
35
+ filenames.should have_at_least(2).items
36
+ result[:jobs_done].should_not be_empty
37
+
38
+ # no duplicate filenames
39
+ expect(Set.new(filenames).length).to eq(filenames.length)
47
40
 
48
- result[:filenames].each do |filename|
41
+ filenames.each do |filename|
49
42
  File.size?(filename).should be_true
50
43
 
51
44
  lines = CSV.read(filename)
52
45
 
53
46
  if lines.length > 1
54
47
  # first line should be the header
55
- lines[0].should eql(["Id", "Name"])
48
+ lines[0].should eql(@field_list)
56
49
 
57
50
  # first id shouldn't be emtpy
58
51
  lines[1][0].should_not be_empty
@@ -79,8 +72,7 @@ describe SalesforceBulkQuery do
79
72
  )
80
73
 
81
74
  result[:filenames].should have(1).items
82
- result[:results].should_not be_empty
83
- result[:done_jobs].should_not be_empty
75
+ result[:jobs_done].should_not be_empty
84
76
 
85
77
  filename = result[:filenames][0]
86
78
 
@@ -99,32 +91,42 @@ describe SalesforceBulkQuery do
99
91
  end
100
92
  end
101
93
  context "when you give it a short time limit" do
102
- it "downloads just a few files" do
94
+ it "downloads some stuff is unfinished" do
103
95
  result = @api.query(
104
- "Task",
105
- "SELECT Id, Name, CreatedDate FROM Task",
106
- :time_limit => 30
96
+ "Opportunity",
97
+ "SELECT Id, Name, CreatedDate FROM Opportunity",
98
+ :time_limit => 15
107
99
  )
108
- result[:results].should_not be_empty
100
+ # one of them should be non-empty
101
+ expect((! result[:unfinished_subqueries].empty?) || (! result[:filenames].empty?)).to eq true
102
+ end
103
+ end
104
+ context "when you pass a short job time limit" do
105
+ it "creates quite a few jobs quickly", :skip => true do
106
+ # development only
107
+ result = @api.query(
108
+ @entity,
109
+ "SELECT Id, CreatedDate FROM #{@entity}",
110
+ :count_lines => true,
111
+ :job_time_limit => 60
112
+ )
113
+ require 'pry'; binding.pry
109
114
  end
110
115
  end
111
116
  end
112
117
 
113
118
  describe "start_query" do
114
119
  it "starts a query that finishes some time later" do
115
- query = @api.start_query("Opportunity", "SELECT Id, Name, CreatedDate FROM Opportunity")
120
+ query = @api.start_query("Opportunity", "SELECT Id, Name, CreatedDate FROM Opportunity", :single_batch => true)
116
121
 
117
122
  # get a cofee
118
- sleep(40)
123
+ sleep(60*2)
119
124
 
120
125
  # check the status
121
- status = query.check_status
122
- if status[:finished]
123
- result = query.get_results
124
- result[:filenames].should have_at_least(2).items
125
- result[:results].should_not be_empty
126
- result[:done_jobs].should_not be_empty
127
- end
126
+ result = query.get_available_results
127
+ expect(result[:succeeded]).to eq true
128
+ result[:filenames].should have_at_least(1).items
129
+ result[:jobs_done].should_not be_empty
128
130
  end
129
131
 
130
132
  end
data/spec/spec_helper.rb CHANGED
@@ -1,6 +1,33 @@
1
1
  require 'salesforce_bulk_query'
2
+ require 'restforce'
3
+
2
4
 
3
5
  RSpec.configure do |c|
4
6
  c.filter_run :focus => true
5
7
  c.run_all_when_everything_filtered = true
8
+ c.filter_run_excluding :skip => true
9
+ end
10
+
11
+ class SpecHelper
12
+ DEFAULT_API_VERSION = '30.0'
13
+ def self.create_default_restforce
14
+ Restforce.new(
15
+ :username => ENV['USERNAME'],
16
+ :password => ENV['PASSWORD'],
17
+ :security_token => ENV['TOKEN'],
18
+ :client_id => ENV['CLIENT_ID'],
19
+ :client_secret => ENV['CLIENT_SECRET'],
20
+ :api_version => ENV['API_VERSION'] || DEFAULT_API_VERSION
21
+ )
22
+ end
23
+
24
+ def self.create_default_api(restforce)
25
+ # switch off the normal logging
26
+ Restforce.log = false
27
+
28
+ SalesforceBulkQuery::Api.new(restforce,
29
+ :api_version => ENV['API_VERSION'] || DEFAULT_API_VERSION,
30
+ :logger => ENV['LOGGING'] ? Logger.new(STDOUT): nil
31
+ )
32
+ end
6
33
  end
metadata CHANGED
@@ -1,99 +1,147 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: salesforce_bulk_query
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.6
4
+ version: 0.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Petr Cvengros
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2015-01-09 00:00:00.000000000 Z
11
+ date: 2015-01-23 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: json
15
15
  requirement: !ruby/object:Gem::Requirement
16
16
  requirements:
17
- - - ~>
17
+ - - "~>"
18
18
  - !ruby/object:Gem::Version
19
19
  version: '1.8'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
- - - ~>
24
+ - - "~>"
25
25
  - !ruby/object:Gem::Version
26
26
  version: '1.8'
27
27
  - !ruby/object:Gem::Dependency
28
28
  name: xml-simple
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
- - - ~>
31
+ - - "~>"
32
32
  - !ruby/object:Gem::Version
33
33
  version: '1.1'
34
34
  type: :runtime
35
35
  prerelease: false
36
36
  version_requirements: !ruby/object:Gem::Requirement
37
37
  requirements:
38
- - - ~>
38
+ - - "~>"
39
39
  - !ruby/object:Gem::Version
40
40
  version: '1.1'
41
41
  - !ruby/object:Gem::Dependency
42
42
  name: multi_json
43
43
  requirement: !ruby/object:Gem::Requirement
44
44
  requirements:
45
- - - ~>
45
+ - - "~>"
46
46
  - !ruby/object:Gem::Version
47
47
  version: '1.9'
48
48
  type: :development
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
- - - ~>
52
+ - - "~>"
53
53
  - !ruby/object:Gem::Version
54
54
  version: '1.9'
55
55
  - !ruby/object:Gem::Dependency
56
56
  name: restforce
57
57
  requirement: !ruby/object:Gem::Requirement
58
58
  requirements:
59
- - - ~>
59
+ - - "~>"
60
60
  - !ruby/object:Gem::Version
61
61
  version: '1.4'
62
62
  type: :development
63
63
  prerelease: false
64
64
  version_requirements: !ruby/object:Gem::Requirement
65
65
  requirements:
66
- - - ~>
66
+ - - "~>"
67
67
  - !ruby/object:Gem::Version
68
68
  version: '1.4'
69
69
  - !ruby/object:Gem::Dependency
70
70
  name: rspec
71
71
  requirement: !ruby/object:Gem::Requirement
72
72
  requirements:
73
- - - ~>
73
+ - - "~>"
74
74
  - !ruby/object:Gem::Version
75
75
  version: '2.14'
76
76
  type: :development
77
77
  prerelease: false
78
78
  version_requirements: !ruby/object:Gem::Requirement
79
79
  requirements:
80
- - - ~>
80
+ - - "~>"
81
81
  - !ruby/object:Gem::Version
82
82
  version: '2.14'
83
83
  - !ruby/object:Gem::Dependency
84
84
  name: pry
85
85
  requirement: !ruby/object:Gem::Requirement
86
86
  requirements:
87
- - - ~>
87
+ - - "~>"
88
88
  - !ruby/object:Gem::Version
89
89
  version: '0.9'
90
90
  type: :development
91
91
  prerelease: false
92
92
  version_requirements: !ruby/object:Gem::Requirement
93
93
  requirements:
94
- - - ~>
94
+ - - "~>"
95
95
  - !ruby/object:Gem::Version
96
96
  version: '0.9'
97
+ - !ruby/object:Gem::Dependency
98
+ name: pry-stack_explorer
99
+ requirement: !ruby/object:Gem::Requirement
100
+ requirements:
101
+ - - "~>"
102
+ - !ruby/object:Gem::Version
103
+ version: '0.4'
104
+ type: :development
105
+ prerelease: false
106
+ version_requirements: !ruby/object:Gem::Requirement
107
+ requirements:
108
+ - - "~>"
109
+ - !ruby/object:Gem::Version
110
+ version: '0.4'
111
+ - !ruby/object:Gem::Dependency
112
+ name: rake
113
+ requirement: !ruby/object:Gem::Requirement
114
+ requirements:
115
+ - - "~>"
116
+ - !ruby/object:Gem::Version
117
+ version: '10.3'
118
+ type: :development
119
+ prerelease: false
120
+ version_requirements: !ruby/object:Gem::Requirement
121
+ requirements:
122
+ - - "~>"
123
+ - !ruby/object:Gem::Version
124
+ version: '10.3'
125
+ - !ruby/object:Gem::Dependency
126
+ name: coveralls
127
+ requirement: !ruby/object:Gem::Requirement
128
+ requirements:
129
+ - - "~>"
130
+ - !ruby/object:Gem::Version
131
+ version: '0.7'
132
+ - - ">="
133
+ - !ruby/object:Gem::Version
134
+ version: 0.7.0
135
+ type: :development
136
+ prerelease: false
137
+ version_requirements: !ruby/object:Gem::Requirement
138
+ requirements:
139
+ - - "~>"
140
+ - !ruby/object:Gem::Version
141
+ version: '0.7'
142
+ - - ">="
143
+ - !ruby/object:Gem::Version
144
+ version: 0.7.0
97
145
  description: A library for downloading data from Salesforce Bulk API. We only focus
98
146
  on querying, other operations of the API aren't supported. Designed to handle a
99
147
  lot of data.
@@ -103,17 +151,20 @@ executables: []
103
151
  extensions: []
104
152
  extra_rdoc_files: []
105
153
  files:
106
- - .gitignore
154
+ - ".gitignore"
155
+ - ".travis.yml"
107
156
  - Gemfile
108
157
  - LICENSE
109
158
  - README.md
110
- - example_test_salesforce_credentials.json
159
+ - Rakefile
160
+ - env_setup-example.sh
111
161
  - lib/salesforce_bulk_query.rb
112
162
  - lib/salesforce_bulk_query/batch.rb
113
163
  - lib/salesforce_bulk_query/connection.rb
114
164
  - lib/salesforce_bulk_query/job.rb
115
165
  - lib/salesforce_bulk_query/logger.rb
116
166
  - lib/salesforce_bulk_query/query.rb
167
+ - lib/salesforce_bulk_query/utils.rb
117
168
  - lib/salesforce_bulk_query/version.rb
118
169
  - salesforce_bulk_query.gemspec
119
170
  - spec/salesforce_bulk_query_spec.rb
@@ -128,12 +179,12 @@ require_paths:
128
179
  - lib
129
180
  required_ruby_version: !ruby/object:Gem::Requirement
130
181
  requirements:
131
- - - '>='
182
+ - - ">="
132
183
  - !ruby/object:Gem::Version
133
184
  version: '1.9'
134
185
  required_rubygems_version: !ruby/object:Gem::Requirement
135
186
  requirements:
136
- - - '>='
187
+ - - ">="
137
188
  - !ruby/object:Gem::Version
138
189
  version: '0'
139
190
  requirements: []
@@ -143,3 +194,4 @@ signing_key:
143
194
  specification_version: 4
144
195
  summary: Downloading data from Salesforce Bulk API made easy and scalable.
145
196
  test_files: []
197
+ has_rdoc:
@@ -1,7 +0,0 @@
1
- {
2
- "username": "me@mycompany.com",
3
- "password": "mypassword",
4
- "token": "token I got in my email",
5
- "client_id": "id for my registered SFDC app",
6
- "client_secret": "secret number for my SFDC app"
7
- }