google-cloud-bigquery 1.16.0 → 1.17.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/AUTHENTICATION.md +7 -27
- data/CHANGELOG.md +8 -0
- data/CONTRIBUTING.md +1 -1
- data/OVERVIEW.md +5 -5
- data/lib/google-cloud-bigquery.rb +5 -11
- data/lib/google/cloud/bigquery.rb +3 -5
- data/lib/google/cloud/bigquery/convert.rb +23 -46
- data/lib/google/cloud/bigquery/copy_job.rb +6 -16
- data/lib/google/cloud/bigquery/credentials.rb +5 -12
- data/lib/google/cloud/bigquery/data.rb +10 -16
- data/lib/google/cloud/bigquery/dataset.rb +58 -118
- data/lib/google/cloud/bigquery/dataset/access.rb +5 -13
- data/lib/google/cloud/bigquery/dataset/list.rb +4 -9
- data/lib/google/cloud/bigquery/external.rb +14 -35
- data/lib/google/cloud/bigquery/extract_job.rb +2 -5
- data/lib/google/cloud/bigquery/insert_response.rb +1 -3
- data/lib/google/cloud/bigquery/job.rb +5 -9
- data/lib/google/cloud/bigquery/job/list.rb +3 -7
- data/lib/google/cloud/bigquery/load_job.rb +18 -33
- data/lib/google/cloud/bigquery/model.rb +1 -4
- data/lib/google/cloud/bigquery/model/list.rb +3 -7
- data/lib/google/cloud/bigquery/project.rb +27 -56
- data/lib/google/cloud/bigquery/project/list.rb +3 -7
- data/lib/google/cloud/bigquery/query_job.rb +40 -79
- data/lib/google/cloud/bigquery/schema.rb +3 -8
- data/lib/google/cloud/bigquery/schema/field.rb +6 -11
- data/lib/google/cloud/bigquery/service.rb +26 -58
- data/lib/google/cloud/bigquery/table.rb +58 -112
- data/lib/google/cloud/bigquery/table/async_inserter.rb +10 -18
- data/lib/google/cloud/bigquery/table/list.rb +3 -7
- data/lib/google/cloud/bigquery/version.rb +1 -1
- metadata +36 -42
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 35c396da998783ed10060a16abd258256f2b14fd9f1a2bae8842a01db2edcd61
|
4
|
+
data.tar.gz: c942df4f34c23a7403344d532b62ad740c0073604d184f2851f4278310558250
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: d8b5a6f8dbec478cbb582f756ce27297e9e284a3146f644bc60a585ac952d7558235603d7a7a53a026cc9f2237bc6bf19890357b39b9559dcf1e4da671e19b7f
|
7
|
+
data.tar.gz: 86cbd63bffb1956b4d5685c79662befc444a6e15337f3da58118cc3cac6f3d7609b06f7be5ee836667f501dc7887b0acd5790fa9ec4f7f954606ab7780aa358a
|
data/AUTHENTICATION.md
CHANGED
@@ -2,7 +2,9 @@
|
|
2
2
|
|
3
3
|
In general, the google-cloud-bigquery library uses [Service
|
4
4
|
Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
|
5
|
-
credentials to connect to Google Cloud services. When running on
|
5
|
+
credentials to connect to Google Cloud services. When running on Google Cloud
|
6
|
+
Platform (GCP), including Google Compute Engine (GCE), Google Kubernetes Engine
|
7
|
+
(GKE), Google App Engine (GAE), Google Cloud Functions (GCF) and Cloud Run,
|
6
8
|
the credentials will be discovered automatically. When running on other
|
7
9
|
environments, the Service Account credentials can be specified by providing the
|
8
10
|
path to the [JSON
|
@@ -35,32 +37,10 @@ providing **Project ID** and **Service Account Credentials** directly in code.
|
|
35
37
|
|
36
38
|
### Google Cloud Platform environments
|
37
39
|
|
38
|
-
|
39
|
-
|
40
|
-
|
41
|
-
should be written as if already authenticated.
|
42
|
-
GCE instance][gce-how-to], you add the correct scopes for the APIs you want to
|
43
|
-
access. For example:
|
44
|
-
|
45
|
-
* **All APIs**
|
46
|
-
* `https://www.googleapis.com/auth/cloud-platform`
|
47
|
-
* `https://www.googleapis.com/auth/cloud-platform.read-only`
|
48
|
-
* **BigQuery**
|
49
|
-
* `https://www.googleapis.com/auth/bigquery`
|
50
|
-
* `https://www.googleapis.com/auth/bigquery.insertdata`
|
51
|
-
* **Compute Engine**
|
52
|
-
* `https://www.googleapis.com/auth/compute`
|
53
|
-
* **Datastore**
|
54
|
-
* `https://www.googleapis.com/auth/datastore`
|
55
|
-
* `https://www.googleapis.com/auth/userinfo.email`
|
56
|
-
* **DNS**
|
57
|
-
* `https://www.googleapis.com/auth/ndev.clouddns.readwrite`
|
58
|
-
* **Pub/Sub**
|
59
|
-
* `https://www.googleapis.com/auth/pubsub`
|
60
|
-
* **Storage**
|
61
|
-
* `https://www.googleapis.com/auth/devstorage.full_control`
|
62
|
-
* `https://www.googleapis.com/auth/devstorage.read_only`
|
63
|
-
* `https://www.googleapis.com/auth/devstorage.read_write`
|
40
|
+
When running on Google Cloud Platform (GCP), including Google Compute Engine (GCE),
|
41
|
+
Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud Functions
|
42
|
+
(GCF) and Cloud Run, the **Project ID** and **Credentials** and are discovered
|
43
|
+
automatically. Code should be written as if already authenticated.
|
64
44
|
|
65
45
|
### Environment Variables
|
66
46
|
|
data/CHANGELOG.md
CHANGED
data/CONTRIBUTING.md
CHANGED
@@ -24,7 +24,7 @@ be able to accept your pull requests.
|
|
24
24
|
In order to use the google-cloud-bigquery console and run the project's tests,
|
25
25
|
there is a small amount of setup:
|
26
26
|
|
27
|
-
1. Install Ruby. google-cloud-bigquery requires Ruby 2.
|
27
|
+
1. Install Ruby. google-cloud-bigquery requires Ruby 2.4+. You may choose to
|
28
28
|
manage your Ruby and gem installations with [RVM](https://rvm.io/),
|
29
29
|
[rbenv](https://github.com/rbenv/rbenv), or
|
30
30
|
[chruby](https://github.com/postmodern/chruby).
|
data/OVERVIEW.md
CHANGED
@@ -6,11 +6,11 @@ is BigQuery?](https://cloud.google.com/bigquery/what-is-bigquery).
|
|
6
6
|
|
7
7
|
The goal of google-cloud is to provide an API that is comfortable to Rubyists.
|
8
8
|
Your authentication credentials are detected automatically in Google Cloud
|
9
|
-
Platform
|
10
|
-
Google
|
11
|
-
|
12
|
-
|
13
|
-
Guide}.
|
9
|
+
Platform (GCP), including Google Compute Engine (GCE), Google Kubernetes Engine
|
10
|
+
(GKE), Google App Engine (GAE), Google Cloud Functions (GCF) and Cloud Run. In
|
11
|
+
other environments you can configure authentication easily, either directly in
|
12
|
+
your code or via environment variables. Read more about the options for
|
13
|
+
connecting in the {file:AUTHENTICATION.md Authentication Guide}.
|
14
14
|
|
15
15
|
To help you get started quickly, the first few examples below use a public
|
16
16
|
dataset provided by Google. As soon as you have [signed
|
@@ -103,13 +103,10 @@ module Google
|
|
103
103
|
# dataset = bigquery.dataset "my_dataset"
|
104
104
|
# table = dataset.table "my_table"
|
105
105
|
#
|
106
|
-
def self.bigquery project_id = nil, credentials = nil, scope: nil,
|
107
|
-
retries: nil, timeout: nil
|
106
|
+
def self.bigquery project_id = nil, credentials = nil, scope: nil, retries: nil, timeout: nil
|
108
107
|
require "google/cloud/bigquery"
|
109
|
-
Google::Cloud::Bigquery.new project_id: project_id,
|
110
|
-
|
111
|
-
scope: scope, retries: retries,
|
112
|
-
timeout: timeout
|
108
|
+
Google::Cloud::Bigquery.new project_id: project_id, credentials: credentials,
|
109
|
+
scope: scope, retries: retries, timeout: timeout
|
113
110
|
end
|
114
111
|
end
|
115
112
|
end
|
@@ -121,16 +118,13 @@ Google::Cloud.configure.add_config! :bigquery do |config|
|
|
121
118
|
end
|
122
119
|
default_creds = Google::Cloud::Config.deferred do
|
123
120
|
Google::Cloud::Config.credentials_from_env(
|
124
|
-
"BIGQUERY_CREDENTIALS", "BIGQUERY_CREDENTIALS_JSON",
|
125
|
-
"BIGQUERY_KEYFILE", "BIGQUERY_KEYFILE_JSON"
|
121
|
+
"BIGQUERY_CREDENTIALS", "BIGQUERY_CREDENTIALS_JSON", "BIGQUERY_KEYFILE", "BIGQUERY_KEYFILE_JSON"
|
126
122
|
)
|
127
123
|
end
|
128
124
|
|
129
125
|
config.add_field! :project_id, default_project, match: String, allow_nil: true
|
130
126
|
config.add_alias! :project, :project_id
|
131
|
-
config.add_field! :credentials, default_creds,
|
132
|
-
match: [String, Hash, Google::Auth::Credentials],
|
133
|
-
allow_nil: true
|
127
|
+
config.add_field! :credentials, default_creds, match: [String, Hash, Google::Auth::Credentials], allow_nil: true
|
134
128
|
config.add_alias! :keyfile, :credentials
|
135
129
|
config.add_field! :scope, nil, match: [String, Array]
|
136
130
|
config.add_field! :retries, nil, match: Integer
|
@@ -66,8 +66,8 @@ module Google
|
|
66
66
|
# dataset = bigquery.dataset "my_dataset"
|
67
67
|
# table = dataset.table "my_table"
|
68
68
|
#
|
69
|
-
def self.new project_id: nil, credentials: nil, scope: nil, retries: nil,
|
70
|
-
|
69
|
+
def self.new project_id: nil, credentials: nil, scope: nil, retries: nil, timeout: nil, endpoint: nil,
|
70
|
+
project: nil, keyfile: nil
|
71
71
|
scope ||= configure.scope
|
72
72
|
retries ||= configure.retries
|
73
73
|
timeout ||= configure.timeout
|
@@ -121,9 +121,7 @@ module Google
|
|
121
121
|
# @private Resolve project.
|
122
122
|
def self.resolve_project_id given_project, credentials
|
123
123
|
project_id = given_project || default_project_id
|
124
|
-
if credentials.respond_to? :project_id
|
125
|
-
project_id ||= credentials.project_id
|
126
|
-
end
|
124
|
+
project_id ||= credentials.project_id if credentials.respond_to? :project_id
|
127
125
|
project_id.to_s # Always cast to a string
|
128
126
|
end
|
129
127
|
|
@@ -118,83 +118,62 @@ module Google
|
|
118
118
|
def self.to_query_param value
|
119
119
|
if TrueClass === value
|
120
120
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
121
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
122
|
-
|
123
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
124
|
-
value: true)
|
121
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "BOOL"),
|
122
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: true)
|
125
123
|
)
|
126
124
|
elsif FalseClass === value
|
127
125
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
128
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
129
|
-
|
130
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
131
|
-
value: false)
|
126
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "BOOL"),
|
127
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: false)
|
132
128
|
)
|
133
129
|
elsif Integer === value
|
134
130
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
135
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
136
|
-
|
137
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
138
|
-
value: value)
|
131
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "INT64"),
|
132
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: value)
|
139
133
|
)
|
140
134
|
elsif Float === value
|
141
135
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
142
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
143
|
-
|
144
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
145
|
-
value: value)
|
136
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "FLOAT64"),
|
137
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: value)
|
146
138
|
)
|
147
139
|
elsif BigDecimal === value
|
148
140
|
# Round to precision of 9
|
149
141
|
value_str = value.finite? ? value.round(9).to_s("F") : value.to_s
|
150
142
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
151
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
152
|
-
|
153
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
154
|
-
value: value_str)
|
143
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "NUMERIC"),
|
144
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: value_str)
|
155
145
|
)
|
156
146
|
elsif String === value
|
157
147
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
158
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
159
|
-
|
160
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
161
|
-
value: value)
|
148
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "STRING"),
|
149
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: value)
|
162
150
|
)
|
163
151
|
elsif DateTime === value
|
164
152
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
165
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
166
|
-
|
167
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
168
|
-
value: value.strftime("%Y-%m-%d %H:%M:%S.%6N"))
|
153
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "DATETIME"),
|
154
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: value.strftime("%Y-%m-%d %H:%M:%S.%6N"))
|
169
155
|
)
|
170
156
|
elsif Date === value
|
171
157
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
172
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
173
|
-
|
174
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
175
|
-
value: value.to_s)
|
158
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "DATE"),
|
159
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: value.to_s)
|
176
160
|
)
|
177
161
|
elsif ::Time === value
|
178
162
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
179
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
180
|
-
type: "TIMESTAMP"),
|
163
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "TIMESTAMP"),
|
181
164
|
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
182
165
|
value: value.strftime("%Y-%m-%d %H:%M:%S.%6N%:z"))
|
183
166
|
)
|
184
167
|
elsif Bigquery::Time === value
|
185
168
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
186
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
187
|
-
|
188
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
189
|
-
value: value.value)
|
169
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "TIME"),
|
170
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: value.value)
|
190
171
|
)
|
191
172
|
elsif value.respond_to?(:read) && value.respond_to?(:rewind)
|
192
173
|
value.rewind
|
193
174
|
return Google::Apis::BigqueryV2::QueryParameter.new(
|
194
|
-
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(
|
195
|
-
|
196
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
197
|
-
value: Base64.strict_encode64(
|
175
|
+
parameter_type: Google::Apis::BigqueryV2::QueryParameterType.new(type: "BYTES"),
|
176
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(value: Base64.strict_encode64(
|
198
177
|
value.read.force_encoding("ASCII-8BIT")))
|
199
178
|
)
|
200
179
|
elsif Array === value
|
@@ -225,9 +204,7 @@ module Google
|
|
225
204
|
type: "STRUCT",
|
226
205
|
struct_types: struct_pairs.map(&:first)
|
227
206
|
),
|
228
|
-
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(
|
229
|
-
struct_values: struct_values
|
230
|
-
)
|
207
|
+
parameter_value: Google::Apis::BigqueryV2::QueryParameterValue.new(struct_values: struct_values)
|
231
208
|
)
|
232
209
|
else
|
233
210
|
raise "A query parameter of type #{value.class} is not supported."
|
@@ -390,7 +367,7 @@ module Google
|
|
390
367
|
# nil if the given argument is nil.
|
391
368
|
def self.time_to_millis time_obj
|
392
369
|
return nil unless time_obj
|
393
|
-
(time_obj.to_i * 1000) + (time_obj.nsec /
|
370
|
+
(time_obj.to_i * 1000) + (time_obj.nsec / 1_000_000)
|
394
371
|
end
|
395
372
|
end
|
396
373
|
|
@@ -51,9 +51,7 @@ module Google
|
|
51
51
|
def source
|
52
52
|
table = @gapi.configuration.copy.source_table
|
53
53
|
return nil unless table
|
54
|
-
retrieve_table table.project_id,
|
55
|
-
table.dataset_id,
|
56
|
-
table.table_id
|
54
|
+
retrieve_table table.project_id, table.dataset_id, table.table_id
|
57
55
|
end
|
58
56
|
|
59
57
|
##
|
@@ -64,9 +62,7 @@ module Google
|
|
64
62
|
def destination
|
65
63
|
table = @gapi.configuration.copy.destination_table
|
66
64
|
return nil unless table
|
67
|
-
retrieve_table table.project_id,
|
68
|
-
table.dataset_id,
|
69
|
-
table.table_id
|
65
|
+
retrieve_table table.project_id, table.dataset_id, table.table_id
|
70
66
|
end
|
71
67
|
|
72
68
|
##
|
@@ -139,9 +135,7 @@ module Google
|
|
139
135
|
#
|
140
136
|
# @!group Attributes
|
141
137
|
def encryption
|
142
|
-
EncryptionConfiguration.from_gapi
|
143
|
-
@gapi.configuration.copy.destination_encryption_configuration
|
144
|
-
)
|
138
|
+
EncryptionConfiguration.from_gapi @gapi.configuration.copy.destination_encryption_configuration
|
145
139
|
end
|
146
140
|
|
147
141
|
##
|
@@ -227,8 +221,7 @@ module Google
|
|
227
221
|
#
|
228
222
|
# @!group Attributes
|
229
223
|
def create= new_create
|
230
|
-
@gapi.configuration.copy.update!
|
231
|
-
create_disposition: Convert.create_disposition(new_create)
|
224
|
+
@gapi.configuration.copy.update! create_disposition: Convert.create_disposition(new_create)
|
232
225
|
end
|
233
226
|
|
234
227
|
##
|
@@ -248,8 +241,7 @@ module Google
|
|
248
241
|
#
|
249
242
|
# @!group Attributes
|
250
243
|
def write= new_write
|
251
|
-
@gapi.configuration.copy.update!
|
252
|
-
write_disposition: Convert.write_disposition(new_write)
|
244
|
+
@gapi.configuration.copy.update! write_disposition: Convert.write_disposition(new_write)
|
253
245
|
end
|
254
246
|
|
255
247
|
##
|
@@ -273,9 +265,7 @@ module Google
|
|
273
265
|
#
|
274
266
|
# @!group Attributes
|
275
267
|
def encryption= val
|
276
|
-
@gapi.configuration.copy.update!
|
277
|
-
destination_encryption_configuration: val.to_gapi
|
278
|
-
)
|
268
|
+
@gapi.configuration.copy.update! destination_encryption_configuration: val.to_gapi
|
279
269
|
end
|
280
270
|
|
281
271
|
##
|
@@ -39,18 +39,11 @@ module Google
|
|
39
39
|
#
|
40
40
|
class Credentials < Google::Auth::Credentials
|
41
41
|
SCOPE = ["https://www.googleapis.com/auth/bigquery"].freeze
|
42
|
-
PATH_ENV_VARS =
|
43
|
-
|
44
|
-
|
45
|
-
|
46
|
-
|
47
|
-
JSON_ENV_VARS = %w[BIGQUERY_CREDENTIALS_JSON
|
48
|
-
GOOGLE_CLOUD_CREDENTIALS_JSON
|
49
|
-
BIGQUERY_KEYFILE_JSON
|
50
|
-
GOOGLE_CLOUD_KEYFILE_JSON
|
51
|
-
GCLOUD_KEYFILE_JSON].freeze
|
52
|
-
DEFAULT_PATHS = \
|
53
|
-
["~/.config/gcloud/application_default_credentials.json"].freeze
|
42
|
+
PATH_ENV_VARS = ["BIGQUERY_CREDENTIALS", "GOOGLE_CLOUD_CREDENTIALS", "BIGQUERY_KEYFILE", "GOOGLE_CLOUD_KEYFILE",
|
43
|
+
"GCLOUD_KEYFILE"].freeze
|
44
|
+
JSON_ENV_VARS = ["BIGQUERY_CREDENTIALS_JSON", "GOOGLE_CLOUD_CREDENTIALS_JSON", "BIGQUERY_KEYFILE_JSON",
|
45
|
+
"GOOGLE_CLOUD_KEYFILE_JSON", "GCLOUD_KEYFILE_JSON"].freeze
|
46
|
+
DEFAULT_PATHS = ["~/.config/gcloud/application_default_credentials.json"].freeze
|
54
47
|
end
|
55
48
|
end
|
56
49
|
end
|
@@ -227,8 +227,7 @@ module Google
|
|
227
227
|
# @return [String, nil] The type of query statement.
|
228
228
|
#
|
229
229
|
def statement_type
|
230
|
-
|
231
|
-
job_gapi.statistics.query.statement_type
|
230
|
+
job_gapi&.statistics&.query&.statement_type
|
232
231
|
end
|
233
232
|
|
234
233
|
##
|
@@ -249,8 +248,8 @@ module Google
|
|
249
248
|
# data.ddl? #=> true
|
250
249
|
#
|
251
250
|
def ddl?
|
252
|
-
|
253
|
-
|
251
|
+
["CREATE_MODEL", "CREATE_TABLE", "CREATE_TABLE_AS_SELECT", "CREATE_VIEW", "\n", "DROP_MODEL", "DROP_TABLE",
|
252
|
+
"DROP_VIEW"].include? statement_type
|
254
253
|
end
|
255
254
|
|
256
255
|
##
|
@@ -273,7 +272,7 @@ module Google
|
|
273
272
|
# data.dml? #=> true
|
274
273
|
#
|
275
274
|
def dml?
|
276
|
-
|
275
|
+
["INSERT", "UPDATE", "MERGE", "DELETE"].include? statement_type
|
277
276
|
end
|
278
277
|
|
279
278
|
##
|
@@ -292,8 +291,7 @@ module Google
|
|
292
291
|
# @return [String, nil] The DDL operation performed.
|
293
292
|
#
|
294
293
|
def ddl_operation_performed
|
295
|
-
|
296
|
-
job_gapi.statistics.query.ddl_operation_performed
|
294
|
+
job_gapi&.statistics&.query&.ddl_operation_performed
|
297
295
|
end
|
298
296
|
|
299
297
|
##
|
@@ -305,10 +303,9 @@ module Google
|
|
305
303
|
# reference state.
|
306
304
|
#
|
307
305
|
def ddl_target_table
|
308
|
-
return nil unless job_gapi && job_gapi.statistics.query
|
309
306
|
ensure_service!
|
310
|
-
table = job_gapi
|
311
|
-
return nil
|
307
|
+
table = job_gapi&.statistics&.query&.ddl_target_table
|
308
|
+
return nil if table.nil?
|
312
309
|
Google::Cloud::Bigquery::Table.new_reference_from_gapi table, service
|
313
310
|
end
|
314
311
|
|
@@ -320,8 +317,7 @@ module Google
|
|
320
317
|
# or `nil` if the query is not a DML statement.
|
321
318
|
#
|
322
319
|
def num_dml_affected_rows
|
323
|
-
|
324
|
-
job_gapi.statistics.query.num_dml_affected_rows
|
320
|
+
job_gapi&.statistics&.query&.num_dml_affected_rows
|
325
321
|
end
|
326
322
|
|
327
323
|
##
|
@@ -447,7 +443,7 @@ module Google
|
|
447
443
|
results.each { |r| yield r }
|
448
444
|
if request_limit
|
449
445
|
request_limit -= 1
|
450
|
-
break if request_limit
|
446
|
+
break if request_limit.negative?
|
451
447
|
end
|
452
448
|
break unless results.next?
|
453
449
|
results = results.next
|
@@ -458,9 +454,7 @@ module Google
|
|
458
454
|
# @private New Data from a response object.
|
459
455
|
def self.from_gapi_json gapi_json, table_gapi, job_gapi, service
|
460
456
|
rows = gapi_json[:rows] || []
|
461
|
-
unless rows.empty?
|
462
|
-
rows = Convert.format_rows rows, table_gapi.schema.fields
|
463
|
-
end
|
457
|
+
rows = Convert.format_rows rows, table_gapi.schema.fields unless rows.empty?
|
464
458
|
|
465
459
|
data = new rows
|
466
460
|
data.table_gapi = table_gapi
|