google-cloud-bigquery-storage-v1 0.1.2 → 0.2.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/AUTHENTICATION.md +6 -6
- data/README.md +48 -1
- data/lib/google-cloud-bigquery-storage-v1.rb +21 -1
- data/lib/google/cloud/bigquery/storage/v1.rb +18 -0
- data/lib/google/cloud/bigquery/storage/v1/big_query_read.rb +34 -1
- data/lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb +121 -147
- data/lib/google/cloud/bigquery/storage/v1/big_query_read/credentials.rb +1 -1
- data/lib/google/cloud/bigquery/storage/v1/big_query_read/paths.rb +11 -11
- data/lib/google/cloud/bigquery/storage/v1/version.rb +1 -1
- data/proto_docs/google/api/resource.rb +12 -12
- data/proto_docs/google/cloud/bigquery/storage/v1/arrow.rb +7 -7
- data/proto_docs/google/cloud/bigquery/storage/v1/avro.rb +7 -7
- data/proto_docs/google/cloud/bigquery/storage/v1/storage.rb +34 -34
- data/proto_docs/google/cloud/bigquery/storage/v1/stream.rb +21 -21
- data/proto_docs/google/protobuf/timestamp.rb +4 -4
- metadata +33 -6
- data/lib/google/cloud/common_resources_pb.rb +0 -15
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 43d19b3d610e13d3a29f38a6bfe4912d1c53ad5818f57d30ce59cce9ad2d7831
|
4
|
+
data.tar.gz: a1c820f52d31c66955391ebef6be8163344e725b52774ceafb878d2301b29748
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: bd43fe7ec72516ee8cc78cdcca41f741377e5bda754bcc06ee503387288587387540a9b79a62b7879f342f469548736a3bbf38bf8f0372627019296c856824e2
|
7
|
+
data.tar.gz: 59106091d5359c87b4f3ddbf83fe631715d459cde23d60723b8a72e99e961fcaa5caee70ea382a3782ba6789b958214b87047a14ebc13db5ceac99c9ff39a0d8
|
data/AUTHENTICATION.md
CHANGED
@@ -27,7 +27,7 @@ export BIGQUERY_STORAGE_CREDENTIALS=path/to/keyfile.json
|
|
27
27
|
```ruby
|
28
28
|
require "google/cloud/bigquery/storage/v1"
|
29
29
|
|
30
|
-
client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
30
|
+
client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
31
31
|
```
|
32
32
|
|
33
33
|
## Credential Lookup
|
@@ -64,7 +64,7 @@ containers where writing files is difficult or not encouraged.
|
|
64
64
|
|
65
65
|
The environment variables that google-cloud-bigquery-storage-v1
|
66
66
|
checks for credentials are configured on the service Credentials class (such as
|
67
|
-
{Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Credentials}):
|
67
|
+
{::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Credentials}):
|
68
68
|
|
69
69
|
1. `BIGQUERY_STORAGE_CREDENTIALS` - Path to JSON file, or JSON contents
|
70
70
|
2. `BIGQUERY_STORAGE_KEYFILE` - Path to JSON file, or JSON contents
|
@@ -77,7 +77,7 @@ require "google/cloud/bigquery/storage/v1"
|
|
77
77
|
|
78
78
|
ENV["BIGQUERY_STORAGE_CREDENTIALS"] = "path/to/keyfile.json"
|
79
79
|
|
80
|
-
client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
80
|
+
client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
81
81
|
```
|
82
82
|
|
83
83
|
### Configuration
|
@@ -88,7 +88,7 @@ environment variables. Either on an individual client initialization:
|
|
88
88
|
```ruby
|
89
89
|
require "google/cloud/bigquery/storage/v1"
|
90
90
|
|
91
|
-
client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new do |config|
|
91
|
+
client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new do |config|
|
92
92
|
config.credentials = "path/to/keyfile.json"
|
93
93
|
end
|
94
94
|
```
|
@@ -98,11 +98,11 @@ Or configured globally for all clients:
|
|
98
98
|
```ruby
|
99
99
|
require "google/cloud/bigquery/storage/v1"
|
100
100
|
|
101
|
-
Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.configure do |config|
|
101
|
+
::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.configure do |config|
|
102
102
|
config.credentials = "path/to/keyfile.json"
|
103
103
|
end
|
104
104
|
|
105
|
-
client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
105
|
+
client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
106
106
|
```
|
107
107
|
|
108
108
|
### Cloud SDK
|
data/README.md
CHANGED
@@ -1,4 +1,4 @@
|
|
1
|
-
# BigQuery Storage V1
|
1
|
+
# Ruby Client for the BigQuery Storage V1 API
|
2
2
|
|
3
3
|
API Client library for the BigQuery Storage V1 API
|
4
4
|
|
@@ -12,6 +12,53 @@ https://github.com/googleapis/google-cloud-ruby
|
|
12
12
|
$ gem install google-cloud-bigquery-storage-v1
|
13
13
|
```
|
14
14
|
|
15
|
+
## Before You Begin
|
16
|
+
|
17
|
+
In order to use this library, you first need to go through the following steps:
|
18
|
+
|
19
|
+
1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
|
20
|
+
1. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
|
21
|
+
1. {file:AUTHENTICATION.md Set up authentication.}
|
22
|
+
|
23
|
+
## Quick Start
|
24
|
+
|
25
|
+
```ruby
|
26
|
+
require "google/cloud/bigquery/storage/v1"
|
27
|
+
|
28
|
+
client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
29
|
+
request = my_create_request
|
30
|
+
response = client.create_read_session request
|
31
|
+
```
|
32
|
+
|
33
|
+
View the [Client Library Documentation](https://googleapis.dev/ruby/google-cloud-bigquery-storage-v1/latest)
|
34
|
+
for class and method documentation.
|
35
|
+
|
36
|
+
## Enabling Logging
|
37
|
+
|
38
|
+
To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
|
39
|
+
The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib/libdoc/logger/rdoc/Logger.html) as shown below,
|
40
|
+
or a [`Google::Cloud::Logging::Logger`](https://googleapis.dev/ruby/google-cloud-logging/latest)
|
41
|
+
that will write logs to [Cloud Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
|
42
|
+
and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
|
43
|
+
|
44
|
+
Configuring a Ruby stdlib logger:
|
45
|
+
|
46
|
+
```ruby
|
47
|
+
require "logger"
|
48
|
+
|
49
|
+
module MyLogger
|
50
|
+
LOGGER = Logger.new $stderr, level: Logger::WARN
|
51
|
+
def logger
|
52
|
+
LOGGER
|
53
|
+
end
|
54
|
+
end
|
55
|
+
|
56
|
+
# Define a gRPC module-level logger method before grpc/logconfig.rb loads.
|
57
|
+
module GRPC
|
58
|
+
extend MyLogger
|
59
|
+
end
|
60
|
+
```
|
61
|
+
|
15
62
|
## Supported Ruby Versions
|
16
63
|
|
17
64
|
This library is supported on Ruby 2.4+.
|
@@ -1 +1,21 @@
|
|
1
|
-
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
# Copyright 2020 Google LLC
|
4
|
+
#
|
5
|
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
6
|
+
# you may not use this file except in compliance with the License.
|
7
|
+
# You may obtain a copy of the License at
|
8
|
+
#
|
9
|
+
# https://www.apache.org/licenses/LICENSE-2.0
|
10
|
+
#
|
11
|
+
# Unless required by applicable law or agreed to in writing, software
|
12
|
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
13
|
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
14
|
+
# See the License for the specific language governing permissions and
|
15
|
+
# limitations under the License.
|
16
|
+
|
17
|
+
# Auto-generated by gapic-generator-ruby. DO NOT EDIT!
|
18
|
+
|
19
|
+
# This gem does not autoload during Bundler.require. To load this gem,
|
20
|
+
# issue explicit require statements for the packages desired, e.g.:
|
21
|
+
# require "google/cloud/bigquery/storage/v1"
|
@@ -17,3 +17,21 @@
|
|
17
17
|
# Auto-generated by gapic-generator-ruby. DO NOT EDIT!
|
18
18
|
|
19
19
|
require "google/cloud/bigquery/storage/v1/big_query_read"
|
20
|
+
require "google/cloud/bigquery/storage/v1/version"
|
21
|
+
|
22
|
+
module Google
|
23
|
+
module Cloud
|
24
|
+
module Bigquery
|
25
|
+
module Storage
|
26
|
+
##
|
27
|
+
# To load this package, including all its services, and instantiate a client:
|
28
|
+
#
|
29
|
+
# require "google/cloud/bigquery/storage/v1"
|
30
|
+
# client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
31
|
+
#
|
32
|
+
module V1
|
33
|
+
end
|
34
|
+
end
|
35
|
+
end
|
36
|
+
end
|
37
|
+
end
|
@@ -16,5 +16,38 @@
|
|
16
16
|
|
17
17
|
# Auto-generated by gapic-generator-ruby. DO NOT EDIT!
|
18
18
|
|
19
|
-
require "
|
19
|
+
require "gapic/common"
|
20
|
+
require "gapic/config"
|
21
|
+
require "gapic/config/method"
|
22
|
+
|
23
|
+
require "google/cloud/bigquery/storage/v1/version"
|
24
|
+
|
20
25
|
require "google/cloud/bigquery/storage/v1/big_query_read/credentials"
|
26
|
+
require "google/cloud/bigquery/storage/v1/big_query_read/paths"
|
27
|
+
require "google/cloud/bigquery/storage/v1/big_query_read/client"
|
28
|
+
|
29
|
+
module Google
|
30
|
+
module Cloud
|
31
|
+
module Bigquery
|
32
|
+
module Storage
|
33
|
+
module V1
|
34
|
+
##
|
35
|
+
# BigQuery Read API.
|
36
|
+
#
|
37
|
+
# The Read API can be used to read data from BigQuery.
|
38
|
+
#
|
39
|
+
# To load this service and instantiate a client:
|
40
|
+
#
|
41
|
+
# require "google/cloud/bigquery/storage/v1/big_query_read"
|
42
|
+
# client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
43
|
+
#
|
44
|
+
module BigQueryRead
|
45
|
+
end
|
46
|
+
end
|
47
|
+
end
|
48
|
+
end
|
49
|
+
end
|
50
|
+
end
|
51
|
+
|
52
|
+
helper_path = ::File.join __dir__, "big_query_read", "helpers.rb"
|
53
|
+
require "google/cloud/bigquery/storage/v1/big_query_read/helpers" if ::File.file? helper_path
|
@@ -16,15 +16,8 @@
|
|
16
16
|
|
17
17
|
# Auto-generated by gapic-generator-ruby. DO NOT EDIT!
|
18
18
|
|
19
|
-
require "gapic/common"
|
20
|
-
require "gapic/config"
|
21
|
-
require "gapic/config/method"
|
22
|
-
|
23
19
|
require "google/cloud/errors"
|
24
|
-
require "google/cloud/bigquery/storage/v1/version"
|
25
20
|
require "google/cloud/bigquery/storage/v1/storage_pb"
|
26
|
-
require "google/cloud/bigquery/storage/v1/big_query_read/credentials"
|
27
|
-
require "google/cloud/bigquery/storage/v1/big_query_read/paths"
|
28
21
|
|
29
22
|
module Google
|
30
23
|
module Cloud
|
@@ -48,15 +41,15 @@ module Google
|
|
48
41
|
##
|
49
42
|
# Configure the BigQueryRead Client class.
|
50
43
|
#
|
51
|
-
# See {Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client::Configuration}
|
44
|
+
# See {::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client::Configuration}
|
52
45
|
# for a description of the configuration fields.
|
53
46
|
#
|
54
47
|
# ## Example
|
55
48
|
#
|
56
49
|
# To modify the configuration for all BigQueryRead clients:
|
57
50
|
#
|
58
|
-
# Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.configure do |config|
|
59
|
-
# config.timeout =
|
51
|
+
# ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.configure do |config|
|
52
|
+
# config.timeout = 10.0
|
60
53
|
# end
|
61
54
|
#
|
62
55
|
# @yield [config] Configure the Client client.
|
@@ -112,7 +105,7 @@ module Google
|
|
112
105
|
# but structural changes (adding new fields, etc.) are not allowed. Structural changes
|
113
106
|
# should be made on {Client.configure}.
|
114
107
|
#
|
115
|
-
# See {Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client::Configuration}
|
108
|
+
# See {::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client::Configuration}
|
116
109
|
# for a description of the configuration fields.
|
117
110
|
#
|
118
111
|
# @yield [config] Configure the Client client.
|
@@ -133,13 +126,13 @@ module Google
|
|
133
126
|
# To create a new BigQueryRead client with the default
|
134
127
|
# configuration:
|
135
128
|
#
|
136
|
-
# client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
129
|
+
# client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
|
137
130
|
#
|
138
131
|
# To create a new BigQueryRead client with a custom
|
139
132
|
# configuration:
|
140
133
|
#
|
141
|
-
# client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new do |config|
|
142
|
-
# config.timeout =
|
134
|
+
# client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new do |config|
|
135
|
+
# config.timeout = 10.0
|
143
136
|
# end
|
144
137
|
#
|
145
138
|
# @yield [config] Configure the BigQueryRead client.
|
@@ -164,10 +157,11 @@ module Google
|
|
164
157
|
if credentials.is_a?(String) || credentials.is_a?(Hash)
|
165
158
|
credentials = Credentials.new credentials, scope: @config.scope
|
166
159
|
end
|
167
|
-
@quota_project_id =
|
160
|
+
@quota_project_id = @config.quota_project
|
161
|
+
@quota_project_id ||= credentials.quota_project_id if credentials.respond_to? :quota_project_id
|
168
162
|
|
169
|
-
@big_query_read_stub = Gapic::ServiceStub.new(
|
170
|
-
Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Stub,
|
163
|
+
@big_query_read_stub = ::Gapic::ServiceStub.new(
|
164
|
+
::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Stub,
|
171
165
|
credentials: credentials,
|
172
166
|
endpoint: @config.endpoint,
|
173
167
|
channel_args: @config.channel_args,
|
@@ -199,36 +193,26 @@ module Google
|
|
199
193
|
# not require manual clean-up by the caller.
|
200
194
|
#
|
201
195
|
# @overload create_read_session(request, options = nil)
|
202
|
-
#
|
203
|
-
#
|
204
|
-
#
|
205
|
-
#
|
206
|
-
#
|
207
|
-
#
|
208
|
-
#
|
209
|
-
# A particular row can be read by at most one stream. When the caller has
|
210
|
-
# reached the end of each stream in the session, then all the data in the
|
211
|
-
# table has been read.
|
212
|
-
#
|
213
|
-
# Data is assigned to each stream such that roughly the same number of
|
214
|
-
# rows can be read from each stream. Because the server-side unit for
|
215
|
-
# assigning data is collections of rows, the API does not guarantee that
|
216
|
-
# each stream will return the same number or rows. Additionally, the
|
217
|
-
# limits are enforced based on the number of pre-filtered rows, so some
|
218
|
-
# filters can lead to lopsided assignments.
|
219
|
-
#
|
220
|
-
# Read sessions automatically expire 24 hours after they are created and do
|
221
|
-
# not require manual clean-up by the caller.
|
222
|
-
# @param options [Gapic::CallOptions, Hash]
|
196
|
+
# Pass arguments to `create_read_session` via a request object, either of type
|
197
|
+
# {::Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest} or an equivalent Hash.
|
198
|
+
#
|
199
|
+
# @param request [::Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest, ::Hash]
|
200
|
+
# A request object representing the call parameters. Required. To specify no
|
201
|
+
# parameters, or to keep all the default parameter values, pass an empty Hash.
|
202
|
+
# @param options [::Gapic::CallOptions, ::Hash]
|
223
203
|
# Overrides the default settings for this call, e.g, timeout, retries, etc. Optional.
|
224
204
|
#
|
225
205
|
# @overload create_read_session(parent: nil, read_session: nil, max_stream_count: nil)
|
226
|
-
#
|
206
|
+
# Pass arguments to `create_read_session` via keyword arguments. Note that at
|
207
|
+
# least one keyword argument is required. To specify no parameters, or to keep all
|
208
|
+
# the default parameter values, pass an empty Hash as a request object (see above).
|
209
|
+
#
|
210
|
+
# @param parent [::String]
|
227
211
|
# Required. The request project that owns the session, in the form of
|
228
212
|
# `projects/{project_id}`.
|
229
|
-
# @param read_session [Google::Cloud::Bigquery::Storage::V1::ReadSession
|
213
|
+
# @param read_session [::Google::Cloud::Bigquery::Storage::V1::ReadSession, ::Hash]
|
230
214
|
# Required. Session to be created.
|
231
|
-
# @param max_stream_count [Integer]
|
215
|
+
# @param max_stream_count [::Integer]
|
232
216
|
# Max initial number of streams. If unset or zero, the server will
|
233
217
|
# provide a value of streams so as to produce reasonable throughput. Must be
|
234
218
|
# non-negative. The number of streams may be lower than the requested number,
|
@@ -238,28 +222,27 @@ module Google
|
|
238
222
|
#
|
239
223
|
# Streams must be read starting from offset 0.
|
240
224
|
#
|
241
|
-
#
|
242
225
|
# @yield [response, operation] Access the result along with the RPC operation
|
243
|
-
# @yieldparam response [Google::Cloud::Bigquery::Storage::V1::ReadSession]
|
244
|
-
# @yieldparam operation [GRPC::ActiveCall::Operation]
|
226
|
+
# @yieldparam response [::Google::Cloud::Bigquery::Storage::V1::ReadSession]
|
227
|
+
# @yieldparam operation [::GRPC::ActiveCall::Operation]
|
245
228
|
#
|
246
|
-
# @return [Google::Cloud::Bigquery::Storage::V1::ReadSession]
|
229
|
+
# @return [::Google::Cloud::Bigquery::Storage::V1::ReadSession]
|
247
230
|
#
|
248
|
-
# @raise [Google::Cloud::Error] if the RPC is aborted.
|
231
|
+
# @raise [::Google::Cloud::Error] if the RPC is aborted.
|
249
232
|
#
|
250
233
|
def create_read_session request, options = nil
|
251
|
-
raise ArgumentError, "request must be provided" if request.nil?
|
234
|
+
raise ::ArgumentError, "request must be provided" if request.nil?
|
252
235
|
|
253
|
-
request = Gapic::Protobuf.coerce request, to: Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest
|
236
|
+
request = ::Gapic::Protobuf.coerce request, to: ::Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest
|
254
237
|
|
255
238
|
# Converts hash and nil to an options object
|
256
|
-
options = Gapic::CallOptions.new(**options.to_h) if options.respond_to? :to_h
|
239
|
+
options = ::Gapic::CallOptions.new(**options.to_h) if options.respond_to? :to_h
|
257
240
|
|
258
241
|
# Customize the options with defaults
|
259
242
|
metadata = @config.rpcs.create_read_session.metadata.to_h
|
260
243
|
|
261
244
|
# Set x-goog-api-client and x-goog-user-project headers
|
262
|
-
metadata[:"x-goog-api-client"] ||= Gapic::Headers.x_goog_api_client \
|
245
|
+
metadata[:"x-goog-api-client"] ||= ::Gapic::Headers.x_goog_api_client \
|
263
246
|
lib_name: @config.lib_name, lib_version: @config.lib_version,
|
264
247
|
gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
|
265
248
|
metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
|
@@ -280,8 +263,8 @@ module Google
|
|
280
263
|
yield response, operation if block_given?
|
281
264
|
return response
|
282
265
|
end
|
283
|
-
rescue GRPC::BadStatus => e
|
284
|
-
raise Google::Cloud::Error.from_error(e)
|
266
|
+
rescue ::GRPC::BadStatus => e
|
267
|
+
raise ::Google::Cloud::Error.from_error(e)
|
285
268
|
end
|
286
269
|
|
287
270
|
##
|
@@ -294,47 +277,48 @@ module Google
|
|
294
277
|
# state of the stream.
|
295
278
|
#
|
296
279
|
# @overload read_rows(request, options = nil)
|
297
|
-
#
|
298
|
-
#
|
299
|
-
#
|
300
|
-
#
|
301
|
-
#
|
302
|
-
#
|
303
|
-
#
|
304
|
-
# state of the stream.
|
305
|
-
# @param options [Gapic::CallOptions, Hash]
|
280
|
+
# Pass arguments to `read_rows` via a request object, either of type
|
281
|
+
# {::Google::Cloud::Bigquery::Storage::V1::ReadRowsRequest} or an equivalent Hash.
|
282
|
+
#
|
283
|
+
# @param request [::Google::Cloud::Bigquery::Storage::V1::ReadRowsRequest, ::Hash]
|
284
|
+
# A request object representing the call parameters. Required. To specify no
|
285
|
+
# parameters, or to keep all the default parameter values, pass an empty Hash.
|
286
|
+
# @param options [::Gapic::CallOptions, ::Hash]
|
306
287
|
# Overrides the default settings for this call, e.g, timeout, retries, etc. Optional.
|
307
288
|
#
|
308
289
|
# @overload read_rows(read_stream: nil, offset: nil)
|
309
|
-
#
|
290
|
+
# Pass arguments to `read_rows` via keyword arguments. Note that at
|
291
|
+
# least one keyword argument is required. To specify no parameters, or to keep all
|
292
|
+
# the default parameter values, pass an empty Hash as a request object (see above).
|
293
|
+
#
|
294
|
+
# @param read_stream [::String]
|
310
295
|
# Required. Stream to read rows from.
|
311
|
-
# @param offset [Integer]
|
296
|
+
# @param offset [::Integer]
|
312
297
|
# The offset requested must be less than the last row read from Read.
|
313
298
|
# Requesting a larger offset is undefined. If not specified, start reading
|
314
299
|
# from offset zero.
|
315
300
|
#
|
316
|
-
#
|
317
301
|
# @yield [response, operation] Access the result along with the RPC operation
|
318
|
-
# @yieldparam response [Enumerable
|
319
|
-
# @yieldparam operation [GRPC::ActiveCall::Operation]
|
302
|
+
# @yieldparam response [::Enumerable<::Google::Cloud::Bigquery::Storage::V1::ReadRowsResponse>]
|
303
|
+
# @yieldparam operation [::GRPC::ActiveCall::Operation]
|
320
304
|
#
|
321
|
-
# @return [Enumerable
|
305
|
+
# @return [::Enumerable<::Google::Cloud::Bigquery::Storage::V1::ReadRowsResponse>]
|
322
306
|
#
|
323
|
-
# @raise [Google::Cloud::Error] if the RPC is aborted.
|
307
|
+
# @raise [::Google::Cloud::Error] if the RPC is aborted.
|
324
308
|
#
|
325
309
|
def read_rows request, options = nil
|
326
|
-
raise ArgumentError, "request must be provided" if request.nil?
|
310
|
+
raise ::ArgumentError, "request must be provided" if request.nil?
|
327
311
|
|
328
|
-
request = Gapic::Protobuf.coerce request, to: Google::Cloud::Bigquery::Storage::V1::ReadRowsRequest
|
312
|
+
request = ::Gapic::Protobuf.coerce request, to: ::Google::Cloud::Bigquery::Storage::V1::ReadRowsRequest
|
329
313
|
|
330
314
|
# Converts hash and nil to an options object
|
331
|
-
options = Gapic::CallOptions.new(**options.to_h) if options.respond_to? :to_h
|
315
|
+
options = ::Gapic::CallOptions.new(**options.to_h) if options.respond_to? :to_h
|
332
316
|
|
333
317
|
# Customize the options with defaults
|
334
318
|
metadata = @config.rpcs.read_rows.metadata.to_h
|
335
319
|
|
336
320
|
# Set x-goog-api-client and x-goog-user-project headers
|
337
|
-
metadata[:"x-goog-api-client"] ||= Gapic::Headers.x_goog_api_client \
|
321
|
+
metadata[:"x-goog-api-client"] ||= ::Gapic::Headers.x_goog_api_client \
|
338
322
|
lib_name: @config.lib_name, lib_version: @config.lib_version,
|
339
323
|
gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
|
340
324
|
metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
|
@@ -355,8 +339,8 @@ module Google
|
|
355
339
|
yield response, operation if block_given?
|
356
340
|
return response
|
357
341
|
end
|
358
|
-
rescue GRPC::BadStatus => e
|
359
|
-
raise Google::Cloud::Error.from_error(e)
|
342
|
+
rescue ::GRPC::BadStatus => e
|
343
|
+
raise ::Google::Cloud::Error.from_error(e)
|
360
344
|
end
|
361
345
|
|
362
346
|
##
|
@@ -374,26 +358,23 @@ module Google
|
|
374
358
|
# completion.
|
375
359
|
#
|
376
360
|
# @overload split_read_stream(request, options = nil)
|
377
|
-
#
|
378
|
-
#
|
379
|
-
#
|
380
|
-
#
|
381
|
-
#
|
382
|
-
#
|
383
|
-
#
|
384
|
-
#
|
385
|
-
# Moreover, the two child streams will be allocated back-to-back in the
|
386
|
-
# original `ReadStream`. Concretely, it is guaranteed that for streams
|
387
|
-
# original, primary, and residual, that original[0-j] = primary[0-j] and
|
388
|
-
# original[j-n] = residual[0-m] once the streams have been read to
|
389
|
-
# completion.
|
390
|
-
# @param options [Gapic::CallOptions, Hash]
|
361
|
+
# Pass arguments to `split_read_stream` via a request object, either of type
|
362
|
+
# {::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamRequest} or an equivalent Hash.
|
363
|
+
#
|
364
|
+
# @param request [::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamRequest, ::Hash]
|
365
|
+
# A request object representing the call parameters. Required. To specify no
|
366
|
+
# parameters, or to keep all the default parameter values, pass an empty Hash.
|
367
|
+
# @param options [::Gapic::CallOptions, ::Hash]
|
391
368
|
# Overrides the default settings for this call, e.g, timeout, retries, etc. Optional.
|
392
369
|
#
|
393
370
|
# @overload split_read_stream(name: nil, fraction: nil)
|
394
|
-
#
|
371
|
+
# Pass arguments to `split_read_stream` via keyword arguments. Note that at
|
372
|
+
# least one keyword argument is required. To specify no parameters, or to keep all
|
373
|
+
# the default parameter values, pass an empty Hash as a request object (see above).
|
374
|
+
#
|
375
|
+
# @param name [::String]
|
395
376
|
# Required. Name of the stream to split.
|
396
|
-
# @param fraction [Float]
|
377
|
+
# @param fraction [::Float]
|
397
378
|
# A value in the range (0.0, 1.0) that specifies the fractional point at
|
398
379
|
# which the original stream should be split. The actual split point is
|
399
380
|
# evaluated on pre-filtered rows, so if a filter is provided, then there is
|
@@ -402,28 +383,27 @@ module Google
|
|
402
383
|
# server-side unit for assigning data is collections of rows, this fraction
|
403
384
|
# will always map to a data storage boundary on the server side.
|
404
385
|
#
|
405
|
-
#
|
406
386
|
# @yield [response, operation] Access the result along with the RPC operation
|
407
|
-
# @yieldparam response [Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse]
|
408
|
-
# @yieldparam operation [GRPC::ActiveCall::Operation]
|
387
|
+
# @yieldparam response [::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse]
|
388
|
+
# @yieldparam operation [::GRPC::ActiveCall::Operation]
|
409
389
|
#
|
410
|
-
# @return [Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse]
|
390
|
+
# @return [::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse]
|
411
391
|
#
|
412
|
-
# @raise [Google::Cloud::Error] if the RPC is aborted.
|
392
|
+
# @raise [::Google::Cloud::Error] if the RPC is aborted.
|
413
393
|
#
|
414
394
|
def split_read_stream request, options = nil
|
415
|
-
raise ArgumentError, "request must be provided" if request.nil?
|
395
|
+
raise ::ArgumentError, "request must be provided" if request.nil?
|
416
396
|
|
417
|
-
request = Gapic::Protobuf.coerce request, to: Google::Cloud::Bigquery::Storage::V1::SplitReadStreamRequest
|
397
|
+
request = ::Gapic::Protobuf.coerce request, to: ::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamRequest
|
418
398
|
|
419
399
|
# Converts hash and nil to an options object
|
420
|
-
options = Gapic::CallOptions.new(**options.to_h) if options.respond_to? :to_h
|
400
|
+
options = ::Gapic::CallOptions.new(**options.to_h) if options.respond_to? :to_h
|
421
401
|
|
422
402
|
# Customize the options with defaults
|
423
403
|
metadata = @config.rpcs.split_read_stream.metadata.to_h
|
424
404
|
|
425
405
|
# Set x-goog-api-client and x-goog-user-project headers
|
426
|
-
metadata[:"x-goog-api-client"] ||= Gapic::Headers.x_goog_api_client \
|
406
|
+
metadata[:"x-goog-api-client"] ||= ::Gapic::Headers.x_goog_api_client \
|
427
407
|
lib_name: @config.lib_name, lib_version: @config.lib_version,
|
428
408
|
gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
|
429
409
|
metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
|
@@ -444,8 +424,8 @@ module Google
|
|
444
424
|
yield response, operation if block_given?
|
445
425
|
return response
|
446
426
|
end
|
447
|
-
rescue GRPC::BadStatus => e
|
448
|
-
raise Google::Cloud::Error.from_error(e)
|
427
|
+
rescue ::GRPC::BadStatus => e
|
428
|
+
raise ::Google::Cloud::Error.from_error(e)
|
449
429
|
end
|
450
430
|
|
451
431
|
##
|
@@ -455,7 +435,7 @@ module Google
|
|
455
435
|
# providing control over timeouts, retry behavior, logging, transport
|
456
436
|
# parameters, and other low-level controls. Certain parameters can also be
|
457
437
|
# applied individually to specific RPCs. See
|
458
|
-
# {Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client::Configuration::Rpcs}
|
438
|
+
# {::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client::Configuration::Rpcs}
|
459
439
|
# for a list of RPCs that can be configured independently.
|
460
440
|
#
|
461
441
|
# Configuration can be applied globally to all clients, or to a single client
|
@@ -466,22 +446,22 @@ module Google
|
|
466
446
|
# To modify the global config, setting the timeout for create_read_session
|
467
447
|
# to 20 seconds, and all remaining timeouts to 10 seconds:
|
468
448
|
#
|
469
|
-
# Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.configure do |config|
|
470
|
-
# config.timeout =
|
471
|
-
# config.rpcs.create_read_session.timeout =
|
449
|
+
# ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.configure do |config|
|
450
|
+
# config.timeout = 10.0
|
451
|
+
# config.rpcs.create_read_session.timeout = 20.0
|
472
452
|
# end
|
473
453
|
#
|
474
454
|
# To apply the above configuration only to a new client:
|
475
455
|
#
|
476
|
-
# client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new do |config|
|
477
|
-
# config.timeout =
|
478
|
-
# config.rpcs.create_read_session.timeout =
|
456
|
+
# client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new do |config|
|
457
|
+
# config.timeout = 10.0
|
458
|
+
# config.rpcs.create_read_session.timeout = 20.0
|
479
459
|
# end
|
480
460
|
#
|
481
461
|
# @!attribute [rw] endpoint
|
482
462
|
# The hostname or hostname:port of the service endpoint.
|
483
463
|
# Defaults to `"bigquerystorage.googleapis.com"`.
|
484
|
-
# @return [String]
|
464
|
+
# @return [::String]
|
485
465
|
# @!attribute [rw] credentials
|
486
466
|
# Credentials to send with calls. You may provide any of the following types:
|
487
467
|
# * (`String`) The path to a service account key file in JSON format
|
@@ -493,29 +473,29 @@ module Google
|
|
493
473
|
# * (`GRPC::Core::Channel`) a gRPC channel with included credentials
|
494
474
|
# * (`GRPC::Core::ChannelCredentials`) a gRPC credentails object
|
495
475
|
# * (`nil`) indicating no credentials
|
496
|
-
# @return [Object]
|
476
|
+
# @return [::Object]
|
497
477
|
# @!attribute [rw] scope
|
498
478
|
# The OAuth scopes
|
499
|
-
# @return [Array
|
479
|
+
# @return [::Array<::String>]
|
500
480
|
# @!attribute [rw] lib_name
|
501
481
|
# The library name as recorded in instrumentation and logging
|
502
|
-
# @return [String]
|
482
|
+
# @return [::String]
|
503
483
|
# @!attribute [rw] lib_version
|
504
484
|
# The library version as recorded in instrumentation and logging
|
505
|
-
# @return [String]
|
485
|
+
# @return [::String]
|
506
486
|
# @!attribute [rw] channel_args
|
507
487
|
# Extra parameters passed to the gRPC channel. Note: this is ignored if a
|
508
488
|
# `GRPC::Core::Channel` object is provided as the credential.
|
509
|
-
# @return [Hash]
|
489
|
+
# @return [::Hash]
|
510
490
|
# @!attribute [rw] interceptors
|
511
491
|
# An array of interceptors that are run before calls are executed.
|
512
|
-
# @return [Array
|
492
|
+
# @return [::Array<::GRPC::ClientInterceptor>]
|
513
493
|
# @!attribute [rw] timeout
|
514
|
-
# The call timeout in
|
515
|
-
# @return [Numeric]
|
494
|
+
# The call timeout in seconds.
|
495
|
+
# @return [::Numeric]
|
516
496
|
# @!attribute [rw] metadata
|
517
497
|
# Additional gRPC headers to be sent with the call.
|
518
|
-
# @return [Hash{Symbol
|
498
|
+
# @return [::Hash{::Symbol=>::String}]
|
519
499
|
# @!attribute [rw] retry_policy
|
520
500
|
# The retry policy. The value is a hash with the following keys:
|
521
501
|
# * `:initial_delay` (*type:* `Numeric`) - The initial delay in seconds.
|
@@ -523,25 +503,29 @@ module Google
|
|
523
503
|
# * `:multiplier` (*type:* `Numeric`) - The incremental backoff multiplier.
|
524
504
|
# * `:retry_codes` (*type:* `Array<String>`) - The error codes that should
|
525
505
|
# trigger a retry.
|
526
|
-
# @return [Hash]
|
506
|
+
# @return [::Hash]
|
507
|
+
# @!attribute [rw] quota_project
|
508
|
+
# A separate project against which to charge quota.
|
509
|
+
# @return [::String]
|
527
510
|
#
|
528
511
|
class Configuration
|
529
|
-
extend Gapic::Config
|
512
|
+
extend ::Gapic::Config
|
530
513
|
|
531
|
-
config_attr :endpoint,
|
532
|
-
config_attr :credentials,
|
514
|
+
config_attr :endpoint, "bigquerystorage.googleapis.com", ::String
|
515
|
+
config_attr :credentials, nil do |value|
|
533
516
|
allowed = [::String, ::Hash, ::Proc, ::Google::Auth::Credentials, ::Signet::OAuth2::Client, nil]
|
534
517
|
allowed += [::GRPC::Core::Channel, ::GRPC::Core::ChannelCredentials] if defined? ::GRPC
|
535
518
|
allowed.any? { |klass| klass === value }
|
536
519
|
end
|
537
|
-
config_attr :scope,
|
538
|
-
config_attr :lib_name,
|
539
|
-
config_attr :lib_version,
|
540
|
-
config_attr(:channel_args,
|
541
|
-
config_attr :interceptors,
|
542
|
-
config_attr :timeout,
|
543
|
-
config_attr :metadata,
|
544
|
-
config_attr :retry_policy,
|
520
|
+
config_attr :scope, nil, ::String, ::Array, nil
|
521
|
+
config_attr :lib_name, nil, ::String, nil
|
522
|
+
config_attr :lib_version, nil, ::String, nil
|
523
|
+
config_attr(:channel_args, { "grpc.service_config_disable_resolution"=>1 }, ::Hash, nil)
|
524
|
+
config_attr :interceptors, nil, ::Array, nil
|
525
|
+
config_attr :timeout, nil, ::Numeric, nil
|
526
|
+
config_attr :metadata, nil, ::Hash, nil
|
527
|
+
config_attr :retry_policy, nil, ::Hash, ::Proc, nil
|
528
|
+
config_attr :quota_project, nil, ::String, nil
|
545
529
|
|
546
530
|
# @private
|
547
531
|
def initialize parent_config = nil
|
@@ -557,7 +541,7 @@ module Google
|
|
557
541
|
def rpcs
|
558
542
|
@rpcs ||= begin
|
559
543
|
parent_rpcs = nil
|
560
|
-
parent_rpcs = @parent_config.rpcs if @parent_config&.respond_to?
|
544
|
+
parent_rpcs = @parent_config.rpcs if defined?(@parent_config) && @parent_config&.respond_to?(:rpcs)
|
561
545
|
Rpcs.new parent_rpcs
|
562
546
|
end
|
563
547
|
end
|
@@ -582,28 +566,28 @@ module Google
|
|
582
566
|
class Rpcs
|
583
567
|
##
|
584
568
|
# RPC-specific configuration for `create_read_session`
|
585
|
-
# @return [Gapic::Config::Method]
|
569
|
+
# @return [::Gapic::Config::Method]
|
586
570
|
#
|
587
571
|
attr_reader :create_read_session
|
588
572
|
##
|
589
573
|
# RPC-specific configuration for `read_rows`
|
590
|
-
# @return [Gapic::Config::Method]
|
574
|
+
# @return [::Gapic::Config::Method]
|
591
575
|
#
|
592
576
|
attr_reader :read_rows
|
593
577
|
##
|
594
578
|
# RPC-specific configuration for `split_read_stream`
|
595
|
-
# @return [Gapic::Config::Method]
|
579
|
+
# @return [::Gapic::Config::Method]
|
596
580
|
#
|
597
581
|
attr_reader :split_read_stream
|
598
582
|
|
599
583
|
# @private
|
600
584
|
def initialize parent_rpcs = nil
|
601
585
|
create_read_session_config = parent_rpcs&.create_read_session if parent_rpcs&.respond_to? :create_read_session
|
602
|
-
@create_read_session = Gapic::Config::Method.new create_read_session_config
|
586
|
+
@create_read_session = ::Gapic::Config::Method.new create_read_session_config
|
603
587
|
read_rows_config = parent_rpcs&.read_rows if parent_rpcs&.respond_to? :read_rows
|
604
|
-
@read_rows = Gapic::Config::Method.new read_rows_config
|
588
|
+
@read_rows = ::Gapic::Config::Method.new read_rows_config
|
605
589
|
split_read_stream_config = parent_rpcs&.split_read_stream if parent_rpcs&.respond_to? :split_read_stream
|
606
|
-
@split_read_stream = Gapic::Config::Method.new split_read_stream_config
|
590
|
+
@split_read_stream = ::Gapic::Config::Method.new split_read_stream_config
|
607
591
|
|
608
592
|
yield self if block_given?
|
609
593
|
end
|
@@ -616,13 +600,3 @@ module Google
|
|
616
600
|
end
|
617
601
|
end
|
618
602
|
end
|
619
|
-
|
620
|
-
# rubocop:disable Lint/HandleExceptions
|
621
|
-
|
622
|
-
# Once client is loaded, load helpers.rb if it exists.
|
623
|
-
begin
|
624
|
-
require "google/cloud/bigquery/storage/v1/big_query_read/helpers"
|
625
|
-
rescue LoadError
|
626
|
-
end
|
627
|
-
|
628
|
-
# rubocop:enable Lint/HandleExceptions
|