google-cloud-bigquery-data_transfer 0.1.0 → 0.2.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4eccdec713eb8c3f2203e1cb9d98157a78cc3e003276ce59ce740432d8f6c00c
4
- data.tar.gz: 70f8882c9267311213fc51018a8231f8fbdb58858eb1d6fba536929f5342ffa9
3
+ metadata.gz: e913d55b6a99f53b0b0eeae6c30e160e8776ffb6e5dc711647c78b2a9a7a41d2
4
+ data.tar.gz: 35d47eacbdd9f8fde9997dd8bd6b46f505670237c517e3085be354f1718ebb4a
5
5
  SHA512:
6
- metadata.gz: 80242f7fdaee23c7e197de2b27f988e3ec432f8729733066f15c02102f1caaa85c552f4059f1a729781ead6d3e5bec9c17adba8c15d00d2c3d9775ddfcb1729c
7
- data.tar.gz: 53f743b7de0b86f0a83eabc3b319b2832e1cd10b50ed21c6280c27abcb1c2f4e650800f149af9d1a2f5ee5fe8e12ba3c1f39c2fb69b4b7c66d32b9856e382254
6
+ metadata.gz: a5a7e334f95d4b0d206896237533b0caa3095d8694d3454b0452a3661893699a2359e215b8808e9642ad715f288007878f9650ed85a0db0f39bb38f451e489d4
7
+ data.tar.gz: 4d5ccb101655e0a1549dc9e02c7f6eee69f2c8c566a55dc8980d841ecf7edb01e9122be219b84d5fa85a93fd4daadd543c720b714e6e5645ad94bc82899e7664
data/README.md CHANGED
@@ -51,4 +51,41 @@ end
51
51
  to see the full list of Cloud APIs that we cover.
52
52
 
53
53
  [Client Library Documentation]: https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-bigquery-data_transfer/latest/google/cloud/bigquery/datatransfer/v1
54
- [Product Documentation]: https://cloud.google.com/bigquerydatatransfer
54
+ [Product Documentation]: https://cloud.google.com/bigquery/transfer/
55
+
56
+ ## Enabling Logging
57
+
58
+ To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
59
+ The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib-2.5.0/libdoc/logger/rdoc/Logger.html) as shown below,
60
+ or a [`Google::Cloud::Logging::Logger`](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
61
+ that will write logs to [Stackdriver Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
62
+ and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
63
+
64
+ Configuring a Ruby stdlib logger:
65
+
66
+ ```ruby
67
+ require "logger"
68
+
69
+ module MyLogger
70
+ LOGGER = Logger.new $stderr, level: Logger::WARN
71
+ def logger
72
+ LOGGER
73
+ end
74
+ end
75
+
76
+ # Define a gRPC module-level logger method before grpc/logconfig.rb loads.
77
+ module GRPC
78
+ extend MyLogger
79
+ end
80
+ ```
81
+
82
+ ## Supported Ruby Versions
83
+
84
+ This library is supported on Ruby 2.3+.
85
+
86
+ Google provides official support for Ruby versions that are actively supported
87
+ by Ruby Core—that is, Ruby versions that are either in normal maintenance or
88
+ in security maintenance, and not end of life. Currently, this means Ruby 2.3
89
+ and later. Older versions of Ruby _may_ still work, but are unsupported and not
90
+ recommended. See https://www.ruby-lang.org/en/downloads/branches/ for details
91
+ about the Ruby support schedule.
@@ -67,6 +67,31 @@ module Google
67
67
  #
68
68
  # [Product Documentation]: https://cloud.google.com/bigquerydatatransfer
69
69
  #
70
+ # ## Enabling Logging
71
+ #
72
+ # To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
73
+ # The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib-2.5.0/libdoc/logger/rdoc/Logger.html) as shown below,
74
+ # or a [`Google::Cloud::Logging::Logger`](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
75
+ # that will write logs to [Stackdriver Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
76
+ # and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
77
+ #
78
+ # Configuring a Ruby stdlib logger:
79
+ #
80
+ # ```ruby
81
+ # require "logger"
82
+ #
83
+ # module MyLogger
84
+ # LOGGER = Logger.new $stderr, level: Logger::WARN
85
+ # def logger
86
+ # LOGGER
87
+ # end
88
+ # end
89
+ #
90
+ # # Define a gRPC module-level logger method before grpc/logconfig.rb loads.
91
+ # module GRPC
92
+ # extend MyLogger
93
+ # end
94
+ # ```
70
95
  #
71
96
  module DataTransfer
72
97
  # rubocop:enable LineLength
@@ -113,6 +138,11 @@ module Google
113
138
  # or the specified config is missing data points.
114
139
  # @param timeout [Numeric]
115
140
  # The default timeout, in seconds, for calls made through this client.
141
+ # @param metadata [Hash]
142
+ # Default metadata to be sent with each request. This can be overridden on a per call basis.
143
+ # @param exception_transformer [Proc]
144
+ # An optional proc that intercepts any exceptions raised during an API call to inject
145
+ # custom error handling.
116
146
  def self.new(*args, version: :v1, **kwargs)
117
147
  unless AVAILABLE_VERSIONS.include?(version.to_s.downcase)
118
148
  raise "The version: #{version} is not available. The available versions " \
@@ -17,57 +17,82 @@ require "google/cloud/bigquery/data_transfer/v1/data_transfer_service_client"
17
17
  module Google
18
18
  module Cloud
19
19
  module Bigquery
20
- # rubocop:disable LineLength
21
-
22
- ##
23
- # # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/GoogleCloudPlatform/google-cloud-ruby#versioning))
24
- #
25
- # [BigQuery Data Transfer API][Product Documentation]:
26
- # Transfers data from partner SaaS applications to Google BigQuery on a
27
- # scheduled, managed basis.
28
- # - [Product Documentation][]
29
- #
30
- # ## Quick Start
31
- # In order to use this library, you first need to go through the following
32
- # steps:
33
- #
34
- # 1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
35
- # 2. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
36
- # 3. [Enable the BigQuery Data Transfer API.](https://console.cloud.google.com/apis/api/bigquerydatatransfer)
37
- # 4. [Setup Authentication.](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
38
- #
39
- # ### Preview
40
- # #### DataTransferServiceClient
41
- # ```rb
42
- # require "google/cloud/bigquery/data_transfer/v1"
43
- #
44
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
45
- # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path(project_id)
46
- #
47
- # # Iterate over all results.
48
- # data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
49
- # # Process element.
50
- # end
51
- #
52
- # # Or iterate over results one page at a time.
53
- # data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
54
- # # Process each page at a time.
55
- # page.each do |element|
56
- # # Process element.
57
- # end
58
- # end
59
- # ```
60
- #
61
- # ### Next Steps
62
- # - Read the [BigQuery Data Transfer API Product documentation][Product Documentation]
63
- # to learn more about the product and see How-to Guides.
64
- # - View this [repository's main README](https://github.com/GoogleCloudPlatform/google-cloud-ruby/blob/master/README.md)
65
- # to see the full list of Cloud APIs that we cover.
66
- #
67
- # [Product Documentation]: https://cloud.google.com/bigquerydatatransfer
68
- #
69
- #
70
20
  module DataTransfer
21
+ # rubocop:disable LineLength
22
+
23
+ ##
24
+ # # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/GoogleCloudPlatform/google-cloud-ruby#versioning))
25
+ #
26
+ # [BigQuery Data Transfer API][Product Documentation]:
27
+ # Transfers data from partner SaaS applications to Google BigQuery on a
28
+ # scheduled, managed basis.
29
+ # - [Product Documentation][]
30
+ #
31
+ # ## Quick Start
32
+ # In order to use this library, you first need to go through the following
33
+ # steps:
34
+ #
35
+ # 1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
36
+ # 2. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
37
+ # 3. [Enable the BigQuery Data Transfer API.](https://console.cloud.google.com/apis/api/bigquerydatatransfer)
38
+ # 4. [Setup Authentication.](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
39
+ #
40
+ # ### Preview
41
+ # #### DataTransferServiceClient
42
+ # ```rb
43
+ # require "google/cloud/bigquery/data_transfer"
44
+ #
45
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
46
+ # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path(project_id)
47
+ #
48
+ # # Iterate over all results.
49
+ # data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
50
+ # # Process element.
51
+ # end
52
+ #
53
+ # # Or iterate over results one page at a time.
54
+ # data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
55
+ # # Process each page at a time.
56
+ # page.each do |element|
57
+ # # Process element.
58
+ # end
59
+ # end
60
+ # ```
61
+ #
62
+ # ### Next Steps
63
+ # - Read the [BigQuery Data Transfer API Product documentation][Product Documentation]
64
+ # to learn more about the product and see How-to Guides.
65
+ # - View this [repository's main README](https://github.com/GoogleCloudPlatform/google-cloud-ruby/blob/master/README.md)
66
+ # to see the full list of Cloud APIs that we cover.
67
+ #
68
+ # [Product Documentation]: https://cloud.google.com/bigquerydatatransfer
69
+ #
70
+ # ## Enabling Logging
71
+ #
72
+ # To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
73
+ # The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib-2.5.0/libdoc/logger/rdoc/Logger.html) as shown below,
74
+ # or a [`Google::Cloud::Logging::Logger`](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
75
+ # that will write logs to [Stackdriver Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
76
+ # and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
77
+ #
78
+ # Configuring a Ruby stdlib logger:
79
+ #
80
+ # ```ruby
81
+ # require "logger"
82
+ #
83
+ # module MyLogger
84
+ # LOGGER = Logger.new $stderr, level: Logger::WARN
85
+ # def logger
86
+ # LOGGER
87
+ # end
88
+ # end
89
+ #
90
+ # # Define a gRPC module-level logger method before grpc/logconfig.rb loads.
91
+ # module GRPC
92
+ # extend MyLogger
93
+ # end
94
+ # ```
95
+ #
71
96
  module V1
72
97
  # rubocop:enable LineLength
73
98
 
@@ -101,11 +126,18 @@ module Google
101
126
  # or the specified config is missing data points.
102
127
  # @param timeout [Numeric]
103
128
  # The default timeout, in seconds, for calls made through this client.
129
+ # @param metadata [Hash]
130
+ # Default metadata to be sent with each request. This can be overridden on a per call basis.
131
+ # @param exception_transformer [Proc]
132
+ # An optional proc that intercepts any exceptions raised during an API call to inject
133
+ # custom error handling.
104
134
  def self.new \
105
135
  credentials: nil,
106
136
  scopes: nil,
107
137
  client_config: nil,
108
138
  timeout: nil,
139
+ metadata: nil,
140
+ exception_transformer: nil,
109
141
  lib_name: nil,
110
142
  lib_version: nil
111
143
  kwargs = {
@@ -113,6 +145,8 @@ module Google
113
145
  scopes: scopes,
114
146
  client_config: client_config,
115
147
  timeout: timeout,
148
+ metadata: metadata,
149
+ exception_transformer: exception_transformer,
116
150
  lib_name: lib_name,
117
151
  lib_version: lib_version
118
152
  }.select { |_, v| v != nil }
@@ -0,0 +1,42 @@
1
+ # Copyright 2018 Google LLC
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # https://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+
15
+ require "googleauth"
16
+
17
+ module Google
18
+ module Cloud
19
+ module Bigquery
20
+ module DataTransfer
21
+ module V1
22
+ class Credentials < Google::Auth::Credentials
23
+ SCOPE = [
24
+ "https://www.googleapis.com/auth/cloud-platform"
25
+ ].freeze
26
+ PATH_ENV_VARS = %w(DATA_TRANSFER_CREDENTIALS
27
+ DATA_TRANSFER_KEYFILE
28
+ GOOGLE_CLOUD_CREDENTIALS
29
+ GOOGLE_CLOUD_KEYFILE
30
+ GCLOUD_KEYFILE)
31
+ JSON_ENV_VARS = %w(DATA_TRANSFER_CREDENTIALS_JSON
32
+ DATA_TRANSFER_KEYFILE_JSON
33
+ GOOGLE_CLOUD_CREDENTIALS_JSON
34
+ GOOGLE_CLOUD_KEYFILE_JSON
35
+ GCLOUD_KEYFILE_JSON)
36
+ DEFAULT_PATHS = ["~/.config/gcloud/application_default_credentials.json"]
37
+ end
38
+ end
39
+ end
40
+ end
41
+ end
42
+ end
@@ -24,8 +24,8 @@ require "pathname"
24
24
 
25
25
  require "google/gax"
26
26
 
27
- require "google/cloud/bigquery/data_transfer/v1/data_transfer_pb"
28
- require "google/cloud/bigquery/data_transfer/credentials"
27
+ require "google/cloud/bigquery/datatransfer/v1/datatransfer_pb"
28
+ require "google/cloud/bigquery/data_transfer/v1/credentials"
29
29
 
30
30
  module Google
31
31
  module Cloud
@@ -38,7 +38,7 @@ module Google
38
38
  # frontend.
39
39
  #
40
40
  # @!attribute [r] data_transfer_service_stub
41
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Stub]
41
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::DataTransferService::Stub]
42
42
  class DataTransferServiceClient
43
43
  attr_reader :data_transfer_service_stub
44
44
 
@@ -48,6 +48,9 @@ module Google
48
48
  # The default port of the service.
49
49
  DEFAULT_SERVICE_PORT = 443
50
50
 
51
+ # The default set of gRPC interceptors.
52
+ GRPC_INTERCEPTORS = []
53
+
51
54
  DEFAULT_TIMEOUT = 30
52
55
 
53
56
  PAGE_DESCRIPTORS = {
@@ -170,23 +173,30 @@ module Google
170
173
  # or the specified config is missing data points.
171
174
  # @param timeout [Numeric]
172
175
  # The default timeout, in seconds, for calls made through this client.
176
+ # @param metadata [Hash]
177
+ # Default metadata to be sent with each request. This can be overridden on a per call basis.
178
+ # @param exception_transformer [Proc]
179
+ # An optional proc that intercepts any exceptions raised during an API call to inject
180
+ # custom error handling.
173
181
  def initialize \
174
182
  credentials: nil,
175
183
  scopes: ALL_SCOPES,
176
184
  client_config: {},
177
185
  timeout: DEFAULT_TIMEOUT,
186
+ metadata: nil,
187
+ exception_transformer: nil,
178
188
  lib_name: nil,
179
189
  lib_version: ""
180
190
  # These require statements are intentionally placed here to initialize
181
191
  # the gRPC module only when it's required.
182
192
  # See https://github.com/googleapis/toolkit/issues/446
183
193
  require "google/gax/grpc"
184
- require "google/cloud/bigquery/data_transfer/v1/data_transfer_services_pb"
194
+ require "google/cloud/bigquery/datatransfer/v1/datatransfer_services_pb"
185
195
 
186
- credentials ||= Google::Cloud::Bigquery::DataTransfer::Credentials.default
196
+ credentials ||= Google::Cloud::Bigquery::DataTransfer::V1::Credentials.default
187
197
 
188
198
  if credentials.is_a?(String) || credentials.is_a?(Hash)
189
- updater_proc = Google::Cloud::Bigquery::DataTransfer::Credentials.new(credentials).updater_proc
199
+ updater_proc = Google::Cloud::Bigquery::DataTransfer::V1::Credentials.new(credentials).updater_proc
190
200
  end
191
201
  if credentials.is_a?(GRPC::Core::Channel)
192
202
  channel = credentials
@@ -210,6 +220,7 @@ module Google
210
220
  google_api_client.freeze
211
221
 
212
222
  headers = { :"x-goog-api-client" => google_api_client }
223
+ headers.merge!(metadata) unless metadata.nil?
213
224
  client_config_file = Pathname.new(__dir__).join(
214
225
  "data_transfer_service_client_config.json"
215
226
  )
@@ -222,13 +233,14 @@ module Google
222
233
  timeout,
223
234
  page_descriptors: PAGE_DESCRIPTORS,
224
235
  errors: Google::Gax::Grpc::API_ERRORS,
225
- kwargs: headers
236
+ metadata: headers
226
237
  )
227
238
  end
228
239
 
229
240
  # Allow overriding the service path/port in subclasses.
230
241
  service_path = self.class::SERVICE_ADDRESS
231
242
  port = self.class::DEFAULT_SERVICE_PORT
243
+ interceptors = self.class::GRPC_INTERCEPTORS
232
244
  @data_transfer_service_stub = Google::Gax::Grpc.create_stub(
233
245
  service_path,
234
246
  port,
@@ -236,60 +248,113 @@ module Google
236
248
  channel: channel,
237
249
  updater_proc: updater_proc,
238
250
  scopes: scopes,
239
- &Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Stub.method(:new)
251
+ interceptors: interceptors,
252
+ &Google::Cloud::Bigquery::Datatransfer::V1::DataTransferService::Stub.method(:new)
240
253
  )
241
254
 
242
255
  @get_data_source = Google::Gax.create_api_call(
243
256
  @data_transfer_service_stub.method(:get_data_source),
244
- defaults["get_data_source"]
257
+ defaults["get_data_source"],
258
+ exception_transformer: exception_transformer,
259
+ params_extractor: proc do |request|
260
+ {'name' => request.name}
261
+ end
245
262
  )
246
263
  @list_data_sources = Google::Gax.create_api_call(
247
264
  @data_transfer_service_stub.method(:list_data_sources),
248
- defaults["list_data_sources"]
265
+ defaults["list_data_sources"],
266
+ exception_transformer: exception_transformer,
267
+ params_extractor: proc do |request|
268
+ {'parent' => request.parent}
269
+ end
249
270
  )
250
271
  @create_transfer_config = Google::Gax.create_api_call(
251
272
  @data_transfer_service_stub.method(:create_transfer_config),
252
- defaults["create_transfer_config"]
273
+ defaults["create_transfer_config"],
274
+ exception_transformer: exception_transformer,
275
+ params_extractor: proc do |request|
276
+ {'parent' => request.parent}
277
+ end
253
278
  )
254
279
  @update_transfer_config = Google::Gax.create_api_call(
255
280
  @data_transfer_service_stub.method(:update_transfer_config),
256
- defaults["update_transfer_config"]
281
+ defaults["update_transfer_config"],
282
+ exception_transformer: exception_transformer,
283
+ params_extractor: proc do |request|
284
+ {'transfer_config.name' => request.transfer_config.name}
285
+ end
257
286
  )
258
287
  @delete_transfer_config = Google::Gax.create_api_call(
259
288
  @data_transfer_service_stub.method(:delete_transfer_config),
260
- defaults["delete_transfer_config"]
289
+ defaults["delete_transfer_config"],
290
+ exception_transformer: exception_transformer,
291
+ params_extractor: proc do |request|
292
+ {'name' => request.name}
293
+ end
261
294
  )
262
295
  @get_transfer_config = Google::Gax.create_api_call(
263
296
  @data_transfer_service_stub.method(:get_transfer_config),
264
- defaults["get_transfer_config"]
297
+ defaults["get_transfer_config"],
298
+ exception_transformer: exception_transformer,
299
+ params_extractor: proc do |request|
300
+ {'name' => request.name}
301
+ end
265
302
  )
266
303
  @list_transfer_configs = Google::Gax.create_api_call(
267
304
  @data_transfer_service_stub.method(:list_transfer_configs),
268
- defaults["list_transfer_configs"]
305
+ defaults["list_transfer_configs"],
306
+ exception_transformer: exception_transformer,
307
+ params_extractor: proc do |request|
308
+ {'parent' => request.parent}
309
+ end
269
310
  )
270
311
  @schedule_transfer_runs = Google::Gax.create_api_call(
271
312
  @data_transfer_service_stub.method(:schedule_transfer_runs),
272
- defaults["schedule_transfer_runs"]
313
+ defaults["schedule_transfer_runs"],
314
+ exception_transformer: exception_transformer,
315
+ params_extractor: proc do |request|
316
+ {'parent' => request.parent}
317
+ end
273
318
  )
274
319
  @get_transfer_run = Google::Gax.create_api_call(
275
320
  @data_transfer_service_stub.method(:get_transfer_run),
276
- defaults["get_transfer_run"]
321
+ defaults["get_transfer_run"],
322
+ exception_transformer: exception_transformer,
323
+ params_extractor: proc do |request|
324
+ {'name' => request.name}
325
+ end
277
326
  )
278
327
  @delete_transfer_run = Google::Gax.create_api_call(
279
328
  @data_transfer_service_stub.method(:delete_transfer_run),
280
- defaults["delete_transfer_run"]
329
+ defaults["delete_transfer_run"],
330
+ exception_transformer: exception_transformer,
331
+ params_extractor: proc do |request|
332
+ {'name' => request.name}
333
+ end
281
334
  )
282
335
  @list_transfer_runs = Google::Gax.create_api_call(
283
336
  @data_transfer_service_stub.method(:list_transfer_runs),
284
- defaults["list_transfer_runs"]
337
+ defaults["list_transfer_runs"],
338
+ exception_transformer: exception_transformer,
339
+ params_extractor: proc do |request|
340
+ {'parent' => request.parent}
341
+ end
285
342
  )
286
343
  @list_transfer_logs = Google::Gax.create_api_call(
287
344
  @data_transfer_service_stub.method(:list_transfer_logs),
288
- defaults["list_transfer_logs"]
345
+ defaults["list_transfer_logs"],
346
+ exception_transformer: exception_transformer,
347
+ params_extractor: proc do |request|
348
+ {'parent' => request.parent}
349
+ end
289
350
  )
290
351
  @check_valid_creds = Google::Gax.create_api_call(
291
352
  @data_transfer_service_stub.method(:check_valid_creds),
292
- defaults["check_valid_creds"]
353
+ defaults["check_valid_creds"],
354
+ exception_transformer: exception_transformer,
355
+ params_extractor: proc do |request|
356
+ {'name' => request.name}
357
+ end
293
358
  )
294
359
  end
295
360
 
@@ -304,23 +369,27 @@ module Google
304
369
  # @param options [Google::Gax::CallOptions]
305
370
  # Overrides the default settings for this call, e.g, timeout,
306
371
  # retries, etc.
307
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::DataSource]
372
+ # @yield [result, operation] Access the result along with the RPC operation
373
+ # @yieldparam result [Google::Cloud::Bigquery::Datatransfer::V1::DataSource]
374
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
375
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::DataSource]
308
376
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
309
377
  # @example
310
- # require "google/cloud/bigquery/data_transfer/v1"
378
+ # require "google/cloud/bigquery/data_transfer"
311
379
  #
312
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
380
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
313
381
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_data_source_path("[PROJECT]", "[DATA_SOURCE]")
314
382
  # response = data_transfer_service_client.get_data_source(formatted_name)
315
383
 
316
384
  def get_data_source \
317
385
  name,
318
- options: nil
386
+ options: nil,
387
+ &block
319
388
  req = {
320
389
  name: name
321
390
  }.delete_if { |_, v| v.nil? }
322
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::GetDataSourceRequest)
323
- @get_data_source.call(req, options)
391
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::GetDataSourceRequest)
392
+ @get_data_source.call(req, options, &block)
324
393
  end
325
394
 
326
395
  # Lists supported data sources and returns their settings,
@@ -338,16 +407,19 @@ module Google
338
407
  # @param options [Google::Gax::CallOptions]
339
408
  # Overrides the default settings for this call, e.g, timeout,
340
409
  # retries, etc.
341
- # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::DataTransfer::V1::DataSource>]
342
- # An enumerable of Google::Cloud::Bigquery::DataTransfer::V1::DataSource instances.
410
+ # @yield [result, operation] Access the result along with the RPC operation
411
+ # @yieldparam result [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::DataSource>]
412
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
413
+ # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::DataSource>]
414
+ # An enumerable of Google::Cloud::Bigquery::Datatransfer::V1::DataSource instances.
343
415
  # See Google::Gax::PagedEnumerable documentation for other
344
416
  # operations such as per-page iteration or access to the response
345
417
  # object.
346
418
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
347
419
  # @example
348
- # require "google/cloud/bigquery/data_transfer/v1"
420
+ # require "google/cloud/bigquery/data_transfer"
349
421
  #
350
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
422
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
351
423
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path("[PROJECT]")
352
424
  #
353
425
  # # Iterate over all results.
@@ -366,13 +438,14 @@ module Google
366
438
  def list_data_sources \
367
439
  parent,
368
440
  page_size: nil,
369
- options: nil
441
+ options: nil,
442
+ &block
370
443
  req = {
371
444
  parent: parent,
372
445
  page_size: page_size
373
446
  }.delete_if { |_, v| v.nil? }
374
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::ListDataSourcesRequest)
375
- @list_data_sources.call(req, options)
447
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::ListDataSourcesRequest)
448
+ @list_data_sources.call(req, options, &block)
376
449
  end
377
450
 
378
451
  # Creates a new data transfer configuration.
@@ -382,9 +455,9 @@ module Google
382
455
  # Must be in the format /projects/{project_id}/locations/{location_id}
383
456
  # If specified location and location of the destination bigquery dataset
384
457
  # do not match - the request will fail.
385
- # @param transfer_config [Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig | Hash]
458
+ # @param transfer_config [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig | Hash]
386
459
  # Data transfer configuration to create.
387
- # A hash of the same form as `Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig`
460
+ # A hash of the same form as `Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig`
388
461
  # can also be provided.
389
462
  # @param authorization_code [String]
390
463
  # Optional OAuth2 authorization code to use with this transfer configuration.
@@ -406,12 +479,15 @@ module Google
406
479
  # @param options [Google::Gax::CallOptions]
407
480
  # Overrides the default settings for this call, e.g, timeout,
408
481
  # retries, etc.
409
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig]
482
+ # @yield [result, operation] Access the result along with the RPC operation
483
+ # @yieldparam result [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig]
484
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
485
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig]
410
486
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
411
487
  # @example
412
- # require "google/cloud/bigquery/data_transfer/v1"
488
+ # require "google/cloud/bigquery/data_transfer"
413
489
  #
414
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
490
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
415
491
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path("[PROJECT]")
416
492
  #
417
493
  # # TODO: Initialize +transfer_config+:
@@ -422,22 +498,23 @@ module Google
422
498
  parent,
423
499
  transfer_config,
424
500
  authorization_code: nil,
425
- options: nil
501
+ options: nil,
502
+ &block
426
503
  req = {
427
504
  parent: parent,
428
505
  transfer_config: transfer_config,
429
506
  authorization_code: authorization_code
430
507
  }.delete_if { |_, v| v.nil? }
431
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::CreateTransferConfigRequest)
432
- @create_transfer_config.call(req, options)
508
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::CreateTransferConfigRequest)
509
+ @create_transfer_config.call(req, options, &block)
433
510
  end
434
511
 
435
512
  # Updates a data transfer configuration.
436
513
  # All fields must be set, even if they are not updated.
437
514
  #
438
- # @param transfer_config [Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig | Hash]
515
+ # @param transfer_config [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig | Hash]
439
516
  # Data transfer configuration to create.
440
- # A hash of the same form as `Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig`
517
+ # A hash of the same form as `Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig`
441
518
  # can also be provided.
442
519
  # @param update_mask [Google::Protobuf::FieldMask | Hash]
443
520
  # Required list of fields to be updated in this request.
@@ -463,12 +540,15 @@ module Google
463
540
  # @param options [Google::Gax::CallOptions]
464
541
  # Overrides the default settings for this call, e.g, timeout,
465
542
  # retries, etc.
466
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig]
543
+ # @yield [result, operation] Access the result along with the RPC operation
544
+ # @yieldparam result [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig]
545
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
546
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig]
467
547
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
468
548
  # @example
469
- # require "google/cloud/bigquery/data_transfer/v1"
549
+ # require "google/cloud/bigquery/data_transfer"
470
550
  #
471
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
551
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
472
552
  #
473
553
  # # TODO: Initialize +transfer_config+:
474
554
  # transfer_config = {}
@@ -481,14 +561,15 @@ module Google
481
561
  transfer_config,
482
562
  update_mask,
483
563
  authorization_code: nil,
484
- options: nil
564
+ options: nil,
565
+ &block
485
566
  req = {
486
567
  transfer_config: transfer_config,
487
568
  update_mask: update_mask,
488
569
  authorization_code: authorization_code
489
570
  }.delete_if { |_, v| v.nil? }
490
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::UpdateTransferConfigRequest)
491
- @update_transfer_config.call(req, options)
571
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::UpdateTransferConfigRequest)
572
+ @update_transfer_config.call(req, options, &block)
492
573
  end
493
574
 
494
575
  # Deletes a data transfer configuration,
@@ -500,22 +581,26 @@ module Google
500
581
  # @param options [Google::Gax::CallOptions]
501
582
  # Overrides the default settings for this call, e.g, timeout,
502
583
  # retries, etc.
584
+ # @yield [result, operation] Access the result along with the RPC operation
585
+ # @yieldparam result []
586
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
503
587
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
504
588
  # @example
505
- # require "google/cloud/bigquery/data_transfer/v1"
589
+ # require "google/cloud/bigquery/data_transfer"
506
590
  #
507
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
591
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
508
592
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
509
593
  # data_transfer_service_client.delete_transfer_config(formatted_name)
510
594
 
511
595
  def delete_transfer_config \
512
596
  name,
513
- options: nil
597
+ options: nil,
598
+ &block
514
599
  req = {
515
600
  name: name
516
601
  }.delete_if { |_, v| v.nil? }
517
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::DeleteTransferConfigRequest)
518
- @delete_transfer_config.call(req, options)
602
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::DeleteTransferConfigRequest)
603
+ @delete_transfer_config.call(req, options, &block)
519
604
  nil
520
605
  end
521
606
 
@@ -527,23 +612,27 @@ module Google
527
612
  # @param options [Google::Gax::CallOptions]
528
613
  # Overrides the default settings for this call, e.g, timeout,
529
614
  # retries, etc.
530
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig]
615
+ # @yield [result, operation] Access the result along with the RPC operation
616
+ # @yieldparam result [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig]
617
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
618
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig]
531
619
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
532
620
  # @example
533
- # require "google/cloud/bigquery/data_transfer/v1"
621
+ # require "google/cloud/bigquery/data_transfer"
534
622
  #
535
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
623
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
536
624
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
537
625
  # response = data_transfer_service_client.get_transfer_config(formatted_name)
538
626
 
539
627
  def get_transfer_config \
540
628
  name,
541
- options: nil
629
+ options: nil,
630
+ &block
542
631
  req = {
543
632
  name: name
544
633
  }.delete_if { |_, v| v.nil? }
545
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::GetTransferConfigRequest)
546
- @get_transfer_config.call(req, options)
634
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::GetTransferConfigRequest)
635
+ @get_transfer_config.call(req, options, &block)
547
636
  end
548
637
 
549
638
  # Returns information about all data transfers in the project.
@@ -562,16 +651,19 @@ module Google
562
651
  # @param options [Google::Gax::CallOptions]
563
652
  # Overrides the default settings for this call, e.g, timeout,
564
653
  # retries, etc.
565
- # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig>]
566
- # An enumerable of Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig instances.
654
+ # @yield [result, operation] Access the result along with the RPC operation
655
+ # @yieldparam result [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig>]
656
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
657
+ # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig>]
658
+ # An enumerable of Google::Cloud::Bigquery::Datatransfer::V1::TransferConfig instances.
567
659
  # See Google::Gax::PagedEnumerable documentation for other
568
660
  # operations such as per-page iteration or access to the response
569
661
  # object.
570
662
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
571
663
  # @example
572
- # require "google/cloud/bigquery/data_transfer/v1"
664
+ # require "google/cloud/bigquery/data_transfer"
573
665
  #
574
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
666
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
575
667
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path("[PROJECT]")
576
668
  #
577
669
  # # Iterate over all results.
@@ -591,14 +683,15 @@ module Google
591
683
  parent,
592
684
  data_source_ids: nil,
593
685
  page_size: nil,
594
- options: nil
686
+ options: nil,
687
+ &block
595
688
  req = {
596
689
  parent: parent,
597
690
  data_source_ids: data_source_ids,
598
691
  page_size: page_size
599
692
  }.delete_if { |_, v| v.nil? }
600
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::ListTransferConfigsRequest)
601
- @list_transfer_configs.call(req, options)
693
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::ListTransferConfigsRequest)
694
+ @list_transfer_configs.call(req, options, &block)
602
695
  end
603
696
 
604
697
  # Creates transfer runs for a time range [start_time, end_time].
@@ -622,12 +715,15 @@ module Google
622
715
  # @param options [Google::Gax::CallOptions]
623
716
  # Overrides the default settings for this call, e.g, timeout,
624
717
  # retries, etc.
625
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::ScheduleTransferRunsResponse]
718
+ # @yield [result, operation] Access the result along with the RPC operation
719
+ # @yieldparam result [Google::Cloud::Bigquery::Datatransfer::V1::ScheduleTransferRunsResponse]
720
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
721
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::ScheduleTransferRunsResponse]
626
722
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
627
723
  # @example
628
- # require "google/cloud/bigquery/data_transfer/v1"
724
+ # require "google/cloud/bigquery/data_transfer"
629
725
  #
630
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
726
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
631
727
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
632
728
  #
633
729
  # # TODO: Initialize +start_time+:
@@ -641,14 +737,15 @@ module Google
641
737
  parent,
642
738
  start_time,
643
739
  end_time,
644
- options: nil
740
+ options: nil,
741
+ &block
645
742
  req = {
646
743
  parent: parent,
647
744
  start_time: start_time,
648
745
  end_time: end_time
649
746
  }.delete_if { |_, v| v.nil? }
650
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::ScheduleTransferRunsRequest)
651
- @schedule_transfer_runs.call(req, options)
747
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::ScheduleTransferRunsRequest)
748
+ @schedule_transfer_runs.call(req, options, &block)
652
749
  end
653
750
 
654
751
  # Returns information about the particular transfer run.
@@ -659,23 +756,27 @@ module Google
659
756
  # @param options [Google::Gax::CallOptions]
660
757
  # Overrides the default settings for this call, e.g, timeout,
661
758
  # retries, etc.
662
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::TransferRun]
759
+ # @yield [result, operation] Access the result along with the RPC operation
760
+ # @yieldparam result [Google::Cloud::Bigquery::Datatransfer::V1::TransferRun]
761
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
762
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::TransferRun]
663
763
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
664
764
  # @example
665
- # require "google/cloud/bigquery/data_transfer/v1"
765
+ # require "google/cloud/bigquery/data_transfer"
666
766
  #
667
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
767
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
668
768
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_run_path("[PROJECT]", "[TRANSFER_CONFIG]", "[RUN]")
669
769
  # response = data_transfer_service_client.get_transfer_run(formatted_name)
670
770
 
671
771
  def get_transfer_run \
672
772
  name,
673
- options: nil
773
+ options: nil,
774
+ &block
674
775
  req = {
675
776
  name: name
676
777
  }.delete_if { |_, v| v.nil? }
677
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::GetTransferRunRequest)
678
- @get_transfer_run.call(req, options)
778
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::GetTransferRunRequest)
779
+ @get_transfer_run.call(req, options, &block)
679
780
  end
680
781
 
681
782
  # Deletes the specified transfer run.
@@ -686,22 +787,26 @@ module Google
686
787
  # @param options [Google::Gax::CallOptions]
687
788
  # Overrides the default settings for this call, e.g, timeout,
688
789
  # retries, etc.
790
+ # @yield [result, operation] Access the result along with the RPC operation
791
+ # @yieldparam result []
792
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
689
793
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
690
794
  # @example
691
- # require "google/cloud/bigquery/data_transfer/v1"
795
+ # require "google/cloud/bigquery/data_transfer"
692
796
  #
693
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
797
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
694
798
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_run_path("[PROJECT]", "[TRANSFER_CONFIG]", "[RUN]")
695
799
  # data_transfer_service_client.delete_transfer_run(formatted_name)
696
800
 
697
801
  def delete_transfer_run \
698
802
  name,
699
- options: nil
803
+ options: nil,
804
+ &block
700
805
  req = {
701
806
  name: name
702
807
  }.delete_if { |_, v| v.nil? }
703
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::DeleteTransferRunRequest)
704
- @delete_transfer_run.call(req, options)
808
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::DeleteTransferRunRequest)
809
+ @delete_transfer_run.call(req, options, &block)
705
810
  nil
706
811
  end
707
812
 
@@ -711,7 +816,7 @@ module Google
711
816
  # Name of transfer configuration for which transfer runs should be retrieved.
712
817
  # Format of transfer configuration resource name is:
713
818
  # +projects/{project_id}/transferConfigs/{config_id}+.
714
- # @param states [Array<Google::Cloud::Bigquery::DataTransfer::V1::TransferState>]
819
+ # @param states [Array<Google::Cloud::Bigquery::Datatransfer::V1::TransferState>]
715
820
  # When specified, only transfer runs with requested states are returned.
716
821
  # @param page_size [Integer]
717
822
  # The maximum number of resources contained in the underlying API
@@ -719,21 +824,24 @@ module Google
719
824
  # parameter does not affect the return value. If page streaming is
720
825
  # performed per-page, this determines the maximum number of
721
826
  # resources in a page.
722
- # @param run_attempt [Google::Cloud::Bigquery::DataTransfer::V1::ListTransferRunsRequest::RunAttempt]
827
+ # @param run_attempt [Google::Cloud::Bigquery::Datatransfer::V1::ListTransferRunsRequest::RunAttempt]
723
828
  # Indicates how run attempts are to be pulled.
724
829
  # @param options [Google::Gax::CallOptions]
725
830
  # Overrides the default settings for this call, e.g, timeout,
726
831
  # retries, etc.
727
- # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::DataTransfer::V1::TransferRun>]
728
- # An enumerable of Google::Cloud::Bigquery::DataTransfer::V1::TransferRun instances.
832
+ # @yield [result, operation] Access the result along with the RPC operation
833
+ # @yieldparam result [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::TransferRun>]
834
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
835
+ # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::TransferRun>]
836
+ # An enumerable of Google::Cloud::Bigquery::Datatransfer::V1::TransferRun instances.
729
837
  # See Google::Gax::PagedEnumerable documentation for other
730
838
  # operations such as per-page iteration or access to the response
731
839
  # object.
732
840
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
733
841
  # @example
734
- # require "google/cloud/bigquery/data_transfer/v1"
842
+ # require "google/cloud/bigquery/data_transfer"
735
843
  #
736
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
844
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
737
845
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
738
846
  #
739
847
  # # Iterate over all results.
@@ -754,15 +862,16 @@ module Google
754
862
  states: nil,
755
863
  page_size: nil,
756
864
  run_attempt: nil,
757
- options: nil
865
+ options: nil,
866
+ &block
758
867
  req = {
759
868
  parent: parent,
760
869
  states: states,
761
870
  page_size: page_size,
762
871
  run_attempt: run_attempt
763
872
  }.delete_if { |_, v| v.nil? }
764
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::ListTransferRunsRequest)
765
- @list_transfer_runs.call(req, options)
873
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::ListTransferRunsRequest)
874
+ @list_transfer_runs.call(req, options, &block)
766
875
  end
767
876
 
768
877
  # Returns user facing log messages for the data transfer run.
@@ -776,22 +885,25 @@ module Google
776
885
  # parameter does not affect the return value. If page streaming is
777
886
  # performed per-page, this determines the maximum number of
778
887
  # resources in a page.
779
- # @param message_types [Array<Google::Cloud::Bigquery::DataTransfer::V1::TransferMessage::MessageSeverity>]
888
+ # @param message_types [Array<Google::Cloud::Bigquery::Datatransfer::V1::TransferMessage::MessageSeverity>]
780
889
  # Message types to return. If not populated - INFO, WARNING and ERROR
781
890
  # messages are returned.
782
891
  # @param options [Google::Gax::CallOptions]
783
892
  # Overrides the default settings for this call, e.g, timeout,
784
893
  # retries, etc.
785
- # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::DataTransfer::V1::TransferMessage>]
786
- # An enumerable of Google::Cloud::Bigquery::DataTransfer::V1::TransferMessage instances.
894
+ # @yield [result, operation] Access the result along with the RPC operation
895
+ # @yieldparam result [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::TransferMessage>]
896
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
897
+ # @return [Google::Gax::PagedEnumerable<Google::Cloud::Bigquery::Datatransfer::V1::TransferMessage>]
898
+ # An enumerable of Google::Cloud::Bigquery::Datatransfer::V1::TransferMessage instances.
787
899
  # See Google::Gax::PagedEnumerable documentation for other
788
900
  # operations such as per-page iteration or access to the response
789
901
  # object.
790
902
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
791
903
  # @example
792
- # require "google/cloud/bigquery/data_transfer/v1"
904
+ # require "google/cloud/bigquery/data_transfer"
793
905
  #
794
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
906
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
795
907
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_run_path("[PROJECT]", "[TRANSFER_CONFIG]", "[RUN]")
796
908
  #
797
909
  # # Iterate over all results.
@@ -811,14 +923,15 @@ module Google
811
923
  parent,
812
924
  page_size: nil,
813
925
  message_types: nil,
814
- options: nil
926
+ options: nil,
927
+ &block
815
928
  req = {
816
929
  parent: parent,
817
930
  page_size: page_size,
818
931
  message_types: message_types
819
932
  }.delete_if { |_, v| v.nil? }
820
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::ListTransferLogsRequest)
821
- @list_transfer_logs.call(req, options)
933
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::ListTransferLogsRequest)
934
+ @list_transfer_logs.call(req, options, &block)
822
935
  end
823
936
 
824
937
  # Returns true if valid credentials exist for the given data source and
@@ -834,23 +947,27 @@ module Google
834
947
  # @param options [Google::Gax::CallOptions]
835
948
  # Overrides the default settings for this call, e.g, timeout,
836
949
  # retries, etc.
837
- # @return [Google::Cloud::Bigquery::DataTransfer::V1::CheckValidCredsResponse]
950
+ # @yield [result, operation] Access the result along with the RPC operation
951
+ # @yieldparam result [Google::Cloud::Bigquery::Datatransfer::V1::CheckValidCredsResponse]
952
+ # @yieldparam operation [GRPC::ActiveCall::Operation]
953
+ # @return [Google::Cloud::Bigquery::Datatransfer::V1::CheckValidCredsResponse]
838
954
  # @raise [Google::Gax::GaxError] if the RPC is aborted.
839
955
  # @example
840
- # require "google/cloud/bigquery/data_transfer/v1"
956
+ # require "google/cloud/bigquery/data_transfer"
841
957
  #
842
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer::V1.new
958
+ # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
843
959
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_data_source_path("[PROJECT]", "[DATA_SOURCE]")
844
960
  # response = data_transfer_service_client.check_valid_creds(formatted_name)
845
961
 
846
962
  def check_valid_creds \
847
963
  name,
848
- options: nil
964
+ options: nil,
965
+ &block
849
966
  req = {
850
967
  name: name
851
968
  }.delete_if { |_, v| v.nil? }
852
- req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::DataTransfer::V1::CheckValidCredsRequest)
853
- @check_valid_creds.call(req, options)
969
+ req = Google::Gax::to_proto(req, Google::Cloud::Bigquery::Datatransfer::V1::CheckValidCredsRequest)
970
+ @check_valid_creds.call(req, options, &block)
854
971
  end
855
972
  end
856
973
  end