google-cloud-bigquery-data_transfer-v1 0.10.0 → 0.12.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 6433955b8b3f7bcdb862815a1264b7aded4630fa0e03baad0488ba4efbfd90af
4
- data.tar.gz: dd9ce54ca3f027af29a8ff704fb893d863bcfdc8f8c64b4f5e4d5a2f6f6fcc12
3
+ metadata.gz: 414dd8889ba23effda6a32a344fa660e76a977148767394098c1732f53e076a2
4
+ data.tar.gz: b805e9aba8f2e5edc6103a92506c7ec98de653c31ddd5cd19f9ba29813f561de
5
5
  SHA512:
6
- metadata.gz: a4fcd678ba1f117f0df384049f4debf475a1f1b172c8061f55534f07431dd82a6e8801326057771a323f08b0f60849af7f4f8ab5bc05cb8a95201376ff2ef0f0
7
- data.tar.gz: ef21e6e47e3d20a10d3c88b9894c1f3e1534fb5531ec6c677321203dd7666d243d59548136cc4605606ffc1acea2dd2d30e34df11779d0315ab2d8a11a453c15
6
+ metadata.gz: bc02d1135772af2432c562aa2c295022dd6f42479b3b8d4d10295fbf69ca1457a117f7cc8bce5eb218a82d5eb12db21cb43e22ee2b4630951b79203353d0f137
7
+ data.tar.gz: 0f7f11932f1962208bacda2c12ebfbce80ef2ed824c94b07a4574689da92b9327c4a7d4f86c7933598e04f53f0c0a6b4ef17557a525135b63352b8acbe04ff8d
data/AUTHENTICATION.md CHANGED
@@ -1,151 +1,122 @@
1
1
  # Authentication
2
2
 
3
- In general, the google-cloud-bigquery-data_transfer-v1 library uses
4
- [Service Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
5
- credentials to connect to Google Cloud services. When running within
6
- [Google Cloud Platform environments](#google-cloud-platform-environments) the
7
- credentials will be discovered automatically. When running on other
8
- environments, the Service Account credentials can be specified by providing the
9
- path to the
10
- [JSON keyfile](https://cloud.google.com/iam/docs/managing-service-account-keys)
11
- for the account (or the JSON itself) in
12
- [environment variables](#environment-variables). Additionally, Cloud SDK
13
- credentials can also be discovered automatically, but this is only recommended
14
- during development.
3
+ The recommended way to authenticate to the google-cloud-bigquery-data_transfer-v1 library is to use
4
+ [Application Default Credentials (ADC)](https://cloud.google.com/docs/authentication/application-default-credentials).
5
+ To review all of your authentication options, see [Credentials lookup](#credential-lookup).
15
6
 
16
7
  ## Quickstart
17
8
 
18
- 1. [Create a service account and credentials](#creating-a-service-account).
19
- 2. Set the [environment variable](#environment-variables).
9
+ The following example shows how to set up authentication for a local development
10
+ environment with your user credentials.
20
11
 
21
- ```sh
22
- export DATA_TRANSFER_CREDENTIALS=path/to/keyfile.json
23
- ```
24
-
25
- 3. Initialize the client.
12
+ **NOTE:** This method is _not_ recommended for running in production. User credentials
13
+ should be used only during development.
26
14
 
27
- ```ruby
28
- require "google/cloud/bigquery/data_transfer/v1"
15
+ 1. [Download and install the Google Cloud CLI](https://cloud.google.com/sdk).
16
+ 2. Set up a local ADC file with your user credentials:
29
17
 
30
- client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
18
+ ```sh
19
+ gcloud auth application-default login
31
20
  ```
32
21
 
33
- ## Credential Lookup
34
-
35
- The google-cloud-bigquery-data_transfer-v1 library aims to make authentication
36
- as simple as possible, and provides several mechanisms to configure your system
37
- without requiring **Service Account Credentials** directly in code.
38
-
39
- **Credentials** are discovered in the following order:
40
-
41
- 1. Specify credentials in method arguments
42
- 2. Specify credentials in configuration
43
- 3. Discover credentials path in environment variables
44
- 4. Discover credentials JSON in environment variables
45
- 5. Discover credentials file in the Cloud SDK's path
46
- 6. Discover GCP credentials
47
-
48
- ### Google Cloud Platform environments
22
+ 3. Write code as if already authenticated.
49
23
 
50
- When running on Google Cloud Platform (GCP), including Google Compute Engine
51
- (GCE), Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud
52
- Functions (GCF) and Cloud Run, **Credentials** are discovered automatically.
53
- Code should be written as if already authenticated.
24
+ For more information about setting up authentication for a local development environment, see
25
+ [Set up Application Default Credentials](https://cloud.google.com/docs/authentication/provide-credentials-adc#local-dev).
54
26
 
55
- ### Environment Variables
27
+ ## Credential Lookup
56
28
 
57
- The **Credentials JSON** can be placed in environment variables instead of
58
- declaring them directly in code. Each service has its own environment variable,
59
- allowing for different service accounts to be used for different services. (See
60
- the READMEs for the individual service gems for details.) The path to the
61
- **Credentials JSON** file can be stored in the environment variable, or the
62
- **Credentials JSON** itself can be stored for environments such as Docker
63
- containers where writing files is difficult or not encouraged.
29
+ The google-cloud-bigquery-data_transfer-v1 library provides several mechanisms to configure your system.
30
+ Generally, using Application Default Credentials to facilitate automatic
31
+ credentials discovery is the easist method. But if you need to explicitly specify
32
+ credentials, there are several methods available to you.
64
33
 
65
- The environment variables that google-cloud-bigquery-data_transfer-v1
66
- checks for credentials are configured on the service Credentials class (such as
67
- {::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Credentials}):
34
+ Credentials are accepted in the following ways, in the following order or precedence:
68
35
 
69
- * `DATA_TRANSFER_CREDENTIALS` - Path to JSON file, or JSON contents
70
- * `DATA_TRANSFER_KEYFILE` - Path to JSON file, or JSON contents
71
- * `GOOGLE_CLOUD_CREDENTIALS` - Path to JSON file, or JSON contents
72
- * `GOOGLE_CLOUD_KEYFILE` - Path to JSON file, or JSON contents
73
- * `GOOGLE_APPLICATION_CREDENTIALS` - Path to JSON file
36
+ 1. Credentials specified in method arguments
37
+ 2. Credentials specified in configuration
38
+ 3. Credentials pointed to or included in environment variables
39
+ 4. Credentials found in local ADC file
40
+ 5. Credentials returned by the metadata server for the attached service account (GCP)
74
41
 
75
- ```ruby
76
- require "google/cloud/bigquery/data_transfer/v1"
77
-
78
- ENV["DATA_TRANSFER_CREDENTIALS"] = "path/to/keyfile.json"
42
+ ### Configuration
79
43
 
80
- client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
81
- ```
44
+ You can configure a path to a JSON credentials file, either for an individual client object or
45
+ globally, for all client objects. The JSON file can contain credentials created for
46
+ [workload identity federation](https://cloud.google.com/iam/docs/workload-identity-federation),
47
+ [workforce identity federation](https://cloud.google.com/iam/docs/workforce-identity-federation), or a
48
+ [service account key](https://cloud.google.com/docs/authentication/provide-credentials-adc#local-key).
82
49
 
83
- ### Configuration
50
+ Note: Service account keys are a security risk if not managed correctly. You should
51
+ [choose a more secure alternative to service account keys](https://cloud.google.com/docs/authentication#auth-decision-tree)
52
+ whenever possible.
84
53
 
85
- The path to the **Credentials JSON** file can be configured instead of storing
86
- it in an environment variable. Either on an individual client initialization:
54
+ To configure a credentials file for an individual client initialization:
87
55
 
88
56
  ```ruby
89
57
  require "google/cloud/bigquery/data_transfer/v1"
90
58
 
91
59
  client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new do |config|
92
- config.credentials = "path/to/keyfile.json"
60
+ config.credentials = "path/to/credentialfile.json"
93
61
  end
94
62
  ```
95
63
 
96
- Or globally for all clients:
64
+ To configure a credentials file globally for all clients:
97
65
 
98
66
  ```ruby
99
67
  require "google/cloud/bigquery/data_transfer/v1"
100
68
 
101
69
  ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.configure do |config|
102
- config.credentials = "path/to/keyfile.json"
70
+ config.credentials = "path/to/credentialfile.json"
103
71
  end
104
72
 
105
73
  client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
106
74
  ```
107
75
 
108
- ### Cloud SDK
76
+ ### Environment Variables
109
77
 
110
- This option allows for an easy way to authenticate during development. If
111
- credentials are not provided in code or in environment variables, then Cloud SDK
112
- credentials are discovered.
78
+ You can also use an environment variable to provide a JSON credentials file.
79
+ The environment variable can contain a path to the credentials file or, for
80
+ environments such as Docker containers where writing files is not encouraged,
81
+ you can include the credentials file itself.
113
82
 
114
- To configure your system for this, simply:
83
+ The JSON file can contain credentials created for
84
+ [workload identity federation](https://cloud.google.com/iam/docs/workload-identity-federation),
85
+ [workforce identity federation](https://cloud.google.com/iam/docs/workforce-identity-federation), or a
86
+ [service account key](https://cloud.google.com/docs/authentication/provide-credentials-adc#local-key).
115
87
 
116
- 1. [Download and install the Cloud SDK](https://cloud.google.com/sdk)
117
- 2. Authenticate using OAuth 2.0 `$ gcloud auth application-default login`
118
- 3. Write code as if already authenticated.
88
+ Note: Service account keys are a security risk if not managed correctly. You should
89
+ [choose a more secure alternative to service account keys](https://cloud.google.com/docs/authentication#auth-decision-tree)
90
+ whenever possible.
91
+
92
+ The environment variables that google-cloud-bigquery-data_transfer-v1
93
+ checks for credentials are:
119
94
 
120
- **NOTE:** This is _not_ recommended for running in production. The Cloud SDK
121
- *should* only be used during development.
95
+ * `GOOGLE_CLOUD_CREDENTIALS` - Path to JSON file, or JSON contents
96
+ * `GOOGLE_APPLICATION_CREDENTIALS` - Path to JSON file
122
97
 
123
- ## Creating a Service Account
98
+ ```ruby
99
+ require "google/cloud/bigquery/data_transfer/v1"
124
100
 
125
- Google Cloud requires **Service Account Credentials** to
126
- connect to the APIs. You will use the **JSON key file** to
127
- connect to most services with google-cloud-bigquery-data_transfer-v1.
101
+ ENV["GOOGLE_APPLICATION_CREDENTIALS"] = "path/to/credentialfile.json"
128
102
 
129
- If you are not running this client within
130
- [Google Cloud Platform environments](#google-cloud-platform-environments), you
131
- need a Google Developers service account.
103
+ client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
104
+ ```
132
105
 
133
- 1. Visit the [Google Cloud Console](https://console.cloud.google.com/project).
134
- 2. Create a new project or click on an existing project.
135
- 3. Activate the menu in the upper left and select **APIs & Services**. From
136
- here, you will enable the APIs that your application requires.
106
+ ### Local ADC file
137
107
 
138
- *Note: You may need to enable billing in order to use these services.*
108
+ You can set up a local ADC file with your user credentials for authentication during
109
+ development. If credentials are not provided in code or in environment variables,
110
+ then the local ADC credentials are discovered.
139
111
 
140
- 4. Select **Credentials** from the side navigation.
112
+ Follow the steps in [Quickstart](#quickstart) to set up a local ADC file.
141
113
 
142
- Find the "Create credentials" drop down near the top of the page, and select
143
- "Service account" to be guided through downloading a new JSON key file.
114
+ ### Google Cloud Platform environments
144
115
 
145
- If you want to re-use an existing service account, you can easily generate a
146
- new key file. Just select the account you wish to re-use, click the pencil
147
- tool on the right side to edit the service account, select the **Keys** tab,
148
- and then select **Add Key**.
116
+ When running on Google Cloud Platform (GCP), including Google Compute Engine
117
+ (GCE), Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud
118
+ Functions (GCF) and Cloud Run, credentials are retrieved from the attached
119
+ service account automatically. Code should be written as if already authenticated.
149
120
 
150
- The key file you download will be used by this library to authenticate API
151
- requests and should be stored in a secure location.
121
+ For more information, see
122
+ [Set up ADC for Google Cloud services](https://cloud.google.com/docs/authentication/provide-credentials-adc#attached-sa).
@@ -32,6 +32,9 @@ module Google
32
32
  # This API allows users to manage their data transfers into BigQuery.
33
33
  #
34
34
  class Client
35
+ # @private
36
+ DEFAULT_ENDPOINT_TEMPLATE = "bigquerydatatransfer.$UNIVERSE_DOMAIN$"
37
+
35
38
  include Paths
36
39
 
37
40
  # @private
@@ -148,6 +151,15 @@ module Google
148
151
  @config
149
152
  end
150
153
 
154
+ ##
155
+ # The effective universe domain
156
+ #
157
+ # @return [String]
158
+ #
159
+ def universe_domain
160
+ @data_transfer_service_stub.universe_domain
161
+ end
162
+
151
163
  ##
152
164
  # Create a new DataTransferService client object.
153
165
  #
@@ -181,8 +193,9 @@ module Google
181
193
  credentials = @config.credentials
182
194
  # Use self-signed JWT if the endpoint is unchanged from default,
183
195
  # but only if the default endpoint does not have a region prefix.
184
- enable_self_signed_jwt = @config.endpoint == Configuration::DEFAULT_ENDPOINT &&
185
- !@config.endpoint.split(".").first.include?("-")
196
+ enable_self_signed_jwt = @config.endpoint.nil? ||
197
+ (@config.endpoint == Configuration::DEFAULT_ENDPOINT &&
198
+ !@config.endpoint.split(".").first.include?("-"))
186
199
  credentials ||= Credentials.default scope: @config.scope,
187
200
  enable_self_signed_jwt: enable_self_signed_jwt
188
201
  if credentials.is_a?(::String) || credentials.is_a?(::Hash)
@@ -195,14 +208,18 @@ module Google
195
208
  config.credentials = credentials
196
209
  config.quota_project = @quota_project_id
197
210
  config.endpoint = @config.endpoint
211
+ config.universe_domain = @config.universe_domain
198
212
  end
199
213
 
200
214
  @data_transfer_service_stub = ::Gapic::ServiceStub.new(
201
215
  ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Stub,
202
- credentials: credentials,
203
- endpoint: @config.endpoint,
216
+ credentials: credentials,
217
+ endpoint: @config.endpoint,
218
+ endpoint_template: DEFAULT_ENDPOINT_TEMPLATE,
219
+ universe_domain: @config.universe_domain,
204
220
  channel_args: @config.channel_args,
205
- interceptors: @config.interceptors
221
+ interceptors: @config.interceptors,
222
+ channel_pool_config: @config.channel_pool
206
223
  )
207
224
  end
208
225
 
@@ -1735,9 +1752,9 @@ module Google
1735
1752
  # end
1736
1753
  #
1737
1754
  # @!attribute [rw] endpoint
1738
- # The hostname or hostname:port of the service endpoint.
1739
- # Defaults to `"bigquerydatatransfer.googleapis.com"`.
1740
- # @return [::String]
1755
+ # A custom service endpoint, as a hostname or hostname:port. The default is
1756
+ # nil, indicating to use the default endpoint in the current universe domain.
1757
+ # @return [::String,nil]
1741
1758
  # @!attribute [rw] credentials
1742
1759
  # Credentials to send with calls. You may provide any of the following types:
1743
1760
  # * (`String`) The path to a service account key file in JSON format
@@ -1783,13 +1800,20 @@ module Google
1783
1800
  # @!attribute [rw] quota_project
1784
1801
  # A separate project against which to charge quota.
1785
1802
  # @return [::String]
1803
+ # @!attribute [rw] universe_domain
1804
+ # The universe domain within which to make requests. This determines the
1805
+ # default endpoint URL. The default value of nil uses the environment
1806
+ # universe (usually the default "googleapis.com" universe).
1807
+ # @return [::String,nil]
1786
1808
  #
1787
1809
  class Configuration
1788
1810
  extend ::Gapic::Config
1789
1811
 
1812
+ # @private
1813
+ # The endpoint specific to the default "googleapis.com" universe. Deprecated.
1790
1814
  DEFAULT_ENDPOINT = "bigquerydatatransfer.googleapis.com"
1791
1815
 
1792
- config_attr :endpoint, DEFAULT_ENDPOINT, ::String
1816
+ config_attr :endpoint, nil, ::String, nil
1793
1817
  config_attr :credentials, nil do |value|
1794
1818
  allowed = [::String, ::Hash, ::Proc, ::Symbol, ::Google::Auth::Credentials, ::Signet::OAuth2::Client, nil]
1795
1819
  allowed += [::GRPC::Core::Channel, ::GRPC::Core::ChannelCredentials] if defined? ::GRPC
@@ -1804,6 +1828,7 @@ module Google
1804
1828
  config_attr :metadata, nil, ::Hash, nil
1805
1829
  config_attr :retry_policy, nil, ::Hash, ::Proc, nil
1806
1830
  config_attr :quota_project, nil, ::String, nil
1831
+ config_attr :universe_domain, nil, ::String, nil
1807
1832
 
1808
1833
  # @private
1809
1834
  def initialize parent_config = nil
@@ -1824,6 +1849,14 @@ module Google
1824
1849
  end
1825
1850
  end
1826
1851
 
1852
+ ##
1853
+ # Configuration for the channel pool
1854
+ # @return [::Gapic::ServiceStub::ChannelPool::Configuration]
1855
+ #
1856
+ def channel_pool
1857
+ @channel_pool ||= ::Gapic::ServiceStub::ChannelPool::Configuration.new
1858
+ end
1859
+
1827
1860
  ##
1828
1861
  # Configuration RPC class for the DataTransferService API.
1829
1862
  #
@@ -34,6 +34,9 @@ module Google
34
34
  # This API allows users to manage their data transfers into BigQuery.
35
35
  #
36
36
  class Client
37
+ # @private
38
+ DEFAULT_ENDPOINT_TEMPLATE = "bigquerydatatransfer.$UNIVERSE_DOMAIN$"
39
+
37
40
  include Paths
38
41
 
39
42
  # @private
@@ -150,6 +153,15 @@ module Google
150
153
  @config
151
154
  end
152
155
 
156
+ ##
157
+ # The effective universe domain
158
+ #
159
+ # @return [String]
160
+ #
161
+ def universe_domain
162
+ @data_transfer_service_stub.universe_domain
163
+ end
164
+
153
165
  ##
154
166
  # Create a new DataTransferService REST client object.
155
167
  #
@@ -177,8 +189,9 @@ module Google
177
189
  credentials = @config.credentials
178
190
  # Use self-signed JWT if the endpoint is unchanged from default,
179
191
  # but only if the default endpoint does not have a region prefix.
180
- enable_self_signed_jwt = @config.endpoint == Configuration::DEFAULT_ENDPOINT &&
181
- !@config.endpoint.split(".").first.include?("-")
192
+ enable_self_signed_jwt = @config.endpoint.nil? ||
193
+ (@config.endpoint == Configuration::DEFAULT_ENDPOINT &&
194
+ !@config.endpoint.split(".").first.include?("-"))
182
195
  credentials ||= Credentials.default scope: @config.scope,
183
196
  enable_self_signed_jwt: enable_self_signed_jwt
184
197
  if credentials.is_a?(::String) || credentials.is_a?(::Hash)
@@ -192,10 +205,16 @@ module Google
192
205
  config.credentials = credentials
193
206
  config.quota_project = @quota_project_id
194
207
  config.endpoint = @config.endpoint
208
+ config.universe_domain = @config.universe_domain
195
209
  config.bindings_override = @config.bindings_override
196
210
  end
197
211
 
198
- @data_transfer_service_stub = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::ServiceStub.new endpoint: @config.endpoint, credentials: credentials
212
+ @data_transfer_service_stub = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::ServiceStub.new(
213
+ endpoint: @config.endpoint,
214
+ endpoint_template: DEFAULT_ENDPOINT_TEMPLATE,
215
+ universe_domain: @config.universe_domain,
216
+ credentials: credentials
217
+ )
199
218
  end
200
219
 
201
220
  ##
@@ -236,6 +255,22 @@ module Google
236
255
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::DataSource]
237
256
  #
238
257
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
258
+ #
259
+ # @example Basic example
260
+ # require "google/cloud/bigquery/data_transfer/v1"
261
+ #
262
+ # # Create a client object. The client can be reused for multiple calls.
263
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
264
+ #
265
+ # # Create a request. To set request fields, pass in keyword arguments.
266
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::GetDataSourceRequest.new
267
+ #
268
+ # # Call the get_data_source method.
269
+ # result = client.get_data_source request
270
+ #
271
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::DataSource.
272
+ # p result
273
+ #
239
274
  def get_data_source request, options = nil
240
275
  raise ::ArgumentError, "request must be provided" if request.nil?
241
276
 
@@ -308,6 +343,26 @@ module Google
308
343
  # @return [::Gapic::Rest::PagedEnumerable<::Google::Cloud::Bigquery::DataTransfer::V1::DataSource>]
309
344
  #
310
345
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
346
+ #
347
+ # @example Basic example
348
+ # require "google/cloud/bigquery/data_transfer/v1"
349
+ #
350
+ # # Create a client object. The client can be reused for multiple calls.
351
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
352
+ #
353
+ # # Create a request. To set request fields, pass in keyword arguments.
354
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::ListDataSourcesRequest.new
355
+ #
356
+ # # Call the list_data_sources method.
357
+ # result = client.list_data_sources request
358
+ #
359
+ # # The returned object is of type Gapic::PagedEnumerable. You can iterate
360
+ # # over elements, and API calls will be issued to fetch pages as needed.
361
+ # result.each do |item|
362
+ # # Each element is of type ::Google::Cloud::Bigquery::DataTransfer::V1::DataSource.
363
+ # p item
364
+ # end
365
+ #
311
366
  def list_data_sources request, options = nil
312
367
  raise ::ArgumentError, "request must be provided" if request.nil?
313
368
 
@@ -417,6 +472,22 @@ module Google
417
472
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig]
418
473
  #
419
474
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
475
+ #
476
+ # @example Basic example
477
+ # require "google/cloud/bigquery/data_transfer/v1"
478
+ #
479
+ # # Create a client object. The client can be reused for multiple calls.
480
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
481
+ #
482
+ # # Create a request. To set request fields, pass in keyword arguments.
483
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::CreateTransferConfigRequest.new
484
+ #
485
+ # # Call the create_transfer_config method.
486
+ # result = client.create_transfer_config request
487
+ #
488
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig.
489
+ # p result
490
+ #
420
491
  def create_transfer_config request, options = nil
421
492
  raise ::ArgumentError, "request must be provided" if request.nil?
422
493
 
@@ -522,6 +593,22 @@ module Google
522
593
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig]
523
594
  #
524
595
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
596
+ #
597
+ # @example Basic example
598
+ # require "google/cloud/bigquery/data_transfer/v1"
599
+ #
600
+ # # Create a client object. The client can be reused for multiple calls.
601
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
602
+ #
603
+ # # Create a request. To set request fields, pass in keyword arguments.
604
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::UpdateTransferConfigRequest.new
605
+ #
606
+ # # Call the update_transfer_config method.
607
+ # result = client.update_transfer_config request
608
+ #
609
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig.
610
+ # p result
611
+ #
525
612
  def update_transfer_config request, options = nil
526
613
  raise ::ArgumentError, "request must be provided" if request.nil?
527
614
 
@@ -587,6 +674,22 @@ module Google
587
674
  # @return [::Google::Protobuf::Empty]
588
675
  #
589
676
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
677
+ #
678
+ # @example Basic example
679
+ # require "google/cloud/bigquery/data_transfer/v1"
680
+ #
681
+ # # Create a client object. The client can be reused for multiple calls.
682
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
683
+ #
684
+ # # Create a request. To set request fields, pass in keyword arguments.
685
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::DeleteTransferConfigRequest.new
686
+ #
687
+ # # Call the delete_transfer_config method.
688
+ # result = client.delete_transfer_config request
689
+ #
690
+ # # The returned object is of type Google::Protobuf::Empty.
691
+ # p result
692
+ #
590
693
  def delete_transfer_config request, options = nil
591
694
  raise ::ArgumentError, "request must be provided" if request.nil?
592
695
 
@@ -651,6 +754,22 @@ module Google
651
754
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig]
652
755
  #
653
756
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
757
+ #
758
+ # @example Basic example
759
+ # require "google/cloud/bigquery/data_transfer/v1"
760
+ #
761
+ # # Create a client object. The client can be reused for multiple calls.
762
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
763
+ #
764
+ # # Create a request. To set request fields, pass in keyword arguments.
765
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::GetTransferConfigRequest.new
766
+ #
767
+ # # Call the get_transfer_config method.
768
+ # result = client.get_transfer_config request
769
+ #
770
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig.
771
+ # p result
772
+ #
654
773
  def get_transfer_config request, options = nil
655
774
  raise ::ArgumentError, "request must be provided" if request.nil?
656
775
 
@@ -726,6 +845,26 @@ module Google
726
845
  # @return [::Gapic::Rest::PagedEnumerable<::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig>]
727
846
  #
728
847
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
848
+ #
849
+ # @example Basic example
850
+ # require "google/cloud/bigquery/data_transfer/v1"
851
+ #
852
+ # # Create a client object. The client can be reused for multiple calls.
853
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
854
+ #
855
+ # # Create a request. To set request fields, pass in keyword arguments.
856
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::ListTransferConfigsRequest.new
857
+ #
858
+ # # Call the list_transfer_configs method.
859
+ # result = client.list_transfer_configs request
860
+ #
861
+ # # The returned object is of type Gapic::PagedEnumerable. You can iterate
862
+ # # over elements, and API calls will be issued to fetch pages as needed.
863
+ # result.each do |item|
864
+ # # Each element is of type ::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig.
865
+ # p item
866
+ # end
867
+ #
729
868
  def list_transfer_configs request, options = nil
730
869
  raise ::ArgumentError, "request must be provided" if request.nil?
731
870
 
@@ -801,6 +940,22 @@ module Google
801
940
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::ScheduleTransferRunsResponse]
802
941
  #
803
942
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
943
+ #
944
+ # @example Basic example
945
+ # require "google/cloud/bigquery/data_transfer/v1"
946
+ #
947
+ # # Create a client object. The client can be reused for multiple calls.
948
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
949
+ #
950
+ # # Create a request. To set request fields, pass in keyword arguments.
951
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::ScheduleTransferRunsRequest.new
952
+ #
953
+ # # Call the schedule_transfer_runs method.
954
+ # result = client.schedule_transfer_runs request
955
+ #
956
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::ScheduleTransferRunsResponse.
957
+ # p result
958
+ #
804
959
  def schedule_transfer_runs request, options = nil
805
960
  raise ::ArgumentError, "request must be provided" if request.nil?
806
961
 
@@ -878,6 +1033,22 @@ module Google
878
1033
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::StartManualTransferRunsResponse]
879
1034
  #
880
1035
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
1036
+ #
1037
+ # @example Basic example
1038
+ # require "google/cloud/bigquery/data_transfer/v1"
1039
+ #
1040
+ # # Create a client object. The client can be reused for multiple calls.
1041
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
1042
+ #
1043
+ # # Create a request. To set request fields, pass in keyword arguments.
1044
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::StartManualTransferRunsRequest.new
1045
+ #
1046
+ # # Call the start_manual_transfer_runs method.
1047
+ # result = client.start_manual_transfer_runs request
1048
+ #
1049
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::StartManualTransferRunsResponse.
1050
+ # p result
1051
+ #
881
1052
  def start_manual_transfer_runs request, options = nil
882
1053
  raise ::ArgumentError, "request must be provided" if request.nil?
883
1054
 
@@ -943,6 +1114,22 @@ module Google
943
1114
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::TransferRun]
944
1115
  #
945
1116
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
1117
+ #
1118
+ # @example Basic example
1119
+ # require "google/cloud/bigquery/data_transfer/v1"
1120
+ #
1121
+ # # Create a client object. The client can be reused for multiple calls.
1122
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
1123
+ #
1124
+ # # Create a request. To set request fields, pass in keyword arguments.
1125
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::GetTransferRunRequest.new
1126
+ #
1127
+ # # Call the get_transfer_run method.
1128
+ # result = client.get_transfer_run request
1129
+ #
1130
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::TransferRun.
1131
+ # p result
1132
+ #
946
1133
  def get_transfer_run request, options = nil
947
1134
  raise ::ArgumentError, "request must be provided" if request.nil?
948
1135
 
@@ -1008,6 +1195,22 @@ module Google
1008
1195
  # @return [::Google::Protobuf::Empty]
1009
1196
  #
1010
1197
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
1198
+ #
1199
+ # @example Basic example
1200
+ # require "google/cloud/bigquery/data_transfer/v1"
1201
+ #
1202
+ # # Create a client object. The client can be reused for multiple calls.
1203
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
1204
+ #
1205
+ # # Create a request. To set request fields, pass in keyword arguments.
1206
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::DeleteTransferRunRequest.new
1207
+ #
1208
+ # # Call the delete_transfer_run method.
1209
+ # result = client.delete_transfer_run request
1210
+ #
1211
+ # # The returned object is of type Google::Protobuf::Empty.
1212
+ # p result
1213
+ #
1011
1214
  def delete_transfer_run request, options = nil
1012
1215
  raise ::ArgumentError, "request must be provided" if request.nil?
1013
1216
 
@@ -1085,6 +1288,26 @@ module Google
1085
1288
  # @return [::Gapic::Rest::PagedEnumerable<::Google::Cloud::Bigquery::DataTransfer::V1::TransferRun>]
1086
1289
  #
1087
1290
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
1291
+ #
1292
+ # @example Basic example
1293
+ # require "google/cloud/bigquery/data_transfer/v1"
1294
+ #
1295
+ # # Create a client object. The client can be reused for multiple calls.
1296
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
1297
+ #
1298
+ # # Create a request. To set request fields, pass in keyword arguments.
1299
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::ListTransferRunsRequest.new
1300
+ #
1301
+ # # Call the list_transfer_runs method.
1302
+ # result = client.list_transfer_runs request
1303
+ #
1304
+ # # The returned object is of type Gapic::PagedEnumerable. You can iterate
1305
+ # # over elements, and API calls will be issued to fetch pages as needed.
1306
+ # result.each do |item|
1307
+ # # Each element is of type ::Google::Cloud::Bigquery::DataTransfer::V1::TransferRun.
1308
+ # p item
1309
+ # end
1310
+ #
1088
1311
  def list_transfer_runs request, options = nil
1089
1312
  raise ::ArgumentError, "request must be provided" if request.nil?
1090
1313
 
@@ -1161,6 +1384,26 @@ module Google
1161
1384
  # @return [::Gapic::Rest::PagedEnumerable<::Google::Cloud::Bigquery::DataTransfer::V1::TransferMessage>]
1162
1385
  #
1163
1386
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
1387
+ #
1388
+ # @example Basic example
1389
+ # require "google/cloud/bigquery/data_transfer/v1"
1390
+ #
1391
+ # # Create a client object. The client can be reused for multiple calls.
1392
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
1393
+ #
1394
+ # # Create a request. To set request fields, pass in keyword arguments.
1395
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::ListTransferLogsRequest.new
1396
+ #
1397
+ # # Call the list_transfer_logs method.
1398
+ # result = client.list_transfer_logs request
1399
+ #
1400
+ # # The returned object is of type Gapic::PagedEnumerable. You can iterate
1401
+ # # over elements, and API calls will be issued to fetch pages as needed.
1402
+ # result.each do |item|
1403
+ # # Each element is of type ::Google::Cloud::Bigquery::DataTransfer::V1::TransferMessage.
1404
+ # p item
1405
+ # end
1406
+ #
1164
1407
  def list_transfer_logs request, options = nil
1165
1408
  raise ::ArgumentError, "request must be provided" if request.nil?
1166
1409
 
@@ -1227,6 +1470,22 @@ module Google
1227
1470
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::CheckValidCredsResponse]
1228
1471
  #
1229
1472
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
1473
+ #
1474
+ # @example Basic example
1475
+ # require "google/cloud/bigquery/data_transfer/v1"
1476
+ #
1477
+ # # Create a client object. The client can be reused for multiple calls.
1478
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
1479
+ #
1480
+ # # Create a request. To set request fields, pass in keyword arguments.
1481
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::CheckValidCredsRequest.new
1482
+ #
1483
+ # # Call the check_valid_creds method.
1484
+ # result = client.check_valid_creds request
1485
+ #
1486
+ # # The returned object is of type Google::Cloud::Bigquery::DataTransfer::V1::CheckValidCredsResponse.
1487
+ # p result
1488
+ #
1230
1489
  def check_valid_creds request, options = nil
1231
1490
  raise ::ArgumentError, "request must be provided" if request.nil?
1232
1491
 
@@ -1299,6 +1558,22 @@ module Google
1299
1558
  # @return [::Google::Protobuf::Empty]
1300
1559
  #
1301
1560
  # @raise [::Google::Cloud::Error] if the REST call is aborted.
1561
+ #
1562
+ # @example Basic example
1563
+ # require "google/cloud/bigquery/data_transfer/v1"
1564
+ #
1565
+ # # Create a client object. The client can be reused for multiple calls.
1566
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::Client.new
1567
+ #
1568
+ # # Create a request. To set request fields, pass in keyword arguments.
1569
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::EnrollDataSourcesRequest.new
1570
+ #
1571
+ # # Call the enroll_data_sources method.
1572
+ # result = client.enroll_data_sources request
1573
+ #
1574
+ # # The returned object is of type Google::Protobuf::Empty.
1575
+ # p result
1576
+ #
1302
1577
  def enroll_data_sources request, options = nil
1303
1578
  raise ::ArgumentError, "request must be provided" if request.nil?
1304
1579
 
@@ -1364,9 +1639,9 @@ module Google
1364
1639
  # end
1365
1640
  #
1366
1641
  # @!attribute [rw] endpoint
1367
- # The hostname or hostname:port of the service endpoint.
1368
- # Defaults to `"bigquerydatatransfer.googleapis.com"`.
1369
- # @return [::String]
1642
+ # A custom service endpoint, as a hostname or hostname:port. The default is
1643
+ # nil, indicating to use the default endpoint in the current universe domain.
1644
+ # @return [::String,nil]
1370
1645
  # @!attribute [rw] credentials
1371
1646
  # Credentials to send with calls. You may provide any of the following types:
1372
1647
  # * (`String`) The path to a service account key file in JSON format
@@ -1403,13 +1678,20 @@ module Google
1403
1678
  # @!attribute [rw] quota_project
1404
1679
  # A separate project against which to charge quota.
1405
1680
  # @return [::String]
1681
+ # @!attribute [rw] universe_domain
1682
+ # The universe domain within which to make requests. This determines the
1683
+ # default endpoint URL. The default value of nil uses the environment
1684
+ # universe (usually the default "googleapis.com" universe).
1685
+ # @return [::String,nil]
1406
1686
  #
1407
1687
  class Configuration
1408
1688
  extend ::Gapic::Config
1409
1689
 
1690
+ # @private
1691
+ # The endpoint specific to the default "googleapis.com" universe. Deprecated.
1410
1692
  DEFAULT_ENDPOINT = "bigquerydatatransfer.googleapis.com"
1411
1693
 
1412
- config_attr :endpoint, DEFAULT_ENDPOINT, ::String
1694
+ config_attr :endpoint, nil, ::String, nil
1413
1695
  config_attr :credentials, nil do |value|
1414
1696
  allowed = [::String, ::Hash, ::Proc, ::Symbol, ::Google::Auth::Credentials, ::Signet::OAuth2::Client, nil]
1415
1697
  allowed.any? { |klass| klass === value }
@@ -1421,6 +1703,7 @@ module Google
1421
1703
  config_attr :metadata, nil, ::Hash, nil
1422
1704
  config_attr :retry_policy, nil, ::Hash, ::Proc, nil
1423
1705
  config_attr :quota_project, nil, ::String, nil
1706
+ config_attr :universe_domain, nil, ::String, nil
1424
1707
 
1425
1708
  # @private
1426
1709
  # Overrides for http bindings for the RPCs of this service
@@ -31,16 +31,28 @@ module Google
31
31
  # including transcoding, making the REST call, and deserialing the response.
32
32
  #
33
33
  class ServiceStub
34
- def initialize endpoint:, credentials:
34
+ def initialize endpoint:, endpoint_template:, universe_domain:, credentials:
35
35
  # These require statements are intentionally placed here to initialize
36
36
  # the REST modules only when it's required.
37
37
  require "gapic/rest"
38
38
 
39
- @client_stub = ::Gapic::Rest::ClientStub.new endpoint: endpoint, credentials: credentials,
39
+ @client_stub = ::Gapic::Rest::ClientStub.new endpoint: endpoint,
40
+ endpoint_template: endpoint_template,
41
+ universe_domain: universe_domain,
42
+ credentials: credentials,
40
43
  numeric_enums: true,
41
44
  raise_faraday_errors: false
42
45
  end
43
46
 
47
+ ##
48
+ # The effective universe domain
49
+ #
50
+ # @return [String]
51
+ #
52
+ def universe_domain
53
+ @client_stub.universe_domain
54
+ end
55
+
44
56
  ##
45
57
  # Baseline implementation for the get_data_source REST call
46
58
  #
@@ -22,7 +22,7 @@ module Google
22
22
  module Bigquery
23
23
  module DataTransfer
24
24
  module V1
25
- VERSION = "0.10.0"
25
+ VERSION = "0.12.0"
26
26
  end
27
27
  end
28
28
  end
@@ -21,6 +21,7 @@ module Google
21
21
  module Api
22
22
  # Required information for every language.
23
23
  # @!attribute [rw] reference_docs_uri
24
+ # @deprecated This field is deprecated and may be removed in the next major version update.
24
25
  # @return [::String]
25
26
  # Link to automatically generated reference documentation. Example:
26
27
  # https://cloud.google.com/nodejs/docs/reference/asset/latest
@@ -304,6 +305,19 @@ module Google
304
305
  # seconds: 360 # 6 minutes
305
306
  # total_poll_timeout:
306
307
  # seconds: 54000 # 90 minutes
308
+ # @!attribute [rw] auto_populated_fields
309
+ # @return [::Array<::String>]
310
+ # List of top-level fields of the request message, that should be
311
+ # automatically populated by the client libraries based on their
312
+ # (google.api.field_info).format. Currently supported format: UUID4.
313
+ #
314
+ # Example of a YAML configuration:
315
+ #
316
+ # publishing:
317
+ # method_settings:
318
+ # - selector: google.example.v1.ExampleService.CreateExample
319
+ # auto_populated_fields:
320
+ # - request_id
307
321
  class MethodSettings
308
322
  include ::Google::Protobuf::MessageExts
309
323
  extend ::Google::Protobuf::MessageExts::ClassMethods
@@ -66,6 +66,20 @@ module Google
66
66
  # a non-empty value will be returned. The user will not be aware of what
67
67
  # non-empty value to expect.
68
68
  NON_EMPTY_DEFAULT = 7
69
+
70
+ # Denotes that the field in a resource (a message annotated with
71
+ # google.api.resource) is used in the resource name to uniquely identify the
72
+ # resource. For AIP-compliant APIs, this should only be applied to the
73
+ # `name` field on the resource.
74
+ #
75
+ # This behavior should not be applied to references to other resources within
76
+ # the message.
77
+ #
78
+ # The identifier field of resources often have different field behavior
79
+ # depending on the request it is embedded in (e.g. for Create methods name
80
+ # is optional and unused, while for Update methods it is required). Instead
81
+ # of method-specific annotations, only `IDENTIFIER` is required.
82
+ IDENTIFIER = 8
69
83
  end
70
84
  end
71
85
  end
@@ -128,9 +128,11 @@ module Google
128
128
  # scopes needed by a data source to prepare data and ingest them into
129
129
  # BigQuery, e.g., https://www.googleapis.com/auth/bigquery
130
130
  # @!attribute [rw] transfer_type
131
+ # @deprecated This field is deprecated and may be removed in the next major version update.
131
132
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::TransferType]
132
133
  # Deprecated. This field has no effect.
133
134
  # @!attribute [rw] supports_multiple_transfers
135
+ # @deprecated This field is deprecated and may be removed in the next major version update.
134
136
  # @return [::Boolean]
135
137
  # Deprecated. This field has no effect.
136
138
  # @!attribute [rw] update_deadline_seconds
@@ -75,9 +75,10 @@ module Google
75
75
  # @!attribute [rw] name
76
76
  # @return [::String]
77
77
  # The resource name of the transfer config.
78
- # Transfer config names have the form
79
- # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`.
80
- # Where `config_id` is usually a uuid, even though it is not
78
+ # Transfer config names have the form either
79
+ # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}` or
80
+ # `projects/{project_id}/transferConfigs/{config_id}`,
81
+ # where `config_id` is usually a UUID, even though it is not
81
82
  # guaranteed or required. The name is ignored when creating a transfer
82
83
  # config.
83
84
  # @!attribute [rw] destination_dataset_id
@@ -282,6 +283,7 @@ module Google
282
283
  end
283
284
 
284
285
  # DEPRECATED. Represents data transfer type.
286
+ # @deprecated This enum is deprecated and may be removed in the next major version update.
285
287
  module TransferType
286
288
  # Invalid or Unknown transfer type placeholder.
287
289
  TRANSFER_TYPE_UNSPECIFIED = 0
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery-data_transfer-v1
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.10.0
4
+ version: 0.12.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Google LLC
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-08-03 00:00:00.000000000 Z
11
+ date: 2024-01-11 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: gapic-common
@@ -16,7 +16,7 @@ dependencies:
16
16
  requirements:
17
17
  - - ">="
18
18
  - !ruby/object:Gem::Version
19
- version: 0.19.1
19
+ version: 0.21.1
20
20
  - - "<"
21
21
  - !ruby/object:Gem::Version
22
22
  version: 2.a
@@ -26,7 +26,7 @@ dependencies:
26
26
  requirements:
27
27
  - - ">="
28
28
  - !ruby/object:Gem::Version
29
- version: 0.19.1
29
+ version: 0.21.1
30
30
  - - "<"
31
31
  - !ruby/object:Gem::Version
32
32
  version: 2.a
@@ -50,7 +50,7 @@ dependencies:
50
50
  requirements:
51
51
  - - ">="
52
52
  - !ruby/object:Gem::Version
53
- version: '0.4'
53
+ version: '0.7'
54
54
  - - "<"
55
55
  - !ruby/object:Gem::Version
56
56
  version: 2.a
@@ -60,7 +60,7 @@ dependencies:
60
60
  requirements:
61
61
  - - ">="
62
62
  - !ruby/object:Gem::Version
63
- version: '0.4'
63
+ version: '0.7'
64
64
  - - "<"
65
65
  - !ruby/object:Gem::Version
66
66
  version: 2.a
@@ -239,7 +239,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
239
239
  - !ruby/object:Gem::Version
240
240
  version: '0'
241
241
  requirements: []
242
- rubygems_version: 3.4.2
242
+ rubygems_version: 3.5.3
243
243
  signing_key:
244
244
  specification_version: 4
245
245
  summary: Schedule queries or transfer external data from SaaS applications to Google