google-cloud-bigquery-data_transfer-v1 0.11.0 → 0.12.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 3b56f293bce6677ee068aa41c1ba4ec764de4f61657333377423693953afa721
4
- data.tar.gz: 45409cc61fd48cf43a5e3d06507aed1d2ee9470af0769d79a75cbeb4da287a47
3
+ metadata.gz: 414dd8889ba23effda6a32a344fa660e76a977148767394098c1732f53e076a2
4
+ data.tar.gz: b805e9aba8f2e5edc6103a92506c7ec98de653c31ddd5cd19f9ba29813f561de
5
5
  SHA512:
6
- metadata.gz: ed2d2c5378ddac46078fe2302021af9e97a7621b1deabbec1fa61418d330af545a535e7eb719801074295c43aa804b4dd46eea67f03792a2cf97def3e8913a58
7
- data.tar.gz: 68d927dcbc0edae92bf20ecaab874f6a460d9096b2c00e9901423ea5888806e82ae6f705461596036e45e3261894a159b5a330ad55f80a5e122b7e181f873751
6
+ metadata.gz: bc02d1135772af2432c562aa2c295022dd6f42479b3b8d4d10295fbf69ca1457a117f7cc8bce5eb218a82d5eb12db21cb43e22ee2b4630951b79203353d0f137
7
+ data.tar.gz: 0f7f11932f1962208bacda2c12ebfbce80ef2ed824c94b07a4574689da92b9327c4a7d4f86c7933598e04f53f0c0a6b4ef17557a525135b63352b8acbe04ff8d
data/AUTHENTICATION.md CHANGED
@@ -1,151 +1,122 @@
1
1
  # Authentication
2
2
 
3
- In general, the google-cloud-bigquery-data_transfer-v1 library uses
4
- [Service Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
5
- credentials to connect to Google Cloud services. When running within
6
- [Google Cloud Platform environments](#google-cloud-platform-environments) the
7
- credentials will be discovered automatically. When running on other
8
- environments, the Service Account credentials can be specified by providing the
9
- path to the
10
- [JSON keyfile](https://cloud.google.com/iam/docs/managing-service-account-keys)
11
- for the account (or the JSON itself) in
12
- [environment variables](#environment-variables). Additionally, Cloud SDK
13
- credentials can also be discovered automatically, but this is only recommended
14
- during development.
3
+ The recommended way to authenticate to the google-cloud-bigquery-data_transfer-v1 library is to use
4
+ [Application Default Credentials (ADC)](https://cloud.google.com/docs/authentication/application-default-credentials).
5
+ To review all of your authentication options, see [Credentials lookup](#credential-lookup).
15
6
 
16
7
  ## Quickstart
17
8
 
18
- 1. [Create a service account and credentials](#creating-a-service-account).
19
- 2. Set the [environment variable](#environment-variables).
9
+ The following example shows how to set up authentication for a local development
10
+ environment with your user credentials.
20
11
 
21
- ```sh
22
- export DATA_TRANSFER_CREDENTIALS=path/to/keyfile.json
23
- ```
24
-
25
- 3. Initialize the client.
12
+ **NOTE:** This method is _not_ recommended for running in production. User credentials
13
+ should be used only during development.
26
14
 
27
- ```ruby
28
- require "google/cloud/bigquery/data_transfer/v1"
15
+ 1. [Download and install the Google Cloud CLI](https://cloud.google.com/sdk).
16
+ 2. Set up a local ADC file with your user credentials:
29
17
 
30
- client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
18
+ ```sh
19
+ gcloud auth application-default login
31
20
  ```
32
21
 
33
- ## Credential Lookup
34
-
35
- The google-cloud-bigquery-data_transfer-v1 library aims to make authentication
36
- as simple as possible, and provides several mechanisms to configure your system
37
- without requiring **Service Account Credentials** directly in code.
38
-
39
- **Credentials** are discovered in the following order:
40
-
41
- 1. Specify credentials in method arguments
42
- 2. Specify credentials in configuration
43
- 3. Discover credentials path in environment variables
44
- 4. Discover credentials JSON in environment variables
45
- 5. Discover credentials file in the Cloud SDK's path
46
- 6. Discover GCP credentials
47
-
48
- ### Google Cloud Platform environments
22
+ 3. Write code as if already authenticated.
49
23
 
50
- When running on Google Cloud Platform (GCP), including Google Compute Engine
51
- (GCE), Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud
52
- Functions (GCF) and Cloud Run, **Credentials** are discovered automatically.
53
- Code should be written as if already authenticated.
24
+ For more information about setting up authentication for a local development environment, see
25
+ [Set up Application Default Credentials](https://cloud.google.com/docs/authentication/provide-credentials-adc#local-dev).
54
26
 
55
- ### Environment Variables
27
+ ## Credential Lookup
56
28
 
57
- The **Credentials JSON** can be placed in environment variables instead of
58
- declaring them directly in code. Each service has its own environment variable,
59
- allowing for different service accounts to be used for different services. (See
60
- the READMEs for the individual service gems for details.) The path to the
61
- **Credentials JSON** file can be stored in the environment variable, or the
62
- **Credentials JSON** itself can be stored for environments such as Docker
63
- containers where writing files is difficult or not encouraged.
29
+ The google-cloud-bigquery-data_transfer-v1 library provides several mechanisms to configure your system.
30
+ Generally, using Application Default Credentials to facilitate automatic
31
+ credentials discovery is the easist method. But if you need to explicitly specify
32
+ credentials, there are several methods available to you.
64
33
 
65
- The environment variables that google-cloud-bigquery-data_transfer-v1
66
- checks for credentials are configured on the service Credentials class (such as
67
- {::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Credentials}):
34
+ Credentials are accepted in the following ways, in the following order or precedence:
68
35
 
69
- * `DATA_TRANSFER_CREDENTIALS` - Path to JSON file, or JSON contents
70
- * `DATA_TRANSFER_KEYFILE` - Path to JSON file, or JSON contents
71
- * `GOOGLE_CLOUD_CREDENTIALS` - Path to JSON file, or JSON contents
72
- * `GOOGLE_CLOUD_KEYFILE` - Path to JSON file, or JSON contents
73
- * `GOOGLE_APPLICATION_CREDENTIALS` - Path to JSON file
36
+ 1. Credentials specified in method arguments
37
+ 2. Credentials specified in configuration
38
+ 3. Credentials pointed to or included in environment variables
39
+ 4. Credentials found in local ADC file
40
+ 5. Credentials returned by the metadata server for the attached service account (GCP)
74
41
 
75
- ```ruby
76
- require "google/cloud/bigquery/data_transfer/v1"
77
-
78
- ENV["DATA_TRANSFER_CREDENTIALS"] = "path/to/keyfile.json"
42
+ ### Configuration
79
43
 
80
- client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
81
- ```
44
+ You can configure a path to a JSON credentials file, either for an individual client object or
45
+ globally, for all client objects. The JSON file can contain credentials created for
46
+ [workload identity federation](https://cloud.google.com/iam/docs/workload-identity-federation),
47
+ [workforce identity federation](https://cloud.google.com/iam/docs/workforce-identity-federation), or a
48
+ [service account key](https://cloud.google.com/docs/authentication/provide-credentials-adc#local-key).
82
49
 
83
- ### Configuration
50
+ Note: Service account keys are a security risk if not managed correctly. You should
51
+ [choose a more secure alternative to service account keys](https://cloud.google.com/docs/authentication#auth-decision-tree)
52
+ whenever possible.
84
53
 
85
- The path to the **Credentials JSON** file can be configured instead of storing
86
- it in an environment variable. Either on an individual client initialization:
54
+ To configure a credentials file for an individual client initialization:
87
55
 
88
56
  ```ruby
89
57
  require "google/cloud/bigquery/data_transfer/v1"
90
58
 
91
59
  client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new do |config|
92
- config.credentials = "path/to/keyfile.json"
60
+ config.credentials = "path/to/credentialfile.json"
93
61
  end
94
62
  ```
95
63
 
96
- Or globally for all clients:
64
+ To configure a credentials file globally for all clients:
97
65
 
98
66
  ```ruby
99
67
  require "google/cloud/bigquery/data_transfer/v1"
100
68
 
101
69
  ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.configure do |config|
102
- config.credentials = "path/to/keyfile.json"
70
+ config.credentials = "path/to/credentialfile.json"
103
71
  end
104
72
 
105
73
  client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
106
74
  ```
107
75
 
108
- ### Cloud SDK
76
+ ### Environment Variables
109
77
 
110
- This option allows for an easy way to authenticate during development. If
111
- credentials are not provided in code or in environment variables, then Cloud SDK
112
- credentials are discovered.
78
+ You can also use an environment variable to provide a JSON credentials file.
79
+ The environment variable can contain a path to the credentials file or, for
80
+ environments such as Docker containers where writing files is not encouraged,
81
+ you can include the credentials file itself.
113
82
 
114
- To configure your system for this, simply:
83
+ The JSON file can contain credentials created for
84
+ [workload identity federation](https://cloud.google.com/iam/docs/workload-identity-federation),
85
+ [workforce identity federation](https://cloud.google.com/iam/docs/workforce-identity-federation), or a
86
+ [service account key](https://cloud.google.com/docs/authentication/provide-credentials-adc#local-key).
115
87
 
116
- 1. [Download and install the Cloud SDK](https://cloud.google.com/sdk)
117
- 2. Authenticate using OAuth 2.0 `$ gcloud auth application-default login`
118
- 3. Write code as if already authenticated.
88
+ Note: Service account keys are a security risk if not managed correctly. You should
89
+ [choose a more secure alternative to service account keys](https://cloud.google.com/docs/authentication#auth-decision-tree)
90
+ whenever possible.
91
+
92
+ The environment variables that google-cloud-bigquery-data_transfer-v1
93
+ checks for credentials are:
119
94
 
120
- **NOTE:** This is _not_ recommended for running in production. The Cloud SDK
121
- *should* only be used during development.
95
+ * `GOOGLE_CLOUD_CREDENTIALS` - Path to JSON file, or JSON contents
96
+ * `GOOGLE_APPLICATION_CREDENTIALS` - Path to JSON file
122
97
 
123
- ## Creating a Service Account
98
+ ```ruby
99
+ require "google/cloud/bigquery/data_transfer/v1"
124
100
 
125
- Google Cloud requires **Service Account Credentials** to
126
- connect to the APIs. You will use the **JSON key file** to
127
- connect to most services with google-cloud-bigquery-data_transfer-v1.
101
+ ENV["GOOGLE_APPLICATION_CREDENTIALS"] = "path/to/credentialfile.json"
128
102
 
129
- If you are not running this client within
130
- [Google Cloud Platform environments](#google-cloud-platform-environments), you
131
- need a Google Developers service account.
103
+ client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
104
+ ```
132
105
 
133
- 1. Visit the [Google Cloud Console](https://console.cloud.google.com/project).
134
- 2. Create a new project or click on an existing project.
135
- 3. Activate the menu in the upper left and select **APIs & Services**. From
136
- here, you will enable the APIs that your application requires.
106
+ ### Local ADC file
137
107
 
138
- *Note: You may need to enable billing in order to use these services.*
108
+ You can set up a local ADC file with your user credentials for authentication during
109
+ development. If credentials are not provided in code or in environment variables,
110
+ then the local ADC credentials are discovered.
139
111
 
140
- 4. Select **Credentials** from the side navigation.
112
+ Follow the steps in [Quickstart](#quickstart) to set up a local ADC file.
141
113
 
142
- Find the "Create credentials" drop down near the top of the page, and select
143
- "Service account" to be guided through downloading a new JSON key file.
114
+ ### Google Cloud Platform environments
144
115
 
145
- If you want to re-use an existing service account, you can easily generate a
146
- new key file. Just select the account you wish to re-use, click the pencil
147
- tool on the right side to edit the service account, select the **Keys** tab,
148
- and then select **Add Key**.
116
+ When running on Google Cloud Platform (GCP), including Google Compute Engine
117
+ (GCE), Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud
118
+ Functions (GCF) and Cloud Run, credentials are retrieved from the attached
119
+ service account automatically. Code should be written as if already authenticated.
149
120
 
150
- The key file you download will be used by this library to authenticate API
151
- requests and should be stored in a secure location.
121
+ For more information, see
122
+ [Set up ADC for Google Cloud services](https://cloud.google.com/docs/authentication/provide-credentials-adc#attached-sa).
@@ -32,6 +32,9 @@ module Google
32
32
  # This API allows users to manage their data transfers into BigQuery.
33
33
  #
34
34
  class Client
35
+ # @private
36
+ DEFAULT_ENDPOINT_TEMPLATE = "bigquerydatatransfer.$UNIVERSE_DOMAIN$"
37
+
35
38
  include Paths
36
39
 
37
40
  # @private
@@ -148,6 +151,15 @@ module Google
148
151
  @config
149
152
  end
150
153
 
154
+ ##
155
+ # The effective universe domain
156
+ #
157
+ # @return [String]
158
+ #
159
+ def universe_domain
160
+ @data_transfer_service_stub.universe_domain
161
+ end
162
+
151
163
  ##
152
164
  # Create a new DataTransferService client object.
153
165
  #
@@ -181,8 +193,9 @@ module Google
181
193
  credentials = @config.credentials
182
194
  # Use self-signed JWT if the endpoint is unchanged from default,
183
195
  # but only if the default endpoint does not have a region prefix.
184
- enable_self_signed_jwt = @config.endpoint == Configuration::DEFAULT_ENDPOINT &&
185
- !@config.endpoint.split(".").first.include?("-")
196
+ enable_self_signed_jwt = @config.endpoint.nil? ||
197
+ (@config.endpoint == Configuration::DEFAULT_ENDPOINT &&
198
+ !@config.endpoint.split(".").first.include?("-"))
186
199
  credentials ||= Credentials.default scope: @config.scope,
187
200
  enable_self_signed_jwt: enable_self_signed_jwt
188
201
  if credentials.is_a?(::String) || credentials.is_a?(::Hash)
@@ -195,12 +208,15 @@ module Google
195
208
  config.credentials = credentials
196
209
  config.quota_project = @quota_project_id
197
210
  config.endpoint = @config.endpoint
211
+ config.universe_domain = @config.universe_domain
198
212
  end
199
213
 
200
214
  @data_transfer_service_stub = ::Gapic::ServiceStub.new(
201
215
  ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Stub,
202
- credentials: credentials,
203
- endpoint: @config.endpoint,
216
+ credentials: credentials,
217
+ endpoint: @config.endpoint,
218
+ endpoint_template: DEFAULT_ENDPOINT_TEMPLATE,
219
+ universe_domain: @config.universe_domain,
204
220
  channel_args: @config.channel_args,
205
221
  interceptors: @config.interceptors,
206
222
  channel_pool_config: @config.channel_pool
@@ -1736,9 +1752,9 @@ module Google
1736
1752
  # end
1737
1753
  #
1738
1754
  # @!attribute [rw] endpoint
1739
- # The hostname or hostname:port of the service endpoint.
1740
- # Defaults to `"bigquerydatatransfer.googleapis.com"`.
1741
- # @return [::String]
1755
+ # A custom service endpoint, as a hostname or hostname:port. The default is
1756
+ # nil, indicating to use the default endpoint in the current universe domain.
1757
+ # @return [::String,nil]
1742
1758
  # @!attribute [rw] credentials
1743
1759
  # Credentials to send with calls. You may provide any of the following types:
1744
1760
  # * (`String`) The path to a service account key file in JSON format
@@ -1784,13 +1800,20 @@ module Google
1784
1800
  # @!attribute [rw] quota_project
1785
1801
  # A separate project against which to charge quota.
1786
1802
  # @return [::String]
1803
+ # @!attribute [rw] universe_domain
1804
+ # The universe domain within which to make requests. This determines the
1805
+ # default endpoint URL. The default value of nil uses the environment
1806
+ # universe (usually the default "googleapis.com" universe).
1807
+ # @return [::String,nil]
1787
1808
  #
1788
1809
  class Configuration
1789
1810
  extend ::Gapic::Config
1790
1811
 
1812
+ # @private
1813
+ # The endpoint specific to the default "googleapis.com" universe. Deprecated.
1791
1814
  DEFAULT_ENDPOINT = "bigquerydatatransfer.googleapis.com"
1792
1815
 
1793
- config_attr :endpoint, DEFAULT_ENDPOINT, ::String
1816
+ config_attr :endpoint, nil, ::String, nil
1794
1817
  config_attr :credentials, nil do |value|
1795
1818
  allowed = [::String, ::Hash, ::Proc, ::Symbol, ::Google::Auth::Credentials, ::Signet::OAuth2::Client, nil]
1796
1819
  allowed += [::GRPC::Core::Channel, ::GRPC::Core::ChannelCredentials] if defined? ::GRPC
@@ -1805,6 +1828,7 @@ module Google
1805
1828
  config_attr :metadata, nil, ::Hash, nil
1806
1829
  config_attr :retry_policy, nil, ::Hash, ::Proc, nil
1807
1830
  config_attr :quota_project, nil, ::String, nil
1831
+ config_attr :universe_domain, nil, ::String, nil
1808
1832
 
1809
1833
  # @private
1810
1834
  def initialize parent_config = nil
@@ -34,6 +34,9 @@ module Google
34
34
  # This API allows users to manage their data transfers into BigQuery.
35
35
  #
36
36
  class Client
37
+ # @private
38
+ DEFAULT_ENDPOINT_TEMPLATE = "bigquerydatatransfer.$UNIVERSE_DOMAIN$"
39
+
37
40
  include Paths
38
41
 
39
42
  # @private
@@ -150,6 +153,15 @@ module Google
150
153
  @config
151
154
  end
152
155
 
156
+ ##
157
+ # The effective universe domain
158
+ #
159
+ # @return [String]
160
+ #
161
+ def universe_domain
162
+ @data_transfer_service_stub.universe_domain
163
+ end
164
+
153
165
  ##
154
166
  # Create a new DataTransferService REST client object.
155
167
  #
@@ -177,8 +189,9 @@ module Google
177
189
  credentials = @config.credentials
178
190
  # Use self-signed JWT if the endpoint is unchanged from default,
179
191
  # but only if the default endpoint does not have a region prefix.
180
- enable_self_signed_jwt = @config.endpoint == Configuration::DEFAULT_ENDPOINT &&
181
- !@config.endpoint.split(".").first.include?("-")
192
+ enable_self_signed_jwt = @config.endpoint.nil? ||
193
+ (@config.endpoint == Configuration::DEFAULT_ENDPOINT &&
194
+ !@config.endpoint.split(".").first.include?("-"))
182
195
  credentials ||= Credentials.default scope: @config.scope,
183
196
  enable_self_signed_jwt: enable_self_signed_jwt
184
197
  if credentials.is_a?(::String) || credentials.is_a?(::Hash)
@@ -192,10 +205,16 @@ module Google
192
205
  config.credentials = credentials
193
206
  config.quota_project = @quota_project_id
194
207
  config.endpoint = @config.endpoint
208
+ config.universe_domain = @config.universe_domain
195
209
  config.bindings_override = @config.bindings_override
196
210
  end
197
211
 
198
- @data_transfer_service_stub = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::ServiceStub.new endpoint: @config.endpoint, credentials: credentials
212
+ @data_transfer_service_stub = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Rest::ServiceStub.new(
213
+ endpoint: @config.endpoint,
214
+ endpoint_template: DEFAULT_ENDPOINT_TEMPLATE,
215
+ universe_domain: @config.universe_domain,
216
+ credentials: credentials
217
+ )
199
218
  end
200
219
 
201
220
  ##
@@ -1620,9 +1639,9 @@ module Google
1620
1639
  # end
1621
1640
  #
1622
1641
  # @!attribute [rw] endpoint
1623
- # The hostname or hostname:port of the service endpoint.
1624
- # Defaults to `"bigquerydatatransfer.googleapis.com"`.
1625
- # @return [::String]
1642
+ # A custom service endpoint, as a hostname or hostname:port. The default is
1643
+ # nil, indicating to use the default endpoint in the current universe domain.
1644
+ # @return [::String,nil]
1626
1645
  # @!attribute [rw] credentials
1627
1646
  # Credentials to send with calls. You may provide any of the following types:
1628
1647
  # * (`String`) The path to a service account key file in JSON format
@@ -1659,13 +1678,20 @@ module Google
1659
1678
  # @!attribute [rw] quota_project
1660
1679
  # A separate project against which to charge quota.
1661
1680
  # @return [::String]
1681
+ # @!attribute [rw] universe_domain
1682
+ # The universe domain within which to make requests. This determines the
1683
+ # default endpoint URL. The default value of nil uses the environment
1684
+ # universe (usually the default "googleapis.com" universe).
1685
+ # @return [::String,nil]
1662
1686
  #
1663
1687
  class Configuration
1664
1688
  extend ::Gapic::Config
1665
1689
 
1690
+ # @private
1691
+ # The endpoint specific to the default "googleapis.com" universe. Deprecated.
1666
1692
  DEFAULT_ENDPOINT = "bigquerydatatransfer.googleapis.com"
1667
1693
 
1668
- config_attr :endpoint, DEFAULT_ENDPOINT, ::String
1694
+ config_attr :endpoint, nil, ::String, nil
1669
1695
  config_attr :credentials, nil do |value|
1670
1696
  allowed = [::String, ::Hash, ::Proc, ::Symbol, ::Google::Auth::Credentials, ::Signet::OAuth2::Client, nil]
1671
1697
  allowed.any? { |klass| klass === value }
@@ -1677,6 +1703,7 @@ module Google
1677
1703
  config_attr :metadata, nil, ::Hash, nil
1678
1704
  config_attr :retry_policy, nil, ::Hash, ::Proc, nil
1679
1705
  config_attr :quota_project, nil, ::String, nil
1706
+ config_attr :universe_domain, nil, ::String, nil
1680
1707
 
1681
1708
  # @private
1682
1709
  # Overrides for http bindings for the RPCs of this service
@@ -31,16 +31,28 @@ module Google
31
31
  # including transcoding, making the REST call, and deserialing the response.
32
32
  #
33
33
  class ServiceStub
34
- def initialize endpoint:, credentials:
34
+ def initialize endpoint:, endpoint_template:, universe_domain:, credentials:
35
35
  # These require statements are intentionally placed here to initialize
36
36
  # the REST modules only when it's required.
37
37
  require "gapic/rest"
38
38
 
39
- @client_stub = ::Gapic::Rest::ClientStub.new endpoint: endpoint, credentials: credentials,
39
+ @client_stub = ::Gapic::Rest::ClientStub.new endpoint: endpoint,
40
+ endpoint_template: endpoint_template,
41
+ universe_domain: universe_domain,
42
+ credentials: credentials,
40
43
  numeric_enums: true,
41
44
  raise_faraday_errors: false
42
45
  end
43
46
 
47
+ ##
48
+ # The effective universe domain
49
+ #
50
+ # @return [String]
51
+ #
52
+ def universe_domain
53
+ @client_stub.universe_domain
54
+ end
55
+
44
56
  ##
45
57
  # Baseline implementation for the get_data_source REST call
46
58
  #
@@ -22,7 +22,7 @@ module Google
22
22
  module Bigquery
23
23
  module DataTransfer
24
24
  module V1
25
- VERSION = "0.11.0"
25
+ VERSION = "0.12.0"
26
26
  end
27
27
  end
28
28
  end
@@ -21,6 +21,7 @@ module Google
21
21
  module Api
22
22
  # Required information for every language.
23
23
  # @!attribute [rw] reference_docs_uri
24
+ # @deprecated This field is deprecated and may be removed in the next major version update.
24
25
  # @return [::String]
25
26
  # Link to automatically generated reference documentation. Example:
26
27
  # https://cloud.google.com/nodejs/docs/reference/asset/latest
@@ -304,6 +305,19 @@ module Google
304
305
  # seconds: 360 # 6 minutes
305
306
  # total_poll_timeout:
306
307
  # seconds: 54000 # 90 minutes
308
+ # @!attribute [rw] auto_populated_fields
309
+ # @return [::Array<::String>]
310
+ # List of top-level fields of the request message, that should be
311
+ # automatically populated by the client libraries based on their
312
+ # (google.api.field_info).format. Currently supported format: UUID4.
313
+ #
314
+ # Example of a YAML configuration:
315
+ #
316
+ # publishing:
317
+ # method_settings:
318
+ # - selector: google.example.v1.ExampleService.CreateExample
319
+ # auto_populated_fields:
320
+ # - request_id
307
321
  class MethodSettings
308
322
  include ::Google::Protobuf::MessageExts
309
323
  extend ::Google::Protobuf::MessageExts::ClassMethods
@@ -128,9 +128,11 @@ module Google
128
128
  # scopes needed by a data source to prepare data and ingest them into
129
129
  # BigQuery, e.g., https://www.googleapis.com/auth/bigquery
130
130
  # @!attribute [rw] transfer_type
131
+ # @deprecated This field is deprecated and may be removed in the next major version update.
131
132
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::TransferType]
132
133
  # Deprecated. This field has no effect.
133
134
  # @!attribute [rw] supports_multiple_transfers
135
+ # @deprecated This field is deprecated and may be removed in the next major version update.
134
136
  # @return [::Boolean]
135
137
  # Deprecated. This field has no effect.
136
138
  # @!attribute [rw] update_deadline_seconds
@@ -283,6 +283,7 @@ module Google
283
283
  end
284
284
 
285
285
  # DEPRECATED. Represents data transfer type.
286
+ # @deprecated This enum is deprecated and may be removed in the next major version update.
286
287
  module TransferType
287
288
  # Invalid or Unknown transfer type placeholder.
288
289
  TRANSFER_TYPE_UNSPECIFIED = 0
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery-data_transfer-v1
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.11.0
4
+ version: 0.12.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Google LLC
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-09-12 00:00:00.000000000 Z
11
+ date: 2024-01-11 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: gapic-common
@@ -16,7 +16,7 @@ dependencies:
16
16
  requirements:
17
17
  - - ">="
18
18
  - !ruby/object:Gem::Version
19
- version: 0.20.0
19
+ version: 0.21.1
20
20
  - - "<"
21
21
  - !ruby/object:Gem::Version
22
22
  version: 2.a
@@ -26,7 +26,7 @@ dependencies:
26
26
  requirements:
27
27
  - - ">="
28
28
  - !ruby/object:Gem::Version
29
- version: 0.20.0
29
+ version: 0.21.1
30
30
  - - "<"
31
31
  - !ruby/object:Gem::Version
32
32
  version: 2.a
@@ -50,7 +50,7 @@ dependencies:
50
50
  requirements:
51
51
  - - ">="
52
52
  - !ruby/object:Gem::Version
53
- version: '0.4'
53
+ version: '0.7'
54
54
  - - "<"
55
55
  - !ruby/object:Gem::Version
56
56
  version: 2.a
@@ -60,7 +60,7 @@ dependencies:
60
60
  requirements:
61
61
  - - ">="
62
62
  - !ruby/object:Gem::Version
63
- version: '0.4'
63
+ version: '0.7'
64
64
  - - "<"
65
65
  - !ruby/object:Gem::Version
66
66
  version: 2.a
@@ -239,7 +239,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
239
239
  - !ruby/object:Gem::Version
240
240
  version: '0'
241
241
  requirements: []
242
- rubygems_version: 3.4.19
242
+ rubygems_version: 3.5.3
243
243
  signing_key:
244
244
  specification_version: 4
245
245
  summary: Schedule queries or transfer external data from SaaS applications to Google