google-cloud-bigquery-data_transfer-v1 0.4.5 → 0.5.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 858ac9046282b3f9f1ad16896571a7ec0ff7167e6a41f4a677e3623b4e08eaaa
4
- data.tar.gz: caefa5a2b16a67126c8915ddcb5bc9d85cf62a06bce500d35fb6c7c5630b2b75
3
+ metadata.gz: 06a68fb75520b8a8d83f87ff124f7060d3a0f25d674f772ea2811eda730a646e
4
+ data.tar.gz: 19ba48eb638f9ad858ef5732dc310fb8bf88b606f2610754f981817853cbf787
5
5
  SHA512:
6
- metadata.gz: cbe0687f2bb91583fec9192380f098421d1bec4f6726aa4155cba8ba855e21dc551d063eec692974aa3955003bb8fb8bd91e3605bcb9e0f9110b97a04544e361
7
- data.tar.gz: 1587893e7feeb969dcc82f8973facf0e835c19cfcbb46b80c538c4ef91d7edf091c26b7ab37d4eeabb769c89b160bcdb3d4a1b370809626a3fbadd19ca74872e
6
+ metadata.gz: 3c2bcc7b82cfe3c6d70600f018cf4f55a3aa26bb35b8bcf6b1fad6b47109eeb3a6aed1e6d7f28fa7745b62a15d51898108fae997dc6dc67897f75b6003773b80
7
+ data.tar.gz: 29ea71fdfa058051760d6d5769b0572f3bf0fd3785b2fba98be8162a278ac6b5d737253f8562d774af982dfa8da2d525ea073fa62d4539a80f80d2632c061608
data/.yardopts CHANGED
@@ -1,5 +1,5 @@
1
1
  --no-private
2
- --title=BigQuery Data Transfer Service V1 API
2
+ --title="BigQuery Data Transfer Service V1 API"
3
3
  --exclude _pb\.rb$
4
4
  --markup markdown
5
5
  --markup-provider redcarpet
data/AUTHENTICATION.md CHANGED
@@ -120,15 +120,6 @@ To configure your system for this, simply:
120
120
  **NOTE:** This is _not_ recommended for running in production. The Cloud SDK
121
121
  *should* only be used during development.
122
122
 
123
- [gce-how-to]: https://cloud.google.com/compute/docs/authentication#using
124
- [dev-console]: https://console.cloud.google.com/project
125
-
126
- [enable-apis]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/enable-apis.png
127
-
128
- [create-new-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account.png
129
- [create-new-service-account-existing-keys]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account-existing-keys.png
130
- [reuse-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/reuse-service-account.png
131
-
132
123
  ## Creating a Service Account
133
124
 
134
125
  Google Cloud requires **Service Account Credentials** to
@@ -139,31 +130,22 @@ If you are not running this client within
139
130
  [Google Cloud Platform environments](#google-cloud-platform-environments), you
140
131
  need a Google Developers service account.
141
132
 
142
- 1. Visit the [Google Developers Console][dev-console].
133
+ 1. Visit the [Google Cloud Console](https://console.cloud.google.com/project).
143
134
  2. Create a new project or click on an existing project.
144
- 3. Activate the slide-out navigation tray and select **API Manager**. From
135
+ 3. Activate the menu in the upper left and select **APIs & Services**. From
145
136
  here, you will enable the APIs that your application requires.
146
137
 
147
- ![Enable the APIs that your application requires][enable-apis]
148
-
149
138
  *Note: You may need to enable billing in order to use these services.*
150
139
 
151
140
  4. Select **Credentials** from the side navigation.
152
141
 
153
- You should see a screen like one of the following.
154
-
155
- ![Create a new service account][create-new-service-account]
156
-
157
- ![Create a new service account With Existing Keys][create-new-service-account-existing-keys]
158
-
159
- Find the "Add credentials" drop down and select "Service account" to be
160
- guided through downloading a new JSON key file.
142
+ Find the "Create credentials" drop down near the top of the page, and select
143
+ "Service account" to be guided through downloading a new JSON key file.
161
144
 
162
145
  If you want to re-use an existing service account, you can easily generate a
163
- new key file. Just select the account you wish to re-use, and click "Generate
164
- new JSON key":
165
-
166
- ![Re-use an existing service account][reuse-service-account]
146
+ new key file. Just select the account you wish to re-use, click the pencil
147
+ tool on the right side to edit the service account, select the **Keys** tab,
148
+ and then select **Add Key**.
167
149
 
168
150
  The key file you download will be used by this library to authenticate API
169
151
  requests and should be stored in a secure location.
data/README.md CHANGED
@@ -37,7 +37,7 @@ request = ::Google::Cloud::Bigquery::DataTransfer::V1::GetDataSourceRequest.new
37
37
  response = client.get_data_source request
38
38
  ```
39
39
 
40
- View the [Client Library Documentation](https://googleapis.dev/ruby/google-cloud-bigquery-data_transfer-v1/latest)
40
+ View the [Client Library Documentation](https://cloud.google.com/ruby/docs/reference/google-cloud-bigquery-data_transfer-v1/latest)
41
41
  for class and method documentation.
42
42
 
43
43
  See also the [Product Documentation](https://cloud.google.com/bigquery/transfer)
@@ -28,10 +28,7 @@ module Google
28
28
  ##
29
29
  # Client for the DataTransferService service.
30
30
  #
31
- # The Google BigQuery Data Transfer Service API enables BigQuery users to
32
- # configure the transfer of their data from other Google Products into
33
- # BigQuery. This service contains methods that are end user exposed. It backs
34
- # up the frontend.
31
+ # This API allows users to manage their data transfers into BigQuery.
35
32
  #
36
33
  class Client
37
34
  include Paths
@@ -205,8 +202,7 @@ module Google
205
202
  # Service calls
206
203
 
207
204
  ##
208
- # Retrieves a supported data source and returns its settings,
209
- # which can be used for UI rendering.
205
+ # Retrieves a supported data source and returns its settings.
210
206
  #
211
207
  # @overload get_data_source(request, options = nil)
212
208
  # Pass arguments to `get_data_source` via a request object, either of type
@@ -293,8 +289,7 @@ module Google
293
289
  end
294
290
 
295
291
  ##
296
- # Lists supported data sources and returns their settings,
297
- # which can be used for UI rendering.
292
+ # Lists supported data sources and returns their settings.
298
293
  #
299
294
  # @overload list_data_sources(request, options = nil)
300
295
  # Pass arguments to `list_data_sources` via a request object, either of type
@@ -633,8 +628,8 @@ module Google
633
628
  end
634
629
 
635
630
  ##
636
- # Deletes a data transfer configuration,
637
- # including any associated transfer runs and logs.
631
+ # Deletes a data transfer configuration, including any associated transfer
632
+ # runs and logs.
638
633
  #
639
634
  # @overload delete_transfer_config(request, options = nil)
640
635
  # Pass arguments to `delete_transfer_config` via a request object, either of type
@@ -1281,7 +1276,7 @@ module Google
1281
1276
  end
1282
1277
 
1283
1278
  ##
1284
- # Returns information about running and completed jobs.
1279
+ # Returns information about running and completed transfer runs.
1285
1280
  #
1286
1281
  # @overload list_transfer_runs(request, options = nil)
1287
1282
  # Pass arguments to `list_transfer_runs` via a request object, either of type
@@ -1388,7 +1383,7 @@ module Google
1388
1383
  end
1389
1384
 
1390
1385
  ##
1391
- # Returns user facing log messages for the data transfer run.
1386
+ # Returns log messages for the transfer run.
1392
1387
  #
1393
1388
  # @overload list_transfer_logs(request, options = nil)
1394
1389
  # Pass arguments to `list_transfer_logs` via a request object, either of type
@@ -1495,10 +1490,6 @@ module Google
1495
1490
  ##
1496
1491
  # Returns true if valid credentials exist for the given data source and
1497
1492
  # requesting user.
1498
- # Some data sources doesn't support service account, so we need to talk to
1499
- # them on behalf of the end user. This API just checks whether we have OAuth
1500
- # token for the particular user, which is a pre-requisite before user can
1501
- # create a transfer config.
1502
1493
  #
1503
1494
  # @overload check_valid_creds(request, options = nil)
1504
1495
  # Pass arguments to `check_valid_creds` via a request object, either of type
@@ -1584,6 +1575,100 @@ module Google
1584
1575
  raise ::Google::Cloud::Error.from_error(e)
1585
1576
  end
1586
1577
 
1578
+ ##
1579
+ # Enroll data sources in a user project. This allows users to create transfer
1580
+ # configurations for these data sources. They will also appear in the
1581
+ # ListDataSources RPC and as such, will appear in the BigQuery UI
1582
+ # 'https://bigquery.cloud.google.com' (and the documents can be found at
1583
+ # https://cloud.google.com/bigquery/bigquery-web-ui and
1584
+ # https://cloud.google.com/bigquery/docs/working-with-transfers).
1585
+ #
1586
+ # @overload enroll_data_sources(request, options = nil)
1587
+ # Pass arguments to `enroll_data_sources` via a request object, either of type
1588
+ # {::Google::Cloud::Bigquery::DataTransfer::V1::EnrollDataSourcesRequest} or an equivalent Hash.
1589
+ #
1590
+ # @param request [::Google::Cloud::Bigquery::DataTransfer::V1::EnrollDataSourcesRequest, ::Hash]
1591
+ # A request object representing the call parameters. Required. To specify no
1592
+ # parameters, or to keep all the default parameter values, pass an empty Hash.
1593
+ # @param options [::Gapic::CallOptions, ::Hash]
1594
+ # Overrides the default settings for this call, e.g, timeout, retries, etc. Optional.
1595
+ #
1596
+ # @overload enroll_data_sources(name: nil, data_source_ids: nil)
1597
+ # Pass arguments to `enroll_data_sources` via keyword arguments. Note that at
1598
+ # least one keyword argument is required. To specify no parameters, or to keep all
1599
+ # the default parameter values, pass an empty Hash as a request object (see above).
1600
+ #
1601
+ # @param name [::String]
1602
+ # The name of the project resource in the form:
1603
+ # `projects/{project_id}`
1604
+ # @param data_source_ids [::Array<::String>]
1605
+ # Data sources that are enrolled. It is required to provide at least one
1606
+ # data source id.
1607
+ #
1608
+ # @yield [response, operation] Access the result along with the RPC operation
1609
+ # @yieldparam response [::Google::Protobuf::Empty]
1610
+ # @yieldparam operation [::GRPC::ActiveCall::Operation]
1611
+ #
1612
+ # @return [::Google::Protobuf::Empty]
1613
+ #
1614
+ # @raise [::Google::Cloud::Error] if the RPC is aborted.
1615
+ #
1616
+ # @example Basic example
1617
+ # require "google/cloud/bigquery/data_transfer/v1"
1618
+ #
1619
+ # # Create a client object. The client can be reused for multiple calls.
1620
+ # client = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
1621
+ #
1622
+ # # Create a request. To set request fields, pass in keyword arguments.
1623
+ # request = Google::Cloud::Bigquery::DataTransfer::V1::EnrollDataSourcesRequest.new
1624
+ #
1625
+ # # Call the enroll_data_sources method.
1626
+ # result = client.enroll_data_sources request
1627
+ #
1628
+ # # The returned object is of type Google::Protobuf::Empty.
1629
+ # p result
1630
+ #
1631
+ def enroll_data_sources request, options = nil
1632
+ raise ::ArgumentError, "request must be provided" if request.nil?
1633
+
1634
+ request = ::Gapic::Protobuf.coerce request, to: ::Google::Cloud::Bigquery::DataTransfer::V1::EnrollDataSourcesRequest
1635
+
1636
+ # Converts hash and nil to an options object
1637
+ options = ::Gapic::CallOptions.new(**options.to_h) if options.respond_to? :to_h
1638
+
1639
+ # Customize the options with defaults
1640
+ metadata = @config.rpcs.enroll_data_sources.metadata.to_h
1641
+
1642
+ # Set x-goog-api-client and x-goog-user-project headers
1643
+ metadata[:"x-goog-api-client"] ||= ::Gapic::Headers.x_goog_api_client \
1644
+ lib_name: @config.lib_name, lib_version: @config.lib_version,
1645
+ gapic_version: ::Google::Cloud::Bigquery::DataTransfer::V1::VERSION
1646
+ metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
1647
+
1648
+ header_params = {}
1649
+ if request.name
1650
+ header_params["name"] = request.name
1651
+ end
1652
+
1653
+ request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
1654
+ metadata[:"x-goog-request-params"] ||= request_params_header
1655
+
1656
+ options.apply_defaults timeout: @config.rpcs.enroll_data_sources.timeout,
1657
+ metadata: metadata,
1658
+ retry_policy: @config.rpcs.enroll_data_sources.retry_policy
1659
+
1660
+ options.apply_defaults timeout: @config.timeout,
1661
+ metadata: @config.metadata,
1662
+ retry_policy: @config.retry_policy
1663
+
1664
+ @data_transfer_service_stub.call_rpc :enroll_data_sources, request, options: options do |response, operation|
1665
+ yield response, operation if block_given?
1666
+ return response
1667
+ end
1668
+ rescue ::GRPC::BadStatus => e
1669
+ raise ::Google::Cloud::Error.from_error(e)
1670
+ end
1671
+
1587
1672
  ##
1588
1673
  # Configuration class for the DataTransferService API.
1589
1674
  #
@@ -1789,6 +1874,11 @@ module Google
1789
1874
  # @return [::Gapic::Config::Method]
1790
1875
  #
1791
1876
  attr_reader :check_valid_creds
1877
+ ##
1878
+ # RPC-specific configuration for `enroll_data_sources`
1879
+ # @return [::Gapic::Config::Method]
1880
+ #
1881
+ attr_reader :enroll_data_sources
1792
1882
 
1793
1883
  # @private
1794
1884
  def initialize parent_rpcs = nil
@@ -1820,6 +1910,8 @@ module Google
1820
1910
  @list_transfer_logs = ::Gapic::Config::Method.new list_transfer_logs_config
1821
1911
  check_valid_creds_config = parent_rpcs.check_valid_creds if parent_rpcs.respond_to? :check_valid_creds
1822
1912
  @check_valid_creds = ::Gapic::Config::Method.new check_valid_creds_config
1913
+ enroll_data_sources_config = parent_rpcs.enroll_data_sources if parent_rpcs.respond_to? :enroll_data_sources
1914
+ @enroll_data_sources = ::Gapic::Config::Method.new enroll_data_sources_config
1823
1915
 
1824
1916
  yield self if block_given?
1825
1917
  end
@@ -32,10 +32,7 @@ module Google
32
32
  module DataTransfer
33
33
  module V1
34
34
  ##
35
- # The Google BigQuery Data Transfer Service API enables BigQuery users to
36
- # configure the transfer of their data from other Google Products into
37
- # BigQuery. This service contains methods that are end user exposed. It backs
38
- # up the frontend.
35
+ # This API allows users to manage their data transfers into BigQuery.
39
36
  #
40
37
  # To load this service and instantiate a client:
41
38
  #
@@ -22,7 +22,7 @@ module Google
22
22
  module Bigquery
23
23
  module DataTransfer
24
24
  module V1
25
- VERSION = "0.4.5"
25
+ VERSION = "0.5.1"
26
26
  end
27
27
  end
28
28
  end
@@ -26,6 +26,8 @@ module Google
26
26
  ##
27
27
  # To load this package, including all its services, and instantiate a client:
28
28
  #
29
+ # @example
30
+ #
29
31
  # require "google/cloud/bigquery/data_transfer/v1"
30
32
  # client = ::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.new
31
33
  #
@@ -174,6 +174,10 @@ Google::Protobuf::DescriptorPool.generated_pool.build do
174
174
  add_message "google.cloud.bigquery.datatransfer.v1.StartManualTransferRunsResponse" do
175
175
  repeated :runs, :message, 1, "google.cloud.bigquery.datatransfer.v1.TransferRun"
176
176
  end
177
+ add_message "google.cloud.bigquery.datatransfer.v1.EnrollDataSourcesRequest" do
178
+ optional :name, :string, 1
179
+ repeated :data_source_ids, :string, 2
180
+ end
177
181
  end
178
182
  end
179
183
 
@@ -210,6 +214,7 @@ module Google
210
214
  StartManualTransferRunsRequest = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.StartManualTransferRunsRequest").msgclass
211
215
  StartManualTransferRunsRequest::TimeRange = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.StartManualTransferRunsRequest.TimeRange").msgclass
212
216
  StartManualTransferRunsResponse = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.StartManualTransferRunsResponse").msgclass
217
+ EnrollDataSourcesRequest = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.EnrollDataSourcesRequest").msgclass
213
218
  end
214
219
  end
215
220
  end
@@ -25,10 +25,7 @@ module Google
25
25
  module DataTransfer
26
26
  module V1
27
27
  module DataTransferService
28
- # The Google BigQuery Data Transfer Service API enables BigQuery users to
29
- # configure the transfer of their data from other Google Products into
30
- # BigQuery. This service contains methods that are end user exposed. It backs
31
- # up the frontend.
28
+ # This API allows users to manage their data transfers into BigQuery.
32
29
  class Service
33
30
 
34
31
  include ::GRPC::GenericService
@@ -37,19 +34,17 @@ module Google
37
34
  self.unmarshal_class_method = :decode
38
35
  self.service_name = 'google.cloud.bigquery.datatransfer.v1.DataTransferService'
39
36
 
40
- # Retrieves a supported data source and returns its settings,
41
- # which can be used for UI rendering.
37
+ # Retrieves a supported data source and returns its settings.
42
38
  rpc :GetDataSource, ::Google::Cloud::Bigquery::DataTransfer::V1::GetDataSourceRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::DataSource
43
- # Lists supported data sources and returns their settings,
44
- # which can be used for UI rendering.
39
+ # Lists supported data sources and returns their settings.
45
40
  rpc :ListDataSources, ::Google::Cloud::Bigquery::DataTransfer::V1::ListDataSourcesRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::ListDataSourcesResponse
46
41
  # Creates a new data transfer configuration.
47
42
  rpc :CreateTransferConfig, ::Google::Cloud::Bigquery::DataTransfer::V1::CreateTransferConfigRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig
48
43
  # Updates a data transfer configuration.
49
44
  # All fields must be set, even if they are not updated.
50
45
  rpc :UpdateTransferConfig, ::Google::Cloud::Bigquery::DataTransfer::V1::UpdateTransferConfigRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig
51
- # Deletes a data transfer configuration,
52
- # including any associated transfer runs and logs.
46
+ # Deletes a data transfer configuration, including any associated transfer
47
+ # runs and logs.
53
48
  rpc :DeleteTransferConfig, ::Google::Cloud::Bigquery::DataTransfer::V1::DeleteTransferConfigRequest, ::Google::Protobuf::Empty
54
49
  # Returns information about a data transfer config.
55
50
  rpc :GetTransferConfig, ::Google::Cloud::Bigquery::DataTransfer::V1::GetTransferConfigRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::TransferConfig
@@ -71,17 +66,20 @@ module Google
71
66
  rpc :GetTransferRun, ::Google::Cloud::Bigquery::DataTransfer::V1::GetTransferRunRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::TransferRun
72
67
  # Deletes the specified transfer run.
73
68
  rpc :DeleteTransferRun, ::Google::Cloud::Bigquery::DataTransfer::V1::DeleteTransferRunRequest, ::Google::Protobuf::Empty
74
- # Returns information about running and completed jobs.
69
+ # Returns information about running and completed transfer runs.
75
70
  rpc :ListTransferRuns, ::Google::Cloud::Bigquery::DataTransfer::V1::ListTransferRunsRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::ListTransferRunsResponse
76
- # Returns user facing log messages for the data transfer run.
71
+ # Returns log messages for the transfer run.
77
72
  rpc :ListTransferLogs, ::Google::Cloud::Bigquery::DataTransfer::V1::ListTransferLogsRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::ListTransferLogsResponse
78
73
  # Returns true if valid credentials exist for the given data source and
79
74
  # requesting user.
80
- # Some data sources doesn't support service account, so we need to talk to
81
- # them on behalf of the end user. This API just checks whether we have OAuth
82
- # token for the particular user, which is a pre-requisite before user can
83
- # create a transfer config.
84
75
  rpc :CheckValidCreds, ::Google::Cloud::Bigquery::DataTransfer::V1::CheckValidCredsRequest, ::Google::Cloud::Bigquery::DataTransfer::V1::CheckValidCredsResponse
76
+ # Enroll data sources in a user project. This allows users to create transfer
77
+ # configurations for these data sources. They will also appear in the
78
+ # ListDataSources RPC and as such, will appear in the BigQuery UI
79
+ # 'https://bigquery.cloud.google.com' (and the documents can be found at
80
+ # https://cloud.google.com/bigquery/bigquery-web-ui and
81
+ # https://cloud.google.com/bigquery/docs/working-with-transfers).
82
+ rpc :EnrollDataSources, ::Google::Cloud::Bigquery::DataTransfer::V1::EnrollDataSourcesRequest, ::Google::Protobuf::Empty
85
83
  end
86
84
 
87
85
  Stub = Service.rpc_stub_class
@@ -3,7 +3,6 @@
3
3
 
4
4
  require 'google/api/field_behavior_pb'
5
5
  require 'google/api/resource_pb'
6
- require 'google/protobuf/duration_pb'
7
6
  require 'google/protobuf/struct_pb'
8
7
  require 'google/protobuf/timestamp_pb'
9
8
  require 'google/rpc/status_pb'
@@ -19,6 +18,9 @@ Google::Protobuf::DescriptorPool.generated_pool.build do
19
18
  optional :start_time, :message, 1, "google.protobuf.Timestamp"
20
19
  optional :end_time, :message, 2, "google.protobuf.Timestamp"
21
20
  end
21
+ add_message "google.cloud.bigquery.datatransfer.v1.UserInfo" do
22
+ proto3_optional :email, :string, 1
23
+ end
22
24
  add_message "google.cloud.bigquery.datatransfer.v1.TransferConfig" do
23
25
  optional :name, :string, 1
24
26
  optional :display_name, :string, 3
@@ -35,6 +37,7 @@ Google::Protobuf::DescriptorPool.generated_pool.build do
35
37
  optional :dataset_region, :string, 14
36
38
  optional :notification_pubsub_topic, :string, 15
37
39
  optional :email_preferences, :message, 18, "google.cloud.bigquery.datatransfer.v1.EmailPreferences"
40
+ proto3_optional :owner_info, :message, 27, "google.cloud.bigquery.datatransfer.v1.UserInfo"
38
41
  oneof :destination do
39
42
  optional :destination_dataset_id, :string, 2
40
43
  end
@@ -92,6 +95,7 @@ module Google
92
95
  module V1
93
96
  EmailPreferences = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.EmailPreferences").msgclass
94
97
  ScheduleOptions = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.ScheduleOptions").msgclass
98
+ UserInfo = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.UserInfo").msgclass
95
99
  TransferConfig = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.TransferConfig").msgclass
96
100
  TransferRun = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.TransferRun").msgclass
97
101
  TransferMessage = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.datatransfer.v1.TransferMessage").msgclass
@@ -33,11 +33,7 @@ module Google
33
33
  # // For Kubernetes resources, the format is {api group}/{kind}.
34
34
  # option (google.api.resource) = {
35
35
  # type: "pubsub.googleapis.com/Topic"
36
- # name_descriptor: {
37
- # pattern: "projects/{project}/topics/{topic}"
38
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
39
- # parent_name_extractor: "projects/{project}"
40
- # }
36
+ # pattern: "projects/{project}/topics/{topic}"
41
37
  # };
42
38
  # }
43
39
  #
@@ -45,10 +41,7 @@ module Google
45
41
  #
46
42
  # resources:
47
43
  # - type: "pubsub.googleapis.com/Topic"
48
- # name_descriptor:
49
- # - pattern: "projects/{project}/topics/{topic}"
50
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
51
- # parent_name_extractor: "projects/{project}"
44
+ # pattern: "projects/{project}/topics/{topic}"
52
45
  #
53
46
  # Sometimes, resources have multiple patterns, typically because they can
54
47
  # live under multiple parents.
@@ -58,26 +51,10 @@ module Google
58
51
  # message LogEntry {
59
52
  # option (google.api.resource) = {
60
53
  # type: "logging.googleapis.com/LogEntry"
61
- # name_descriptor: {
62
- # pattern: "projects/{project}/logs/{log}"
63
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
64
- # parent_name_extractor: "projects/{project}"
65
- # }
66
- # name_descriptor: {
67
- # pattern: "folders/{folder}/logs/{log}"
68
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
69
- # parent_name_extractor: "folders/{folder}"
70
- # }
71
- # name_descriptor: {
72
- # pattern: "organizations/{organization}/logs/{log}"
73
- # parent_type: "cloudresourcemanager.googleapis.com/Organization"
74
- # parent_name_extractor: "organizations/{organization}"
75
- # }
76
- # name_descriptor: {
77
- # pattern: "billingAccounts/{billing_account}/logs/{log}"
78
- # parent_type: "billing.googleapis.com/BillingAccount"
79
- # parent_name_extractor: "billingAccounts/{billing_account}"
80
- # }
54
+ # pattern: "projects/{project}/logs/{log}"
55
+ # pattern: "folders/{folder}/logs/{log}"
56
+ # pattern: "organizations/{organization}/logs/{log}"
57
+ # pattern: "billingAccounts/{billing_account}/logs/{log}"
81
58
  # };
82
59
  # }
83
60
  #
@@ -85,48 +62,10 @@ module Google
85
62
  #
86
63
  # resources:
87
64
  # - type: 'logging.googleapis.com/LogEntry'
88
- # name_descriptor:
89
- # - pattern: "projects/{project}/logs/{log}"
90
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
91
- # parent_name_extractor: "projects/{project}"
92
- # - pattern: "folders/{folder}/logs/{log}"
93
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
94
- # parent_name_extractor: "folders/{folder}"
95
- # - pattern: "organizations/{organization}/logs/{log}"
96
- # parent_type: "cloudresourcemanager.googleapis.com/Organization"
97
- # parent_name_extractor: "organizations/{organization}"
98
- # - pattern: "billingAccounts/{billing_account}/logs/{log}"
99
- # parent_type: "billing.googleapis.com/BillingAccount"
100
- # parent_name_extractor: "billingAccounts/{billing_account}"
101
- #
102
- # For flexible resources, the resource name doesn't contain parent names, but
103
- # the resource itself has parents for policy evaluation.
104
- #
105
- # Example:
106
- #
107
- # message Shelf {
108
- # option (google.api.resource) = {
109
- # type: "library.googleapis.com/Shelf"
110
- # name_descriptor: {
111
- # pattern: "shelves/{shelf}"
112
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
113
- # }
114
- # name_descriptor: {
115
- # pattern: "shelves/{shelf}"
116
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
117
- # }
118
- # };
119
- # }
120
- #
121
- # The ResourceDescriptor Yaml config will look like:
122
- #
123
- # resources:
124
- # - type: 'library.googleapis.com/Shelf'
125
- # name_descriptor:
126
- # - pattern: "shelves/{shelf}"
127
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
128
- # - pattern: "shelves/{shelf}"
129
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
65
+ # pattern: "projects/{project}/logs/{log}"
66
+ # pattern: "folders/{folder}/logs/{log}"
67
+ # pattern: "organizations/{organization}/logs/{log}"
68
+ # pattern: "billingAccounts/{billing_account}/logs/{log}"
130
69
  # @!attribute [rw] type
131
70
  # @return [::String]
132
71
  # The resource type. It must be in the format of
@@ -22,12 +22,7 @@ module Google
22
22
  module Bigquery
23
23
  module DataTransfer
24
24
  module V1
25
- # Represents a data source parameter with validation rules, so that
26
- # parameters can be rendered in the UI. These parameters are given to us by
27
- # supported data sources, and include all needed information for rendering
28
- # and validation.
29
- # Thus, whoever uses this api can decide to generate either generic ui,
30
- # or custom data source specific forms.
25
+ # A parameter used to define custom fields in a data source definition.
31
26
  # @!attribute [rw] param_id
32
27
  # @return [::String]
33
28
  # Parameter identifier.
@@ -108,8 +103,7 @@ module Google
108
103
  end
109
104
  end
110
105
 
111
- # Represents data source metadata. Metadata is sufficient to
112
- # render UI and request proper OAuth tokens.
106
+ # Defines the properties and custom parameters for a data source.
113
107
  # @!attribute [r] name
114
108
  # @return [::String]
115
109
  # Output only. Data source resource name.
@@ -266,9 +260,9 @@ module Google
266
260
  # A request to create a data transfer configuration. If new credentials are
267
261
  # needed for this transfer configuration, an authorization code must be
268
262
  # provided. If an authorization code is provided, the transfer configuration
269
- # will be associated with the user id corresponding to the
270
- # authorization code. Otherwise, the transfer configuration will be associated
271
- # with the calling user.
263
+ # will be associated with the user id corresponding to the authorization code.
264
+ # Otherwise, the transfer configuration will be associated with the calling
265
+ # user.
272
266
  # @!attribute [rw] parent
273
267
  # @return [::String]
274
268
  # Required. The BigQuery project id where the transfer configuration should be created.
@@ -445,9 +439,7 @@ module Google
445
439
  extend ::Google::Protobuf::MessageExts::ClassMethods
446
440
  end
447
441
 
448
- # A request to list data transfer runs. UI can use this method to show/filter
449
- # specific data transfer runs. The data source can use this method to request
450
- # all scheduled transfer runs.
442
+ # A request to list data transfer runs.
451
443
  # @!attribute [rw] parent
452
444
  # @return [::String]
453
445
  # Required. Name of transfer configuration for which transfer runs should be retrieved.
@@ -637,6 +629,21 @@ module Google
637
629
  include ::Google::Protobuf::MessageExts
638
630
  extend ::Google::Protobuf::MessageExts::ClassMethods
639
631
  end
632
+
633
+ # A request to enroll a set of data sources so they are visible in the
634
+ # BigQuery UI's `Transfer` tab.
635
+ # @!attribute [rw] name
636
+ # @return [::String]
637
+ # The name of the project resource in the form:
638
+ # `projects/{project_id}`
639
+ # @!attribute [rw] data_source_ids
640
+ # @return [::Array<::String>]
641
+ # Data sources that are enrolled. It is required to provide at least one
642
+ # data source id.
643
+ class EnrollDataSourcesRequest
644
+ include ::Google::Protobuf::MessageExts
645
+ extend ::Google::Protobuf::MessageExts::ClassMethods
646
+ end
640
647
  end
641
648
  end
642
649
  end
@@ -57,6 +57,15 @@ module Google
57
57
  extend ::Google::Protobuf::MessageExts::ClassMethods
58
58
  end
59
59
 
60
+ # Information about a user.
61
+ # @!attribute [rw] email
62
+ # @return [::String]
63
+ # E-mail address of the user.
64
+ class UserInfo
65
+ include ::Google::Protobuf::MessageExts
66
+ extend ::Google::Protobuf::MessageExts::ClassMethods
67
+ end
68
+
60
69
  # Represents a data transfer configuration. A transfer configuration
61
70
  # contains all metadata needed to perform a data transfer. For example,
62
71
  # `destination_dataset_id` specifies where data should be stored.
@@ -99,7 +108,9 @@ module Google
99
108
  # `first sunday of quarter 00:00`.
100
109
  # See more explanation about the format here:
101
110
  # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
102
- # NOTE: the granularity should be at least 8 hours, or less frequent.
111
+ #
112
+ # NOTE: The minimum interval time between recurring transfers depends on the
113
+ # data source; refer to the documentation for your data source.
103
114
  # @!attribute [rw] schedule_options
104
115
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::ScheduleOptions]
105
116
  # Options customizing the data transfer schedule.
@@ -141,6 +152,11 @@ module Google
141
152
  # @return [::Google::Cloud::Bigquery::DataTransfer::V1::EmailPreferences]
142
153
  # Email notifications will be sent according to these preferences
143
154
  # to the email address of the user who owns this transfer config.
155
+ # @!attribute [r] owner_info
156
+ # @return [::Google::Cloud::Bigquery::DataTransfer::V1::UserInfo]
157
+ # Output only. Information about the user whose credentials are used to transfer data.
158
+ # Populated only for `transferConfigs.get` requests. In case the user
159
+ # information is not available, this field will not be populated.
144
160
  class TransferConfig
145
161
  include ::Google::Protobuf::MessageExts
146
162
  extend ::Google::Protobuf::MessageExts::ClassMethods
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery-data_transfer-v1
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.4.5
4
+ version: 0.5.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Google LLC
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2021-11-08 00:00:00.000000000 Z
11
+ date: 2022-04-01 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: gapic-common
@@ -212,7 +212,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
212
212
  - !ruby/object:Gem::Version
213
213
  version: '0'
214
214
  requirements: []
215
- rubygems_version: 3.2.17
215
+ rubygems_version: 3.3.5
216
216
  signing_key:
217
217
  specification_version: 4
218
218
  summary: API Client library for the BigQuery Data Transfer Service V1 API