google-cloud-bigquery-data_transfer 0.2.3 → 0.2.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 657a3f3f47798f10e04643f1b87cfee8fcb1eb32b9ad0f60de1692efb20b12c6
4
- data.tar.gz: 78517e940938cee138dd28078d96f0989f19890a6a4156e570d2741bf21c3fbc
3
+ metadata.gz: dc8f0001f8d494c4c0caa52faa1365faf4eac9f526284a1ce5dad1d371714cbb
4
+ data.tar.gz: 65389fd9f60a871f1f1e89d703d425d3d8ed732bd2d561be2da56cf803d98389
5
5
  SHA512:
6
- metadata.gz: 61e1693d79ad694f24b4bf79b911ee6175947e2fdf69812354ae4011309168430634949839be8aa71224b60ffbe5ecc313839ce0bf5edfb05734fdc35d9ff450
7
- data.tar.gz: 1de3b9a686d0a7d445e92b9da24268d3ec51e845b4a26c835f0618af25c7b83d5f0d6a2a08cd9815cf9d8bc9927109aeecad8dc45f0644129cdaecf257a77310
6
+ metadata.gz: 0147e39b896cd0d646b661480303fbae70d2b6de5b39b1509a167712155b139cff4b44cf339d11a0c242bc5d8f625b219dedeae78d3c03b33822e26fa1a49d7f
7
+ data.tar.gz: 8061c0b0292be433bc6455327c0c9c9fec477dc9f21c850ad1c50586c25de1a2b3efbbaed7642fb434937fc9d002c0c34a48f125c65cfbff95b24b2c7f15a074
data/.yardopts CHANGED
@@ -7,3 +7,5 @@
7
7
  ./lib/**/*.rb
8
8
  -
9
9
  README.md
10
+ AUTHENTICATION.md
11
+ LICENSE
data/AUTHENTICATION.md ADDED
@@ -0,0 +1,199 @@
1
+ # Authentication
2
+
3
+ In general, the google-cloud-bigquery-data_transfer library uses [Service
4
+ Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
5
+ credentials to connect to Google Cloud services. When running within [Google
6
+ Cloud Platform environments](#google-cloud-platform-environments)
7
+ the credentials will be discovered automatically. When running on other
8
+ environments, the Service Account credentials can be specified by providing the
9
+ path to the [JSON
10
+ keyfile](https://cloud.google.com/iam/docs/managing-service-account-keys) for
11
+ the account (or the JSON itself) in [environment
12
+ variables](#environment-variables). Additionally, Cloud SDK credentials can also
13
+ be discovered automatically, but this is only recommended during development.
14
+
15
+ ## Quickstart
16
+
17
+ 1. [Create a service account and credentials](#creating-a-service-account).
18
+ 2. Set the [environment variable](#environment-variables).
19
+
20
+ ```sh
21
+ export DATA_TRANSFER_CREDENTIALS=/path/to/json`
22
+ ```
23
+
24
+ 3. Initialize the client.
25
+
26
+ ```ruby
27
+ require "google/cloud/bigquery/data_transfer"
28
+
29
+ client = Google::Cloud::Bigquery::DataTransfer.new
30
+ ```
31
+
32
+ ## Project and Credential Lookup
33
+
34
+ The google-cloud-bigquery-data_transfer library aims to make authentication
35
+ as simple as possible, and provides several mechanisms to configure your system
36
+ without providing **Project ID** and **Service Account Credentials** directly in
37
+ code.
38
+
39
+ **Project ID** is discovered in the following order:
40
+
41
+ 1. Specify project ID in method arguments
42
+ 2. Specify project ID in configuration
43
+ 3. Discover project ID in environment variables
44
+ 4. Discover GCE project ID
45
+ 5. Discover project ID in credentials JSON
46
+
47
+ **Credentials** are discovered in the following order:
48
+
49
+ 1. Specify credentials in method arguments
50
+ 2. Specify credentials in configuration
51
+ 3. Discover credentials path in environment variables
52
+ 4. Discover credentials JSON in environment variables
53
+ 5. Discover credentials file in the Cloud SDK's path
54
+ 6. Discover GCE credentials
55
+
56
+ ### Google Cloud Platform environments
57
+
58
+ While running on Google Cloud Platform environments such as Google Compute
59
+ Engine, Google App Engine and Google Kubernetes Engine, no extra work is needed.
60
+ The **Project ID** and **Credentials** and are discovered automatically. Code
61
+ should be written as if already authenticated. Just be sure when you [set up the
62
+ GCE instance][gce-how-to], you add the correct scopes for the APIs you want to
63
+ access. For example:
64
+
65
+ * **All APIs**
66
+ * `https://www.googleapis.com/auth/cloud-platform`
67
+ * `https://www.googleapis.com/auth/cloud-platform.read-only`
68
+ * **BigQuery**
69
+ * `https://www.googleapis.com/auth/bigquery`
70
+ * `https://www.googleapis.com/auth/bigquery.insertdata`
71
+ * **Compute Engine**
72
+ * `https://www.googleapis.com/auth/compute`
73
+ * **Datastore**
74
+ * `https://www.googleapis.com/auth/datastore`
75
+ * `https://www.googleapis.com/auth/userinfo.email`
76
+ * **DNS**
77
+ * `https://www.googleapis.com/auth/ndev.clouddns.readwrite`
78
+ * **Pub/Sub**
79
+ * `https://www.googleapis.com/auth/pubsub`
80
+ * **Storage**
81
+ * `https://www.googleapis.com/auth/devstorage.full_control`
82
+ * `https://www.googleapis.com/auth/devstorage.read_only`
83
+ * `https://www.googleapis.com/auth/devstorage.read_write`
84
+
85
+ ### Environment Variables
86
+
87
+ The **Project ID** and **Credentials JSON** can be placed in environment
88
+ variables instead of declaring them directly in code. Each service has its own
89
+ environment variable, allowing for different service accounts to be used for
90
+ different services. (See the READMEs for the individual service gems for
91
+ details.) The path to the **Credentials JSON** file can be stored in the
92
+ environment variable, or the **Credentials JSON** itself can be stored for
93
+ environments such as Docker containers where writing files is difficult or not
94
+ encouraged.
95
+
96
+ The environment variables that google-cloud-bigquery-data_transfer checks for project ID are:
97
+
98
+ 1. `DATA_TRANSFER_PROJECT`
99
+ 2. `GOOGLE_CLOUD_PROJECT`
100
+
101
+ The environment variables that google-cloud-bigquery-data_transfer checks for credentials are configured on {Google::Cloud::Bigquery::DataTransfer::V1::Credentials}:
102
+
103
+ 1. `DATA_TRANSFER_CREDENTIALS` - Path to JSON file, or JSON contents
104
+ 2. `DATA_TRANSFER_KEYFILE` - Path to JSON file, or JSON contents
105
+ 3. `GOOGLE_CLOUD_CREDENTIALS` - Path to JSON file, or JSON contents
106
+ 4. `GOOGLE_CLOUD_KEYFILE` - Path to JSON file, or JSON contents
107
+ 5. `GOOGLE_APPLICATION_CREDENTIALS` - Path to JSON file
108
+
109
+ ```ruby
110
+ require "google/cloud/bigquery/data_transfer"
111
+
112
+ ENV["DATA_TRANSFER_PROJECT"] = "my-project-id"
113
+ ENV["DATA_TRANSFER_CREDENTIALS"] = "path/to/keyfile.json"
114
+
115
+ client = Google::Cloud::Bigquery::DataTransfer.new
116
+ ```
117
+
118
+ ### Configuration
119
+
120
+ The **Project ID** and **Credentials JSON** can be configured instead of placing them in environment variables or providing them as arguments.
121
+
122
+ ```ruby
123
+ require "google/cloud/bigquery/data_transfer"
124
+
125
+ Google::Cloud::Bigquery::DataTransfer.configure do |config|
126
+ config.project_id = "my-project-id"
127
+ config.credentials = "path/to/keyfile.json"
128
+ end
129
+
130
+ client = Google::Cloud::Bigquery::DataTransfer.new
131
+ ```
132
+
133
+ ### Cloud SDK
134
+
135
+ This option allows for an easy way to authenticate during development. If
136
+ credentials are not provided in code or in environment variables, then Cloud SDK
137
+ credentials are discovered.
138
+
139
+ To configure your system for this, simply:
140
+
141
+ 1. [Download and install the Cloud SDK](https://cloud.google.com/sdk)
142
+ 2. Authenticate using OAuth 2.0 `$ gcloud auth login`
143
+ 3. Write code as if already authenticated.
144
+
145
+ **NOTE:** This is _not_ recommended for running in production. The Cloud SDK
146
+ *should* only be used during development.
147
+
148
+ [gce-how-to]: https://cloud.google.com/compute/docs/authentication#using
149
+ [dev-console]: https://console.cloud.google.com/project
150
+
151
+ [enable-apis]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/enable-apis.png
152
+
153
+ [create-new-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account.png
154
+ [create-new-service-account-existing-keys]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account-existing-keys.png
155
+ [reuse-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/reuse-service-account.png
156
+
157
+ ## Creating a Service Account
158
+
159
+ Google Cloud requires a **Project ID** and **Service Account Credentials** to
160
+ connect to the APIs. You will use the **Project ID** and **JSON key file** to
161
+ connect to most services with google-cloud-bigquery-data_transfer.
162
+
163
+ If you are not running this client within [Google Cloud Platform
164
+ environments](#google-cloud-platform-environments), you need a Google
165
+ Developers service account.
166
+
167
+ 1. Visit the [Google Developers Console][dev-console].
168
+ 1. Create a new project or click on an existing project.
169
+ 1. Activate the slide-out navigation tray and select **API Manager**. From
170
+ here, you will enable the APIs that your application requires.
171
+
172
+ ![Enable the APIs that your application requires][enable-apis]
173
+
174
+ *Note: You may need to enable billing in order to use these services.*
175
+
176
+ 1. Select **Credentials** from the side navigation.
177
+
178
+ You should see a screen like one of the following.
179
+
180
+ ![Create a new service account][create-new-service-account]
181
+
182
+ ![Create a new service account With Existing Keys][create-new-service-account-existing-keys]
183
+
184
+ Find the "Add credentials" drop down and select "Service account" to be
185
+ guided through downloading a new JSON key file.
186
+
187
+ If you want to re-use an existing service account, you can easily generate a
188
+ new key file. Just select the account you wish to re-use, and click "Generate
189
+ new JSON key":
190
+
191
+ ![Re-use an existing service account][reuse-service-account]
192
+
193
+ The key file you download will be used by this library to authenticate API
194
+ requests and should be stored in a secure location.
195
+
196
+ ## Troubleshooting
197
+
198
+ If you're having trouble authenticating you can ask for help by following the
199
+ {file:TROUBLESHOOTING.md Troubleshooting Guide}.
data/README.md CHANGED
@@ -25,16 +25,16 @@ $ gem install google-cloud-bigquery-data_transfer
25
25
  ```rb
26
26
  require "google/cloud/bigquery/data_transfer"
27
27
 
28
- data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new
28
+ data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new
29
29
  formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path(project_id)
30
30
 
31
31
  # Iterate over all results.
32
- data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
32
+ data_transfer_client.list_data_sources(formatted_parent).each do |element|
33
33
  # Process element.
34
34
  end
35
35
 
36
36
  # Or iterate over results one page at a time.
37
- data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
37
+ data_transfer_client.list_data_sources(formatted_parent).each_page do |page|
38
38
  # Process each page at a time.
39
39
  page.each do |element|
40
40
  # Process element.
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -48,16 +48,16 @@ module Google
48
48
  # ```rb
49
49
  # require "google/cloud/bigquery/data_transfer"
50
50
  #
51
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new
51
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new
52
52
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path(project_id)
53
53
  #
54
54
  # # Iterate over all results.
55
- # data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
55
+ # data_transfer_client.list_data_sources(formatted_parent).each do |element|
56
56
  # # Process element.
57
57
  # end
58
58
  #
59
59
  # # Or iterate over results one page at a time.
60
- # data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
60
+ # data_transfer_client.list_data_sources(formatted_parent).each_page do |page|
61
61
  # # Process each page at a time.
62
62
  # page.each do |element|
63
63
  # # Process element.
@@ -112,9 +112,9 @@ module Google
112
112
 
113
113
  ##
114
114
  # The Google BigQuery Data Transfer Service API enables BigQuery users to
115
- # configure the transfer of their data from other Google Products into BigQuery.
116
- # This service contains methods that are end user exposed. It backs up the
117
- # frontend.
115
+ # configure the transfer of their data from other Google Products into
116
+ # BigQuery. This service contains methods that are end user exposed. It backs
117
+ # up the frontend.
118
118
  #
119
119
  # @param version [Symbol, String]
120
120
  # The major version of the service to be used. By default :v1
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -48,16 +48,16 @@ module Google
48
48
  # ```rb
49
49
  # require "google/cloud/bigquery/data_transfer"
50
50
  #
51
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
51
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
52
52
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path(project_id)
53
53
  #
54
54
  # # Iterate over all results.
55
- # data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
55
+ # data_transfer_client.list_data_sources(formatted_parent).each do |element|
56
56
  # # Process element.
57
57
  # end
58
58
  #
59
59
  # # Or iterate over results one page at a time.
60
- # data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
60
+ # data_transfer_client.list_data_sources(formatted_parent).each_page do |page|
61
61
  # # Process each page at a time.
62
62
  # page.each do |element|
63
63
  # # Process element.
@@ -104,9 +104,9 @@ module Google
104
104
 
105
105
  ##
106
106
  # The Google BigQuery Data Transfer Service API enables BigQuery users to
107
- # configure the transfer of their data from other Google Products into BigQuery.
108
- # This service contains methods that are end user exposed. It backs up the
109
- # frontend.
107
+ # configure the transfer of their data from other Google Products into
108
+ # BigQuery. This service contains methods that are end user exposed. It backs
109
+ # up the frontend.
110
110
  #
111
111
  # @param credentials [Google::Auth::Credentials, String, Hash, GRPC::Core::Channel, GRPC::Core::ChannelCredentials, Proc]
112
112
  # Provides the means for authenticating requests made by the client. This parameter can
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -34,9 +34,9 @@ module Google
34
34
  module DataTransfer
35
35
  module V1
36
36
  # The Google BigQuery Data Transfer Service API enables BigQuery users to
37
- # configure the transfer of their data from other Google Products into BigQuery.
38
- # This service contains methods that are end user exposed. It backs up the
39
- # frontend.
37
+ # configure the transfer of their data from other Google Products into
38
+ # BigQuery. This service contains methods that are end user exposed. It backs
39
+ # up the frontend.
40
40
  #
41
41
  # @!attribute [r] data_transfer_service_stub
42
42
  # @return [Google::Cloud::Bigquery::Datatransfer::V1::DataTransferService::Stub]
@@ -83,23 +83,17 @@ module Google
83
83
  ].freeze
84
84
 
85
85
 
86
- PROJECT_DATA_SOURCE_PATH_TEMPLATE = Google::Gax::PathTemplate.new(
87
- "projects/{project}/dataSources/{data_source}"
88
- )
89
-
90
- private_constant :PROJECT_DATA_SOURCE_PATH_TEMPLATE
91
-
92
86
  PROJECT_PATH_TEMPLATE = Google::Gax::PathTemplate.new(
93
87
  "projects/{project}"
94
88
  )
95
89
 
96
90
  private_constant :PROJECT_PATH_TEMPLATE
97
91
 
98
- PROJECT_TRANSFER_CONFIG_PATH_TEMPLATE = Google::Gax::PathTemplate.new(
99
- "projects/{project}/transferConfigs/{transfer_config}"
92
+ PROJECT_DATA_SOURCE_PATH_TEMPLATE = Google::Gax::PathTemplate.new(
93
+ "projects/{project}/dataSources/{data_source}"
100
94
  )
101
95
 
102
- private_constant :PROJECT_TRANSFER_CONFIG_PATH_TEMPLATE
96
+ private_constant :PROJECT_DATA_SOURCE_PATH_TEMPLATE
103
97
 
104
98
  PROJECT_RUN_PATH_TEMPLATE = Google::Gax::PathTemplate.new(
105
99
  "projects/{project}/transferConfigs/{transfer_config}/runs/{run}"
@@ -107,16 +101,11 @@ module Google
107
101
 
108
102
  private_constant :PROJECT_RUN_PATH_TEMPLATE
109
103
 
110
- # Returns a fully-qualified project_data_source resource name string.
111
- # @param project [String]
112
- # @param data_source [String]
113
- # @return [String]
114
- def self.project_data_source_path project, data_source
115
- PROJECT_DATA_SOURCE_PATH_TEMPLATE.render(
116
- :"project" => project,
117
- :"data_source" => data_source
118
- )
119
- end
104
+ PROJECT_TRANSFER_CONFIG_PATH_TEMPLATE = Google::Gax::PathTemplate.new(
105
+ "projects/{project}/transferConfigs/{transfer_config}"
106
+ )
107
+
108
+ private_constant :PROJECT_TRANSFER_CONFIG_PATH_TEMPLATE
120
109
 
121
110
  # Returns a fully-qualified project resource name string.
122
111
  # @param project [String]
@@ -127,14 +116,14 @@ module Google
127
116
  )
128
117
  end
129
118
 
130
- # Returns a fully-qualified project_transfer_config resource name string.
119
+ # Returns a fully-qualified project_data_source resource name string.
131
120
  # @param project [String]
132
- # @param transfer_config [String]
121
+ # @param data_source [String]
133
122
  # @return [String]
134
- def self.project_transfer_config_path project, transfer_config
135
- PROJECT_TRANSFER_CONFIG_PATH_TEMPLATE.render(
123
+ def self.project_data_source_path project, data_source
124
+ PROJECT_DATA_SOURCE_PATH_TEMPLATE.render(
136
125
  :"project" => project,
137
- :"transfer_config" => transfer_config
126
+ :"data_source" => data_source
138
127
  )
139
128
  end
140
129
 
@@ -151,6 +140,17 @@ module Google
151
140
  )
152
141
  end
153
142
 
143
+ # Returns a fully-qualified project_transfer_config resource name string.
144
+ # @param project [String]
145
+ # @param transfer_config [String]
146
+ # @return [String]
147
+ def self.project_transfer_config_path project, transfer_config
148
+ PROJECT_TRANSFER_CONFIG_PATH_TEMPLATE.render(
149
+ :"project" => project,
150
+ :"transfer_config" => transfer_config
151
+ )
152
+ end
153
+
154
154
  # @param credentials [Google::Auth::Credentials, String, Hash, GRPC::Core::Channel, GRPC::Core::ChannelCredentials, Proc]
155
155
  # Provides the means for authenticating requests made by the client. This parameter can
156
156
  # be many types.
@@ -379,9 +379,9 @@ module Google
379
379
  # @example
380
380
  # require "google/cloud/bigquery/data_transfer"
381
381
  #
382
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
382
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
383
383
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_data_source_path("[PROJECT]", "[DATA_SOURCE]")
384
- # response = data_transfer_service_client.get_data_source(formatted_name)
384
+ # response = data_transfer_client.get_data_source(formatted_name)
385
385
 
386
386
  def get_data_source \
387
387
  name,
@@ -421,16 +421,16 @@ module Google
421
421
  # @example
422
422
  # require "google/cloud/bigquery/data_transfer"
423
423
  #
424
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
424
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
425
425
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path("[PROJECT]")
426
426
  #
427
427
  # # Iterate over all results.
428
- # data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
428
+ # data_transfer_client.list_data_sources(formatted_parent).each do |element|
429
429
  # # Process element.
430
430
  # end
431
431
  #
432
432
  # # Or iterate over results one page at a time.
433
- # data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
433
+ # data_transfer_client.list_data_sources(formatted_parent).each_page do |page|
434
434
  # # Process each page at a time.
435
435
  # page.each do |element|
436
436
  # # Process element.
@@ -489,12 +489,12 @@ module Google
489
489
  # @example
490
490
  # require "google/cloud/bigquery/data_transfer"
491
491
  #
492
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
492
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
493
493
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path("[PROJECT]")
494
494
  #
495
495
  # # TODO: Initialize `transfer_config`:
496
496
  # transfer_config = {}
497
- # response = data_transfer_service_client.create_transfer_config(formatted_parent, transfer_config)
497
+ # response = data_transfer_client.create_transfer_config(formatted_parent, transfer_config)
498
498
 
499
499
  def create_transfer_config \
500
500
  parent,
@@ -550,14 +550,14 @@ module Google
550
550
  # @example
551
551
  # require "google/cloud/bigquery/data_transfer"
552
552
  #
553
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
553
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
554
554
  #
555
555
  # # TODO: Initialize `transfer_config`:
556
556
  # transfer_config = {}
557
557
  #
558
558
  # # TODO: Initialize `update_mask`:
559
559
  # update_mask = {}
560
- # response = data_transfer_service_client.update_transfer_config(transfer_config, update_mask)
560
+ # response = data_transfer_client.update_transfer_config(transfer_config, update_mask)
561
561
 
562
562
  def update_transfer_config \
563
563
  transfer_config,
@@ -590,9 +590,9 @@ module Google
590
590
  # @example
591
591
  # require "google/cloud/bigquery/data_transfer"
592
592
  #
593
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
593
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
594
594
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
595
- # data_transfer_service_client.delete_transfer_config(formatted_name)
595
+ # data_transfer_client.delete_transfer_config(formatted_name)
596
596
 
597
597
  def delete_transfer_config \
598
598
  name,
@@ -622,9 +622,9 @@ module Google
622
622
  # @example
623
623
  # require "google/cloud/bigquery/data_transfer"
624
624
  #
625
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
625
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
626
626
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
627
- # response = data_transfer_service_client.get_transfer_config(formatted_name)
627
+ # response = data_transfer_client.get_transfer_config(formatted_name)
628
628
 
629
629
  def get_transfer_config \
630
630
  name,
@@ -665,16 +665,16 @@ module Google
665
665
  # @example
666
666
  # require "google/cloud/bigquery/data_transfer"
667
667
  #
668
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
668
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
669
669
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path("[PROJECT]")
670
670
  #
671
671
  # # Iterate over all results.
672
- # data_transfer_service_client.list_transfer_configs(formatted_parent).each do |element|
672
+ # data_transfer_client.list_transfer_configs(formatted_parent).each do |element|
673
673
  # # Process element.
674
674
  # end
675
675
  #
676
676
  # # Or iterate over results one page at a time.
677
- # data_transfer_service_client.list_transfer_configs(formatted_parent).each_page do |page|
677
+ # data_transfer_client.list_transfer_configs(formatted_parent).each_page do |page|
678
678
  # # Process each page at a time.
679
679
  # page.each do |element|
680
680
  # # Process element.
@@ -725,7 +725,7 @@ module Google
725
725
  # @example
726
726
  # require "google/cloud/bigquery/data_transfer"
727
727
  #
728
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
728
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
729
729
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
730
730
  #
731
731
  # # TODO: Initialize `start_time`:
@@ -733,7 +733,7 @@ module Google
733
733
  #
734
734
  # # TODO: Initialize `end_time`:
735
735
  # end_time = {}
736
- # response = data_transfer_service_client.schedule_transfer_runs(formatted_parent, start_time, end_time)
736
+ # response = data_transfer_client.schedule_transfer_runs(formatted_parent, start_time, end_time)
737
737
 
738
738
  def schedule_transfer_runs \
739
739
  parent,
@@ -766,9 +766,9 @@ module Google
766
766
  # @example
767
767
  # require "google/cloud/bigquery/data_transfer"
768
768
  #
769
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
769
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
770
770
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_run_path("[PROJECT]", "[TRANSFER_CONFIG]", "[RUN]")
771
- # response = data_transfer_service_client.get_transfer_run(formatted_name)
771
+ # response = data_transfer_client.get_transfer_run(formatted_name)
772
772
 
773
773
  def get_transfer_run \
774
774
  name,
@@ -796,9 +796,9 @@ module Google
796
796
  # @example
797
797
  # require "google/cloud/bigquery/data_transfer"
798
798
  #
799
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
799
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
800
800
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_run_path("[PROJECT]", "[TRANSFER_CONFIG]", "[RUN]")
801
- # data_transfer_service_client.delete_transfer_run(formatted_name)
801
+ # data_transfer_client.delete_transfer_run(formatted_name)
802
802
 
803
803
  def delete_transfer_run \
804
804
  name,
@@ -843,16 +843,16 @@ module Google
843
843
  # @example
844
844
  # require "google/cloud/bigquery/data_transfer"
845
845
  #
846
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
846
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
847
847
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
848
848
  #
849
849
  # # Iterate over all results.
850
- # data_transfer_service_client.list_transfer_runs(formatted_parent).each do |element|
850
+ # data_transfer_client.list_transfer_runs(formatted_parent).each do |element|
851
851
  # # Process element.
852
852
  # end
853
853
  #
854
854
  # # Or iterate over results one page at a time.
855
- # data_transfer_service_client.list_transfer_runs(formatted_parent).each_page do |page|
855
+ # data_transfer_client.list_transfer_runs(formatted_parent).each_page do |page|
856
856
  # # Process each page at a time.
857
857
  # page.each do |element|
858
858
  # # Process element.
@@ -905,16 +905,16 @@ module Google
905
905
  # @example
906
906
  # require "google/cloud/bigquery/data_transfer"
907
907
  #
908
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
908
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
909
909
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_run_path("[PROJECT]", "[TRANSFER_CONFIG]", "[RUN]")
910
910
  #
911
911
  # # Iterate over all results.
912
- # data_transfer_service_client.list_transfer_logs(formatted_parent).each do |element|
912
+ # data_transfer_client.list_transfer_logs(formatted_parent).each do |element|
913
913
  # # Process element.
914
914
  # end
915
915
  #
916
916
  # # Or iterate over results one page at a time.
917
- # data_transfer_service_client.list_transfer_logs(formatted_parent).each_page do |page|
917
+ # data_transfer_client.list_transfer_logs(formatted_parent).each_page do |page|
918
918
  # # Process each page at a time.
919
919
  # page.each do |element|
920
920
  # # Process element.
@@ -957,9 +957,9 @@ module Google
957
957
  # @example
958
958
  # require "google/cloud/bigquery/data_transfer"
959
959
  #
960
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
960
+ # data_transfer_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
961
961
  # formatted_name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_data_source_path("[PROJECT]", "[DATA_SOURCE]")
962
- # response = data_transfer_service_client.check_valid_creds(formatted_name)
962
+ # response = data_transfer_client.check_valid_creds(formatted_name)
963
963
 
964
964
  def check_valid_creds \
965
965
  name,
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -174,19 +174,6 @@ module Google
174
174
  end
175
175
  end
176
176
 
177
- # DEPRECATED. Represents data transfer type.
178
- module TransferType
179
- # Invalid or Unknown transfer type placeholder.
180
- TRANSFER_TYPE_UNSPECIFIED = 0
181
-
182
- # Batch data transfer.
183
- BATCH = 1
184
-
185
- # Streaming data transfer. Streaming data source currently doesn't
186
- # support multiple transfer configs per project.
187
- STREAMING = 2
188
- end
189
-
190
177
  # Represents data transfer run state.
191
178
  module TransferState
192
179
  # State placeholder.
@@ -208,6 +195,19 @@ module Google
208
195
  # Data transfer is cancelled.
209
196
  CANCELLED = 6
210
197
  end
198
+
199
+ # DEPRECATED. Represents data transfer type.
200
+ module TransferType
201
+ # Invalid or Unknown transfer type placeholder.
202
+ TRANSFER_TYPE_UNSPECIFIED = 0
203
+
204
+ # Batch data transfer.
205
+ BATCH = 1
206
+
207
+ # Streaming data transfer. Streaming data source currently doesn't
208
+ # support multiple transfer configs per project.
209
+ STREAMING = 2
210
+ end
211
211
  end
212
212
  end
213
213
  end
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -97,7 +97,8 @@ module Google
97
97
  # @!attribute [rw] type_url
98
98
  # @return [String]
99
99
  # A URL/resource name that uniquely identifies the type of the serialized
100
- # protocol buffer message. The last segment of the URL's path must represent
100
+ # protocol buffer message. This string must contain at least
101
+ # one "/" character. The last segment of the URL's path must represent
101
102
  # the fully qualified name of the type (as in
102
103
  # `path/google.protobuf.Duration`). The name should be in a canonical form
103
104
  # (e.g., leading "." is not accepted).
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -83,57 +83,49 @@ module Google
83
83
  # describe the updated values, the API ignores the values of all
84
84
  # fields not covered by the mask.
85
85
  #
86
- # If a repeated field is specified for an update operation, the existing
87
- # repeated values in the target resource will be overwritten by the new values.
88
- # Note that a repeated field is only allowed in the last position of a `paths`
89
- # string.
86
+ # If a repeated field is specified for an update operation, new values will
87
+ # be appended to the existing repeated field in the target resource. Note that
88
+ # a repeated field is only allowed in the last position of a `paths` string.
90
89
  #
91
90
  # If a sub-message is specified in the last position of the field mask for an
92
- # update operation, then the existing sub-message in the target resource is
93
- # overwritten. Given the target message:
91
+ # update operation, then new value will be merged into the existing sub-message
92
+ # in the target resource.
93
+ #
94
+ # For example, given the target message:
94
95
  #
95
96
  # f {
96
97
  # b {
97
- # d : 1
98
- # x : 2
98
+ # d: 1
99
+ # x: 2
99
100
  # }
100
- # c : 1
101
+ # c: [1]
101
102
  # }
102
103
  #
103
104
  # And an update message:
104
105
  #
105
106
  # f {
106
107
  # b {
107
- # d : 10
108
+ # d: 10
108
109
  # }
110
+ # c: [2]
109
111
  # }
110
112
  #
111
113
  # then if the field mask is:
112
114
  #
113
- # paths: "f.b"
115
+ # paths: ["f.b", "f.c"]
114
116
  #
115
117
  # then the result will be:
116
118
  #
117
119
  # f {
118
120
  # b {
119
- # d : 10
121
+ # d: 10
122
+ # x: 2
120
123
  # }
121
- # c : 1
124
+ # c: [1, 2]
122
125
  # }
123
126
  #
124
- # However, if the update mask was:
125
- #
126
- # paths: "f.b.d"
127
- #
128
- # then the result would be:
129
- #
130
- # f {
131
- # b {
132
- # d : 10
133
- # x : 2
134
- # }
135
- # c : 1
136
- # }
127
+ # An implementation may provide options to override this default behavior for
128
+ # repeated and message fields.
137
129
  #
138
130
  # In order to reset a field's value to the default, the field must
139
131
  # be in the mask and set to the default value in the provided resource.
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -15,17 +15,19 @@
15
15
 
16
16
  module Google
17
17
  module Protobuf
18
- # A Timestamp represents a point in time independent of any time zone
19
- # or calendar, represented as seconds and fractions of seconds at
20
- # nanosecond resolution in UTC Epoch time. It is encoded using the
21
- # Proleptic Gregorian Calendar which extends the Gregorian calendar
22
- # backwards to year one. It is encoded assuming all minutes are 60
23
- # seconds long, i.e. leap seconds are "smeared" so that no leap second
24
- # table is needed for interpretation. Range is from
25
- # 0001-01-01T00:00:00Z to 9999-12-31T23:59:59.999999999Z.
26
- # By restricting to that range, we ensure that we can convert to
27
- # and from RFC 3339 date strings.
28
- # See [https://www.ietf.org/rfc/rfc3339.txt](https://www.ietf.org/rfc/rfc3339.txt).
18
+ # A Timestamp represents a point in time independent of any time zone or local
19
+ # calendar, encoded as a count of seconds and fractions of seconds at
20
+ # nanosecond resolution. The count is relative to an epoch at UTC midnight on
21
+ # January 1, 1970, in the proleptic Gregorian calendar which extends the
22
+ # Gregorian calendar backwards to year one.
23
+ #
24
+ # All minutes are 60 seconds long. Leap seconds are "smeared" so that no leap
25
+ # second table is needed for interpretation, using a [24-hour linear
26
+ # smear](https://developers.google.com/time/smear).
27
+ #
28
+ # The range is from 0001-01-01T00:00:00Z to 9999-12-31T23:59:59.999999999Z. By
29
+ # restricting to that range, we ensure that we can convert to and from [RFC
30
+ # 3339](https://www.ietf.org/rfc/rfc3339.txt) date strings.
29
31
  #
30
32
  # = Examples
31
33
  #
@@ -86,12 +88,12 @@ module Google
86
88
  # 01:30 UTC on January 15, 2017.
87
89
  #
88
90
  # In JavaScript, one can convert a Date object to this format using the
89
- # standard [toISOString()](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/toISOString]
91
+ # standard [toISOString()](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/toISOString)
90
92
  # method. In Python, a standard `datetime.datetime` object can be converted
91
93
  # to this format using [`strftime`](https://docs.python.org/2/library/time.html#time.strftime)
92
94
  # with the time format spec '%Y-%m-%dT%H:%M:%S.%fZ'. Likewise, in Java, one
93
95
  # can use the Joda Time's [`ISODateTimeFormat.dateTime()`](
94
- # http://www.joda.org/joda-time/apidocs/org/joda/time/format/ISODateTimeFormat.html#dateTime--
96
+ # http://www.joda.org/joda-time/apidocs/org/joda/time/format/ISODateTimeFormat.html#dateTime%2D%2D
95
97
  # ) to obtain a formatter capable of generating timestamps in this format.
96
98
  # @!attribute [rw] seconds
97
99
  # @return [Integer]
@@ -1,4 +1,4 @@
1
- # Copyright 2018 Google LLC
1
+ # Copyright 2019 Google LLC
2
2
  #
3
3
  # Licensed under the Apache License, Version 2.0 (the "License");
4
4
  # you may not use this file except in compliance with the License.
@@ -15,24 +15,25 @@
15
15
 
16
16
  module Google
17
17
  module Rpc
18
- # The `Status` type defines a logical error model that is suitable for different
19
- # programming environments, including REST APIs and RPC APIs. It is used by
20
- # [gRPC](https://github.com/grpc). The error model is designed to be:
18
+ # The `Status` type defines a logical error model that is suitable for
19
+ # different programming environments, including REST APIs and RPC APIs. It is
20
+ # used by [gRPC](https://github.com/grpc). The error model is designed to be:
21
21
  #
22
22
  # * Simple to use and understand for most users
23
23
  # * Flexible enough to meet unexpected needs
24
24
  #
25
25
  # = Overview
26
26
  #
27
- # The `Status` message contains three pieces of data: error code, error message,
28
- # and error details. The error code should be an enum value of
29
- # {Google::Rpc::Code}, but it may accept additional error codes if needed. The
30
- # error message should be a developer-facing English message that helps
31
- # developers *understand* and *resolve* the error. If a localized user-facing
32
- # error message is needed, put the localized message in the error details or
33
- # localize it in the client. The optional error details may contain arbitrary
34
- # information about the error. There is a predefined set of error detail types
35
- # in the package `google.rpc` that can be used for common error conditions.
27
+ # The `Status` message contains three pieces of data: error code, error
28
+ # message, and error details. The error code should be an enum value of
29
+ # {Google::Rpc::Code}, but it may accept additional error codes
30
+ # if needed. The error message should be a developer-facing English message
31
+ # that helps developers *understand* and *resolve* the error. If a localized
32
+ # user-facing error message is needed, put the localized message in the error
33
+ # details or localize it in the client. The optional error details may contain
34
+ # arbitrary information about the error. There is a predefined set of error
35
+ # detail types in the package `google.rpc` that can be used for common error
36
+ # conditions.
36
37
  #
37
38
  # = Language mapping
38
39
  #
@@ -69,12 +70,14 @@ module Google
69
70
  # be used directly after any stripping needed for security/privacy reasons.
70
71
  # @!attribute [rw] code
71
72
  # @return [Integer]
72
- # The status code, which should be an enum value of {Google::Rpc::Code}.
73
+ # The status code, which should be an enum value of
74
+ # {Google::Rpc::Code}.
73
75
  # @!attribute [rw] message
74
76
  # @return [String]
75
77
  # A developer-facing error message, which should be in English. Any
76
78
  # user-facing error message should be localized and sent in the
77
- # {Google::Rpc::Status#details} field, or localized by the client.
79
+ # {Google::Rpc::Status#details} field, or localized
80
+ # by the client.
78
81
  # @!attribute [rw] details
79
82
  # @return [Array<Google::Protobuf::Any>]
80
83
  # A list of messages that carry the error details. There is a common set of
@@ -27,9 +27,9 @@ module Google
27
27
  module V1
28
28
  module DataTransferService
29
29
  # The Google BigQuery Data Transfer Service API enables BigQuery users to
30
- # configure the transfer of their data from other Google Products into BigQuery.
31
- # This service contains methods that are end user exposed. It backs up the
32
- # frontend.
30
+ # configure the transfer of their data from other Google Products into
31
+ # BigQuery. This service contains methods that are end user exposed. It backs
32
+ # up the frontend.
33
33
  class Service
34
34
 
35
35
  include GRPC::GenericService
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery-data_transfer
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.3
4
+ version: 0.2.4
5
5
  platform: ruby
6
6
  authors:
7
7
  - Google LLC
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-09-20 00:00:00.000000000 Z
11
+ date: 2019-04-29 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: google-gax
@@ -58,14 +58,14 @@ dependencies:
58
58
  requirements:
59
59
  - - "~>"
60
60
  - !ruby/object:Gem::Version
61
- version: 0.50.0
61
+ version: 0.64.0
62
62
  type: :development
63
63
  prerelease: false
64
64
  version_requirements: !ruby/object:Gem::Requirement
65
65
  requirements:
66
66
  - - "~>"
67
67
  - !ruby/object:Gem::Version
68
- version: 0.50.0
68
+ version: 0.64.0
69
69
  - !ruby/object:Gem::Dependency
70
70
  name: simplecov
71
71
  requirement: !ruby/object:Gem::Requirement
@@ -102,6 +102,7 @@ extensions: []
102
102
  extra_rdoc_files: []
103
103
  files:
104
104
  - ".yardopts"
105
+ - AUTHENTICATION.md
105
106
  - LICENSE
106
107
  - README.md
107
108
  - lib/google/cloud/bigquery/data_transfer.rb
@@ -147,8 +148,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
147
148
  - !ruby/object:Gem::Version
148
149
  version: '0'
149
150
  requirements: []
150
- rubyforge_project:
151
- rubygems_version: 2.7.7
151
+ rubygems_version: 3.0.3
152
152
  signing_key:
153
153
  specification_version: 4
154
154
  summary: API Client library for BigQuery Data Transfer API