google-cloud-bigquery-data_transfer 0.9.0 → 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
Files changed (36) hide show
  1. checksums.yaml +4 -4
  2. data/.yardopts +2 -1
  3. data/AUTHENTICATION.md +51 -54
  4. data/LICENSE.md +203 -0
  5. data/MIGRATING.md +301 -0
  6. data/README.md +33 -45
  7. data/lib/{google/cloud/bigquery/data_transfer/v1/doc/google/cloud/bigquery/datatransfer/v1/datasource.rb → google-cloud-bigquery-data_transfer.rb} +5 -11
  8. data/lib/google/cloud/bigquery/data_transfer.rb +82 -140
  9. data/lib/google/cloud/bigquery/data_transfer/version.rb +6 -2
  10. metadata +85 -65
  11. data/LICENSE +0 -201
  12. data/lib/google/cloud/bigquery/data_transfer/credentials.rb +0 -33
  13. data/lib/google/cloud/bigquery/data_transfer/v1.rb +0 -173
  14. data/lib/google/cloud/bigquery/data_transfer/v1/credentials.rb +0 -43
  15. data/lib/google/cloud/bigquery/data_transfer/v1/data_transfer_pb.rb +0 -190
  16. data/lib/google/cloud/bigquery/data_transfer/v1/data_transfer_service_client.rb +0 -1230
  17. data/lib/google/cloud/bigquery/data_transfer/v1/data_transfer_service_client_config.json +0 -96
  18. data/lib/google/cloud/bigquery/data_transfer/v1/data_transfer_services_pb.rb +0 -87
  19. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/cloud/bigquery/data_transfer/v1/data_transfer.rb +0 -500
  20. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/cloud/bigquery/data_transfer/v1/transfer.rb +0 -217
  21. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/cloud/bigquery/datatransfer/v1/datatransfer.rb +0 -570
  22. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/cloud/bigquery/datatransfer/v1/transfer.rb +0 -257
  23. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/any.rb +0 -131
  24. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/duration.rb +0 -91
  25. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/empty.rb +0 -29
  26. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/field_mask.rb +0 -222
  27. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/struct.rb +0 -74
  28. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/timestamp.rb +0 -113
  29. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/wrappers.rb +0 -26
  30. data/lib/google/cloud/bigquery/data_transfer/v1/doc/google/rpc/status.rb +0 -39
  31. data/lib/google/cloud/bigquery/data_transfer/v1/transfer_pb.rb +0 -83
  32. data/lib/google/cloud/bigquery/datatransfer/v1/datasource_pb.rb +0 -170
  33. data/lib/google/cloud/bigquery/datatransfer/v1/datasource_services_pb.rb +0 -103
  34. data/lib/google/cloud/bigquery/datatransfer/v1/datatransfer_pb.rb +0 -217
  35. data/lib/google/cloud/bigquery/datatransfer/v1/datatransfer_services_pb.rb +0 -94
  36. data/lib/google/cloud/bigquery/datatransfer/v1/transfer_pb.rb +0 -104
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: cb1b49118f9f9a1acb642c3441952ac3177ded8b27941b7b9ef7e9e7fb2d0747
4
- data.tar.gz: cbcfcd84ed59aea97fe159411c31880e840f96815a4fcd39ed421eb0c6fd218b
3
+ metadata.gz: 7ac493bd5c515735214a61f57e8b52677c9f3acdde6679218bd64dbc0e63169f
4
+ data.tar.gz: e781e7658fffcc76f3ed4655ed0f4193486c71c72e150deee50585c67142921d
5
5
  SHA512:
6
- metadata.gz: 9e76cd854cd7a764f34ca205f9ed5d4cc0b87fd2e0b664605ad22c52713e5a403c026413defe3e899d1322f8b3bc9e471b9ac4ad8d11ee6f1733802aee059e22
7
- data.tar.gz: e82f85d963081fe0819e76c3e10b2c311e274bfb89065947bcb2ddba49929ea18e931f678538add63e50c7de863e4d10d9cf59608db7504c719d54095dd80c32
6
+ metadata.gz: 8be9e383c066904277e8ba5ddc758310686ef75faccc7b67c8a12d38b20c42747e787ed76a67110a885e6100df66fc7b3a733980951e27f7bf02189b57b07bc3
7
+ data.tar.gz: 37879e27ca3c13709e8c3bc9b1e7e46f6744f4ee422dbdf7bac6db31160cca3ace8939c8605ae4af799f29795f5faec4768f410daa406c8cd43a3769a52fd966
data/.yardopts CHANGED
@@ -8,4 +8,5 @@
8
8
  -
9
9
  README.md
10
10
  AUTHENTICATION.md
11
- LICENSE
11
+ MIGRATING.md
12
+ LICENSE.md
data/AUTHENTICATION.md CHANGED
@@ -1,16 +1,17 @@
1
1
  # Authentication
2
2
 
3
- In general, the google-cloud-bigquery-data_transfer library uses [Service
4
- Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
5
- credentials to connect to Google Cloud services. When running within [Google
6
- Cloud Platform environments](#google-cloud-platform-environments)
7
- the credentials will be discovered automatically. When running on other
3
+ In general, the google-cloud-bigquery-data_transfer library uses
4
+ [Service Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
5
+ credentials to connect to Google Cloud services. When running within
6
+ [Google Cloud Platform environments](#google-cloud-platform-environments) the
7
+ credentials will be discovered automatically. When running on other
8
8
  environments, the Service Account credentials can be specified by providing the
9
- path to the [JSON
10
- keyfile](https://cloud.google.com/iam/docs/managing-service-account-keys) for
11
- the account (or the JSON itself) in [environment
12
- variables](#environment-variables). Additionally, Cloud SDK credentials can also
13
- be discovered automatically, but this is only recommended during development.
9
+ path to the
10
+ [JSON keyfile](https://cloud.google.com/iam/docs/managing-service-account-keys)
11
+ for the account (or the JSON itself) in
12
+ [environment variables](#environment-variables). Additionally, Cloud SDK
13
+ credentials can also be discovered automatically, but this is only recommended
14
+ during development.
14
15
 
15
16
  ## Quickstart
16
17
 
@@ -18,7 +19,7 @@ be discovered automatically, but this is only recommended during development.
18
19
  2. Set the [environment variable](#environment-variables).
19
20
 
20
21
  ```sh
21
- export DATA_TRANSFER_CREDENTIALS=/path/to/json`
22
+ export DATA_TRANSFER_CREDENTIALS=path/to/keyfile.json
22
23
  ```
23
24
 
24
25
  3. Initialize the client.
@@ -26,23 +27,14 @@ export DATA_TRANSFER_CREDENTIALS=/path/to/json`
26
27
  ```ruby
27
28
  require "google/cloud/bigquery/data_transfer"
28
29
 
29
- client = Google::Cloud::Bigquery::DataTransfer.new
30
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
30
31
  ```
31
32
 
32
- ## Project and Credential Lookup
33
+ ## Credential Lookup
33
34
 
34
35
  The google-cloud-bigquery-data_transfer library aims to make authentication
35
36
  as simple as possible, and provides several mechanisms to configure your system
36
- without providing **Project ID** and **Service Account Credentials** directly in
37
- code.
38
-
39
- **Project ID** is discovered in the following order:
40
-
41
- 1. Specify project ID in method arguments
42
- 2. Specify project ID in configuration
43
- 3. Discover project ID in environment variables
44
- 4. Discover GCP project ID
45
- 5. Discover project ID in credentials JSON
37
+ without requiring **Service Account Credentials** directly in code.
46
38
 
47
39
  **Credentials** are discovered in the following order:
48
40
 
@@ -55,28 +47,24 @@ code.
55
47
 
56
48
  ### Google Cloud Platform environments
57
49
 
58
- When running on Google Cloud Platform (GCP), including Google Compute Engine (GCE),
59
- Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud Functions
60
- (GCF) and Cloud Run, the **Project ID** and **Credentials** and are discovered
61
- automatically. Code should be written as if already authenticated.
50
+ When running on Google Cloud Platform (GCP), including Google Compute Engine
51
+ (GCE), Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud
52
+ Functions (GCF) and Cloud Run, **Credentials** are discovered automatically.
53
+ Code should be written as if already authenticated.
62
54
 
63
55
  ### Environment Variables
64
56
 
65
- The **Project ID** and **Credentials JSON** can be placed in environment
66
- variables instead of declaring them directly in code. Each service has its own
67
- environment variable, allowing for different service accounts to be used for
68
- different services. (See the READMEs for the individual service gems for
69
- details.) The path to the **Credentials JSON** file can be stored in the
70
- environment variable, or the **Credentials JSON** itself can be stored for
71
- environments such as Docker containers where writing files is difficult or not
72
- encouraged.
73
-
74
- The environment variables that google-cloud-bigquery-data_transfer checks for project ID are:
57
+ The **Credentials JSON** can be placed in environment variables instead of
58
+ declaring them directly in code. Each service has its own environment variable,
59
+ allowing for different service accounts to be used for different services. (See
60
+ the READMEs for the individual service gems for details.) The path to the
61
+ **Credentials JSON** file can be stored in the environment variable, or the
62
+ **Credentials JSON** itself can be stored for environments such as Docker
63
+ containers where writing files is difficult or not encouraged.
75
64
 
76
- 1. `DATA_TRANSFER_PROJECT`
77
- 2. `GOOGLE_CLOUD_PROJECT`
78
-
79
- The environment variables that google-cloud-bigquery-data_transfer checks for credentials are configured on {Google::Cloud::Bigquery::DataTransfer::V1::Credentials}:
65
+ The environment variables that google-cloud-bigquery-data_transfer
66
+ checks for credentials are configured on the service Credentials class (such as
67
+ `::Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Credentials`):
80
68
 
81
69
  1. `DATA_TRANSFER_CREDENTIALS` - Path to JSON file, or JSON contents
82
70
  2. `DATA_TRANSFER_KEYFILE` - Path to JSON file, or JSON contents
@@ -87,25 +75,34 @@ The environment variables that google-cloud-bigquery-data_transfer checks for cr
87
75
  ```ruby
88
76
  require "google/cloud/bigquery/data_transfer"
89
77
 
90
- ENV["DATA_TRANSFER_PROJECT"] = "my-project-id"
91
78
  ENV["DATA_TRANSFER_CREDENTIALS"] = "path/to/keyfile.json"
92
79
 
93
- client = Google::Cloud::Bigquery::DataTransfer.new
80
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
94
81
  ```
95
82
 
96
83
  ### Configuration
97
84
 
98
- The **Project ID** and **Credentials JSON** can be configured instead of placing them in environment variables or providing them as arguments.
85
+ The **Credentials JSON** can be configured instead of placing them in
86
+ environment variables. Either on an individual client initialization:
87
+
88
+ ```ruby
89
+ require "google/cloud/bigquery/data_transfer"
90
+
91
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service do |config|
92
+ config.credentials = "path/to/keyfile.json"
93
+ end
94
+ ```
95
+
96
+ Or configured globally for all clients:
99
97
 
100
98
  ```ruby
101
99
  require "google/cloud/bigquery/data_transfer"
102
100
 
103
101
  Google::Cloud::Bigquery::DataTransfer.configure do |config|
104
- config.project_id = "my-project-id"
105
102
  config.credentials = "path/to/keyfile.json"
106
103
  end
107
104
 
108
- client = Google::Cloud::Bigquery::DataTransfer.new
105
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
109
106
  ```
110
107
 
111
108
  ### Cloud SDK
@@ -134,24 +131,24 @@ To configure your system for this, simply:
134
131
 
135
132
  ## Creating a Service Account
136
133
 
137
- Google Cloud requires a **Project ID** and **Service Account Credentials** to
138
- connect to the APIs. You will use the **Project ID** and **JSON key file** to
134
+ Google Cloud requires **Service Account Credentials** to
135
+ connect to the APIs. You will use the **JSON key file** to
139
136
  connect to most services with google-cloud-bigquery-data_transfer.
140
137
 
141
- If you are not running this client within [Google Cloud Platform
142
- environments](#google-cloud-platform-environments), you need a Google
143
- Developers service account.
138
+ If you are not running this client within
139
+ [Google Cloud Platform environments](#google-cloud-platform-environments), you
140
+ need a Google Developers service account.
144
141
 
145
142
  1. Visit the [Google Developers Console][dev-console].
146
- 1. Create a new project or click on an existing project.
147
- 1. Activate the slide-out navigation tray and select **API Manager**. From
143
+ 2. Create a new project or click on an existing project.
144
+ 3. Activate the slide-out navigation tray and select **API Manager**. From
148
145
  here, you will enable the APIs that your application requires.
149
146
 
150
147
  ![Enable the APIs that your application requires][enable-apis]
151
148
 
152
149
  *Note: You may need to enable billing in order to use these services.*
153
150
 
154
- 1. Select **Credentials** from the side navigation.
151
+ 4. Select **Credentials** from the side navigation.
155
152
 
156
153
  You should see a screen like one of the following.
157
154
 
data/LICENSE.md ADDED
@@ -0,0 +1,203 @@
1
+ Apache License
2
+ ==============
3
+
4
+ * Version 2.0, January 2004
5
+ * https://www.apache.org/licenses/
6
+
7
+ ### TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
8
+
9
+ 1. **Definitions.**
10
+
11
+ "License" shall mean the terms and conditions for use, reproduction,
12
+ and distribution as defined by Sections 1 through 9 of this document.
13
+
14
+ "Licensor" shall mean the copyright owner or entity authorized by
15
+ the copyright owner that is granting the License.
16
+
17
+ "Legal Entity" shall mean the union of the acting entity and all
18
+ other entities that control, are controlled by, or are under common
19
+ control with that entity. For the purposes of this definition,
20
+ "control" means (i) the power, direct or indirect, to cause the
21
+ direction or management of such entity, whether by contract or
22
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
23
+ outstanding shares, or (iii) beneficial ownership of such entity.
24
+
25
+ "You" (or "Your") shall mean an individual or Legal Entity
26
+ exercising permissions granted by this License.
27
+
28
+ "Source" form shall mean the preferred form for making modifications,
29
+ including but not limited to software source code, documentation
30
+ source, and configuration files.
31
+
32
+ "Object" form shall mean any form resulting from mechanical
33
+ transformation or translation of a Source form, including but
34
+ not limited to compiled object code, generated documentation,
35
+ and conversions to other media types.
36
+
37
+ "Work" shall mean the work of authorship, whether in Source or
38
+ Object form, made available under the License, as indicated by a
39
+ copyright notice that is included in or attached to the work
40
+ (an example is provided in the Appendix below).
41
+
42
+ "Derivative Works" shall mean any work, whether in Source or Object
43
+ form, that is based on (or derived from) the Work and for which the
44
+ editorial revisions, annotations, elaborations, or other modifications
45
+ represent, as a whole, an original work of authorship. For the purposes
46
+ of this License, Derivative Works shall not include works that remain
47
+ separable from, or merely link (or bind by name) to the interfaces of,
48
+ the Work and Derivative Works thereof.
49
+
50
+ "Contribution" shall mean any work of authorship, including
51
+ the original version of the Work and any modifications or additions
52
+ to that Work or Derivative Works thereof, that is intentionally
53
+ submitted to Licensor for inclusion in the Work by the copyright owner
54
+ or by an individual or Legal Entity authorized to submit on behalf of
55
+ the copyright owner. For the purposes of this definition, "submitted"
56
+ means any form of electronic, verbal, or written communication sent
57
+ to the Licensor or its representatives, including but not limited to
58
+ communication on electronic mailing lists, source code control systems,
59
+ and issue tracking systems that are managed by, or on behalf of, the
60
+ Licensor for the purpose of discussing and improving the Work, but
61
+ excluding communication that is conspicuously marked or otherwise
62
+ designated in writing by the copyright owner as "Not a Contribution."
63
+
64
+ "Contributor" shall mean Licensor and any individual or Legal Entity
65
+ on behalf of whom a Contribution has been received by Licensor and
66
+ subsequently incorporated within the Work.
67
+
68
+ 2. **Grant of Copyright License.** Subject to the terms and conditions of
69
+ this License, each Contributor hereby grants to You a perpetual,
70
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
71
+ copyright license to reproduce, prepare Derivative Works of,
72
+ publicly display, publicly perform, sublicense, and distribute the
73
+ Work and such Derivative Works in Source or Object form.
74
+
75
+ 3. **Grant of Patent License.** Subject to the terms and conditions of
76
+ this License, each Contributor hereby grants to You a perpetual,
77
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
78
+ (except as stated in this section) patent license to make, have made,
79
+ use, offer to sell, sell, import, and otherwise transfer the Work,
80
+ where such license applies only to those patent claims licensable
81
+ by such Contributor that are necessarily infringed by their
82
+ Contribution(s) alone or by combination of their Contribution(s)
83
+ with the Work to which such Contribution(s) was submitted. If You
84
+ institute patent litigation against any entity (including a
85
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
86
+ or a Contribution incorporated within the Work constitutes direct
87
+ or contributory patent infringement, then any patent licenses
88
+ granted to You under this License for that Work shall terminate
89
+ as of the date such litigation is filed.
90
+
91
+ 4. **Redistribution.** You may reproduce and distribute copies of the
92
+ Work or Derivative Works thereof in any medium, with or without
93
+ modifications, and in Source or Object form, provided that You
94
+ meet the following conditions:
95
+
96
+ * **(a)** You must give any other recipients of the Work or
97
+ Derivative Works a copy of this License; and
98
+
99
+ * **(b)** You must cause any modified files to carry prominent notices
100
+ stating that You changed the files; and
101
+
102
+ * **(c)** You must retain, in the Source form of any Derivative Works
103
+ that You distribute, all copyright, patent, trademark, and
104
+ attribution notices from the Source form of the Work,
105
+ excluding those notices that do not pertain to any part of
106
+ the Derivative Works; and
107
+
108
+ * **(d)** If the Work includes a "NOTICE" text file as part of its
109
+ distribution, then any Derivative Works that You distribute must
110
+ include a readable copy of the attribution notices contained
111
+ within such NOTICE file, excluding those notices that do not
112
+ pertain to any part of the Derivative Works, in at least one
113
+ of the following places: within a NOTICE text file distributed
114
+ as part of the Derivative Works; within the Source form or
115
+ documentation, if provided along with the Derivative Works; or,
116
+ within a display generated by the Derivative Works, if and
117
+ wherever such third-party notices normally appear. The contents
118
+ of the NOTICE file are for informational purposes only and
119
+ do not modify the License. You may add Your own attribution
120
+ notices within Derivative Works that You distribute, alongside
121
+ or as an addendum to the NOTICE text from the Work, provided
122
+ that such additional attribution notices cannot be construed
123
+ as modifying the License.
124
+
125
+ You may add Your own copyright statement to Your modifications and
126
+ may provide additional or different license terms and conditions
127
+ for use, reproduction, or distribution of Your modifications, or
128
+ for any such Derivative Works as a whole, provided Your use,
129
+ reproduction, and distribution of the Work otherwise complies with
130
+ the conditions stated in this License.
131
+
132
+ 5. **Submission of Contributions.** Unless You explicitly state otherwise,
133
+ any Contribution intentionally submitted for inclusion in the Work
134
+ by You to the Licensor shall be under the terms and conditions of
135
+ this License, without any additional terms or conditions.
136
+ Notwithstanding the above, nothing herein shall supersede or modify
137
+ the terms of any separate license agreement you may have executed
138
+ with Licensor regarding such Contributions.
139
+
140
+ 6. **Trademarks.** This License does not grant permission to use the trade
141
+ names, trademarks, service marks, or product names of the Licensor,
142
+ except as required for reasonable and customary use in describing the
143
+ origin of the Work and reproducing the content of the NOTICE file.
144
+
145
+ 7. **Disclaimer of Warranty.** Unless required by applicable law or
146
+ agreed to in writing, Licensor provides the Work (and each
147
+ Contributor provides its Contributions) on an "AS IS" BASIS,
148
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
149
+ implied, including, without limitation, any warranties or conditions
150
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
151
+ PARTICULAR PURPOSE. You are solely responsible for determining the
152
+ appropriateness of using or redistributing the Work and assume any
153
+ risks associated with Your exercise of permissions under this License.
154
+
155
+ 8. **Limitation of Liability.** In no event and under no legal theory,
156
+ whether in tort (including negligence), contract, or otherwise,
157
+ unless required by applicable law (such as deliberate and grossly
158
+ negligent acts) or agreed to in writing, shall any Contributor be
159
+ liable to You for damages, including any direct, indirect, special,
160
+ incidental, or consequential damages of any character arising as a
161
+ result of this License or out of the use or inability to use the
162
+ Work (including but not limited to damages for loss of goodwill,
163
+ work stoppage, computer failure or malfunction, or any and all
164
+ other commercial damages or losses), even if such Contributor
165
+ has been advised of the possibility of such damages.
166
+
167
+ 9. **Accepting Warranty or Additional Liability.** While redistributing
168
+ the Work or Derivative Works thereof, You may choose to offer,
169
+ and charge a fee for, acceptance of support, warranty, indemnity,
170
+ or other liability obligations and/or rights consistent with this
171
+ License. However, in accepting such obligations, You may act only
172
+ on Your own behalf and on Your sole responsibility, not on behalf
173
+ of any other Contributor, and only if You agree to indemnify,
174
+ defend, and hold each Contributor harmless for any liability
175
+ incurred by, or claims asserted against, such Contributor by reason
176
+ of your accepting any such warranty or additional liability.
177
+
178
+ _END OF TERMS AND CONDITIONS_
179
+
180
+ ### APPENDIX: How to apply the Apache License to your work.
181
+
182
+ To apply the Apache License to your work, attach the following
183
+ boilerplate notice, with the fields enclosed by brackets "`[]`"
184
+ replaced with your own identifying information. (Don't include
185
+ the brackets!) The text should be enclosed in the appropriate
186
+ comment syntax for the file format. We also recommend that a
187
+ file or class name and description of purpose be included on the
188
+ same "printed page" as the copyright notice for easier
189
+ identification within third-party archives.
190
+
191
+ Copyright [yyyy] [name of copyright owner]
192
+
193
+ Licensed under the Apache License, Version 2.0 (the "License");
194
+ you may not use this file except in compliance with the License.
195
+ You may obtain a copy of the License at
196
+
197
+ https://www.apache.org/licenses/LICENSE-2.0
198
+
199
+ Unless required by applicable law or agreed to in writing, software
200
+ distributed under the License is distributed on an "AS IS" BASIS,
201
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
202
+ See the License for the specific language governing permissions and
203
+ limitations under the License.
data/MIGRATING.md ADDED
@@ -0,0 +1,301 @@
1
+ ## Migrating to google-cloud-bigquery-data_transfer 1.0
2
+
3
+ The 1.0 release of the google-cloud-bigquery-data_transfer client is a significant upgrade
4
+ based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-ruby),
5
+ and includes substantial interface changes. Existing code written for earlier
6
+ versions of this library will likely require updates to use this version.
7
+ This document describes the changes that have been made, and what you need to
8
+ do to update your usage.
9
+
10
+ To summarize:
11
+
12
+ * The library has been broken out into multiple libraries. The new gem
13
+ `google-cloud-bigquery-data_transfer-v1` contains the
14
+ actual client classes for version V1 of the BigQuery DataTransfer service,
15
+ and the gem `google-cloud-bigquery-data_transfer` now simply provides a convenience wrapper.
16
+ See [Library Structure](#library-structure) for more info.
17
+ * The library uses a new configuration mechanism giving you closer control
18
+ over endpoint address, network timeouts, and retry. See
19
+ [Client Configuration](#client-configuration) for more info. Furthermore,
20
+ when creating a client object, you can customize its configuration in a
21
+ block rather than passing arguments to the constructor. See
22
+ [Creating Clients](#creating-clients) for more info.
23
+ * Previously, positional arguments were used to indicate required arguments.
24
+ Now, all method arguments are keyword arguments, with documentation that
25
+ specifies whether they are required or optional. Additionally, you can pass
26
+ a proto request object instead of separate arguments. See
27
+ [Passing Arguments](#passing-arguments) for more info.
28
+ * Previously, some client classes included class methods for constructing
29
+ resource paths. These paths are now instance methods on the client objects,
30
+ and are also available in a separate paths module. See
31
+ [Resource Path Helpers](#resource-path-helpers) for more info.
32
+ * Some classes have moved into different namespaces. See
33
+ [Class Namespaces](#class-namespaces) for more info.
34
+
35
+ ### Library Structure
36
+
37
+ Older 0.x releases of the `google-cloud-bigquery-data_transfer` gem were all-in-one gems
38
+ that included potentially multiple clients for multiple versions of the BigQuery
39
+ DataTransfer service. The `Google::Cloud::Bigquery::DataTransfer.new` factory method would
40
+ return you an instance of a `Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient`
41
+ object for the V1 version of the service. All these classes were defined in the same gem.
42
+
43
+ With the 1.0 release, the `google-cloud-bigquery-data_transfer` gem still provides factory
44
+ methods for obtaining clients. (The method signatures will have changed. See
45
+ [Creating Clients](#creating-clients) for details.) However, the actual client
46
+ classes have been moved into separate gems, one per service version. The
47
+ `Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client` class, along with its
48
+ helpers and data types, is now part of the `google-cloud-bigquery-data_transfer-v1` gem.
49
+ Future versions will similarly be located in additional gems.
50
+
51
+ For normal usage, you can continue to install the `google-cloud-bigquery-data_transfer` gem
52
+ (which will bring in the versioned client gems as dependencies) and continue to
53
+ use factory methods to create clients. However, you may alternatively choose to
54
+ install only one of the versioned gems. For example, if you know you will only
55
+ use `V1` of the service, you can install `google-cloud-bigquery-data_transfer-v1` by
56
+ itself, and construct instances of the
57
+ `Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client` client class directly.
58
+
59
+ ### Client Configuration
60
+
61
+ In older releases, if you wanted to customize performance parameters or
62
+ low-level behavior of the client (such as credentials, timeouts, or
63
+ instrumentation), you would pass a variety of keyword arguments to the client
64
+ constructor. It was also extremely difficult to customize the default settings.
65
+
66
+ With the 1.0 release, a configuration interface provides control over these
67
+ parameters, including defaults for all instances of a client, and settings for
68
+ each specific client instance. For example, to set default credentials and
69
+ timeout for all BigQuery DataTransfer V1 clients:
70
+
71
+ ```
72
+ Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.configure do |config|
73
+ config.credentials = "/path/to/credentials.json"
74
+ config.timeout = 10.0
75
+ end
76
+ ```
77
+
78
+ Individual RPCs can also be configured independently. For example, to set the
79
+ timeout for the `get_data_source` call:
80
+
81
+ ```
82
+ Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client.configure do |config|
83
+ config.rpcs.get_data_source.timeout = 20.0
84
+ end
85
+ ```
86
+
87
+ Defaults for certain configurations can be set for all BigQuery DataTransfer versions and
88
+ services globally:
89
+
90
+ ```
91
+ Google::Cloud::Bigquery::DataTransfer.configure do |config|
92
+ config.credentials = "/path/to/credentials.json"
93
+ config.timeout = 10.0
94
+ end
95
+ ```
96
+
97
+ Finally, you can override the configuration for each client instance. See the
98
+ next section on [Creating Clients](#creating-clients) for details.
99
+
100
+ ### Creating Clients
101
+
102
+ In older releases, to create a client object, you would use the
103
+ `Google::Cloud::Bigquery::DataTransfer.new` class method. Keyword arguments were available to
104
+ select a service version and to configure parameters such as credentials and
105
+ timeouts.
106
+
107
+ With the 1.0 release, use the `Google::Cloud::Bigquery::DataTransfer.data_transfer_service` class
108
+ method to create a client object. You may select a service version using the
109
+ `:version` keyword argument. (Currently `:v1` is the only supported version.)
110
+ However, other configuration parameters should be set in a configuration block when you create the client.
111
+
112
+ Old:
113
+ ```
114
+ client = Google::Cloud::Bigquery::DataTransfer.new credentials: "/path/to/credentials.json"
115
+ ```
116
+
117
+ New:
118
+ ```
119
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service do |config|
120
+ config.credentials = "/path/to/credentials.json"
121
+ end
122
+ ```
123
+
124
+ The configuration block is optional. If you do not provide it, or you do not
125
+ set some configuration parameters, then the default configuration is used. See
126
+ [Client Configuration](#client-configuration).
127
+
128
+ ### Passing Arguments
129
+
130
+ In older releases, required arguments would be passed as positional method
131
+ arguments, while most optional arguments would be passed as keyword arguments.
132
+
133
+ With the 1.0 release, all RPC arguments are passed as keyword arguments,
134
+ regardless of whether they are required or optional. For example:
135
+
136
+ Old:
137
+ ```
138
+ client = Google::Cloud::Bigquery::DataTransfer.new
139
+
140
+ name = "projects/my-project/dataSources/my-source"
141
+
142
+ # Name is a positional argument
143
+ response = client.get_data_source name
144
+ ```
145
+
146
+ New:
147
+ ```
148
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
149
+
150
+ name = "projects/my-project/dataSources/my-source"
151
+
152
+ # Name is a keyword argument
153
+ response = client.get_data_source name: name
154
+ ```
155
+
156
+ In the 1.0 release, it is also possible to pass a request object, either
157
+ as a hash or as a protocol buffer.
158
+
159
+ New:
160
+ ```
161
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
162
+
163
+ request = Google::Cloud::Bigquery::DataTransfer::V1::GetDataSourceRequest.new(
164
+ name: "projects/my-project/dataSources/my-source"
165
+ )
166
+
167
+ # Pass a request object as a positional argument:
168
+ response = client.get_data_source request
169
+ ```
170
+
171
+ Finally, in older releases, to provide call options, you would pass a
172
+ `Google::Gax::CallOptions` object with the `:options` keyword argument. In the
173
+ 1.0 release, pass call options using a _second set_ of keyword arguments.
174
+
175
+ Old:
176
+ ```
177
+ client = Google::Cloud::Bigquery::DataTransfer.new
178
+
179
+ name = "projects/my-project/dataSources/my-source"
180
+
181
+ options = Google::Gax::CallOptions.new timeout: 10.0
182
+
183
+ response = client.get_data_source name, options: options
184
+ ```
185
+
186
+ New:
187
+ ```
188
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
189
+
190
+ name = "projects/my-project/dataSources/my-source"
191
+
192
+ # Use a hash to wrap the normal call arguments (or pass a request object), and
193
+ # then add further keyword arguments for the call options.
194
+ response = client.get_data_source({ name: name }, timeout: 10.0)
195
+ ```
196
+
197
+ ### Resource Path Helpers
198
+
199
+ The client library includes helper methods for generating the resource path
200
+ strings passed to many calls. These helpers have changed in two ways:
201
+
202
+ * In older releases, they are _class_ methods on the client class. In the 1.0
203
+ release, they are _instance_ methods on the client. They are also available
204
+ on a separate paths module that you can include elsewhere for convenience.
205
+ * In older releases, arguments to a resource path helper are passed as
206
+ _positional_ arguments. In the 1.0 release, they are passed as named _keyword_
207
+ arguments. Some helpers also support different sets of arguments, each set
208
+ corresponding to a different type of path.
209
+
210
+ Following is an example involving using a resource path helper.
211
+
212
+ Old:
213
+ ```
214
+ client = Google::Cloud::Bigquery::DataTransfer.new
215
+
216
+ # Call the helper on the client class
217
+ name = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.data_source_path(
218
+ "my-project", "my-source"
219
+ )
220
+
221
+ response = client.get_data_source name
222
+ ```
223
+
224
+ New:
225
+ ```
226
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
227
+
228
+ # Call the helper on the client instance, and use keyword arguments
229
+ name = client.data_source_path project: "my-project", data_source: "my-source"
230
+
231
+ response = client.get_data_source name: name
232
+ ```
233
+
234
+ Because IDs are passed as keyword arguments, some closely related paths
235
+ have been combined. For example, `data_source_path` and `location_data_source_path`
236
+ used to be separate helpers, one that took a location argument and one that did not.
237
+ In the 1.0 client, use `data_source_path` for both cases, and either pass or
238
+ omit the `location:` keyword argument.
239
+
240
+ Old:
241
+ ```
242
+ name1 = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.data_source_path(
243
+ "my-project", "my-source"
244
+ )
245
+ name2 = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.location_data_source_path(
246
+ "my-project", "my-location", "my-source"
247
+ )
248
+ ```
249
+
250
+ New:
251
+ ```
252
+ client = Google::Cloud::Dlp.dlp_service
253
+ name1 = client.data_source_path project: "my-project",
254
+ data_source: "my-source"
255
+ name2 = client.data_source_path project: "my-project",
256
+ location: "my-location",
257
+ data_source: "my-source"
258
+ ```
259
+
260
+ Finally, in the 1.0 client, you can also use the paths module as a convenience module.
261
+
262
+ New:
263
+ ```
264
+ # Bring the data_source_path method into the current class
265
+ include Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Paths
266
+
267
+ def foo
268
+ client = Google::Cloud::Bigquery::DataTransfer.data_transfer_service
269
+
270
+ # Call the included helper method
271
+ name = data_source_path project: "my-project", data_source: "my-source"
272
+
273
+ response = client.get_data_source name: name
274
+
275
+ # Do something with response...
276
+ end
277
+ ```
278
+
279
+ ### Class Namespaces
280
+
281
+ In older releases, data type classes were generally located under the module
282
+ `Google::Cloud::Bigquery::Datatransfer::V1`. (Note the lower-case "t" in
283
+ "Datatransfer".) In the 1.0 release, these classes have been moved into the
284
+ same `Google::Cloud::Bigquery::DataTransfer::V1` (upper-case "T") module used
285
+ by the client object, for consistency.
286
+
287
+ In older releases, the client object was of classes with names like:
288
+ `Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient`.
289
+ In the 1.0 release, the client object is of a different class:
290
+ `Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Client`.
291
+ Note that most users will use the factory methods such as
292
+ `Google::Cloud::Bigquery::DataTransfer.data_transfer_service` to create instances of the client object,
293
+ so you may not need to reference the actual class directly.
294
+ See [Creating Clients](#creating-clients).
295
+
296
+ In older releases, the credentials object was of class
297
+ `Google::Cloud::Bigquery::DataTransfer::V1::Credentials`.
298
+ In the 1.0 release, each service has its own credentials class, e.g.
299
+ `Google::Cloud::Bigquery::DataTransfer::V1::DataTransferService::Credentials`.
300
+ Again, most users will not need to reference this class directly.
301
+ See [Client Configuration](#client-configuration).