google-cloud-dataproc 0.9.1 → 1.1.2

Sign up to get free protection for your applications and to get access to all the features.
Files changed (72) hide show
  1. checksums.yaml +4 -4
  2. data/.yardopts +3 -2
  3. data/AUTHENTICATION.md +51 -59
  4. data/LICENSE.md +203 -0
  5. data/MIGRATING.md +361 -0
  6. data/README.md +33 -44
  7. data/lib/{google/cloud/dataproc/v1/doc/google/protobuf/empty.rb → google-cloud-dataproc.rb} +4 -14
  8. data/lib/google/cloud/dataproc.rb +173 -305
  9. data/lib/google/cloud/dataproc/version.rb +6 -2
  10. metadata +71 -102
  11. data/LICENSE +0 -201
  12. data/lib/google/cloud/dataproc/v1.rb +0 -371
  13. data/lib/google/cloud/dataproc/v1/autoscaling_policies_pb.rb +0 -80
  14. data/lib/google/cloud/dataproc/v1/autoscaling_policies_services_pb.rb +0 -59
  15. data/lib/google/cloud/dataproc/v1/autoscaling_policy_service_client.rb +0 -491
  16. data/lib/google/cloud/dataproc/v1/autoscaling_policy_service_client_config.json +0 -51
  17. data/lib/google/cloud/dataproc/v1/cluster_controller_client.rb +0 -821
  18. data/lib/google/cloud/dataproc/v1/cluster_controller_client_config.json +0 -59
  19. data/lib/google/cloud/dataproc/v1/clusters_pb.rb +0 -234
  20. data/lib/google/cloud/dataproc/v1/clusters_services_pb.rb +0 -69
  21. data/lib/google/cloud/dataproc/v1/credentials.rb +0 -41
  22. data/lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/autoscaling_policies.rb +0 -238
  23. data/lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/clusters.rb +0 -819
  24. data/lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/jobs.rb +0 -759
  25. data/lib/google/cloud/dataproc/v1/doc/google/cloud/dataproc/v1/workflow_templates.rb +0 -566
  26. data/lib/google/cloud/dataproc/v1/doc/google/longrunning/operations.rb +0 -51
  27. data/lib/google/cloud/dataproc/v1/doc/google/protobuf/any.rb +0 -131
  28. data/lib/google/cloud/dataproc/v1/doc/google/protobuf/duration.rb +0 -91
  29. data/lib/google/cloud/dataproc/v1/doc/google/protobuf/field_mask.rb +0 -222
  30. data/lib/google/cloud/dataproc/v1/doc/google/protobuf/timestamp.rb +0 -113
  31. data/lib/google/cloud/dataproc/v1/doc/google/rpc/status.rb +0 -39
  32. data/lib/google/cloud/dataproc/v1/job_controller_client.rb +0 -589
  33. data/lib/google/cloud/dataproc/v1/job_controller_client_config.json +0 -59
  34. data/lib/google/cloud/dataproc/v1/jobs_pb.rb +0 -273
  35. data/lib/google/cloud/dataproc/v1/jobs_services_pb.rb +0 -61
  36. data/lib/google/cloud/dataproc/v1/operations_pb.rb +0 -45
  37. data/lib/google/cloud/dataproc/v1/shared_pb.rb +0 -26
  38. data/lib/google/cloud/dataproc/v1/workflow_template_service_client.rb +0 -767
  39. data/lib/google/cloud/dataproc/v1/workflow_template_service_client_config.json +0 -64
  40. data/lib/google/cloud/dataproc/v1/workflow_templates_pb.rb +0 -184
  41. data/lib/google/cloud/dataproc/v1/workflow_templates_services_pb.rb +0 -105
  42. data/lib/google/cloud/dataproc/v1beta2.rb +0 -371
  43. data/lib/google/cloud/dataproc/v1beta2/autoscaling_policies_pb.rb +0 -80
  44. data/lib/google/cloud/dataproc/v1beta2/autoscaling_policies_services_pb.rb +0 -59
  45. data/lib/google/cloud/dataproc/v1beta2/autoscaling_policy_service_client.rb +0 -491
  46. data/lib/google/cloud/dataproc/v1beta2/autoscaling_policy_service_client_config.json +0 -51
  47. data/lib/google/cloud/dataproc/v1beta2/cluster_controller_client.rb +0 -830
  48. data/lib/google/cloud/dataproc/v1beta2/cluster_controller_client_config.json +0 -59
  49. data/lib/google/cloud/dataproc/v1beta2/clusters_pb.rb +0 -241
  50. data/lib/google/cloud/dataproc/v1beta2/clusters_services_pb.rb +0 -69
  51. data/lib/google/cloud/dataproc/v1beta2/credentials.rb +0 -41
  52. data/lib/google/cloud/dataproc/v1beta2/doc/google/cloud/dataproc/v1beta2/autoscaling_policies.rb +0 -238
  53. data/lib/google/cloud/dataproc/v1beta2/doc/google/cloud/dataproc/v1beta2/clusters.rb +0 -841
  54. data/lib/google/cloud/dataproc/v1beta2/doc/google/cloud/dataproc/v1beta2/jobs.rb +0 -728
  55. data/lib/google/cloud/dataproc/v1beta2/doc/google/cloud/dataproc/v1beta2/workflow_templates.rb +0 -579
  56. data/lib/google/cloud/dataproc/v1beta2/doc/google/longrunning/operations.rb +0 -51
  57. data/lib/google/cloud/dataproc/v1beta2/doc/google/protobuf/any.rb +0 -131
  58. data/lib/google/cloud/dataproc/v1beta2/doc/google/protobuf/duration.rb +0 -91
  59. data/lib/google/cloud/dataproc/v1beta2/doc/google/protobuf/empty.rb +0 -29
  60. data/lib/google/cloud/dataproc/v1beta2/doc/google/protobuf/field_mask.rb +0 -222
  61. data/lib/google/cloud/dataproc/v1beta2/doc/google/protobuf/timestamp.rb +0 -113
  62. data/lib/google/cloud/dataproc/v1beta2/doc/google/rpc/status.rb +0 -39
  63. data/lib/google/cloud/dataproc/v1beta2/job_controller_client.rb +0 -589
  64. data/lib/google/cloud/dataproc/v1beta2/job_controller_client_config.json +0 -59
  65. data/lib/google/cloud/dataproc/v1beta2/jobs_pb.rb +0 -261
  66. data/lib/google/cloud/dataproc/v1beta2/jobs_services_pb.rb +0 -61
  67. data/lib/google/cloud/dataproc/v1beta2/operations_pb.rb +0 -44
  68. data/lib/google/cloud/dataproc/v1beta2/shared_pb.rb +0 -30
  69. data/lib/google/cloud/dataproc/v1beta2/workflow_template_service_client.rb +0 -775
  70. data/lib/google/cloud/dataproc/v1beta2/workflow_template_service_client_config.json +0 -64
  71. data/lib/google/cloud/dataproc/v1beta2/workflow_templates_pb.rb +0 -186
  72. data/lib/google/cloud/dataproc/v1beta2/workflow_templates_services_pb.rb +0 -105
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 8f4dc9ab9721db6accf4279d8963bc946bffef23d3480d5b661479df3cf76203
4
- data.tar.gz: 9919cb5e80780916635366bdabe88f9b4675f5cfaedaf07596633a5c316c8745
3
+ metadata.gz: 8943e37f3457adc755f782910a0dcb7cf0439861d33809c05dc043d486f5df3c
4
+ data.tar.gz: 303a9a20524d50f2ee0ea882ecff257090e40a1862cf4751489dbb44e0027fe4
5
5
  SHA512:
6
- metadata.gz: 9b59bad93eadf97880799b451199ab06ff8fb4957463cfcc35277176352ad38098916931ee78cabf8e0507505c49f1f3c3636dbeebb8868c1b7ec0937bcbb9f3
7
- data.tar.gz: 9d60a85d4b95872d332bce6a501be08be0c6d61a4a4f0700392f4f3e2e00ff85b886e2a01b23db2502cfd346fd37269eedb284d54d261466a7354dbbc9437e53
6
+ metadata.gz: 13bf49f8e0cb38ee07c98adeb37392b8dc6271e8db98938f760e2546c29ed43c6a46b6fe5ae63d2994fee0dd272d9878b033e2c868feca742231011502307cb1
7
+ data.tar.gz: 8658db6a60ddbac6961ca2ffddbe68f2928437d8f9611c9f9b62d4e2d4632cdb2a9d169e68713608c6026d2198ddad4e3fdac7680411275b0e6a4fbf4120367e
data/.yardopts CHANGED
@@ -1,5 +1,5 @@
1
1
  --no-private
2
- --title=Google Cloud Dataproc API
2
+ --title=Cloud Dataproc API
3
3
  --exclude _pb\.rb$
4
4
  --markup markdown
5
5
  --markup-provider redcarpet
@@ -8,4 +8,5 @@
8
8
  -
9
9
  README.md
10
10
  AUTHENTICATION.md
11
- LICENSE
11
+ MIGRATING.md
12
+ LICENSE.md
@@ -1,16 +1,17 @@
1
1
  # Authentication
2
2
 
3
- In general, the google-cloud-dataproc library uses [Service
4
- Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
5
- credentials to connect to Google Cloud services. When running within [Google
6
- Cloud Platform environments](#google-cloud-platform-environments)
7
- the credentials will be discovered automatically. When running on other
3
+ In general, the google-cloud-dataproc library uses
4
+ [Service Account](https://cloud.google.com/iam/docs/creating-managing-service-accounts)
5
+ credentials to connect to Google Cloud services. When running within
6
+ [Google Cloud Platform environments](#google-cloud-platform-environments) the
7
+ credentials will be discovered automatically. When running on other
8
8
  environments, the Service Account credentials can be specified by providing the
9
- path to the [JSON
10
- keyfile](https://cloud.google.com/iam/docs/managing-service-account-keys) for
11
- the account (or the JSON itself) in [environment
12
- variables](#environment-variables). Additionally, Cloud SDK credentials can also
13
- be discovered automatically, but this is only recommended during development.
9
+ path to the
10
+ [JSON keyfile](https://cloud.google.com/iam/docs/managing-service-account-keys)
11
+ for the account (or the JSON itself) in
12
+ [environment variables](#environment-variables). Additionally, Cloud SDK
13
+ credentials can also be discovered automatically, but this is only recommended
14
+ during development.
14
15
 
15
16
  ## Quickstart
16
17
 
@@ -18,7 +19,7 @@ be discovered automatically, but this is only recommended during development.
18
19
  2. Set the [environment variable](#environment-variables).
19
20
 
20
21
  ```sh
21
- export DATAPROC_CREDENTIALS=/path/to/json`
22
+ export DATAPROC_CREDENTIALS=path/to/keyfile.json
22
23
  ```
23
24
 
24
25
  3. Initialize the client.
@@ -26,23 +27,14 @@ export DATAPROC_CREDENTIALS=/path/to/json`
26
27
  ```ruby
27
28
  require "google/cloud/dataproc"
28
29
 
29
- client = Google::Cloud::Dataproc::AutoscalingPolicyService.new
30
+ client = Google::Cloud::Dataproc.autoscaling_policy_service
30
31
  ```
31
32
 
32
- ## Project and Credential Lookup
33
+ ## Credential Lookup
33
34
 
34
35
  The google-cloud-dataproc library aims to make authentication
35
36
  as simple as possible, and provides several mechanisms to configure your system
36
- without providing **Project ID** and **Service Account Credentials** directly in
37
- code.
38
-
39
- **Project ID** is discovered in the following order:
40
-
41
- 1. Specify project ID in method arguments
42
- 2. Specify project ID in configuration
43
- 3. Discover project ID in environment variables
44
- 4. Discover GCP project ID
45
- 5. Discover project ID in credentials JSON
37
+ without requiring **Service Account Credentials** directly in code.
46
38
 
47
39
  **Credentials** are discovered in the following order:
48
40
 
@@ -55,28 +47,24 @@ code.
55
47
 
56
48
  ### Google Cloud Platform environments
57
49
 
58
- When running on Google Cloud Platform (GCP), including Google Compute Engine (GCE),
59
- Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud Functions
60
- (GCF) and Cloud Run, the **Project ID** and **Credentials** and are discovered
61
- automatically. Code should be written as if already authenticated.
50
+ When running on Google Cloud Platform (GCP), including Google Compute Engine
51
+ (GCE), Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud
52
+ Functions (GCF) and Cloud Run, **Credentials** are discovered automatically.
53
+ Code should be written as if already authenticated.
62
54
 
63
55
  ### Environment Variables
64
56
 
65
- The **Project ID** and **Credentials JSON** can be placed in environment
66
- variables instead of declaring them directly in code. Each service has its own
67
- environment variable, allowing for different service accounts to be used for
68
- different services. (See the READMEs for the individual service gems for
69
- details.) The path to the **Credentials JSON** file can be stored in the
70
- environment variable, or the **Credentials JSON** itself can be stored for
71
- environments such as Docker containers where writing files is difficult or not
72
- encouraged.
73
-
74
- The environment variables that google-cloud-dataproc checks for project ID are:
75
-
76
- 1. `DATAPROC_PROJECT`
77
- 2. `GOOGLE_CLOUD_PROJECT`
57
+ The **Credentials JSON** can be placed in environment variables instead of
58
+ declaring them directly in code. Each service has its own environment variable,
59
+ allowing for different service accounts to be used for different services. (See
60
+ the READMEs for the individual service gems for details.) The path to the
61
+ **Credentials JSON** file can be stored in the environment variable, or the
62
+ **Credentials JSON** itself can be stored for environments such as Docker
63
+ containers where writing files is difficult or not encouraged.
78
64
 
79
- The environment variables that google-cloud-dataproc checks for credentials are configured on {Google::Cloud::Dataproc::V1::Credentials}:
65
+ The environment variables that google-cloud-dataproc
66
+ checks for credentials are configured on the service Credentials class (such as
67
+ `::Google::Cloud::Dataproc::V1::AutoscalingPolicyService::Credentials`):
80
68
 
81
69
  1. `DATAPROC_CREDENTIALS` - Path to JSON file, or JSON contents
82
70
  2. `DATAPROC_KEYFILE` - Path to JSON file, or JSON contents
@@ -87,25 +75,34 @@ The environment variables that google-cloud-dataproc checks for credentials are
87
75
  ```ruby
88
76
  require "google/cloud/dataproc"
89
77
 
90
- ENV["DATAPROC_PROJECT"] = "my-project-id"
91
78
  ENV["DATAPROC_CREDENTIALS"] = "path/to/keyfile.json"
92
79
 
93
- client = Google::Cloud::Dataproc::AutoscalingPolicyService.new
80
+ client = Google::Cloud::Dataproc.autoscaling_policy_service
94
81
  ```
95
82
 
96
83
  ### Configuration
97
84
 
98
- The **Project ID** and **Credentials JSON** can be configured instead of placing them in environment variables or providing them as arguments.
85
+ The **Credentials JSON** can be configured instead of placing them in
86
+ environment variables. Either on an individual client initialization:
87
+
88
+ ```ruby
89
+ require "google/cloud/dataproc"
90
+
91
+ client = Google::Cloud::Dataproc.autoscaling_policy_service do |config|
92
+ config.credentials = "path/to/keyfile.json"
93
+ end
94
+ ```
95
+
96
+ Or configured globally for all clients:
99
97
 
100
98
  ```ruby
101
99
  require "google/cloud/dataproc"
102
100
 
103
101
  Google::Cloud::Dataproc.configure do |config|
104
- config.project_id = "my-project-id"
105
102
  config.credentials = "path/to/keyfile.json"
106
103
  end
107
104
 
108
- client = Google::Cloud::Dataproc::AutoscalingPolicyService.new
105
+ client = Google::Cloud::Dataproc.autoscaling_policy_service
109
106
  ```
110
107
 
111
108
  ### Cloud SDK
@@ -134,24 +131,24 @@ To configure your system for this, simply:
134
131
 
135
132
  ## Creating a Service Account
136
133
 
137
- Google Cloud requires a **Project ID** and **Service Account Credentials** to
138
- connect to the APIs. You will use the **Project ID** and **JSON key file** to
134
+ Google Cloud requires **Service Account Credentials** to
135
+ connect to the APIs. You will use the **JSON key file** to
139
136
  connect to most services with google-cloud-dataproc.
140
137
 
141
- If you are not running this client within [Google Cloud Platform
142
- environments](#google-cloud-platform-environments), you need a Google
143
- Developers service account.
138
+ If you are not running this client within
139
+ [Google Cloud Platform environments](#google-cloud-platform-environments), you
140
+ need a Google Developers service account.
144
141
 
145
142
  1. Visit the [Google Developers Console][dev-console].
146
- 1. Create a new project or click on an existing project.
147
- 1. Activate the slide-out navigation tray and select **API Manager**. From
143
+ 2. Create a new project or click on an existing project.
144
+ 3. Activate the slide-out navigation tray and select **API Manager**. From
148
145
  here, you will enable the APIs that your application requires.
149
146
 
150
147
  ![Enable the APIs that your application requires][enable-apis]
151
148
 
152
149
  *Note: You may need to enable billing in order to use these services.*
153
150
 
154
- 1. Select **Credentials** from the side navigation.
151
+ 4. Select **Credentials** from the side navigation.
155
152
 
156
153
  You should see a screen like one of the following.
157
154
 
@@ -170,8 +167,3 @@ Developers service account.
170
167
 
171
168
  The key file you download will be used by this library to authenticate API
172
169
  requests and should be stored in a secure location.
173
-
174
- ## Troubleshooting
175
-
176
- If you're having trouble authenticating you can ask for help by following the
177
- {file:TROUBLESHOOTING.md Troubleshooting Guide}.
@@ -0,0 +1,203 @@
1
+ Apache License
2
+ ==============
3
+
4
+ * Version 2.0, January 2004
5
+ * https://www.apache.org/licenses/
6
+
7
+ ### TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
8
+
9
+ 1. **Definitions.**
10
+
11
+ "License" shall mean the terms and conditions for use, reproduction,
12
+ and distribution as defined by Sections 1 through 9 of this document.
13
+
14
+ "Licensor" shall mean the copyright owner or entity authorized by
15
+ the copyright owner that is granting the License.
16
+
17
+ "Legal Entity" shall mean the union of the acting entity and all
18
+ other entities that control, are controlled by, or are under common
19
+ control with that entity. For the purposes of this definition,
20
+ "control" means (i) the power, direct or indirect, to cause the
21
+ direction or management of such entity, whether by contract or
22
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
23
+ outstanding shares, or (iii) beneficial ownership of such entity.
24
+
25
+ "You" (or "Your") shall mean an individual or Legal Entity
26
+ exercising permissions granted by this License.
27
+
28
+ "Source" form shall mean the preferred form for making modifications,
29
+ including but not limited to software source code, documentation
30
+ source, and configuration files.
31
+
32
+ "Object" form shall mean any form resulting from mechanical
33
+ transformation or translation of a Source form, including but
34
+ not limited to compiled object code, generated documentation,
35
+ and conversions to other media types.
36
+
37
+ "Work" shall mean the work of authorship, whether in Source or
38
+ Object form, made available under the License, as indicated by a
39
+ copyright notice that is included in or attached to the work
40
+ (an example is provided in the Appendix below).
41
+
42
+ "Derivative Works" shall mean any work, whether in Source or Object
43
+ form, that is based on (or derived from) the Work and for which the
44
+ editorial revisions, annotations, elaborations, or other modifications
45
+ represent, as a whole, an original work of authorship. For the purposes
46
+ of this License, Derivative Works shall not include works that remain
47
+ separable from, or merely link (or bind by name) to the interfaces of,
48
+ the Work and Derivative Works thereof.
49
+
50
+ "Contribution" shall mean any work of authorship, including
51
+ the original version of the Work and any modifications or additions
52
+ to that Work or Derivative Works thereof, that is intentionally
53
+ submitted to Licensor for inclusion in the Work by the copyright owner
54
+ or by an individual or Legal Entity authorized to submit on behalf of
55
+ the copyright owner. For the purposes of this definition, "submitted"
56
+ means any form of electronic, verbal, or written communication sent
57
+ to the Licensor or its representatives, including but not limited to
58
+ communication on electronic mailing lists, source code control systems,
59
+ and issue tracking systems that are managed by, or on behalf of, the
60
+ Licensor for the purpose of discussing and improving the Work, but
61
+ excluding communication that is conspicuously marked or otherwise
62
+ designated in writing by the copyright owner as "Not a Contribution."
63
+
64
+ "Contributor" shall mean Licensor and any individual or Legal Entity
65
+ on behalf of whom a Contribution has been received by Licensor and
66
+ subsequently incorporated within the Work.
67
+
68
+ 2. **Grant of Copyright License.** Subject to the terms and conditions of
69
+ this License, each Contributor hereby grants to You a perpetual,
70
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
71
+ copyright license to reproduce, prepare Derivative Works of,
72
+ publicly display, publicly perform, sublicense, and distribute the
73
+ Work and such Derivative Works in Source or Object form.
74
+
75
+ 3. **Grant of Patent License.** Subject to the terms and conditions of
76
+ this License, each Contributor hereby grants to You a perpetual,
77
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
78
+ (except as stated in this section) patent license to make, have made,
79
+ use, offer to sell, sell, import, and otherwise transfer the Work,
80
+ where such license applies only to those patent claims licensable
81
+ by such Contributor that are necessarily infringed by their
82
+ Contribution(s) alone or by combination of their Contribution(s)
83
+ with the Work to which such Contribution(s) was submitted. If You
84
+ institute patent litigation against any entity (including a
85
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
86
+ or a Contribution incorporated within the Work constitutes direct
87
+ or contributory patent infringement, then any patent licenses
88
+ granted to You under this License for that Work shall terminate
89
+ as of the date such litigation is filed.
90
+
91
+ 4. **Redistribution.** You may reproduce and distribute copies of the
92
+ Work or Derivative Works thereof in any medium, with or without
93
+ modifications, and in Source or Object form, provided that You
94
+ meet the following conditions:
95
+
96
+ * **(a)** You must give any other recipients of the Work or
97
+ Derivative Works a copy of this License; and
98
+
99
+ * **(b)** You must cause any modified files to carry prominent notices
100
+ stating that You changed the files; and
101
+
102
+ * **(c)** You must retain, in the Source form of any Derivative Works
103
+ that You distribute, all copyright, patent, trademark, and
104
+ attribution notices from the Source form of the Work,
105
+ excluding those notices that do not pertain to any part of
106
+ the Derivative Works; and
107
+
108
+ * **(d)** If the Work includes a "NOTICE" text file as part of its
109
+ distribution, then any Derivative Works that You distribute must
110
+ include a readable copy of the attribution notices contained
111
+ within such NOTICE file, excluding those notices that do not
112
+ pertain to any part of the Derivative Works, in at least one
113
+ of the following places: within a NOTICE text file distributed
114
+ as part of the Derivative Works; within the Source form or
115
+ documentation, if provided along with the Derivative Works; or,
116
+ within a display generated by the Derivative Works, if and
117
+ wherever such third-party notices normally appear. The contents
118
+ of the NOTICE file are for informational purposes only and
119
+ do not modify the License. You may add Your own attribution
120
+ notices within Derivative Works that You distribute, alongside
121
+ or as an addendum to the NOTICE text from the Work, provided
122
+ that such additional attribution notices cannot be construed
123
+ as modifying the License.
124
+
125
+ You may add Your own copyright statement to Your modifications and
126
+ may provide additional or different license terms and conditions
127
+ for use, reproduction, or distribution of Your modifications, or
128
+ for any such Derivative Works as a whole, provided Your use,
129
+ reproduction, and distribution of the Work otherwise complies with
130
+ the conditions stated in this License.
131
+
132
+ 5. **Submission of Contributions.** Unless You explicitly state otherwise,
133
+ any Contribution intentionally submitted for inclusion in the Work
134
+ by You to the Licensor shall be under the terms and conditions of
135
+ this License, without any additional terms or conditions.
136
+ Notwithstanding the above, nothing herein shall supersede or modify
137
+ the terms of any separate license agreement you may have executed
138
+ with Licensor regarding such Contributions.
139
+
140
+ 6. **Trademarks.** This License does not grant permission to use the trade
141
+ names, trademarks, service marks, or product names of the Licensor,
142
+ except as required for reasonable and customary use in describing the
143
+ origin of the Work and reproducing the content of the NOTICE file.
144
+
145
+ 7. **Disclaimer of Warranty.** Unless required by applicable law or
146
+ agreed to in writing, Licensor provides the Work (and each
147
+ Contributor provides its Contributions) on an "AS IS" BASIS,
148
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
149
+ implied, including, without limitation, any warranties or conditions
150
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
151
+ PARTICULAR PURPOSE. You are solely responsible for determining the
152
+ appropriateness of using or redistributing the Work and assume any
153
+ risks associated with Your exercise of permissions under this License.
154
+
155
+ 8. **Limitation of Liability.** In no event and under no legal theory,
156
+ whether in tort (including negligence), contract, or otherwise,
157
+ unless required by applicable law (such as deliberate and grossly
158
+ negligent acts) or agreed to in writing, shall any Contributor be
159
+ liable to You for damages, including any direct, indirect, special,
160
+ incidental, or consequential damages of any character arising as a
161
+ result of this License or out of the use or inability to use the
162
+ Work (including but not limited to damages for loss of goodwill,
163
+ work stoppage, computer failure or malfunction, or any and all
164
+ other commercial damages or losses), even if such Contributor
165
+ has been advised of the possibility of such damages.
166
+
167
+ 9. **Accepting Warranty or Additional Liability.** While redistributing
168
+ the Work or Derivative Works thereof, You may choose to offer,
169
+ and charge a fee for, acceptance of support, warranty, indemnity,
170
+ or other liability obligations and/or rights consistent with this
171
+ License. However, in accepting such obligations, You may act only
172
+ on Your own behalf and on Your sole responsibility, not on behalf
173
+ of any other Contributor, and only if You agree to indemnify,
174
+ defend, and hold each Contributor harmless for any liability
175
+ incurred by, or claims asserted against, such Contributor by reason
176
+ of your accepting any such warranty or additional liability.
177
+
178
+ _END OF TERMS AND CONDITIONS_
179
+
180
+ ### APPENDIX: How to apply the Apache License to your work.
181
+
182
+ To apply the Apache License to your work, attach the following
183
+ boilerplate notice, with the fields enclosed by brackets "`[]`"
184
+ replaced with your own identifying information. (Don't include
185
+ the brackets!) The text should be enclosed in the appropriate
186
+ comment syntax for the file format. We also recommend that a
187
+ file or class name and description of purpose be included on the
188
+ same "printed page" as the copyright notice for easier
189
+ identification within third-party archives.
190
+
191
+ Copyright [yyyy] [name of copyright owner]
192
+
193
+ Licensed under the Apache License, Version 2.0 (the "License");
194
+ you may not use this file except in compliance with the License.
195
+ You may obtain a copy of the License at
196
+
197
+ https://www.apache.org/licenses/LICENSE-2.0
198
+
199
+ Unless required by applicable law or agreed to in writing, software
200
+ distributed under the License is distributed on an "AS IS" BASIS,
201
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
202
+ See the License for the specific language governing permissions and
203
+ limitations under the License.
@@ -0,0 +1,361 @@
1
+ ## Migrating to google-cloud-dataproc 1.0
2
+
3
+ The 1.0 release of the google-cloud-dataproc client is a significant upgrade
4
+ based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-ruby),
5
+ and includes substantial interface changes. Existing code written for earlier
6
+ versions of this library will likely require updates to use this version.
7
+ This document describes the changes that have been made, and what you need to
8
+ do to update your usage.
9
+
10
+ To summarize:
11
+
12
+ * The library has been broken out into multiple libraries. The new gems
13
+ `google-cloud-dataproc-v1` and `google-cloud-dataproc-v1beta2` contain the
14
+ actual client classes for versions V2 and V1beta2 of the Dataproc service,
15
+ and the gem `google-cloud-dataproc` now simply provides a convenience wrapper.
16
+ See [Library Structure](#library-structure) for more info.
17
+ * The library uses a new configuration mechanism giving you closer control
18
+ over endpoint address, network timeouts, and retry. See
19
+ [Client Configuration](#client-configuration) for more info. Furthermore,
20
+ when creating a client object, you can customize its configuration in a
21
+ block rather than passing arguments to the constructor. See
22
+ [Creating Clients](#creating-clients) for more info.
23
+ * Previously, positional arguments were used to indicate required arguments.
24
+ Now, all method arguments are keyword arguments, with documentation that
25
+ specifies whether they are required or optional. Additionally, you can pass
26
+ a proto request object instead of separate arguments. See
27
+ [Passing Arguments](#passing-arguments) for more info.
28
+ * Previously, some client classes included class methods for constructing
29
+ resource paths. These paths are now instance methods on the client objects,
30
+ and are also available in a separate paths module. See
31
+ [Resource Path Helpers](#resource-path-helpers) for more info.
32
+ * Previously, clients reported RPC errors by raising instances of
33
+ `Google::Gax::GaxError` and its subclasses. Now, RPC exceptions are of type
34
+ `Google::Cloud::Error` and its subclasses. See
35
+ [Handling Errors](#handling-errors) for more info.
36
+ * Some classes have moved into different namespaces. See
37
+ [Class Namespaces](#class-namespaces) for more info.
38
+
39
+ ### Library Structure
40
+
41
+ Older 0.x releases of the `google-cloud-dataproc` gem were all-in-one gems
42
+ that included potentially multiple clients for multiple versions of the
43
+ Dataproc service. Factory methods such as `Google::Cloud::Dataproc::ClusterController.new`
44
+ would return you instances of client classes such as
45
+ `Google::Cloud::Dataproc::V1::ClusterController` or
46
+ `Google::Cloud::Dataproc::V1beta2::ClusterController`, depending on which
47
+ version of the API requested. These classes were all defined in the same gem.
48
+
49
+ With the 1.0 release, the `google-cloud-dataproc` gem still provides factory
50
+ methods for obtaining clients. (The method signatures will have changed. See
51
+ [Creating Clients](#creating-clients) for details.) However, the actual client
52
+ classes have been moved into separate gems, one per service version. The
53
+ `Google::Cloud::Dataproc::V1::ClusterController::Client` class, along with its
54
+ helpers and data types, is now part of the `google-cloud-dataproc-v1` gem.
55
+ Similarly, the `Google::Cloud::Dataproc::V1beta2::ClusterController::Client`
56
+ class is part of the `google-cloud-dataproc-v1beta2` gem.
57
+
58
+ For normal usage, you can continue to install the `google-cloud-dataproc` gem
59
+ (which will bring in the versioned client gems as dependencies) and continue to
60
+ use factory methods to create clients. However, you may alternatively choose to
61
+ install only one of the versioned gems. For example, if you know you will only
62
+ use `V1` of the service, you can install `google-cloud-dataproc-v1` by
63
+ itself, and construct instances of the
64
+ `Google::Cloud::Dataproc::V1::ClusterController::Client` client class directly.
65
+
66
+ ### Client Configuration
67
+
68
+ In older releases, if you wanted to customize performance parameters or
69
+ low-level behavior of the client (such as credentials, timeouts, or
70
+ instrumentation), you would pass a variety of keyword arguments to the client
71
+ constructor. It was also extremely difficult to customize the default settings.
72
+
73
+ With the 1.0 release, a configuration interface provides control over these
74
+ parameters, including defaults for all instances of a client, and settings for
75
+ each specific client instance. For example, to set default credentials and
76
+ timeout for all Dataproc V1 ClusterController clients:
77
+
78
+ ```
79
+ Google::Cloud::Dataproc::V1::ClusterController::Client.configure do |config|
80
+ config.credentials = "/path/to/credentials.json"
81
+ config.timeout = 10.0
82
+ end
83
+ ```
84
+
85
+ Individual RPCs can also be configured independently. For example, to set the
86
+ timeout for the `create_cluster` call:
87
+
88
+ ```
89
+ Google::Cloud::Dataproc::V1::ClusterController::Client.configure do |config|
90
+ config.rpcs.create_cluster.timeout = 20.0
91
+ end
92
+ ```
93
+
94
+ Defaults for certain configurations can be set for all Dataproc versions and
95
+ services globally:
96
+
97
+ ```
98
+ Google::Cloud::Dataproc.configure do |config|
99
+ config.credentials = "/path/to/credentials.json"
100
+ config.timeout = 10.0
101
+ end
102
+ ```
103
+
104
+ Finally, you can override the configuration for each client instance. See the
105
+ next section on [Creating Clients](#creating-clients) for details.
106
+
107
+ ### Creating Clients
108
+
109
+ In older releases, to create a client object, you would use the `new` method
110
+ of modules under `Google::Cloud::Dataproc`. For example, you might call
111
+ `Google::Cloud::Dataproc::ClusterController.new`. Keyword arguments were available to
112
+ select a service version and to configure parameters such as credentials and
113
+ timeouts.
114
+
115
+ With the 1.0 release, use named class methods of `Google::Cloud::Dataproc` to
116
+ create a client object. For example, `Google::Cloud::Dataproc.cluster_controller`.
117
+ You may select a service version using the `:version` keyword argument.
118
+ However, other configuration parameters should be set in a configuration block
119
+ when you create the client.
120
+
121
+ Old:
122
+ ```
123
+ client = Google::Cloud::Dataproc::ClusterController.new credentials: "/path/to/credentials.json"
124
+ ```
125
+
126
+ New:
127
+ ```
128
+ client = Google::Cloud::Dataproc.cluster_controller do |config|
129
+ config.credentials = "/path/to/credentials.json"
130
+ end
131
+ ```
132
+
133
+ The configuration block is optional. If you do not provide it, or you do not
134
+ set some configuration parameters, then the default configuration is used. See
135
+ [Client Configuration](#client-configuration).
136
+
137
+ ### Passing Arguments
138
+
139
+ In older releases, required arguments would be passed as positional method
140
+ arguments, while most optional arguments would be passed as keyword arguments.
141
+
142
+ With the 1.0 release, all RPC arguments are passed as keyword arguments,
143
+ regardless of whether they are required or optional. For example:
144
+
145
+ Old:
146
+ ```
147
+ client = Google::Cloud::Dataproc::ClusterController.new
148
+
149
+ project_id = "my-project"
150
+ region = "us-central1"
151
+ cluster_name = "my_cluster"
152
+
153
+ # Arguments are positional
154
+ response = client.get_cluster project_id, region, cluster_name
155
+ ```
156
+
157
+ New:
158
+ ```
159
+ client = Google::Cloud::Dataproc.cluster_controller
160
+
161
+ project_id = "my-project"
162
+ region = "us-central1"
163
+ cluster_name = "my_cluster"
164
+
165
+ # All arguments are keyword arguments
166
+ response = client.get_cluster project_id: project_id, region: region,
167
+ cluster_name: cluster_name
168
+ ```
169
+
170
+ In the 1.0 release, it is also possible to pass a request object, either
171
+ as a hash or as a protocol buffer.
172
+
173
+ New:
174
+ ```
175
+ client = Google::Cloud::Dataproc.cluster_controller
176
+
177
+ request = Google::Cloud::Dataproc::V1::GetClusterRequest.new(
178
+ project_id: "my-project",
179
+ region: "us-central1",
180
+ cluster_name: "my_cluster"
181
+ )
182
+
183
+ # Pass a request object as a positional argument:
184
+ response = client.get_cluster request
185
+ ```
186
+
187
+ Finally, in older releases, to provide call options, you would pass a
188
+ `Google::Gax::CallOptions` object with the `:options` keyword argument. In the
189
+ 1.0 release, pass call options using a _second set_ of keyword arguments.
190
+
191
+ Old:
192
+ ```
193
+ client = Google::Cloud::Dataproc::ClusterController.new
194
+
195
+ project_id = "my-project"
196
+ region = "us-central1"
197
+ cluster_name = "my_cluster"
198
+
199
+ options = Google::Gax::CallOptions.new timeout: 10.0
200
+
201
+ response = client.get_cluster project_id, region, cluster_name, options: options
202
+ ```
203
+
204
+ New:
205
+ ```
206
+ client = Google::Cloud::Dataproc.cluster_controller
207
+
208
+ project_id = "my-project"
209
+ region = "us-central1"
210
+ cluster_name = "my_cluster"
211
+
212
+ # Use a hash to wrap the normal call arguments (or pass a request object), and
213
+ # then add further keyword arguments for the call options.
214
+ response = client.get_feed(
215
+ { project_id: project_id, region: region, cluster_name: cluster_name },
216
+ timeout: 10.0)
217
+ ```
218
+
219
+ ### Resource Path Helpers
220
+
221
+ The client library includes helper methods for generating the resource path
222
+ strings passed to many calls. These helpers have changed in two ways:
223
+
224
+ * In older releases, they are _class_ methods on the client class. In the 1.0
225
+ release, they are _instance_ methods on the client. They are also available
226
+ on a separate paths module that you can include elsewhere for convenience.
227
+ * In older releases, arguments to a resource path helper are passed as
228
+ _positional_ arguments. In the 1.0 release, they are passed as named _keyword_
229
+ arguments. Some helpers also support different sets of arguments, each set
230
+ corresponding to a different type of path.
231
+
232
+ Following is an example involving using a resource path helper.
233
+
234
+ Old:
235
+ ```
236
+ client = Google::Cloud::Dataproc::WorkflowTemplateService.new
237
+
238
+ # Call the helper on the client class
239
+ name = Google::Cloud::Dataproc::V1::WorkflowTemplateServiceClient.
240
+ workflow_template_path("my-project", "us-central1", "my-template")
241
+
242
+ response = client.get_workflow_template name
243
+ ```
244
+
245
+ New:
246
+ ```
247
+ client = Google::Cloud::Dataproc.workflow_template_service
248
+
249
+ # Call the helper on the client instance, and use keyword arguments
250
+ name = client.workflow_template_path project: "my-project",
251
+ region: "us-central1",
252
+ workflow_template: "my-template"
253
+
254
+ response = client.get_workflow_template name: name
255
+ ```
256
+
257
+ Because helpers take keyword arguments, some can now generate several different
258
+ variations on the path that were not available under earlier versions of the
259
+ library. For example, `workflow_template_path` can generate paths with either
260
+ a region or location as the parent resource.
261
+
262
+ New:
263
+ ```
264
+ client = Google::Cloud::Dataproc.workflow_template_service
265
+
266
+ # Create paths with different parent resource types
267
+ name1 = client.workflow_template_path project: "my-project",
268
+ region: "us-central1",
269
+ workflow_template: "my-template"
270
+ # => "projects/my-project/regions/us-central1/workflowTemplates/my-template"
271
+ name2 = client.workflow_template_path project: "my-project",
272
+ location: "my-location",
273
+ workflow_template: "my-template"
274
+ # => "projects/my-project/locations/my-location/workflowTemplates/my-template"
275
+ ```
276
+
277
+ Finally, in the 1.0 client, you can also use the paths module as a convenience module.
278
+
279
+ New:
280
+ ```
281
+ # Bring the path helper methods into the current class
282
+ include Google::Cloud::Dataproc::V1::WorkflowTemplateService::Paths
283
+
284
+ def foo
285
+ client = Google::Cloud::Dataproc.workflow_template_service
286
+
287
+ # Call the included helper method
288
+ name = workflow_template_path project: "my-project",
289
+ location: "my-location",
290
+ workflow_template: "my-template"
291
+
292
+ response = client.get_workflow_template name: name
293
+
294
+ # Do something with response...
295
+ end
296
+ ```
297
+
298
+ ### Handling Errors
299
+
300
+ The client reports standard
301
+ [gRPC error codes](https://github.com/grpc/grpc/blob/master/doc/statuscodes.md)
302
+ by raising exceptions. In older releases, these exceptions were located in the
303
+ `Google::Gax` namespace and were subclasses of the `Google::Gax::GaxError` base
304
+ exception class, defined in the `google-gax` gem. However, these classes were
305
+ different from the standard exceptions (subclasses of `Google::Cloud::Error`)
306
+ thrown by other client libraries such as `google-cloud-storage`.
307
+
308
+ The 1.0 client library now uses the `Google::Cloud::Error` exception hierarchy,
309
+ for consistency across all the Google Cloud client libraries. In general, these
310
+ exceptions have the same name as their counterparts from older releases, but
311
+ are located in the `Google::Cloud` namespace rather than the `Google::Gax`
312
+ namespace.
313
+
314
+ Old:
315
+ ```
316
+ client = Google::Cloud::Dataproc::ClusterController.new
317
+
318
+ project_id = "my-project"
319
+ region = "us-central1"
320
+ cluster_name = "my_cluster"
321
+
322
+ begin
323
+ response = client.get_cluster project_id, region, cluster_name
324
+ rescue Google::Gax::Error => e
325
+ # Handle exceptions that subclass Google::Gax::Error
326
+ end
327
+ ```
328
+
329
+ New:
330
+ ```
331
+ client = Google::Cloud::Dataproc.cluster_controller
332
+
333
+ project_id = "my-project"
334
+ region = "us-central1"
335
+ cluster_name = "my_cluster"
336
+
337
+ begin
338
+ response = client.get_cluster project_id: project_id, region: region,
339
+ cluster_name: cluster_name
340
+ rescue Google::Cloud::Error => e
341
+ # Handle exceptions that subclass Google::Cloud::Error
342
+ end
343
+ ```
344
+
345
+ ### Class Namespaces
346
+
347
+ In older releases, the client object was of classes with names like:
348
+ `Google::Cloud::Dataproc::V1::ClusterControllerClient`.
349
+ In the 1.0 release, the client object is of a different class:
350
+ `Google::Cloud::Dataproc::V1::ClusterController::Client`.
351
+ Note that most users will use the factory methods such as
352
+ `Google::Cloud::Dataproc.cluster_controller` to create instances of the client object,
353
+ so you may not need to reference the actual class directly.
354
+ See [Creating Clients](#creating-clients).
355
+
356
+ In older releases, the credentials object was of class
357
+ `Google::Cloud::Dataproc::V1::Credentials`.
358
+ In the 1.0 release, each service has its own credentials class, e.g.
359
+ `Google::Cloud::Dataproc::V1::ClusterController::Credentials`.
360
+ Again, most users will not need to reference this class directly.
361
+ See [Client Configuration](#client-configuration).