google-cloud-bigquery 1.38.1 → 1.42.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/AUTHENTICATION.md +8 -26
- data/CHANGELOG.md +28 -0
- data/LOGGING.md +1 -1
- data/OVERVIEW.md +1 -1
- data/lib/google/cloud/bigquery/convert.rb +7 -9
- data/lib/google/cloud/bigquery/copy_job.rb +36 -5
- data/lib/google/cloud/bigquery/dataset/access.rb +152 -1
- data/lib/google/cloud/bigquery/dataset/tag.rb +67 -0
- data/lib/google/cloud/bigquery/dataset.rb +90 -14
- data/lib/google/cloud/bigquery/extract_job.rb +11 -5
- data/lib/google/cloud/bigquery/job.rb +3 -3
- data/lib/google/cloud/bigquery/load_job.rb +10 -4
- data/lib/google/cloud/bigquery/project.rb +3 -3
- data/lib/google/cloud/bigquery/query_job.rb +10 -3
- data/lib/google/cloud/bigquery/schema/field.rb +1 -1
- data/lib/google/cloud/bigquery/schema.rb +1 -1
- data/lib/google/cloud/bigquery/service.rb +26 -4
- data/lib/google/cloud/bigquery/table.rb +294 -17
- data/lib/google/cloud/bigquery/version.rb +1 -1
- metadata +9 -8
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 779c0f4690b370db6eb7811fa6fb8dbca101dff34ff478dcdc51ed1e9c83c083
|
4
|
+
data.tar.gz: d1456024ae30a18c93c05d6d8be7eef88dd3f0610cdbf816cf0c46322c9574be
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 6a5da0e2565b0c224f46a6330d138c8fb35831e874ebc2364fd3b91578d400354011d1e9df7d2986dbe2398f610776b130b7eb9305af8282bf3a6265dd84aa05
|
7
|
+
data.tar.gz: 0d14faa0d89ba6a114e247c362f8a5d4775ae4dc19de1aad0a64083da7e41fc93ac810e876b434056340b27412bb08063af05edf33355800ca1c528ff02f437b
|
data/AUTHENTICATION.md
CHANGED
@@ -106,15 +106,6 @@ To configure your system for this, simply:
|
|
106
106
|
**NOTE:** This is _not_ recommended for running in production. The Cloud SDK
|
107
107
|
*should* only be used during development.
|
108
108
|
|
109
|
-
[gce-how-to]: https://cloud.google.com/compute/docs/authentication#using
|
110
|
-
[dev-console]: https://console.cloud.google.com/project
|
111
|
-
|
112
|
-
[enable-apis]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/enable-apis.png
|
113
|
-
|
114
|
-
[create-new-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account.png
|
115
|
-
[create-new-service-account-existing-keys]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account-existing-keys.png
|
116
|
-
[reuse-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/reuse-service-account.png
|
117
|
-
|
118
109
|
## Creating a Service Account
|
119
110
|
|
120
111
|
Google Cloud requires a **Project ID** and **Service Account Credentials** to
|
@@ -124,31 +115,22 @@ connect to most services with google-cloud-bigquery.
|
|
124
115
|
If you are not running this client on Google Compute Engine, you need a Google
|
125
116
|
Developers service account.
|
126
117
|
|
127
|
-
1. Visit the [Google
|
118
|
+
1. Visit the [Google Cloud Console](https://console.cloud.google.com/project).
|
128
119
|
1. Create a new project or click on an existing project.
|
129
|
-
1. Activate the
|
120
|
+
1. Activate the menu in the upper left and select **APIs & Services**. From
|
130
121
|
here, you will enable the APIs that your application requires.
|
131
122
|
|
132
|
-
![Enable the APIs that your application requires][enable-apis]
|
133
|
-
|
134
123
|
*Note: You may need to enable billing in order to use these services.*
|
135
124
|
|
136
125
|
1. Select **Credentials** from the side navigation.
|
137
126
|
|
138
|
-
|
139
|
-
|
140
|
-
![Create a new service account][create-new-service-account]
|
141
|
-
|
142
|
-
![Create a new service account With Existing Keys][create-new-service-account-existing-keys]
|
143
|
-
|
144
|
-
Find the "Add credentials" drop down and select "Service account" to be
|
145
|
-
guided through downloading a new JSON key file.
|
146
|
-
|
147
|
-
If you want to re-use an existing service account, you can easily generate a
|
148
|
-
new key file. Just select the account you wish to re-use, and click "Generate
|
149
|
-
new JSON key":
|
127
|
+
Find the "Create credentials" drop down near the top of the page, and select
|
128
|
+
"Service account" to be guided through downloading a new JSON key file.
|
150
129
|
|
151
|
-
|
130
|
+
If you want to re-use an existing service account, you can easily generate
|
131
|
+
a new key file. Just select the account you wish to re-use click the pencil
|
132
|
+
tool on the right side to edit the service account, select the **Keys** tab,
|
133
|
+
and then select **Add Key**.
|
152
134
|
|
153
135
|
The key file you download will be used by this library to authenticate API
|
154
136
|
requests and should be stored in a secure location.
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,33 @@
|
|
1
1
|
# Release History
|
2
2
|
|
3
|
+
### 1.42.0 (2023-01-15)
|
4
|
+
|
5
|
+
#### Features
|
6
|
+
|
7
|
+
* Added support for authorized dataset ([#19442](https://github.com/googleapis/google-cloud-ruby/issues/19442))
|
8
|
+
* Added support for tags in dataset ([#19350](https://github.com/googleapis/google-cloud-ruby/issues/19350))
|
9
|
+
|
10
|
+
### 1.41.0 (2023-01-05)
|
11
|
+
|
12
|
+
#### Features
|
13
|
+
|
14
|
+
* Add support for partial projection of table metadata
|
15
|
+
#### Bug Fixes
|
16
|
+
|
17
|
+
* Fix querying of array of structs in named parameters ([#19466](https://github.com/googleapis/google-cloud-ruby/issues/19466))
|
18
|
+
|
19
|
+
### 1.40.0 (2022-12-14)
|
20
|
+
|
21
|
+
#### Features
|
22
|
+
|
23
|
+
* support table snapshot and clone ([#19354](https://github.com/googleapis/google-cloud-ruby/issues/19354))
|
24
|
+
|
25
|
+
### 1.39.0 (2022-07-27)
|
26
|
+
|
27
|
+
#### Features
|
28
|
+
|
29
|
+
* Update minimum Ruby version to 2.6 ([#18871](https://github.com/googleapis/google-cloud-ruby/issues/18871))
|
30
|
+
|
3
31
|
### 1.38.1 / 2022-01-13
|
4
32
|
|
5
33
|
#### Bug Fixes
|
data/LOGGING.md
CHANGED
@@ -4,7 +4,7 @@ To enable logging for this library, set the logger for the underlying [Google
|
|
4
4
|
API
|
5
5
|
Client](https://github.com/google/google-api-ruby-client/blob/master/README.md#logging)
|
6
6
|
library. The logger that you set may be a Ruby stdlib
|
7
|
-
[`Logger`](https://ruby-doc.org/
|
7
|
+
[`Logger`](https://ruby-doc.org/current/stdlibs/logger/Logger.html) as
|
8
8
|
shown below, or a
|
9
9
|
[`Google::Cloud::Logging::Logger`](https://googleapis.dev/ruby/google-cloud-logging/latest)
|
10
10
|
that will write logs to [Stackdriver
|
data/OVERVIEW.md
CHANGED
@@ -276,7 +276,7 @@ To follow along with these examples, you will need to set up billing on the
|
|
276
276
|
[Google Developers Console](https://console.developers.google.com).
|
277
277
|
|
278
278
|
In addition to CSV, data can be imported from files that are formatted as
|
279
|
-
[Newline-delimited JSON](
|
279
|
+
[Newline-delimited JSON](https://jsonlines.org/),
|
280
280
|
[Avro](http://avro.apache.org/),
|
281
281
|
[ORC](https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-orc),
|
282
282
|
[Parquet](https://parquet.apache.org/) or from a Google Cloud Datastore backup.
|
@@ -54,10 +54,9 @@ module Google
|
|
54
54
|
end
|
55
55
|
|
56
56
|
def self.format_row row, fields
|
57
|
-
|
57
|
+
fields.zip(row[:f]).to_h do |f, v|
|
58
58
|
[f.name.to_sym, format_value(v, f)]
|
59
59
|
end
|
60
|
-
Hash[row_pairs]
|
61
60
|
end
|
62
61
|
|
63
62
|
# rubocop:disable all
|
@@ -123,10 +122,9 @@ module Google
|
|
123
122
|
array_values = json_value.map { |v| to_query_param_value v, type }
|
124
123
|
Google::Apis::BigqueryV2::QueryParameterValue.new array_values: array_values
|
125
124
|
when Hash
|
126
|
-
|
125
|
+
struct_values = json_value.to_h do |k, v|
|
127
126
|
[String(k), to_query_param_value(v, type)]
|
128
127
|
end
|
129
|
-
struct_values = Hash[struct_pairs]
|
130
128
|
Google::Apis::BigqueryV2::QueryParameterValue.new struct_values: struct_values
|
131
129
|
else
|
132
130
|
# Everything else is converted to a string, per the API expectations.
|
@@ -239,7 +237,7 @@ module Google
|
|
239
237
|
type = extract_array_type type
|
240
238
|
value.map { |x| to_json_value x, type }
|
241
239
|
elsif Hash === value
|
242
|
-
|
240
|
+
value.to_h { |k, v| [k.to_s, to_json_value(v, type)] }
|
243
241
|
else
|
244
242
|
value
|
245
243
|
end
|
@@ -256,17 +254,17 @@ module Google
|
|
256
254
|
|
257
255
|
##
|
258
256
|
# Lists are specified by providing the type code in an array. For example, an array of integers are specified as
|
259
|
-
# `[:INT64]`. Extracts the symbol.
|
257
|
+
# `[:INT64]`. Extracts the symbol/hash.
|
260
258
|
def self.extract_array_type type
|
261
259
|
return nil if type.nil?
|
262
|
-
unless type.is_a?(Array) && type.count == 1 && type.first.is_a?(Symbol)
|
263
|
-
raise ArgumentError, "types Array #{type.inspect} should include only a single symbol element."
|
260
|
+
unless type.is_a?(Array) && type.count == 1 && (type.first.is_a?(Symbol) || type.first.is_a?(Hash))
|
261
|
+
raise ArgumentError, "types Array #{type.inspect} should include only a single symbol or hash element."
|
264
262
|
end
|
265
263
|
type.first
|
266
264
|
end
|
267
265
|
|
268
266
|
def self.to_json_row row
|
269
|
-
|
267
|
+
row.to_h { |k, v| [k.to_s, to_json_value(v)] }
|
270
268
|
end
|
271
269
|
|
272
270
|
def self.resolve_legacy_sql standard_sql, legacy_sql
|
@@ -17,6 +17,24 @@ require "google/cloud/bigquery/encryption_configuration"
|
|
17
17
|
module Google
|
18
18
|
module Cloud
|
19
19
|
module Bigquery
|
20
|
+
module OperationType
|
21
|
+
# Different operation types supported in table copy job.
|
22
|
+
# https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#operationtype
|
23
|
+
|
24
|
+
# The source and destination table have the same table type.
|
25
|
+
COPY = "COPY".freeze
|
26
|
+
|
27
|
+
# The source table type is TABLE and the destination table type is SNAPSHOT.
|
28
|
+
SNAPSHOT = "SNAPSHOT".freeze
|
29
|
+
|
30
|
+
# The source table type is SNAPSHOT and the destination table type is TABLE.
|
31
|
+
RESTORE = "RESTORE".freeze
|
32
|
+
|
33
|
+
# The source and destination table have the same table type, but only bill for
|
34
|
+
# unique data.
|
35
|
+
CLONE = "CLONE".freeze
|
36
|
+
end
|
37
|
+
|
20
38
|
##
|
21
39
|
# # CopyJob
|
22
40
|
#
|
@@ -46,23 +64,35 @@ module Google
|
|
46
64
|
# The table from which data is copied. This is the table on
|
47
65
|
# which {Table#copy_job} was called.
|
48
66
|
#
|
67
|
+
# @param [String] view Specifies the view that determines which table information is returned.
|
68
|
+
# By default, basic table information and storage statistics (STORAGE_STATS) are returned.
|
69
|
+
# Accepted values include `:unspecified`, `:basic`, `:storage`, and
|
70
|
+
# `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
|
71
|
+
# The default value is the `:unspecified` view type.
|
72
|
+
#
|
49
73
|
# @return [Table] A table instance.
|
50
74
|
#
|
51
|
-
def source
|
75
|
+
def source view: nil
|
52
76
|
table = @gapi.configuration.copy.source_table
|
53
77
|
return nil unless table
|
54
|
-
retrieve_table table.project_id, table.dataset_id, table.table_id
|
78
|
+
retrieve_table table.project_id, table.dataset_id, table.table_id, metadata_view: view
|
55
79
|
end
|
56
80
|
|
57
81
|
##
|
58
82
|
# The table to which data is copied.
|
59
83
|
#
|
84
|
+
# @param [String] view Specifies the view that determines which table information is returned.
|
85
|
+
# By default, basic table information and storage statistics (STORAGE_STATS) are returned.
|
86
|
+
# Accepted values include `:unspecified`, `:basic`, `:storage`, and
|
87
|
+
# `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
|
88
|
+
# The default value is the `:unspecified` view type.
|
89
|
+
#
|
60
90
|
# @return [Table] A table instance.
|
61
91
|
#
|
62
|
-
def destination
|
92
|
+
def destination view: nil
|
63
93
|
table = @gapi.configuration.copy.destination_table
|
64
94
|
return nil unless table
|
65
|
-
retrieve_table table.project_id, table.dataset_id, table.table_id
|
95
|
+
retrieve_table table.project_id, table.dataset_id, table.table_id, metadata_view: view
|
66
96
|
end
|
67
97
|
|
68
98
|
##
|
@@ -157,7 +187,8 @@ module Google
|
|
157
187
|
job_ref = service.job_ref_from options[:job_id], options[:prefix]
|
158
188
|
copy_cfg = Google::Apis::BigqueryV2::JobConfigurationTableCopy.new(
|
159
189
|
source_table: source,
|
160
|
-
destination_table: target
|
190
|
+
destination_table: target,
|
191
|
+
operation_type: options[:operation_type]
|
161
192
|
)
|
162
193
|
req = Google::Apis::BigqueryV2::Job.new(
|
163
194
|
job_reference: job_ref,
|
@@ -61,8 +61,10 @@ module Google
|
|
61
61
|
"user" => :user_by_email,
|
62
62
|
"user_by_email" => :user_by_email,
|
63
63
|
"userByEmail" => :user_by_email,
|
64
|
-
"view" => :view
|
64
|
+
"view" => :view,
|
65
|
+
"dataset" => :dataset
|
65
66
|
}.freeze
|
67
|
+
attr_reader :rules
|
66
68
|
|
67
69
|
# @private
|
68
70
|
GROUPS = {
|
@@ -267,6 +269,44 @@ module Google
|
|
267
269
|
add_access_view view
|
268
270
|
end
|
269
271
|
|
272
|
+
##
|
273
|
+
# Add reader access to a dataset.
|
274
|
+
#
|
275
|
+
# @param [Google::Cloud::Bigquery::DatasetAccessEntry, Hash<String,String> ] dataset A DatasetAccessEntry
|
276
|
+
# or a Hash object. Required
|
277
|
+
#
|
278
|
+
# @example
|
279
|
+
# require "google/cloud/bigquery"
|
280
|
+
#
|
281
|
+
# bigquery = Google::Cloud::Bigquery.new
|
282
|
+
# dataset = bigquery.dataset "my_dataset"
|
283
|
+
# other_dataset = bigquery.dataset "my_other_dataset", skip_lookup: true
|
284
|
+
#
|
285
|
+
# params = {
|
286
|
+
# dataset_id: other_dataset.dataset_id,
|
287
|
+
# project_id: other_dataset.project_id,
|
288
|
+
# target_types: ["VIEWS"]
|
289
|
+
# }
|
290
|
+
#
|
291
|
+
# dataset.access do |access|
|
292
|
+
# access.add_reader_dataset params
|
293
|
+
# end
|
294
|
+
#
|
295
|
+
# @example
|
296
|
+
# require "google/cloud/bigquery"
|
297
|
+
#
|
298
|
+
# bigquery = Google::Cloud::Bigquery.new
|
299
|
+
# dataset = bigquery.dataset "my_dataset"
|
300
|
+
# other_dataset = bigquery.dataset "my_other_dataset", skip_lookup: true
|
301
|
+
#
|
302
|
+
# dataset.access do |access|
|
303
|
+
# access.add_reader_dataset other_dataset.access_entry(target_types: ["VIEWS"])
|
304
|
+
# end
|
305
|
+
#
|
306
|
+
def add_reader_dataset dataset
|
307
|
+
add_access_dataset dataset
|
308
|
+
end
|
309
|
+
|
270
310
|
##
|
271
311
|
# Add writer access to a user.
|
272
312
|
#
|
@@ -610,6 +650,44 @@ module Google
|
|
610
650
|
remove_access_view view
|
611
651
|
end
|
612
652
|
|
653
|
+
##
|
654
|
+
# Removes reader access of a dataset.
|
655
|
+
#
|
656
|
+
# @param [Google::Cloud::Bigquery::DatasetAccessEntry, Hash<String,String> ] dataset A DatasetAccessEntry
|
657
|
+
# or a Hash object. Required
|
658
|
+
#
|
659
|
+
# @example
|
660
|
+
# require "google/cloud/bigquery"
|
661
|
+
#
|
662
|
+
# bigquery = Google::Cloud::Bigquery.new
|
663
|
+
# dataset = bigquery.dataset "my_dataset"
|
664
|
+
# other_dataset = bigquery.dataset "my_other_dataset", skip_lookup: true
|
665
|
+
#
|
666
|
+
# params = {
|
667
|
+
# dataset_id: other_dataset.dataset_id,
|
668
|
+
# project_id: other_dataset.project_id,
|
669
|
+
# target_types: ["VIEWS"]
|
670
|
+
# }
|
671
|
+
#
|
672
|
+
# dataset.access do |access|
|
673
|
+
# access.remove_reader_dataset params
|
674
|
+
# end
|
675
|
+
#
|
676
|
+
# @example
|
677
|
+
# require "google/cloud/bigquery"
|
678
|
+
#
|
679
|
+
# bigquery = Google::Cloud::Bigquery.new
|
680
|
+
# dataset = bigquery.dataset "my_dataset"
|
681
|
+
# other_dataset = bigquery.dataset "my_other_dataset", skip_lookup: true
|
682
|
+
#
|
683
|
+
# dataset.access do |access|
|
684
|
+
# access.remove_reader_dataset other_dataset.access_entry(target_types: ["VIEWS"])
|
685
|
+
# end
|
686
|
+
#
|
687
|
+
def remove_reader_dataset dataset
|
688
|
+
remove_access_dataset dataset
|
689
|
+
end
|
690
|
+
|
613
691
|
##
|
614
692
|
# Remove writer access from a user.
|
615
693
|
#
|
@@ -951,6 +1029,40 @@ module Google
|
|
951
1029
|
lookup_access_view view
|
952
1030
|
end
|
953
1031
|
|
1032
|
+
##
|
1033
|
+
# Checks reader access for a dataset.
|
1034
|
+
#
|
1035
|
+
# @param [Google::Cloud::Bigquery::DatasetAccessEntry, Hash<String,String> ] dataset A DatasetAccessEntry
|
1036
|
+
# or a Hash object. Required
|
1037
|
+
#
|
1038
|
+
# @example
|
1039
|
+
# require "google/cloud/bigquery"
|
1040
|
+
#
|
1041
|
+
# bigquery = Google::Cloud::Bigquery.new
|
1042
|
+
# dataset = bigquery.dataset "my_dataset"
|
1043
|
+
# other_dataset = bigquery.dataset "my_other_dataset", skip_lookup: true
|
1044
|
+
#
|
1045
|
+
# params = {
|
1046
|
+
# dataset_id: other_dataset.dataset_id,
|
1047
|
+
# project_id: other_dataset.project_id,
|
1048
|
+
# target_types: ["VIEWS"]
|
1049
|
+
# }
|
1050
|
+
#
|
1051
|
+
# dataset.access.reader_dataset? params
|
1052
|
+
#
|
1053
|
+
# @example
|
1054
|
+
# require "google/cloud/bigquery"
|
1055
|
+
#
|
1056
|
+
# bigquery = Google::Cloud::Bigquery.new
|
1057
|
+
# dataset = bigquery.dataset "my_dataset"
|
1058
|
+
# other_dataset = bigquery.dataset "my_other_dataset", skip_lookup: true
|
1059
|
+
#
|
1060
|
+
# dataset.access.reader_dataset? other_dataset.access_entry(target_types: ["VIEWS"])
|
1061
|
+
#
|
1062
|
+
def reader_dataset? dataset
|
1063
|
+
lookup_access_dataset dataset
|
1064
|
+
end
|
1065
|
+
|
954
1066
|
##
|
955
1067
|
# Checks writer access for a user.
|
956
1068
|
#
|
@@ -1184,6 +1296,18 @@ module Google
|
|
1184
1296
|
end
|
1185
1297
|
end
|
1186
1298
|
|
1299
|
+
# @private
|
1300
|
+
#
|
1301
|
+
# Checks the type of user input and converts it to acceptable format.
|
1302
|
+
#
|
1303
|
+
def validate_dataset dataset
|
1304
|
+
if dataset.is_a? Google::Apis::BigqueryV2::DatasetAccessEntry
|
1305
|
+
dataset
|
1306
|
+
else
|
1307
|
+
Service.dataset_access_entry_from_hash dataset
|
1308
|
+
end
|
1309
|
+
end
|
1310
|
+
|
1187
1311
|
# @private
|
1188
1312
|
def add_access_role_scope_value role, scope, value
|
1189
1313
|
role = validate_role role
|
@@ -1218,6 +1342,17 @@ module Google
|
|
1218
1342
|
@rules << Google::Apis::BigqueryV2::Dataset::Access.new(**opts)
|
1219
1343
|
end
|
1220
1344
|
|
1345
|
+
# @private
|
1346
|
+
def add_access_dataset dataset
|
1347
|
+
# scope is dataset, make sure value is in the right format
|
1348
|
+
value = validate_dataset dataset
|
1349
|
+
# Remove existing rule for input dataset, if any
|
1350
|
+
@rules.reject!(&find_by_scope_and_resource_ref(:dataset, value))
|
1351
|
+
# Add new rule for this role, scope, and value
|
1352
|
+
opts = { dataset: value }
|
1353
|
+
@rules << Google::Apis::BigqueryV2::Dataset::Access.new(**opts)
|
1354
|
+
end
|
1355
|
+
|
1221
1356
|
# @private
|
1222
1357
|
def remove_access_role_scope_value role, scope, value
|
1223
1358
|
role = validate_role role
|
@@ -1244,6 +1379,14 @@ module Google
|
|
1244
1379
|
@rules.reject!(&find_by_scope_and_resource_ref(:view, value))
|
1245
1380
|
end
|
1246
1381
|
|
1382
|
+
# @private
|
1383
|
+
def remove_access_dataset dataset
|
1384
|
+
# scope is dataset, make sure value is in the right format
|
1385
|
+
value = validate_dataset dataset
|
1386
|
+
# Remove existing rule for input dataset, if any
|
1387
|
+
@rules.reject!(&find_by_scope_and_resource_ref(:dataset, value))
|
1388
|
+
end
|
1389
|
+
|
1247
1390
|
# @private
|
1248
1391
|
def lookup_access_role_scope_value role, scope, value
|
1249
1392
|
role = validate_role role
|
@@ -1268,6 +1411,14 @@ module Google
|
|
1268
1411
|
!(!@rules.detect(&find_by_scope_and_resource_ref(:view, value)))
|
1269
1412
|
end
|
1270
1413
|
|
1414
|
+
# @private
|
1415
|
+
def lookup_access_dataset dataset
|
1416
|
+
# scope is dataset, make sure value is in the right format
|
1417
|
+
value = validate_dataset dataset
|
1418
|
+
# Detect existing rule for input dataset, if any
|
1419
|
+
!(!@rules.detect(&find_by_scope_and_resource_ref(:dataset, value)))
|
1420
|
+
end
|
1421
|
+
|
1271
1422
|
# @private
|
1272
1423
|
def find_by_role_and_scope_and_value role, scope, value
|
1273
1424
|
lambda do |a|
|
@@ -0,0 +1,67 @@
|
|
1
|
+
# Copyright 2022 Google LLC
|
2
|
+
#
|
3
|
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
# you may not use this file except in compliance with the License.
|
5
|
+
# You may obtain a copy of the License at
|
6
|
+
#
|
7
|
+
# https://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
#
|
9
|
+
# Unless required by applicable law or agreed to in writing, software
|
10
|
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
# See the License for the specific language governing permissions and
|
13
|
+
# limitations under the License.
|
14
|
+
|
15
|
+
require "google/apis/bigquery_v2"
|
16
|
+
|
17
|
+
module Google
|
18
|
+
module Cloud
|
19
|
+
module Bigquery
|
20
|
+
class Dataset
|
21
|
+
##
|
22
|
+
# A global tag managed by Resource Manager.
|
23
|
+
#
|
24
|
+
# @see https://cloud.google.com/iam/docs/tags-access-control#definitions
|
25
|
+
#
|
26
|
+
class Tag
|
27
|
+
##
|
28
|
+
# @private The Google API Client object.
|
29
|
+
attr_accessor :gapi
|
30
|
+
|
31
|
+
##
|
32
|
+
# @private Create an empty Tag object.
|
33
|
+
def initialize
|
34
|
+
@gapi = Google::Apis::BigqueryV2::Dataset::Tag.new
|
35
|
+
end
|
36
|
+
|
37
|
+
##
|
38
|
+
# The namespaced friendly name of the tag key, e.g. "12345/environment" where
|
39
|
+
# 12345 is org id.
|
40
|
+
#
|
41
|
+
# @return [String]
|
42
|
+
#
|
43
|
+
def tag_key
|
44
|
+
@gapi.tag_key
|
45
|
+
end
|
46
|
+
|
47
|
+
##
|
48
|
+
# The friendly short name of the tag value, e.g. "production".
|
49
|
+
#
|
50
|
+
# @return [String]
|
51
|
+
#
|
52
|
+
def tag_value
|
53
|
+
@gapi.tag_value
|
54
|
+
end
|
55
|
+
|
56
|
+
##
|
57
|
+
# @private Google API Client object.
|
58
|
+
def self.from_gapi gapi
|
59
|
+
new_tag = new
|
60
|
+
new_tag.instance_variable_set :@gapi, gapi
|
61
|
+
new_tag
|
62
|
+
end
|
63
|
+
end
|
64
|
+
end
|
65
|
+
end
|
66
|
+
end
|
67
|
+
end
|