google-cloud-bigquery 1.39.0 → 1.41.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 599d3411dcba02466b7d161bdf01c9b1d2b6f8874c98b5bf309afb4e4f6e324a
4
- data.tar.gz: 2a186e09cd38a6140c92d1df3fef992091b4448211625e4a564ea3f29802ec9a
3
+ metadata.gz: 8f13891ae7f46f3a0561140fabe4eed4cc426923258e17ed4cc10cc8155f9129
4
+ data.tar.gz: 65eb5cfc6fb0feadae0d4503bcf2bd1942a0a9740a6f4f1b6f6aaddf7ee244c0
5
5
  SHA512:
6
- metadata.gz: 2f1eb30d1ed33b4e961eb4bfd5714ad3e92afc25ca22e1668c6a2d00fc0ad50319bc634b3e2d957fb826ee2991b15802c18a1fcf50bd96a7c3c846030485adda
7
- data.tar.gz: f439398806448a13ba5f8b70f7c2846e68e6594c379b75145837d67e81fcb8b59eaaed3512569b553b93562c0aaae96a5570e8b95521a21d0c7f7854056f5214
6
+ metadata.gz: cecf4d1450deb04f33772ebc0bf18892816dda84e9e2b43668615dc37c49b3319465c3d0f3cb9c07ca9cc5a5e7a7118daab1f1aa51c25bf83ebadc925bd52eef
7
+ data.tar.gz: c678e3b5420f82894c1d4935f4f89959fda4ea6e66faffa7fb649c0ff3c0e2fec91daf3e77208e02eb4747cf1a3d41ef5764a4b4da8b76b2a03328861121c7c2
data/AUTHENTICATION.md CHANGED
@@ -106,15 +106,6 @@ To configure your system for this, simply:
106
106
  **NOTE:** This is _not_ recommended for running in production. The Cloud SDK
107
107
  *should* only be used during development.
108
108
 
109
- [gce-how-to]: https://cloud.google.com/compute/docs/authentication#using
110
- [dev-console]: https://console.cloud.google.com/project
111
-
112
- [enable-apis]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/enable-apis.png
113
-
114
- [create-new-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account.png
115
- [create-new-service-account-existing-keys]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account-existing-keys.png
116
- [reuse-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/reuse-service-account.png
117
-
118
109
  ## Creating a Service Account
119
110
 
120
111
  Google Cloud requires a **Project ID** and **Service Account Credentials** to
@@ -124,31 +115,22 @@ connect to most services with google-cloud-bigquery.
124
115
  If you are not running this client on Google Compute Engine, you need a Google
125
116
  Developers service account.
126
117
 
127
- 1. Visit the [Google Developers Console][dev-console].
118
+ 1. Visit the [Google Cloud Console](https://console.cloud.google.com/project).
128
119
  1. Create a new project or click on an existing project.
129
- 1. Activate the slide-out navigation tray and select **API Manager**. From
120
+ 1. Activate the menu in the upper left and select **APIs & Services**. From
130
121
  here, you will enable the APIs that your application requires.
131
122
 
132
- ![Enable the APIs that your application requires][enable-apis]
133
-
134
123
  *Note: You may need to enable billing in order to use these services.*
135
124
 
136
125
  1. Select **Credentials** from the side navigation.
137
126
 
138
- You should see a screen like one of the following.
139
-
140
- ![Create a new service account][create-new-service-account]
141
-
142
- ![Create a new service account With Existing Keys][create-new-service-account-existing-keys]
143
-
144
- Find the "Add credentials" drop down and select "Service account" to be
145
- guided through downloading a new JSON key file.
146
-
147
- If you want to re-use an existing service account, you can easily generate a
148
- new key file. Just select the account you wish to re-use, and click "Generate
149
- new JSON key":
127
+ Find the "Create credentials" drop down near the top of the page, and select
128
+ "Service account" to be guided through downloading a new JSON key file.
150
129
 
151
- ![Re-use an existing service account][reuse-service-account]
130
+ If you want to re-use an existing service account, you can easily generate
131
+ a new key file. Just select the account you wish to re-use click the pencil
132
+ tool on the right side to edit the service account, select the **Keys** tab,
133
+ and then select **Add Key**.
152
134
 
153
135
  The key file you download will be used by this library to authenticate API
154
136
  requests and should be stored in a secure location.
data/CHANGELOG.md CHANGED
@@ -1,5 +1,20 @@
1
1
  # Release History
2
2
 
3
+ ### 1.41.0 (2023-01-05)
4
+
5
+ #### Features
6
+
7
+ * Add support for partial projection of table metadata
8
+ #### Bug Fixes
9
+
10
+ * Fix querying of array of structs in named parameters ([#19466](https://github.com/googleapis/google-cloud-ruby/issues/19466))
11
+
12
+ ### 1.40.0 (2022-12-14)
13
+
14
+ #### Features
15
+
16
+ * support table snapshot and clone ([#19354](https://github.com/googleapis/google-cloud-ruby/issues/19354))
17
+
3
18
  ### 1.39.0 (2022-07-27)
4
19
 
5
20
  #### Features
data/LOGGING.md CHANGED
@@ -4,7 +4,7 @@ To enable logging for this library, set the logger for the underlying [Google
4
4
  API
5
5
  Client](https://github.com/google/google-api-ruby-client/blob/master/README.md#logging)
6
6
  library. The logger that you set may be a Ruby stdlib
7
- [`Logger`](https://ruby-doc.org/stdlib/libdoc/logger/rdoc/Logger.html) as
7
+ [`Logger`](https://ruby-doc.org/current/stdlibs/logger/Logger.html) as
8
8
  shown below, or a
9
9
  [`Google::Cloud::Logging::Logger`](https://googleapis.dev/ruby/google-cloud-logging/latest)
10
10
  that will write logs to [Stackdriver
data/OVERVIEW.md CHANGED
@@ -276,7 +276,7 @@ To follow along with these examples, you will need to set up billing on the
276
276
  [Google Developers Console](https://console.developers.google.com).
277
277
 
278
278
  In addition to CSV, data can be imported from files that are formatted as
279
- [Newline-delimited JSON](http://jsonlines.org/),
279
+ [Newline-delimited JSON](https://jsonlines.org/),
280
280
  [Avro](http://avro.apache.org/),
281
281
  [ORC](https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-orc),
282
282
  [Parquet](https://parquet.apache.org/) or from a Google Cloud Datastore backup.
@@ -254,11 +254,11 @@ module Google
254
254
 
255
255
  ##
256
256
  # Lists are specified by providing the type code in an array. For example, an array of integers are specified as
257
- # `[:INT64]`. Extracts the symbol.
257
+ # `[:INT64]`. Extracts the symbol/hash.
258
258
  def self.extract_array_type type
259
259
  return nil if type.nil?
260
- unless type.is_a?(Array) && type.count == 1 && type.first.is_a?(Symbol)
261
- raise ArgumentError, "types Array #{type.inspect} should include only a single symbol element."
260
+ unless type.is_a?(Array) && type.count == 1 && (type.first.is_a?(Symbol) || type.first.is_a?(Hash))
261
+ raise ArgumentError, "types Array #{type.inspect} should include only a single symbol or hash element."
262
262
  end
263
263
  type.first
264
264
  end
@@ -17,6 +17,24 @@ require "google/cloud/bigquery/encryption_configuration"
17
17
  module Google
18
18
  module Cloud
19
19
  module Bigquery
20
+ module OperationType
21
+ # Different operation types supported in table copy job.
22
+ # https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#operationtype
23
+
24
+ # The source and destination table have the same table type.
25
+ COPY = "COPY".freeze
26
+
27
+ # The source table type is TABLE and the destination table type is SNAPSHOT.
28
+ SNAPSHOT = "SNAPSHOT".freeze
29
+
30
+ # The source table type is SNAPSHOT and the destination table type is TABLE.
31
+ RESTORE = "RESTORE".freeze
32
+
33
+ # The source and destination table have the same table type, but only bill for
34
+ # unique data.
35
+ CLONE = "CLONE".freeze
36
+ end
37
+
20
38
  ##
21
39
  # # CopyJob
22
40
  #
@@ -46,23 +64,35 @@ module Google
46
64
  # The table from which data is copied. This is the table on
47
65
  # which {Table#copy_job} was called.
48
66
  #
67
+ # @param [String] view Specifies the view that determines which table information is returned.
68
+ # By default, basic table information and storage statistics (STORAGE_STATS) are returned.
69
+ # Accepted values include `:unspecified`, `:basic`, `:storage`, and
70
+ # `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
71
+ # The default value is the `:unspecified` view type.
72
+ #
49
73
  # @return [Table] A table instance.
50
74
  #
51
- def source
75
+ def source view: nil
52
76
  table = @gapi.configuration.copy.source_table
53
77
  return nil unless table
54
- retrieve_table table.project_id, table.dataset_id, table.table_id
78
+ retrieve_table table.project_id, table.dataset_id, table.table_id, metadata_view: view
55
79
  end
56
80
 
57
81
  ##
58
82
  # The table to which data is copied.
59
83
  #
84
+ # @param [String] view Specifies the view that determines which table information is returned.
85
+ # By default, basic table information and storage statistics (STORAGE_STATS) are returned.
86
+ # Accepted values include `:unspecified`, `:basic`, `:storage`, and
87
+ # `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
88
+ # The default value is the `:unspecified` view type.
89
+ #
60
90
  # @return [Table] A table instance.
61
91
  #
62
- def destination
92
+ def destination view: nil
63
93
  table = @gapi.configuration.copy.destination_table
64
94
  return nil unless table
65
- retrieve_table table.project_id, table.dataset_id, table.table_id
95
+ retrieve_table table.project_id, table.dataset_id, table.table_id, metadata_view: view
66
96
  end
67
97
 
68
98
  ##
@@ -157,7 +187,8 @@ module Google
157
187
  job_ref = service.job_ref_from options[:job_id], options[:prefix]
158
188
  copy_cfg = Google::Apis::BigqueryV2::JobConfigurationTableCopy.new(
159
189
  source_table: source,
160
- destination_table: target
190
+ destination_table: target,
191
+ operation_type: options[:operation_type]
161
192
  )
162
193
  req = Google::Apis::BigqueryV2::Job.new(
163
194
  job_reference: job_ref,
@@ -793,6 +793,11 @@ module Google
793
793
  # object without verifying that the resource exists on the BigQuery
794
794
  # service. Calls made on this object will raise errors if the resource
795
795
  # does not exist. Default is `false`. Optional.
796
+ # @param [String] view Specifies the view that determines which table information is returned.
797
+ # By default, basic table information and storage statistics (STORAGE_STATS) are returned.
798
+ # Accepted values include `:unspecified`, `:basic`, `:storage`, and
799
+ # `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
800
+ # The default value is the `:unspecified` view type.
796
801
  #
797
802
  # @return [Google::Cloud::Bigquery::Table, nil] Returns `nil` if the
798
803
  # table does not exist.
@@ -815,13 +820,22 @@ module Google
815
820
  #
816
821
  # table = dataset.table "my_table", skip_lookup: true
817
822
  #
823
+ # @example Avoid retrieving transient stats of the table with `view`:
824
+ # require "google/cloud/bigquery"
825
+ #
826
+ # bigquery = Google::Cloud::Bigquery.new
827
+ #
828
+ # dataset = bigquery.dataset "my_dataset"
829
+ #
830
+ # table = dataset.table "my_table", view: "basic"
831
+ #
818
832
  # @!group Table
819
833
  #
820
- def table table_id, skip_lookup: nil
834
+ def table table_id, skip_lookup: nil, view: nil
821
835
  ensure_service!
822
836
  return Table.new_reference project_id, dataset_id, table_id, service if skip_lookup
823
- gapi = service.get_table dataset_id, table_id
824
- Table.from_gapi gapi, service
837
+ gapi = service.get_table dataset_id, table_id, metadata_view: view
838
+ Table.from_gapi gapi, service, metadata_view: view
825
839
  rescue Google::Cloud::NotFoundError
826
840
  nil
827
841
  end
@@ -1816,7 +1830,7 @@ module Google
1816
1830
  # The following values are supported:
1817
1831
  #
1818
1832
  # * `csv` - CSV
1819
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
1833
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1820
1834
  # * `avro` - [Avro](http://avro.apache.org/)
1821
1835
  # * `sheets` - Google Sheets
1822
1836
  # * `datastore_backup` - Cloud Datastore backup
@@ -1879,7 +1893,7 @@ module Google
1879
1893
  # The following values are supported:
1880
1894
  #
1881
1895
  # * `csv` - CSV
1882
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
1896
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1883
1897
  # * `avro` - [Avro](http://avro.apache.org/)
1884
1898
  # * `orc` - [ORC](https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-orc)
1885
1899
  # * `parquet` - [Parquet](https://parquet.apache.org/)
@@ -2141,7 +2155,7 @@ module Google
2141
2155
  # The following values are supported:
2142
2156
  #
2143
2157
  # * `csv` - CSV
2144
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
2158
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
2145
2159
  # * `avro` - [Avro](http://avro.apache.org/)
2146
2160
  # * `orc` - [ORC](https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-orc)
2147
2161
  # * `parquet` - [Parquet](https://parquet.apache.org/)
@@ -2658,6 +2672,11 @@ module Google
2658
2672
  # messages before the batch is published. Default is 10.
2659
2673
  # @attr_reader [Numeric] threads The number of threads used to insert
2660
2674
  # batches of rows. Default is 4.
2675
+ # @param [String] view Specifies the view that determines which table information is returned.
2676
+ # By default, basic table information and storage statistics (STORAGE_STATS) are returned.
2677
+ # Accepted values include `:unspecified`, `:basic`, `:storage`, and
2678
+ # `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
2679
+ # The default value is the `:unspecified` view type.
2661
2680
  # @yield [response] the callback for when a batch of rows is inserted
2662
2681
  # @yieldparam [Table::AsyncInserter::Result] result the result of the
2663
2682
  # asynchronous insert
@@ -2686,13 +2705,35 @@ module Google
2686
2705
  #
2687
2706
  # inserter.stop.wait!
2688
2707
  #
2708
+ # @example Avoid retrieving transient stats of the table with while inserting :
2709
+ # require "google/cloud/bigquery"
2710
+ #
2711
+ # bigquery = Google::Cloud::Bigquery.new
2712
+ # dataset = bigquery.dataset "my_dataset"
2713
+ # inserter = dataset.insert_async("my_table", view: "basic") do |result|
2714
+ # if result.error?
2715
+ # log_error result.error
2716
+ # else
2717
+ # log_insert "inserted #{result.insert_count} rows " \
2718
+ # "with #{result.error_count} errors"
2719
+ # end
2720
+ # end
2721
+ #
2722
+ # rows = [
2723
+ # { "first_name" => "Alice", "age" => 21 },
2724
+ # { "first_name" => "Bob", "age" => 22 }
2725
+ # ]
2726
+ # inserter.insert rows
2727
+ #
2728
+ # inserter.stop.wait!
2729
+ #
2689
2730
  def insert_async table_id, skip_invalid: nil, ignore_unknown: nil, max_bytes: 10_000_000, max_rows: 500,
2690
- interval: 10, threads: 4, &block
2731
+ interval: 10, threads: 4, view: nil, &block
2691
2732
  ensure_service!
2692
2733
 
2693
2734
  # Get table, don't use Dataset#table which handles NotFoundError
2694
- gapi = service.get_table dataset_id, table_id
2695
- table = Table.from_gapi gapi, service
2735
+ gapi = service.get_table dataset_id, table_id, metadata_view: view
2736
+ table = Table.from_gapi gapi, service, metadata_view: view
2696
2737
  # Get the AsyncInserter from the table
2697
2738
  table.insert_async skip_invalid: skip_invalid,
2698
2739
  ignore_unknown: ignore_unknown,
@@ -2944,8 +2985,6 @@ module Google
2944
2985
  @access
2945
2986
  end
2946
2987
 
2947
- # rubocop:disable Style/MethodDefParentheses
2948
-
2949
2988
  ##
2950
2989
  # @raise [RuntimeError] not implemented
2951
2990
  def delete(*)
@@ -3049,8 +3088,6 @@ module Google
3049
3088
  end
3050
3089
  alias refresh! reload!
3051
3090
 
3052
- # rubocop:enable Style/MethodDefParentheses
3053
-
3054
3091
  ##
3055
3092
  # @private Make sure any access changes are saved
3056
3093
  def check_for_mutated_access!
@@ -65,11 +65,17 @@ module Google
65
65
  ##
66
66
  # The table or model which is exported.
67
67
  #
68
+ # @param [String] view Specifies the view that determines which table information is returned.
69
+ # By default, basic table information and storage statistics (STORAGE_STATS) are returned.
70
+ # Accepted values include `:unspecified`, `:basic`, `:storage`, and
71
+ # `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
72
+ # The default value is the `:unspecified` view type.
73
+ #
68
74
  # @return [Table, Model, nil] A table or model instance, or `nil`.
69
75
  #
70
- def source
76
+ def source view: nil
71
77
  if (table = @gapi.configuration.extract.source_table)
72
- retrieve_table table.project_id, table.dataset_id, table.table_id
78
+ retrieve_table table.project_id, table.dataset_id, table.table_id, metadata_view: view
73
79
  elsif (model = @gapi.configuration.extract.source_model)
74
80
  retrieve_model model.project_id, model.dataset_id, model.model_id
75
81
  end
@@ -108,7 +114,7 @@ module Google
108
114
 
109
115
  ##
110
116
  # Checks if the destination format for the table data is [newline-delimited
111
- # JSON](http://jsonlines.org/). The default is `false`. Not applicable when
117
+ # JSON](https://jsonlines.org/). The default is `false`. Not applicable when
112
118
  # extracting models.
113
119
  #
114
120
  # @return [Boolean] `true` when `NEWLINE_DELIMITED_JSON`, `false` if not
@@ -362,7 +368,7 @@ module Google
362
368
  # Supported values for tables:
363
369
  #
364
370
  # * `csv` - CSV
365
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
371
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
366
372
  # * `avro` - [Avro](http://avro.apache.org/)
367
373
  #
368
374
  # Supported values for models:
@@ -710,9 +710,9 @@ module Google
710
710
  raise "Must have active connection" unless service
711
711
  end
712
712
 
713
- def retrieve_table project_id, dataset_id, table_id
713
+ def retrieve_table project_id, dataset_id, table_id, metadata_view: nil
714
714
  ensure_service!
715
- gapi = service.get_project_table project_id, dataset_id, table_id
715
+ gapi = service.get_project_table project_id, dataset_id, table_id, metadata_view: metadata_view
716
716
  Table.from_gapi gapi, service
717
717
  rescue Google::Cloud::NotFoundError
718
718
  nil
@@ -62,12 +62,18 @@ module Google
62
62
  # The table into which the operation loads data. This is the table on
63
63
  # which {Table#load_job} was invoked.
64
64
  #
65
+ # @param [String] view Specifies the view that determines which table information is returned.
66
+ # By default, basic table information and storage statistics (STORAGE_STATS) are returned.
67
+ # Accepted values include `:unspecified`, `:basic`, `:storage`, and
68
+ # `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
69
+ # The default value is the `:unspecified` view type.
70
+ #
65
71
  # @return [Table] A table instance.
66
72
  #
67
- def destination
73
+ def destination view: nil
68
74
  table = @gapi.configuration.load.destination_table
69
75
  return nil unless table
70
- retrieve_table table.project_id, table.dataset_id, table.table_id
76
+ retrieve_table table.project_id, table.dataset_id, table.table_id, metadata_view: view
71
77
  end
72
78
 
73
79
  ##
@@ -188,7 +194,7 @@ module Google
188
194
 
189
195
  ##
190
196
  # Checks if the format of the source data is [newline-delimited
191
- # JSON](http://jsonlines.org/). The default is `false`.
197
+ # JSON](https://jsonlines.org/). The default is `false`.
192
198
  #
193
199
  # @return [Boolean] `true` when the source format is
194
200
  # `NEWLINE_DELIMITED_JSON`, `false` otherwise.
@@ -1269,7 +1275,7 @@ module Google
1269
1275
  # The following values are supported:
1270
1276
  #
1271
1277
  # * `csv` - CSV
1272
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
1278
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1273
1279
  # * `avro` - [Avro](http://avro.apache.org/)
1274
1280
  # * `orc` - [ORC](https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-orc)
1275
1281
  # * `parquet` - [Parquet](https://parquet.apache.org/)
@@ -961,7 +961,7 @@ module Google
961
961
  # The following values are supported:
962
962
  #
963
963
  # * `csv` - CSV
964
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
964
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
965
965
  # * `avro` - [Avro](http://avro.apache.org/)
966
966
  # * `sheets` - Google Sheets
967
967
  # * `datastore_backup` - Cloud Datastore backup
@@ -1554,7 +1554,7 @@ module Google
1554
1554
  # Supported values for tables:
1555
1555
  #
1556
1556
  # * `csv` - CSV
1557
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
1557
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1558
1558
  # * `avro` - [Avro](http://avro.apache.org/)
1559
1559
  #
1560
1560
  # Supported values for models:
@@ -1683,7 +1683,7 @@ module Google
1683
1683
  # Supported values for tables:
1684
1684
  #
1685
1685
  # * `csv` - CSV
1686
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
1686
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1687
1687
  # * `avro` - [Avro](http://avro.apache.org/)
1688
1688
  #
1689
1689
  # Supported values for models:
@@ -437,14 +437,21 @@ module Google
437
437
  ##
438
438
  # The table in which the query results are stored.
439
439
  #
440
+ # @param [String] view Specifies the view that determines which table information is returned.
441
+ # By default, basic table information and storage statistics (STORAGE_STATS) are returned.
442
+ # Accepted values include `:unspecified`, `:basic`, `:storage`, and
443
+ # `:full`. For more information, see [BigQuery Classes](@todo: Update the link).
444
+ # The default value is the `:unspecified` view type.
445
+ #
440
446
  # @return [Table] A table instance.
441
447
  #
442
- def destination
448
+ def destination view: nil
443
449
  table = @gapi.configuration.query.destination_table
444
450
  return nil unless table
445
451
  retrieve_table table.project_id,
446
452
  table.dataset_id,
447
- table.table_id
453
+ table.table_id,
454
+ metadata_view: view
448
455
  end
449
456
 
450
457
  ##
@@ -144,10 +144,11 @@ module Google
144
144
 
145
145
  ##
146
146
  # Gets the specified table resource by full table reference.
147
- def get_project_table project_id, dataset_id, table_id
147
+ def get_project_table project_id, dataset_id, table_id, metadata_view: nil
148
+ metadata_view = table_metadata_view_type_for metadata_view
148
149
  # The get operation is considered idempotent
149
150
  execute backoff: true do
150
- service.get_table project_id, dataset_id, table_id
151
+ service.get_table project_id, dataset_id, table_id, view: metadata_view
151
152
  end
152
153
  end
153
154
 
@@ -156,8 +157,8 @@ module Google
156
157
  # This method does not return the data in the table,
157
158
  # it only returns the table resource,
158
159
  # which describes the structure of this table.
159
- def get_table dataset_id, table_id
160
- get_project_table @project, dataset_id, table_id
160
+ def get_table dataset_id, table_id, metadata_view: nil
161
+ get_project_table @project, dataset_id, table_id, metadata_view: metadata_view
161
162
  end
162
163
 
163
164
  ##
@@ -572,6 +573,14 @@ module Google
572
573
  raise Google::Cloud::Error.from_error e
573
574
  end
574
575
 
576
+ def table_metadata_view_type_for str
577
+ return nil if str.nil?
578
+ { "unspecified" => "TABLE_METADATA_VIEW_UNSPECIFIED",
579
+ "basic" => "BASIC",
580
+ "storage" => "STORAGE_STATS",
581
+ "full" => "FULL" }[str.to_s.downcase]
582
+ end
583
+
575
584
  class Backoff
576
585
  class << self
577
586
  attr_accessor :retries
@@ -108,6 +108,10 @@ module Google
108
108
  # @private A Google API Client Table Reference object.
109
109
  attr_reader :reference
110
110
 
111
+ ##
112
+ # @private The metadata view type string.
113
+ attr_accessor :metadata_view
114
+
111
115
  ##
112
116
  # @private Create an empty Table object.
113
117
  def initialize
@@ -154,6 +158,45 @@ module Google
154
158
  @gapi.table_reference.project_id
155
159
  end
156
160
 
161
+ ##
162
+ # The type of the table like if its a TABLE, VIEW or SNAPSHOT etc.,
163
+ #
164
+ # @return [String, nil] Type of the table, or
165
+ # `nil` if the object is a reference (see {#reference?}).
166
+ #
167
+ # @!group Attributes
168
+ #
169
+ def type
170
+ return nil if reference?
171
+ @gapi.type
172
+ end
173
+
174
+ ##
175
+ # The Information about base table and snapshot time of the table.
176
+ #
177
+ # @return [Google::Apis::BigqueryV2::SnapshotDefinition, nil] Snapshot definition of table snapshot, or
178
+ # `nil` if not snapshot or the object is a reference (see {#reference?}).
179
+ #
180
+ # @!group Attributes
181
+ #
182
+ def snapshot_definition
183
+ return nil if reference?
184
+ @gapi.snapshot_definition
185
+ end
186
+
187
+ ##
188
+ # The Information about base table and clone time of the table.
189
+ #
190
+ # @return [Google::Apis::BigqueryV2::CloneDefinition, nil] Clone definition of table clone, or
191
+ # `nil` if not clone or the object is a reference (see {#reference?}).
192
+ #
193
+ # @!group Attributes
194
+ #
195
+ def clone_definition
196
+ return nil if reference?
197
+ @gapi.clone_definition
198
+ end
199
+
157
200
  ##
158
201
  # @private The gapi fragment containing the Project ID, Dataset ID, and
159
202
  # Table ID.
@@ -820,6 +863,40 @@ module Google
820
863
  @gapi.type == "VIEW"
821
864
  end
822
865
 
866
+ ##
867
+ # Checks if the table's type is `SNAPSHOT`, indicating that the table
868
+ # represents a BigQuery table snapshot.
869
+ #
870
+ # @see https://cloud.google.com/bigquery/docs/table-snapshots-intro
871
+ #
872
+ # @return [Boolean, nil] `true` when the type is `SNAPSHOT`, `false`
873
+ # otherwise, if the object is a resource (see {#resource?}); `nil` if
874
+ # the object is a reference (see {#reference?}).
875
+ #
876
+ # @!group Attributes
877
+ #
878
+ def snapshot?
879
+ return nil if reference?
880
+ @gapi.type == "SNAPSHOT"
881
+ end
882
+
883
+ ##
884
+ # Checks if the table's type is `CLONE`, indicating that the table
885
+ # represents a BigQuery table clone.
886
+ #
887
+ # @see https://cloud.google.com/bigquery/docs/table-clones-intro
888
+ #
889
+ # @return [Boolean, nil] `true` when the type is `CLONE`, `false`
890
+ # otherwise, if the object is a resource (see {#resource?}); `nil` if
891
+ # the object is a reference (see {#reference?}).
892
+ #
893
+ # @!group Attributes
894
+ #
895
+ def clone?
896
+ return nil if reference?
897
+ !@gapi.clone_definition.nil?
898
+ end
899
+
823
900
  ##
824
901
  # Checks if the table's type is `MATERIALIZED_VIEW`, indicating that
825
902
  # the table represents a BigQuery materialized view.
@@ -1697,9 +1774,16 @@ module Google
1697
1774
  #
1698
1775
  # @!group Data
1699
1776
  #
1700
- def copy_job destination_table, create: nil, write: nil, job_id: nil, prefix: nil, labels: nil, dryrun: nil
1777
+ def copy_job destination_table, create: nil, write: nil, job_id: nil, prefix: nil, labels: nil, dryrun: nil,
1778
+ operation_type: nil
1701
1779
  ensure_service!
1702
- options = { create: create, write: write, dryrun: dryrun, labels: labels, job_id: job_id, prefix: prefix }
1780
+ options = { create: create,
1781
+ write: write,
1782
+ dryrun: dryrun,
1783
+ labels: labels,
1784
+ job_id: job_id,
1785
+ prefix: prefix,
1786
+ operation_type: operation_type }
1703
1787
  updater = CopyJob::Updater.from_options(
1704
1788
  service,
1705
1789
  table_ref,
@@ -1780,10 +1864,195 @@ module Google
1780
1864
  # @!group Data
1781
1865
  #
1782
1866
  def copy destination_table, create: nil, write: nil, &block
1783
- job = copy_job destination_table, create: create, write: write, &block
1784
- job.wait_until_done!
1785
- ensure_job_succeeded! job
1786
- true
1867
+ copy_job_with_operation_type destination_table,
1868
+ create: create,
1869
+ write: write,
1870
+ operation_type: OperationType::COPY,
1871
+ &block
1872
+ end
1873
+
1874
+ ##
1875
+ # Clones the data from the table to another table using a synchronous
1876
+ # method that blocks for a response.
1877
+ # The source and destination table have the same table type, but only bill for
1878
+ # unique data.
1879
+ # Timeouts and transient errors are generally handled as needed to complete the job.
1880
+ # See also {#copy_job}.
1881
+ #
1882
+ # The geographic location for the job ("US", "EU", etc.) can be set via
1883
+ # {CopyJob::Updater#location=} in a block passed to this method. If the
1884
+ # table is a full resource representation (see {#resource_full?}), the
1885
+ # location of the job will be automatically set to the location of the
1886
+ # table.
1887
+ #
1888
+ # @param [Table, String] destination_table The destination for the
1889
+ # copied data. This can also be a string identifier as specified by
1890
+ # the [Standard SQL Query
1891
+ # Reference](https://cloud.google.com/bigquery/docs/reference/standard-sql/query-syntax#from-clause)
1892
+ # (`project-name.dataset_id.table_id`) or the [Legacy SQL Query
1893
+ # Reference](https://cloud.google.com/bigquery/query-reference#from)
1894
+ # (`project-name:dataset_id.table_id`). This is useful for referencing
1895
+ # tables in other projects and datasets.
1896
+ #
1897
+ # @yield [job] a job configuration object
1898
+ # @yieldparam [Google::Cloud::Bigquery::CopyJob::Updater] job a job
1899
+ # configuration object for setting additional options.
1900
+ #
1901
+ # @return [Boolean] Returns `true` if the copy operation succeeded.
1902
+ #
1903
+ # @example
1904
+ # require "google/cloud/bigquery"
1905
+ #
1906
+ # bigquery = Google::Cloud::Bigquery.new
1907
+ # dataset = bigquery.dataset "my_dataset"
1908
+ # table = dataset.table "my_table"
1909
+ # destination_table = dataset.table "my_destination_table"
1910
+ #
1911
+ # table.clone destination_table
1912
+ #
1913
+ # @example Passing a string identifier for the destination table:
1914
+ # require "google/cloud/bigquery"
1915
+ #
1916
+ # bigquery = Google::Cloud::Bigquery.new
1917
+ # dataset = bigquery.dataset "my_dataset"
1918
+ # table = dataset.table "my_table"
1919
+ #
1920
+ # table.clone "other-project:other_dataset.other_table"
1921
+ #
1922
+ # @!group Data
1923
+ #
1924
+ def clone destination_table, &block
1925
+ copy_job_with_operation_type destination_table,
1926
+ operation_type: OperationType::CLONE,
1927
+ &block
1928
+ end
1929
+
1930
+ ##
1931
+ # Takes snapshot of the data from the table to another table using a synchronous
1932
+ # method that blocks for a response.
1933
+ # The source table type is TABLE and the destination table type is SNAPSHOT.
1934
+ # Timeouts and transient errors are generally handled as needed to complete the job.
1935
+ # See also {#copy_job}.
1936
+ #
1937
+ # The geographic location for the job ("US", "EU", etc.) can be set via
1938
+ # {CopyJob::Updater#location=} in a block passed to this method. If the
1939
+ # table is a full resource representation (see {#resource_full?}), the
1940
+ # location of the job will be automatically set to the location of the
1941
+ # table.
1942
+ #
1943
+ # @param [Table, String] destination_table The destination for the
1944
+ # copied data. This can also be a string identifier as specified by
1945
+ # the [Standard SQL Query
1946
+ # Reference](https://cloud.google.com/bigquery/docs/reference/standard-sql/query-syntax#from-clause)
1947
+ # (`project-name.dataset_id.table_id`) or the [Legacy SQL Query
1948
+ # Reference](https://cloud.google.com/bigquery/query-reference#from)
1949
+ # (`project-name:dataset_id.table_id`). This is useful for referencing
1950
+ # tables in other projects and datasets.
1951
+ #
1952
+ # @yield [job] a job configuration object
1953
+ # @yieldparam [Google::Cloud::Bigquery::CopyJob::Updater] job a job
1954
+ # configuration object for setting additional options.
1955
+ #
1956
+ # @return [Boolean] Returns `true` if the copy operation succeeded.
1957
+ #
1958
+ # @example
1959
+ # require "google/cloud/bigquery"
1960
+ #
1961
+ # bigquery = Google::Cloud::Bigquery.new
1962
+ # dataset = bigquery.dataset "my_dataset"
1963
+ # table = dataset.table "my_table"
1964
+ # destination_table = dataset.table "my_destination_table"
1965
+ #
1966
+ # table.snapshot destination_table
1967
+ #
1968
+ # @example Passing a string identifier for the destination table:
1969
+ # require "google/cloud/bigquery"
1970
+ #
1971
+ # bigquery = Google::Cloud::Bigquery.new
1972
+ # dataset = bigquery.dataset "my_dataset"
1973
+ # table = dataset.table "my_table"
1974
+ #
1975
+ # table.snapshot "other-project:other_dataset.other_table"
1976
+ #
1977
+ # @!group Data
1978
+ #
1979
+ def snapshot destination_table, &block
1980
+ copy_job_with_operation_type destination_table,
1981
+ operation_type: OperationType::SNAPSHOT,
1982
+ &block
1983
+ end
1984
+
1985
+ ##
1986
+ # Restore the data from the table to another table using a synchronous
1987
+ # method that blocks for a response.
1988
+ # The source table type is SNAPSHOT and the destination table type is TABLE.
1989
+ # Timeouts and transient errors are generally handled as needed to complete the job.
1990
+ # See also {#copy_job}.
1991
+ #
1992
+ # The geographic location for the job ("US", "EU", etc.) can be set via
1993
+ # {CopyJob::Updater#location=} in a block passed to this method. If the
1994
+ # table is a full resource representation (see {#resource_full?}), the
1995
+ # location of the job will be automatically set to the location of the
1996
+ # table.
1997
+ #
1998
+ # @param [Table, String] destination_table The destination for the
1999
+ # copied data. This can also be a string identifier as specified by
2000
+ # the [Standard SQL Query
2001
+ # Reference](https://cloud.google.com/bigquery/docs/reference/standard-sql/query-syntax#from-clause)
2002
+ # (`project-name.dataset_id.table_id`) or the [Legacy SQL Query
2003
+ # Reference](https://cloud.google.com/bigquery/query-reference#from)
2004
+ # (`project-name:dataset_id.table_id`). This is useful for referencing
2005
+ # tables in other projects and datasets.
2006
+ # @param [String] create Specifies whether the job is allowed to create
2007
+ # new tables. The default value is `needed`.
2008
+ #
2009
+ # The following values are supported:
2010
+ #
2011
+ # * `needed` - Create the table if it does not exist.
2012
+ # * `never` - The table must already exist. A 'notFound' error is
2013
+ # raised if the table does not exist.
2014
+ # @param [String] write Specifies how to handle data already present in
2015
+ # the destination table. The default value is `empty`.
2016
+ #
2017
+ # The following values are supported:
2018
+ #
2019
+ # * `truncate` - BigQuery overwrites the table data.
2020
+ # * `append` - BigQuery appends the data to the table.
2021
+ # * `empty` - An error will be returned if the destination table
2022
+ # already contains data.
2023
+ # @yield [job] a job configuration object
2024
+ # @yieldparam [Google::Cloud::Bigquery::CopyJob::Updater] job a job
2025
+ # configuration object for setting additional options.
2026
+ #
2027
+ # @return [Boolean] Returns `true` if the copy operation succeeded.
2028
+ #
2029
+ # @example
2030
+ # require "google/cloud/bigquery"
2031
+ #
2032
+ # bigquery = Google::Cloud::Bigquery.new
2033
+ # dataset = bigquery.dataset "my_dataset"
2034
+ # table = dataset.table "my_table"
2035
+ # destination_table = dataset.table "my_destination_table"
2036
+ #
2037
+ # table.restore destination_table
2038
+ #
2039
+ # @example Passing a string identifier for the destination table:
2040
+ # require "google/cloud/bigquery"
2041
+ #
2042
+ # bigquery = Google::Cloud::Bigquery.new
2043
+ # dataset = bigquery.dataset "my_dataset"
2044
+ # table = dataset.table "my_table"
2045
+ #
2046
+ # table.restore "other-project:other_dataset.other_table"
2047
+ #
2048
+ # @!group Data
2049
+ #
2050
+ def restore destination_table, create: nil, write: nil, &block
2051
+ copy_job_with_operation_type destination_table,
2052
+ create: create,
2053
+ write: write,
2054
+ operation_type: OperationType::RESTORE,
2055
+ &block
1787
2056
  end
1788
2057
 
1789
2058
  ##
@@ -1812,7 +2081,7 @@ module Google
1812
2081
  # The following values are supported:
1813
2082
  #
1814
2083
  # * `csv` - CSV
1815
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
2084
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1816
2085
  # * `avro` - [Avro](http://avro.apache.org/)
1817
2086
  # @param [String] compression The compression type to use for exported
1818
2087
  # files. Possible values include `GZIP` and `NONE`. The default value
@@ -1915,7 +2184,7 @@ module Google
1915
2184
  # The following values are supported:
1916
2185
  #
1917
2186
  # * `csv` - CSV
1918
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
2187
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1919
2188
  # * `avro` - [Avro](http://avro.apache.org/)
1920
2189
  # @param [String] compression The compression type to use for exported
1921
2190
  # files. Possible values include `GZIP` and `NONE`. The default value
@@ -1986,7 +2255,7 @@ module Google
1986
2255
  # The following values are supported:
1987
2256
  #
1988
2257
  # * `csv` - CSV
1989
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
2258
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
1990
2259
  # * `avro` - [Avro](http://avro.apache.org/)
1991
2260
  # * `orc` - [ORC](https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-orc)
1992
2261
  # * `parquet` - [Parquet](https://parquet.apache.org/)
@@ -2199,7 +2468,7 @@ module Google
2199
2468
  # The following values are supported:
2200
2469
  #
2201
2470
  # * `csv` - CSV
2202
- # * `json` - [Newline-delimited JSON](http://jsonlines.org/)
2471
+ # * `json` - [Newline-delimited JSON](https://jsonlines.org/)
2203
2472
  # * `avro` - [Avro](http://avro.apache.org/)
2204
2473
  # * `orc` - [ORC](https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-orc)
2205
2474
  # * `parquet` - [Parquet](https://parquet.apache.org/)
@@ -2571,7 +2840,7 @@ module Google
2571
2840
  #
2572
2841
  def reload!
2573
2842
  ensure_service!
2574
- @gapi = service.get_table dataset_id, table_id
2843
+ @gapi = service.get_table dataset_id, table_id, metadata_view: metadata_view
2575
2844
  @reference = nil
2576
2845
  @exists = nil
2577
2846
  self
@@ -2705,10 +2974,11 @@ module Google
2705
2974
 
2706
2975
  ##
2707
2976
  # @private New Table from a Google API Client object.
2708
- def self.from_gapi gapi, service
2977
+ def self.from_gapi gapi, service, metadata_view: nil
2709
2978
  new.tap do |f|
2710
2979
  f.gapi = gapi
2711
2980
  f.service = service
2981
+ f.metadata_view = metadata_view
2712
2982
  end
2713
2983
  end
2714
2984
 
@@ -2739,6 +3009,17 @@ module Google
2739
3009
 
2740
3010
  protected
2741
3011
 
3012
+ def copy_job_with_operation_type destination_table, create: nil, write: nil, operation_type: nil, &block
3013
+ job = copy_job destination_table,
3014
+ create: create,
3015
+ write: write,
3016
+ operation_type: operation_type,
3017
+ &block
3018
+ job.wait_until_done!
3019
+ ensure_job_succeeded! job
3020
+ true
3021
+ end
3022
+
2742
3023
  ##
2743
3024
  # Raise an error unless an active service is available.
2744
3025
  def ensure_service!
@@ -3706,8 +3987,6 @@ module Google
3706
3987
  schema.record name, description: description, mode: mode, &block
3707
3988
  end
3708
3989
 
3709
- # rubocop:disable Style/MethodDefParentheses
3710
-
3711
3990
  ##
3712
3991
  # @raise [RuntimeError] not implemented
3713
3992
  def data(*)
@@ -3793,8 +4072,6 @@ module Google
3793
4072
  end
3794
4073
  alias refresh! reload!
3795
4074
 
3796
- # rubocop:enable Style/MethodDefParentheses
3797
-
3798
4075
  ##
3799
4076
  # @private Make sure any access changes are saved
3800
4077
  def check_for_mutated_schema!
@@ -16,7 +16,7 @@
16
16
  module Google
17
17
  module Cloud
18
18
  module Bigquery
19
- VERSION = "1.39.0".freeze
19
+ VERSION = "1.41.0".freeze
20
20
  end
21
21
  end
22
22
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.39.0
4
+ version: 1.41.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Mike Moore
@@ -9,7 +9,7 @@ authors:
9
9
  autorequire:
10
10
  bindir: bin
11
11
  cert_chain: []
12
- date: 2022-07-28 00:00:00.000000000 Z
12
+ date: 2023-01-06 00:00:00.000000000 Z
13
13
  dependencies:
14
14
  - !ruby/object:Gem::Dependency
15
15
  name: concurrent-ruby