google-cloud-bigquery-storage-v1 0.9.0 → 0.10.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 3321ce1006bfab7c7513dedf48c006e0f30a930a0b2bc5e1ae9dfe775dd276d4
4
- data.tar.gz: dc10dc964e656decaa880405ec57ce0aa2c61edef92cf29f897b252d8a1ba815
3
+ metadata.gz: 3a95d34616ffc1455f203b27be3825854e709526a5cfe3e4c72db03faa5351b2
4
+ data.tar.gz: 8af44f131742eaf8ce8f843d85509b7f89edfd5e4098a85770e78a537d1ab13a
5
5
  SHA512:
6
- metadata.gz: fa4151bc016d5daea090d012dc307b73dc05dd4006c760b42e330ec0267c075dbc2aaefc989ad1dda861e967c94b1eafa38fc2fe62fb50f54ae8d2dd74cb5b60
7
- data.tar.gz: ee14302ce6f9129d58897a79ac7e5d1eb6c13dd3a21c9d5e2edfbd062122653b2bdc4ac4d93b3ea23b0467b0e77c60658fffd16279b3000c9a6d57b6c8bbad69
6
+ metadata.gz: 77d2465381c89df36bfebfb685ff9ba73f7f7f099c61996277e44d8ed37b0f922f11cebfd52cf6e143923d0935d39f62a24d0e473718faa27c8c0d489e1955ff
7
+ data.tar.gz: fe39b9210c5843f6e3ad6fc5f5aaa068139dff0f941681b0b0003defdd566a13ac773fb0e66e7fbfa351acbec535a0d65a2904f0ba29e87ff7a59fde9fdf87b4
data/.yardopts CHANGED
@@ -1,5 +1,5 @@
1
1
  --no-private
2
- --title=BigQuery Storage V1 API
2
+ --title="BigQuery Storage V1 API"
3
3
  --exclude _pb\.rb$
4
4
  --markup markdown
5
5
  --markup-provider redcarpet
data/AUTHENTICATION.md CHANGED
@@ -120,15 +120,6 @@ To configure your system for this, simply:
120
120
  **NOTE:** This is _not_ recommended for running in production. The Cloud SDK
121
121
  *should* only be used during development.
122
122
 
123
- [gce-how-to]: https://cloud.google.com/compute/docs/authentication#using
124
- [dev-console]: https://console.cloud.google.com/project
125
-
126
- [enable-apis]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/enable-apis.png
127
-
128
- [create-new-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account.png
129
- [create-new-service-account-existing-keys]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account-existing-keys.png
130
- [reuse-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/reuse-service-account.png
131
-
132
123
  ## Creating a Service Account
133
124
 
134
125
  Google Cloud requires **Service Account Credentials** to
@@ -139,31 +130,22 @@ If you are not running this client within
139
130
  [Google Cloud Platform environments](#google-cloud-platform-environments), you
140
131
  need a Google Developers service account.
141
132
 
142
- 1. Visit the [Google Developers Console][dev-console].
133
+ 1. Visit the [Google Cloud Console](https://console.cloud.google.com/project).
143
134
  2. Create a new project or click on an existing project.
144
- 3. Activate the slide-out navigation tray and select **API Manager**. From
135
+ 3. Activate the menu in the upper left and select **APIs & Services**. From
145
136
  here, you will enable the APIs that your application requires.
146
137
 
147
- ![Enable the APIs that your application requires][enable-apis]
148
-
149
138
  *Note: You may need to enable billing in order to use these services.*
150
139
 
151
140
  4. Select **Credentials** from the side navigation.
152
141
 
153
- You should see a screen like one of the following.
154
-
155
- ![Create a new service account][create-new-service-account]
156
-
157
- ![Create a new service account With Existing Keys][create-new-service-account-existing-keys]
158
-
159
- Find the "Add credentials" drop down and select "Service account" to be
160
- guided through downloading a new JSON key file.
142
+ Find the "Create credentials" drop down near the top of the page, and select
143
+ "Service account" to be guided through downloading a new JSON key file.
161
144
 
162
145
  If you want to re-use an existing service account, you can easily generate a
163
- new key file. Just select the account you wish to re-use, and click "Generate
164
- new JSON key":
165
-
166
- ![Re-use an existing service account][reuse-service-account]
146
+ new key file. Just select the account you wish to re-use, click the pencil
147
+ tool on the right side to edit the service account, select the **Keys** tab,
148
+ and then select **Add Key**.
167
149
 
168
150
  The key file you download will be used by this library to authenticate API
169
151
  requests and should be stored in a secure location.
data/README.md CHANGED
@@ -37,7 +37,7 @@ request = ::Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest.new #
37
37
  response = client.create_read_session request
38
38
  ```
39
39
 
40
- View the [Client Library Documentation](https://googleapis.dev/ruby/google-cloud-bigquery-storage-v1/latest)
40
+ View the [Client Library Documentation](https://cloud.google.com/ruby/docs/reference/google-cloud-bigquery-storage-v1/latest)
41
41
  for class and method documentation.
42
42
 
43
43
  See also the [Product Documentation](https://cloud.google.com/bigquery/docs/reference/storage)
@@ -28,7 +28,6 @@ module Google
28
28
  class Credentials < ::Google::Auth::Credentials
29
29
  self.scope = [
30
30
  "https://www.googleapis.com/auth/bigquery",
31
- "https://www.googleapis.com/auth/bigquery.readonly",
32
31
  "https://www.googleapis.com/auth/cloud-platform"
33
32
  ]
34
33
  self.env_vars = [
@@ -306,6 +306,13 @@ module Google
306
306
  # finalized (via the `FinalizeWriteStream` rpc), and the stream is explicitly
307
307
  # committed via the `BatchCommitWriteStreams` rpc.
308
308
  #
309
+ # Note: For users coding against the gRPC api directly, it may be
310
+ # necessary to supply the x-goog-request-params system parameter
311
+ # with `write_stream=<full_write_stream_name>`.
312
+ #
313
+ # More information about system parameters:
314
+ # https://cloud.google.com/apis/docs/system-parameters
315
+ #
309
316
  # @param request [::Gapic::StreamInput, ::Enumerable<::Google::Cloud::Bigquery::Storage::V1::AppendRowsRequest, ::Hash>]
310
317
  # An enumerable of {::Google::Cloud::Bigquery::Storage::V1::AppendRowsRequest} instances.
311
318
  # @param options [::Gapic::CallOptions, ::Hash]
@@ -121,6 +121,8 @@ Google::Protobuf::DescriptorPool.generated_pool.build do
121
121
  value :INVALID_STREAM_STATE, 5
122
122
  value :STREAM_FINALIZED, 6
123
123
  value :SCHEMA_MISMATCH_EXTRA_FIELDS, 7
124
+ value :OFFSET_ALREADY_EXISTS, 8
125
+ value :OFFSET_OUT_OF_RANGE, 9
124
126
  end
125
127
  end
126
128
  end
@@ -134,6 +134,13 @@ module Google
134
134
  # * For PENDING streams, data is not made visible until the stream itself is
135
135
  # finalized (via the `FinalizeWriteStream` rpc), and the stream is explicitly
136
136
  # committed via the `BatchCommitWriteStreams` rpc.
137
+ #
138
+ # Note: For users coding against the gRPC api directly, it may be
139
+ # necessary to supply the x-goog-request-params system parameter
140
+ # with `write_stream=<full_write_stream_name>`.
141
+ #
142
+ # More information about system parameters:
143
+ # https://cloud.google.com/apis/docs/system-parameters
137
144
  rpc :AppendRows, stream(::Google::Cloud::Bigquery::Storage::V1::AppendRowsRequest), stream(::Google::Cloud::Bigquery::Storage::V1::AppendRowsResponse)
138
145
  # Gets information about a write stream.
139
146
  rpc :GetWriteStream, ::Google::Cloud::Bigquery::Storage::V1::GetWriteStreamRequest, ::Google::Cloud::Bigquery::Storage::V1::WriteStream
@@ -20,6 +20,7 @@ Google::Protobuf::DescriptorPool.generated_pool.build do
20
20
  optional :read_options, :message, 8, "google.cloud.bigquery.storage.v1.ReadSession.TableReadOptions"
21
21
  repeated :streams, :message, 10, "google.cloud.bigquery.storage.v1.ReadStream"
22
22
  optional :estimated_total_bytes_scanned, :int64, 12
23
+ optional :trace_id, :string, 13
23
24
  oneof :schema do
24
25
  optional :avro_schema, :message, 4, "google.cloud.bigquery.storage.v1.AvroSchema"
25
26
  optional :arrow_schema, :message, 5, "google.cloud.bigquery.storage.v1.ArrowSchema"
@@ -22,7 +22,7 @@ module Google
22
22
  module Bigquery
23
23
  module Storage
24
24
  module V1
25
- VERSION = "0.9.0"
25
+ VERSION = "0.10.0"
26
26
  end
27
27
  end
28
28
  end
@@ -27,6 +27,8 @@ module Google
27
27
  ##
28
28
  # To load this package, including all its services, and instantiate a client:
29
29
  #
30
+ # @example
31
+ #
30
32
  # require "google/cloud/bigquery/storage/v1"
31
33
  # client = ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
32
34
  #
@@ -33,11 +33,7 @@ module Google
33
33
  # // For Kubernetes resources, the format is {api group}/{kind}.
34
34
  # option (google.api.resource) = {
35
35
  # type: "pubsub.googleapis.com/Topic"
36
- # name_descriptor: {
37
- # pattern: "projects/{project}/topics/{topic}"
38
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
39
- # parent_name_extractor: "projects/{project}"
40
- # }
36
+ # pattern: "projects/{project}/topics/{topic}"
41
37
  # };
42
38
  # }
43
39
  #
@@ -45,10 +41,7 @@ module Google
45
41
  #
46
42
  # resources:
47
43
  # - type: "pubsub.googleapis.com/Topic"
48
- # name_descriptor:
49
- # - pattern: "projects/{project}/topics/{topic}"
50
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
51
- # parent_name_extractor: "projects/{project}"
44
+ # pattern: "projects/{project}/topics/{topic}"
52
45
  #
53
46
  # Sometimes, resources have multiple patterns, typically because they can
54
47
  # live under multiple parents.
@@ -58,26 +51,10 @@ module Google
58
51
  # message LogEntry {
59
52
  # option (google.api.resource) = {
60
53
  # type: "logging.googleapis.com/LogEntry"
61
- # name_descriptor: {
62
- # pattern: "projects/{project}/logs/{log}"
63
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
64
- # parent_name_extractor: "projects/{project}"
65
- # }
66
- # name_descriptor: {
67
- # pattern: "folders/{folder}/logs/{log}"
68
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
69
- # parent_name_extractor: "folders/{folder}"
70
- # }
71
- # name_descriptor: {
72
- # pattern: "organizations/{organization}/logs/{log}"
73
- # parent_type: "cloudresourcemanager.googleapis.com/Organization"
74
- # parent_name_extractor: "organizations/{organization}"
75
- # }
76
- # name_descriptor: {
77
- # pattern: "billingAccounts/{billing_account}/logs/{log}"
78
- # parent_type: "billing.googleapis.com/BillingAccount"
79
- # parent_name_extractor: "billingAccounts/{billing_account}"
80
- # }
54
+ # pattern: "projects/{project}/logs/{log}"
55
+ # pattern: "folders/{folder}/logs/{log}"
56
+ # pattern: "organizations/{organization}/logs/{log}"
57
+ # pattern: "billingAccounts/{billing_account}/logs/{log}"
81
58
  # };
82
59
  # }
83
60
  #
@@ -85,48 +62,10 @@ module Google
85
62
  #
86
63
  # resources:
87
64
  # - type: 'logging.googleapis.com/LogEntry'
88
- # name_descriptor:
89
- # - pattern: "projects/{project}/logs/{log}"
90
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
91
- # parent_name_extractor: "projects/{project}"
92
- # - pattern: "folders/{folder}/logs/{log}"
93
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
94
- # parent_name_extractor: "folders/{folder}"
95
- # - pattern: "organizations/{organization}/logs/{log}"
96
- # parent_type: "cloudresourcemanager.googleapis.com/Organization"
97
- # parent_name_extractor: "organizations/{organization}"
98
- # - pattern: "billingAccounts/{billing_account}/logs/{log}"
99
- # parent_type: "billing.googleapis.com/BillingAccount"
100
- # parent_name_extractor: "billingAccounts/{billing_account}"
101
- #
102
- # For flexible resources, the resource name doesn't contain parent names, but
103
- # the resource itself has parents for policy evaluation.
104
- #
105
- # Example:
106
- #
107
- # message Shelf {
108
- # option (google.api.resource) = {
109
- # type: "library.googleapis.com/Shelf"
110
- # name_descriptor: {
111
- # pattern: "shelves/{shelf}"
112
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
113
- # }
114
- # name_descriptor: {
115
- # pattern: "shelves/{shelf}"
116
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
117
- # }
118
- # };
119
- # }
120
- #
121
- # The ResourceDescriptor Yaml config will look like:
122
- #
123
- # resources:
124
- # - type: 'library.googleapis.com/Shelf'
125
- # name_descriptor:
126
- # - pattern: "shelves/{shelf}"
127
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
128
- # - pattern: "shelves/{shelf}"
129
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
65
+ # pattern: "projects/{project}/logs/{log}"
66
+ # pattern: "folders/{folder}/logs/{log}"
67
+ # pattern: "organizations/{organization}/logs/{log}"
68
+ # pattern: "billingAccounts/{billing_account}/logs/{log}"
130
69
  # @!attribute [rw] type
131
70
  # @return [::String]
132
71
  # The resource type. It must be in the format of
@@ -189,10 +189,12 @@ module Google
189
189
  # request.
190
190
  #
191
191
  # For explicitly created write streams, the format is:
192
- # `projects/{project}/datasets/{dataset}/tables/{table}/streams/{id}`
192
+ #
193
+ # * `projects/{project}/datasets/{dataset}/tables/{table}/streams/{id}`
193
194
  #
194
195
  # For the special default stream, the format is:
195
- # `projects/{project}/datasets/{dataset}/tables/{table}/_default`.
196
+ #
197
+ # * `projects/{project}/datasets/{dataset}/tables/{table}/streams/_default`.
196
198
  # @!attribute [rw] offset
197
199
  # @return [::Google::Protobuf::Int64Value]
198
200
  # If present, the write is only performed if the next append offset is same
@@ -403,6 +405,12 @@ module Google
403
405
  # There is a schema mismatch and it is caused by user schema has extra
404
406
  # field than bigquery schema.
405
407
  SCHEMA_MISMATCH_EXTRA_FIELDS = 7
408
+
409
+ # Offset already exists.
410
+ OFFSET_ALREADY_EXISTS = 8
411
+
412
+ # Offset out of range.
413
+ OFFSET_OUT_OF_RANGE = 9
406
414
  end
407
415
  end
408
416
  end
@@ -64,6 +64,14 @@ module Google
64
64
  # Output only. An estimate on the number of bytes this session will scan when
65
65
  # all streams are completely consumed. This estimate is based on
66
66
  # metadata from the table which might be incomplete or stale.
67
+ # @!attribute [rw] trace_id
68
+ # @return [::String]
69
+ # Optional. ID set by client to annotate a session identity. This does not need
70
+ # to be strictly unique, but instead the same ID should be used to group
71
+ # logically connected sessions (e.g. All using the same ID for all sessions
72
+ # needed to complete a Spark SQL query is reasonable).
73
+ #
74
+ # Maximum length is 256 bytes.
67
75
  class ReadSession
68
76
  include ::Google::Protobuf::MessageExts
69
77
  extend ::Google::Protobuf::MessageExts::ClassMethods
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery-storage-v1
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.9.0
4
+ version: 0.10.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Google LLC
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2021-12-08 00:00:00.000000000 Z
11
+ date: 2022-03-04 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: gapic-common
@@ -219,7 +219,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
219
219
  - !ruby/object:Gem::Version
220
220
  version: '0'
221
221
  requirements: []
222
- rubygems_version: 3.2.17
222
+ rubygems_version: 3.3.5
223
223
  signing_key:
224
224
  specification_version: 4
225
225
  summary: API Client library for the BigQuery Storage V1 API