google-cloud-bigquery-storage-v1 0.8.0 → 0.9.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 6693e2ad11b7bf193b9fc692f91d7ffb59251df9fa28ba3ea24f6e77e385bba1
4
- data.tar.gz: d52647196c983df0ae4cba7fe40267b3ef1d394693257a35a6eefa1b5f75502c
3
+ metadata.gz: dd6247e85a5ffc569f96771fba5c9a3662c25fbd53e6f8e2e9adb7aee9480376
4
+ data.tar.gz: 0b501022ceac5a5bf209143d5639eb229adaa604d2aeb7b591910642c20e3dd5
5
5
  SHA512:
6
- metadata.gz: 1a384535064401bb446fc3464358e076be4f6c99dd3d16e15e43e1d73c5d165dbc3cd2e3176dd9ccba0ca7eb728dca959bc26103e88ac433bd3fd3ce14edb4d1
7
- data.tar.gz: 338a6b1f618ec95fcd662d0b033b0543324e0de62660e51972a52d3872f7c7071387e055474abd2877c2cd75768407f35d3535e6064cafcd284673d815d4f20d
6
+ metadata.gz: b0018a10cb61fd33add4559fc6cd7c2323d79dd70244ffdaa0ee45d9ab79ab4139f0563739868c8fcf8be9ee5810a3ecb1378955e992f024a784ca8a0ef7f64c
7
+ data.tar.gz: 6f45f00046cb5b0723f7bfb214b5e7918ef5ecb151e0f3cc7768a2eeeb9fc1e0f1a4753cd89eb38208dc10552dbeb34bcaae3f49c774b1115c5ad91146935010
data/.yardopts CHANGED
@@ -1,5 +1,5 @@
1
1
  --no-private
2
- --title=BigQuery Storage V1 API
2
+ --title="BigQuery Storage V1 API"
3
3
  --exclude _pb\.rb$
4
4
  --markup markdown
5
5
  --markup-provider redcarpet
data/AUTHENTICATION.md CHANGED
@@ -120,15 +120,6 @@ To configure your system for this, simply:
120
120
  **NOTE:** This is _not_ recommended for running in production. The Cloud SDK
121
121
  *should* only be used during development.
122
122
 
123
- [gce-how-to]: https://cloud.google.com/compute/docs/authentication#using
124
- [dev-console]: https://console.cloud.google.com/project
125
-
126
- [enable-apis]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/enable-apis.png
127
-
128
- [create-new-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account.png
129
- [create-new-service-account-existing-keys]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/create-new-service-account-existing-keys.png
130
- [reuse-service-account]: https://raw.githubusercontent.com/GoogleCloudPlatform/gcloud-common/master/authentication/reuse-service-account.png
131
-
132
123
  ## Creating a Service Account
133
124
 
134
125
  Google Cloud requires **Service Account Credentials** to
@@ -139,31 +130,22 @@ If you are not running this client within
139
130
  [Google Cloud Platform environments](#google-cloud-platform-environments), you
140
131
  need a Google Developers service account.
141
132
 
142
- 1. Visit the [Google Developers Console][dev-console].
133
+ 1. Visit the [Google Cloud Console](https://console.cloud.google.com/project).
143
134
  2. Create a new project or click on an existing project.
144
- 3. Activate the slide-out navigation tray and select **API Manager**. From
135
+ 3. Activate the menu in the upper left and select **APIs & Services**. From
145
136
  here, you will enable the APIs that your application requires.
146
137
 
147
- ![Enable the APIs that your application requires][enable-apis]
148
-
149
138
  *Note: You may need to enable billing in order to use these services.*
150
139
 
151
140
  4. Select **Credentials** from the side navigation.
152
141
 
153
- You should see a screen like one of the following.
154
-
155
- ![Create a new service account][create-new-service-account]
156
-
157
- ![Create a new service account With Existing Keys][create-new-service-account-existing-keys]
158
-
159
- Find the "Add credentials" drop down and select "Service account" to be
160
- guided through downloading a new JSON key file.
142
+ Find the "Create credentials" drop down near the top of the page, and select
143
+ "Service account" to be guided through downloading a new JSON key file.
161
144
 
162
145
  If you want to re-use an existing service account, you can easily generate a
163
- new key file. Just select the account you wish to re-use, and click "Generate
164
- new JSON key":
165
-
166
- ![Re-use an existing service account][reuse-service-account]
146
+ new key file. Just select the account you wish to re-use, click the pencil
147
+ tool on the right side to edit the service account, select the **Keys** tab,
148
+ and then select **Add Key**.
167
149
 
168
150
  The key file you download will be used by this library to authenticate API
169
151
  requests and should be stored in a secure location.
data/README.md CHANGED
@@ -37,7 +37,7 @@ request = ::Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest.new #
37
37
  response = client.create_read_session request
38
38
  ```
39
39
 
40
- View the [Client Library Documentation](https://googleapis.dev/ruby/google-cloud-bigquery-storage-v1/latest)
40
+ View the [Client Library Documentation](https://cloud.google.com/ruby/docs/reference/google-cloud-bigquery-storage-v1/latest)
41
41
  for class and method documentation.
42
42
 
43
43
  See also the [Product Documentation](https://cloud.google.com/bigquery/docs/reference/storage)
@@ -221,6 +221,21 @@ module Google
221
221
  #
222
222
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
223
223
  #
224
+ # @example Basic example
225
+ # require "google/cloud/bigquery/storage/v1"
226
+ #
227
+ # # Create a client object. The client can be reused for multiple calls.
228
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
229
+ #
230
+ # # Create a request. To set request fields, pass in keyword arguments.
231
+ # request = Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest.new
232
+ #
233
+ # # Call the create_read_session method.
234
+ # result = client.create_read_session request
235
+ #
236
+ # # The returned object is of type Google::Cloud::Bigquery::Storage::V1::ReadSession.
237
+ # p result
238
+ #
224
239
  def create_read_session request, options = nil
225
240
  raise ::ArgumentError, "request must be provided" if request.nil?
226
241
 
@@ -238,9 +253,11 @@ module Google
238
253
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
239
254
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
240
255
 
241
- header_params = {
242
- "read_session.table" => request.read_session.table
243
- }
256
+ header_params = {}
257
+ if request.read_session&.table
258
+ header_params["read_session.table"] = request.read_session.table
259
+ end
260
+
244
261
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
245
262
  metadata[:"x-goog-request-params"] ||= request_params_header
246
263
 
@@ -299,6 +316,24 @@ module Google
299
316
  #
300
317
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
301
318
  #
319
+ # @example Basic example
320
+ # require "google/cloud/bigquery/storage/v1"
321
+ #
322
+ # # Create a client object. The client can be reused for multiple calls.
323
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
324
+ #
325
+ # # Create a request. To set request fields, pass in keyword arguments.
326
+ # request = Google::Cloud::Bigquery::Storage::V1::ReadRowsRequest.new
327
+ #
328
+ # # Call the read_rows method.
329
+ # result = client.read_rows request
330
+ #
331
+ # # The returned object is a streamed enumerable yielding elements of
332
+ # # type ::Google::Cloud::Bigquery::Storage::V1::ReadRowsResponse.
333
+ # result.each do |response|
334
+ # p response
335
+ # end
336
+ #
302
337
  def read_rows request, options = nil
303
338
  raise ::ArgumentError, "request must be provided" if request.nil?
304
339
 
@@ -316,9 +351,11 @@ module Google
316
351
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
317
352
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
318
353
 
319
- header_params = {
320
- "read_stream" => request.read_stream
321
- }
354
+ header_params = {}
355
+ if request.read_stream
356
+ header_params["read_stream"] = request.read_stream
357
+ end
358
+
322
359
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
323
360
  metadata[:"x-goog-request-params"] ||= request_params_header
324
361
 
@@ -386,6 +423,21 @@ module Google
386
423
  #
387
424
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
388
425
  #
426
+ # @example Basic example
427
+ # require "google/cloud/bigquery/storage/v1"
428
+ #
429
+ # # Create a client object. The client can be reused for multiple calls.
430
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client.new
431
+ #
432
+ # # Create a request. To set request fields, pass in keyword arguments.
433
+ # request = Google::Cloud::Bigquery::Storage::V1::SplitReadStreamRequest.new
434
+ #
435
+ # # Call the split_read_stream method.
436
+ # result = client.split_read_stream request
437
+ #
438
+ # # The returned object is of type Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse.
439
+ # p result
440
+ #
389
441
  def split_read_stream request, options = nil
390
442
  raise ::ArgumentError, "request must be provided" if request.nil?
391
443
 
@@ -403,9 +455,11 @@ module Google
403
455
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
404
456
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
405
457
 
406
- header_params = {
407
- "name" => request.name
408
- }
458
+ header_params = {}
459
+ if request.name
460
+ header_params["name"] = request.name
461
+ end
462
+
409
463
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
410
464
  metadata[:"x-goog-request-params"] ||= request_params_header
411
465
 
@@ -28,7 +28,6 @@ module Google
28
28
  class Credentials < ::Google::Auth::Credentials
29
29
  self.scope = [
30
30
  "https://www.googleapis.com/auth/bigquery",
31
- "https://www.googleapis.com/auth/bigquery.readonly",
32
31
  "https://www.googleapis.com/auth/cloud-platform"
33
32
  ]
34
33
  self.env_vars = [
@@ -217,6 +217,21 @@ module Google
217
217
  #
218
218
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
219
219
  #
220
+ # @example Basic example
221
+ # require "google/cloud/bigquery/storage/v1"
222
+ #
223
+ # # Create a client object. The client can be reused for multiple calls.
224
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryWrite::Client.new
225
+ #
226
+ # # Create a request. To set request fields, pass in keyword arguments.
227
+ # request = Google::Cloud::Bigquery::Storage::V1::CreateWriteStreamRequest.new
228
+ #
229
+ # # Call the create_write_stream method.
230
+ # result = client.create_write_stream request
231
+ #
232
+ # # The returned object is of type Google::Cloud::Bigquery::Storage::V1::WriteStream.
233
+ # p result
234
+ #
220
235
  def create_write_stream request, options = nil
221
236
  raise ::ArgumentError, "request must be provided" if request.nil?
222
237
 
@@ -234,9 +249,11 @@ module Google
234
249
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
235
250
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
236
251
 
237
- header_params = {
238
- "parent" => request.parent
239
- }
252
+ header_params = {}
253
+ if request.parent
254
+ header_params["parent"] = request.parent
255
+ end
256
+
240
257
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
241
258
  metadata[:"x-goog-request-params"] ||= request_params_header
242
259
 
@@ -302,6 +319,30 @@ module Google
302
319
  #
303
320
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
304
321
  #
322
+ # @example Basic example
323
+ # require "google/cloud/bigquery/storage/v1"
324
+ #
325
+ # # Create a client object. The client can be reused for multiple calls.
326
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryWrite::Client.new
327
+ #
328
+ # # Create an input stream
329
+ # input = Gapic::StreamInput.new
330
+ #
331
+ # # Call the append_rows method to start streaming.
332
+ # output = client.append_rows input
333
+ #
334
+ # # Send requests on the stream. For each request, pass in keyword
335
+ # # arguments to set fields. Be sure to close the stream when done.
336
+ # input << Google::Cloud::Bigquery::Storage::V1::AppendRowsRequest.new
337
+ # input << Google::Cloud::Bigquery::Storage::V1::AppendRowsRequest.new
338
+ # input.close
339
+ #
340
+ # # Handle streamed responses. These may be interleaved with inputs.
341
+ # # Each response is of type ::Google::Cloud::Bigquery::Storage::V1::AppendRowsResponse.
342
+ # output.each do |response|
343
+ # p response
344
+ # end
345
+ #
305
346
  def append_rows request, options = nil
306
347
  unless request.is_a? ::Enumerable
307
348
  raise ::ArgumentError, "request must be an Enumerable" unless request.respond_to? :to_enum
@@ -370,6 +411,21 @@ module Google
370
411
  #
371
412
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
372
413
  #
414
+ # @example Basic example
415
+ # require "google/cloud/bigquery/storage/v1"
416
+ #
417
+ # # Create a client object. The client can be reused for multiple calls.
418
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryWrite::Client.new
419
+ #
420
+ # # Create a request. To set request fields, pass in keyword arguments.
421
+ # request = Google::Cloud::Bigquery::Storage::V1::GetWriteStreamRequest.new
422
+ #
423
+ # # Call the get_write_stream method.
424
+ # result = client.get_write_stream request
425
+ #
426
+ # # The returned object is of type Google::Cloud::Bigquery::Storage::V1::WriteStream.
427
+ # p result
428
+ #
373
429
  def get_write_stream request, options = nil
374
430
  raise ::ArgumentError, "request must be provided" if request.nil?
375
431
 
@@ -387,9 +443,11 @@ module Google
387
443
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
388
444
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
389
445
 
390
- header_params = {
391
- "name" => request.name
392
- }
446
+ header_params = {}
447
+ if request.name
448
+ header_params["name"] = request.name
449
+ end
450
+
393
451
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
394
452
  metadata[:"x-goog-request-params"] ||= request_params_header
395
453
 
@@ -440,6 +498,21 @@ module Google
440
498
  #
441
499
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
442
500
  #
501
+ # @example Basic example
502
+ # require "google/cloud/bigquery/storage/v1"
503
+ #
504
+ # # Create a client object. The client can be reused for multiple calls.
505
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryWrite::Client.new
506
+ #
507
+ # # Create a request. To set request fields, pass in keyword arguments.
508
+ # request = Google::Cloud::Bigquery::Storage::V1::FinalizeWriteStreamRequest.new
509
+ #
510
+ # # Call the finalize_write_stream method.
511
+ # result = client.finalize_write_stream request
512
+ #
513
+ # # The returned object is of type Google::Cloud::Bigquery::Storage::V1::FinalizeWriteStreamResponse.
514
+ # p result
515
+ #
443
516
  def finalize_write_stream request, options = nil
444
517
  raise ::ArgumentError, "request must be provided" if request.nil?
445
518
 
@@ -457,9 +530,11 @@ module Google
457
530
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
458
531
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
459
532
 
460
- header_params = {
461
- "name" => request.name
462
- }
533
+ header_params = {}
534
+ if request.name
535
+ header_params["name"] = request.name
536
+ end
537
+
463
538
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
464
539
  metadata[:"x-goog-request-params"] ||= request_params_header
465
540
 
@@ -516,6 +591,21 @@ module Google
516
591
  #
517
592
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
518
593
  #
594
+ # @example Basic example
595
+ # require "google/cloud/bigquery/storage/v1"
596
+ #
597
+ # # Create a client object. The client can be reused for multiple calls.
598
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryWrite::Client.new
599
+ #
600
+ # # Create a request. To set request fields, pass in keyword arguments.
601
+ # request = Google::Cloud::Bigquery::Storage::V1::BatchCommitWriteStreamsRequest.new
602
+ #
603
+ # # Call the batch_commit_write_streams method.
604
+ # result = client.batch_commit_write_streams request
605
+ #
606
+ # # The returned object is of type Google::Cloud::Bigquery::Storage::V1::BatchCommitWriteStreamsResponse.
607
+ # p result
608
+ #
519
609
  def batch_commit_write_streams request, options = nil
520
610
  raise ::ArgumentError, "request must be provided" if request.nil?
521
611
 
@@ -533,9 +623,11 @@ module Google
533
623
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
534
624
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
535
625
 
536
- header_params = {
537
- "parent" => request.parent
538
- }
626
+ header_params = {}
627
+ if request.parent
628
+ header_params["parent"] = request.parent
629
+ end
630
+
539
631
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
540
632
  metadata[:"x-goog-request-params"] ||= request_params_header
541
633
 
@@ -594,6 +686,21 @@ module Google
594
686
  #
595
687
  # @raise [::Google::Cloud::Error] if the RPC is aborted.
596
688
  #
689
+ # @example Basic example
690
+ # require "google/cloud/bigquery/storage/v1"
691
+ #
692
+ # # Create a client object. The client can be reused for multiple calls.
693
+ # client = Google::Cloud::Bigquery::Storage::V1::BigQueryWrite::Client.new
694
+ #
695
+ # # Create a request. To set request fields, pass in keyword arguments.
696
+ # request = Google::Cloud::Bigquery::Storage::V1::FlushRowsRequest.new
697
+ #
698
+ # # Call the flush_rows method.
699
+ # result = client.flush_rows request
700
+ #
701
+ # # The returned object is of type Google::Cloud::Bigquery::Storage::V1::FlushRowsResponse.
702
+ # p result
703
+ #
597
704
  def flush_rows request, options = nil
598
705
  raise ::ArgumentError, "request must be provided" if request.nil?
599
706
 
@@ -611,9 +718,11 @@ module Google
611
718
  gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION
612
719
  metadata[:"x-goog-user-project"] = @quota_project_id if @quota_project_id
613
720
 
614
- header_params = {
615
- "write_stream" => request.write_stream
616
- }
721
+ header_params = {}
722
+ if request.write_stream
723
+ header_params["write_stream"] = request.write_stream
724
+ end
725
+
617
726
  request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&")
618
727
  metadata[:"x-goog-request-params"] ||= request_params_header
619
728
 
@@ -44,6 +44,7 @@ Google::Protobuf::DescriptorPool.generated_pool.build do
44
44
  optional :create_time, :message, 3, "google.protobuf.Timestamp"
45
45
  optional :commit_time, :message, 4, "google.protobuf.Timestamp"
46
46
  optional :table_schema, :message, 5, "google.cloud.bigquery.storage.v1.TableSchema"
47
+ optional :write_mode, :enum, 7, "google.cloud.bigquery.storage.v1.WriteStream.WriteMode"
47
48
  end
48
49
  add_enum "google.cloud.bigquery.storage.v1.WriteStream.Type" do
49
50
  value :TYPE_UNSPECIFIED, 0
@@ -51,6 +52,10 @@ Google::Protobuf::DescriptorPool.generated_pool.build do
51
52
  value :PENDING, 2
52
53
  value :BUFFERED, 3
53
54
  end
55
+ add_enum "google.cloud.bigquery.storage.v1.WriteStream.WriteMode" do
56
+ value :WRITE_MODE_UNSPECIFIED, 0
57
+ value :INSERT, 1
58
+ end
54
59
  add_enum "google.cloud.bigquery.storage.v1.DataFormat" do
55
60
  value :DATA_FORMAT_UNSPECIFIED, 0
56
61
  value :AVRO, 1
@@ -70,6 +75,7 @@ module Google
70
75
  ReadStream = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.storage.v1.ReadStream").msgclass
71
76
  WriteStream = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.storage.v1.WriteStream").msgclass
72
77
  WriteStream::Type = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.storage.v1.WriteStream.Type").enummodule
78
+ WriteStream::WriteMode = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.storage.v1.WriteStream.WriteMode").enummodule
73
79
  DataFormat = ::Google::Protobuf::DescriptorPool.generated_pool.lookup("google.cloud.bigquery.storage.v1.DataFormat").enummodule
74
80
  end
75
81
  end
@@ -22,7 +22,7 @@ module Google
22
22
  module Bigquery
23
23
  module Storage
24
24
  module V1
25
- VERSION = "0.8.0"
25
+ VERSION = "0.9.2"
26
26
  end
27
27
  end
28
28
  end
@@ -33,11 +33,7 @@ module Google
33
33
  # // For Kubernetes resources, the format is {api group}/{kind}.
34
34
  # option (google.api.resource) = {
35
35
  # type: "pubsub.googleapis.com/Topic"
36
- # name_descriptor: {
37
- # pattern: "projects/{project}/topics/{topic}"
38
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
39
- # parent_name_extractor: "projects/{project}"
40
- # }
36
+ # pattern: "projects/{project}/topics/{topic}"
41
37
  # };
42
38
  # }
43
39
  #
@@ -45,10 +41,7 @@ module Google
45
41
  #
46
42
  # resources:
47
43
  # - type: "pubsub.googleapis.com/Topic"
48
- # name_descriptor:
49
- # - pattern: "projects/{project}/topics/{topic}"
50
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
51
- # parent_name_extractor: "projects/{project}"
44
+ # pattern: "projects/{project}/topics/{topic}"
52
45
  #
53
46
  # Sometimes, resources have multiple patterns, typically because they can
54
47
  # live under multiple parents.
@@ -58,26 +51,10 @@ module Google
58
51
  # message LogEntry {
59
52
  # option (google.api.resource) = {
60
53
  # type: "logging.googleapis.com/LogEntry"
61
- # name_descriptor: {
62
- # pattern: "projects/{project}/logs/{log}"
63
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
64
- # parent_name_extractor: "projects/{project}"
65
- # }
66
- # name_descriptor: {
67
- # pattern: "folders/{folder}/logs/{log}"
68
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
69
- # parent_name_extractor: "folders/{folder}"
70
- # }
71
- # name_descriptor: {
72
- # pattern: "organizations/{organization}/logs/{log}"
73
- # parent_type: "cloudresourcemanager.googleapis.com/Organization"
74
- # parent_name_extractor: "organizations/{organization}"
75
- # }
76
- # name_descriptor: {
77
- # pattern: "billingAccounts/{billing_account}/logs/{log}"
78
- # parent_type: "billing.googleapis.com/BillingAccount"
79
- # parent_name_extractor: "billingAccounts/{billing_account}"
80
- # }
54
+ # pattern: "projects/{project}/logs/{log}"
55
+ # pattern: "folders/{folder}/logs/{log}"
56
+ # pattern: "organizations/{organization}/logs/{log}"
57
+ # pattern: "billingAccounts/{billing_account}/logs/{log}"
81
58
  # };
82
59
  # }
83
60
  #
@@ -85,48 +62,10 @@ module Google
85
62
  #
86
63
  # resources:
87
64
  # - type: 'logging.googleapis.com/LogEntry'
88
- # name_descriptor:
89
- # - pattern: "projects/{project}/logs/{log}"
90
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
91
- # parent_name_extractor: "projects/{project}"
92
- # - pattern: "folders/{folder}/logs/{log}"
93
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
94
- # parent_name_extractor: "folders/{folder}"
95
- # - pattern: "organizations/{organization}/logs/{log}"
96
- # parent_type: "cloudresourcemanager.googleapis.com/Organization"
97
- # parent_name_extractor: "organizations/{organization}"
98
- # - pattern: "billingAccounts/{billing_account}/logs/{log}"
99
- # parent_type: "billing.googleapis.com/BillingAccount"
100
- # parent_name_extractor: "billingAccounts/{billing_account}"
101
- #
102
- # For flexible resources, the resource name doesn't contain parent names, but
103
- # the resource itself has parents for policy evaluation.
104
- #
105
- # Example:
106
- #
107
- # message Shelf {
108
- # option (google.api.resource) = {
109
- # type: "library.googleapis.com/Shelf"
110
- # name_descriptor: {
111
- # pattern: "shelves/{shelf}"
112
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
113
- # }
114
- # name_descriptor: {
115
- # pattern: "shelves/{shelf}"
116
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
117
- # }
118
- # };
119
- # }
120
- #
121
- # The ResourceDescriptor Yaml config will look like:
122
- #
123
- # resources:
124
- # - type: 'library.googleapis.com/Shelf'
125
- # name_descriptor:
126
- # - pattern: "shelves/{shelf}"
127
- # parent_type: "cloudresourcemanager.googleapis.com/Project"
128
- # - pattern: "shelves/{shelf}"
129
- # parent_type: "cloudresourcemanager.googleapis.com/Folder"
65
+ # pattern: "projects/{project}/logs/{log}"
66
+ # pattern: "folders/{folder}/logs/{log}"
67
+ # pattern: "organizations/{organization}/logs/{log}"
68
+ # pattern: "billingAccounts/{billing_account}/logs/{log}"
130
69
  # @!attribute [rw] type
131
70
  # @return [::String]
132
71
  # The resource type. It must be in the format of
@@ -141,6 +141,9 @@ module Google
141
141
  # `CreateWriteStream` response. Caller should generate data that's
142
142
  # compatible with this schema to send in initial `AppendRowsRequest`.
143
143
  # The table schema could go out of date during the life time of the stream.
144
+ # @!attribute [rw] write_mode
145
+ # @return [::Google::Cloud::Bigquery::Storage::V1::WriteStream::WriteMode]
146
+ # Immutable. Mode of the stream.
144
147
  class WriteStream
145
148
  include ::Google::Protobuf::MessageExts
146
149
  extend ::Google::Protobuf::MessageExts::ClassMethods
@@ -160,6 +163,16 @@ module Google
160
163
  # Data is only visible up to the offset to which it was flushed.
161
164
  BUFFERED = 3
162
165
  end
166
+
167
+ # Mode enum of the stream.
168
+ module WriteMode
169
+ # Unknown type.
170
+ WRITE_MODE_UNSPECIFIED = 0
171
+
172
+ # Insert new records into the table.
173
+ # It is the default value if customers do not specify it.
174
+ INSERT = 1
175
+ end
163
176
  end
164
177
 
165
178
  # Data format for input or output data.
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery-storage-v1
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.8.0
4
+ version: 0.9.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Google LLC
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2021-10-18 00:00:00.000000000 Z
11
+ date: 2022-02-18 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: gapic-common
@@ -219,7 +219,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
219
219
  - !ruby/object:Gem::Version
220
220
  version: '0'
221
221
  requirements: []
222
- rubygems_version: 3.2.17
222
+ rubygems_version: 3.3.5
223
223
  signing_key:
224
224
  specification_version: 4
225
225
  summary: API Client library for the BigQuery Storage V1 API