google-cloud-bigquery-data_transfer 0.2.2 → 0.2.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 33aea81129379f3601b6ba0f86f43963dc5e9ef660bbb7c297e3e64ae7072843
4
- data.tar.gz: c8b2d67e1f3a58aafc4f79d794ec68188b6e712f8d5dd64f007bbca61711bb09
3
+ metadata.gz: 657a3f3f47798f10e04643f1b87cfee8fcb1eb32b9ad0f60de1692efb20b12c6
4
+ data.tar.gz: 78517e940938cee138dd28078d96f0989f19890a6a4156e570d2741bf21c3fbc
5
5
  SHA512:
6
- metadata.gz: 4132dcbe28e774b73706398ec5677c2bd112ecd0fa0e018749aaec3febf303f6dd150cf4192604f5183a30062796ae88d0ff72a2f0eda848cc9a8daff3538d2f
7
- data.tar.gz: 438739124cbef463ac281a9320715431ca649e1d646a9c4caa08e81e19700698e108a06ff49a30b48a108aa3f0e48c2c98cf1fdcbc08eba7a63aa721a9d501f1
6
+ metadata.gz: 61e1693d79ad694f24b4bf79b911ee6175947e2fdf69812354ae4011309168430634949839be8aa71224b60ffbe5ecc313839ce0bf5edfb05734fdc35d9ff450
7
+ data.tar.gz: 1de3b9a686d0a7d445e92b9da24268d3ec51e845b4a26c835f0618af25c7b83d5f0d6a2a08cd9815cf9d8bc9927109aeecad8dc45f0644129cdaecf257a77310
data/README.md CHANGED
@@ -1,4 +1,4 @@
1
- # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/GoogleCloudPlatform/google-cloud-ruby#versioning))
1
+ # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/googleapis/google-cloud-ruby#versioning))
2
2
 
3
3
  [BigQuery Data Transfer API][Product Documentation]:
4
4
  Transfers data from partner SaaS applications to Google BigQuery on a
@@ -13,7 +13,7 @@ steps:
13
13
  1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
14
14
  2. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
15
15
  3. [Enable the BigQuery Data Transfer API.](https://console.cloud.google.com/apis/library/bigquerydatatransfer.googleapis.com)
16
- 4. [Setup Authentication.](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
16
+ 4. [Setup Authentication.](https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
17
17
 
18
18
  ### Installation
19
19
  ```
@@ -47,17 +47,17 @@ end
47
47
  to see other available methods on the client.
48
48
  - Read the [BigQuery Data Transfer API Product documentation][Product Documentation]
49
49
  to learn more about the product and see How-to Guides.
50
- - View this [repository's main README](https://github.com/GoogleCloudPlatform/google-cloud-ruby/blob/master/README.md)
50
+ - View this [repository's main README](https://github.com/googleapis/google-cloud-ruby/blob/master/README.md)
51
51
  to see the full list of Cloud APIs that we cover.
52
52
 
53
- [Client Library Documentation]: https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-bigquery-data_transfer/latest/google/cloud/bigquery/datatransfer/v1
53
+ [Client Library Documentation]: https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud-bigquery-data_transfer/latest/google/cloud/bigquery/datatransfer/v1
54
54
  [Product Documentation]: https://cloud.google.com/bigquery/transfer/
55
55
 
56
56
  ## Enabling Logging
57
57
 
58
58
  To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
59
59
  The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib-2.5.0/libdoc/logger/rdoc/Logger.html) as shown below,
60
- or a [`Google::Cloud::Logging::Logger`](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
60
+ or a [`Google::Cloud::Logging::Logger`](https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
61
61
  that will write logs to [Stackdriver Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
62
62
  and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
63
63
 
@@ -22,7 +22,7 @@ module Google
22
22
  # rubocop:disable LineLength
23
23
 
24
24
  ##
25
- # # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/GoogleCloudPlatform/google-cloud-ruby#versioning))
25
+ # # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/googleapis/google-cloud-ruby#versioning))
26
26
  #
27
27
  # [BigQuery Data Transfer API][Product Documentation]:
28
28
  # Transfers data from partner SaaS applications to Google BigQuery on a
@@ -36,7 +36,7 @@ module Google
36
36
  # 1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
37
37
  # 2. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
38
38
  # 3. [Enable the BigQuery Data Transfer API.](https://console.cloud.google.com/apis/library/bigquerydatatransfer.googleapis.com)
39
- # 4. [Setup Authentication.](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
39
+ # 4. [Setup Authentication.](https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
40
40
  #
41
41
  # ### Installation
42
42
  # ```
@@ -68,7 +68,7 @@ module Google
68
68
  # ### Next Steps
69
69
  # - Read the [BigQuery Data Transfer API Product documentation][Product Documentation]
70
70
  # to learn more about the product and see How-to Guides.
71
- # - View this [repository's main README](https://github.com/GoogleCloudPlatform/google-cloud-ruby/blob/master/README.md)
71
+ # - View this [repository's main README](https://github.com/googleapis/google-cloud-ruby/blob/master/README.md)
72
72
  # to see the full list of Cloud APIs that we cover.
73
73
  #
74
74
  # [Product Documentation]: https://cloud.google.com/bigquery/transfer/
@@ -77,7 +77,7 @@ module Google
77
77
  #
78
78
  # To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
79
79
  # The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib-2.5.0/libdoc/logger/rdoc/Logger.html) as shown below,
80
- # or a [`Google::Cloud::Logging::Logger`](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
80
+ # or a [`Google::Cloud::Logging::Logger`](https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
81
81
  # that will write logs to [Stackdriver Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
82
82
  # and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
83
83
  #
@@ -22,7 +22,7 @@ module Google
22
22
  # rubocop:disable LineLength
23
23
 
24
24
  ##
25
- # # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/GoogleCloudPlatform/google-cloud-ruby#versioning))
25
+ # # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/googleapis/google-cloud-ruby#versioning))
26
26
  #
27
27
  # [BigQuery Data Transfer API][Product Documentation]:
28
28
  # Transfers data from partner SaaS applications to Google BigQuery on a
@@ -36,7 +36,7 @@ module Google
36
36
  # 1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
37
37
  # 2. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
38
38
  # 3. [Enable the BigQuery Data Transfer API.](https://console.cloud.google.com/apis/library/bigquerydatatransfer.googleapis.com)
39
- # 4. [Setup Authentication.](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
39
+ # 4. [Setup Authentication.](https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
40
40
  #
41
41
  # ### Installation
42
42
  # ```
@@ -68,7 +68,7 @@ module Google
68
68
  # ### Next Steps
69
69
  # - Read the [BigQuery Data Transfer API Product documentation][Product Documentation]
70
70
  # to learn more about the product and see How-to Guides.
71
- # - View this [repository's main README](https://github.com/GoogleCloudPlatform/google-cloud-ruby/blob/master/README.md)
71
+ # - View this [repository's main README](https://github.com/googleapis/google-cloud-ruby/blob/master/README.md)
72
72
  # to see the full list of Cloud APIs that we cover.
73
73
  #
74
74
  # [Product Documentation]: https://cloud.google.com/bigquery/transfer/
@@ -77,7 +77,7 @@ module Google
77
77
  #
78
78
  # To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
79
79
  # The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib-2.5.0/libdoc/logger/rdoc/Logger.html) as shown below,
80
- # or a [`Google::Cloud::Logging::Logger`](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
80
+ # or a [`Google::Cloud::Logging::Logger`](https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
81
81
  # that will write logs to [Stackdriver Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
82
82
  # and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
83
83
  #
@@ -367,7 +367,7 @@ module Google
367
367
  #
368
368
  # @param name [String]
369
369
  # The field will contain name of the resource requested, for example:
370
- # +projects/\\{project_id}/dataSources/\\{data_source_id}+
370
+ # `projects/{project_id}/dataSources/{data_source_id}`
371
371
  # @param options [Google::Gax::CallOptions]
372
372
  # Overrides the default settings for this call, e.g, timeout,
373
373
  # retries, etc.
@@ -399,7 +399,7 @@ module Google
399
399
  #
400
400
  # @param parent [String]
401
401
  # The BigQuery project id for which data sources should be returned.
402
- # Must be in the form: +projects/\\{project_id}+
402
+ # Must be in the form: `projects/{project_id}`
403
403
  # @param page_size [Integer]
404
404
  # The maximum number of resources contained in the underlying API
405
405
  # response. If page streaming is performed per-resource, this
@@ -464,7 +464,7 @@ module Google
464
464
  # @param authorization_code [String]
465
465
  # Optional OAuth2 authorization code to use with this transfer configuration.
466
466
  # This is required if new credentials are needed, as indicated by
467
- # +CheckValidCreds+.
467
+ # `CheckValidCreds`.
468
468
  # In order to obtain authorization_code, please make a
469
469
  # request to
470
470
  # https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?client_id=<datatransferapiclientid>&scope=<data_source_scopes>&redirect_uri=<redirect_uri>
@@ -492,7 +492,7 @@ module Google
492
492
  # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
493
493
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path("[PROJECT]")
494
494
  #
495
- # # TODO: Initialize +transfer_config+:
495
+ # # TODO: Initialize `transfer_config`:
496
496
  # transfer_config = {}
497
497
  # response = data_transfer_service_client.create_transfer_config(formatted_parent, transfer_config)
498
498
 
@@ -552,10 +552,10 @@ module Google
552
552
  #
553
553
  # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
554
554
  #
555
- # # TODO: Initialize +transfer_config+:
555
+ # # TODO: Initialize `transfer_config`:
556
556
  # transfer_config = {}
557
557
  #
558
- # # TODO: Initialize +update_mask+:
558
+ # # TODO: Initialize `update_mask`:
559
559
  # update_mask = {}
560
560
  # response = data_transfer_service_client.update_transfer_config(transfer_config, update_mask)
561
561
 
@@ -579,7 +579,7 @@ module Google
579
579
  #
580
580
  # @param name [String]
581
581
  # The field will contain name of the resource requested, for example:
582
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+
582
+ # `projects/{project_id}/transferConfigs/{config_id}`
583
583
  # @param options [Google::Gax::CallOptions]
584
584
  # Overrides the default settings for this call, e.g, timeout,
585
585
  # retries, etc.
@@ -610,7 +610,7 @@ module Google
610
610
  #
611
611
  # @param name [String]
612
612
  # The field will contain name of the resource requested, for example:
613
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+
613
+ # `projects/{project_id}/transferConfigs/{config_id}`
614
614
  # @param options [Google::Gax::CallOptions]
615
615
  # Overrides the default settings for this call, e.g, timeout,
616
616
  # retries, etc.
@@ -641,7 +641,7 @@ module Google
641
641
  #
642
642
  # @param parent [String]
643
643
  # The BigQuery project id for which data sources
644
- # should be returned: +projects/\\{project_id}+.
644
+ # should be returned: `projects/{project_id}`.
645
645
  # @param data_source_ids [Array<String>]
646
646
  # When specified, only configurations of requested data sources are returned.
647
647
  # @param page_size [Integer]
@@ -703,15 +703,15 @@ module Google
703
703
  #
704
704
  # @param parent [String]
705
705
  # Transfer configuration name in the form:
706
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+.
706
+ # `projects/{project_id}/transferConfigs/{config_id}`.
707
707
  # @param start_time [Google::Protobuf::Timestamp | Hash]
708
708
  # Start time of the range of transfer runs. For example,
709
- # +"2017-05-25T00:00:00+00:00"+.
709
+ # `"2017-05-25T00:00:00+00:00"`.
710
710
  # A hash of the same form as `Google::Protobuf::Timestamp`
711
711
  # can also be provided.
712
712
  # @param end_time [Google::Protobuf::Timestamp | Hash]
713
713
  # End time of the range of transfer runs. For example,
714
- # +"2017-05-30T00:00:00+00:00"+.
714
+ # `"2017-05-30T00:00:00+00:00"`.
715
715
  # A hash of the same form as `Google::Protobuf::Timestamp`
716
716
  # can also be provided.
717
717
  # @param options [Google::Gax::CallOptions]
@@ -728,10 +728,10 @@ module Google
728
728
  # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new(version: :v1)
729
729
  # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_transfer_config_path("[PROJECT]", "[TRANSFER_CONFIG]")
730
730
  #
731
- # # TODO: Initialize +start_time+:
731
+ # # TODO: Initialize `start_time`:
732
732
  # start_time = {}
733
733
  #
734
- # # TODO: Initialize +end_time+:
734
+ # # TODO: Initialize `end_time`:
735
735
  # end_time = {}
736
736
  # response = data_transfer_service_client.schedule_transfer_runs(formatted_parent, start_time, end_time)
737
737
 
@@ -754,7 +754,7 @@ module Google
754
754
  #
755
755
  # @param name [String]
756
756
  # The field will contain name of the resource requested, for example:
757
- # +projects/\\{project_id}/transferConfigs/\\{config_id}/runs/\\{run_id}+
757
+ # `projects/{project_id}/transferConfigs/{config_id}/runs/{run_id}`
758
758
  # @param options [Google::Gax::CallOptions]
759
759
  # Overrides the default settings for this call, e.g, timeout,
760
760
  # retries, etc.
@@ -785,7 +785,7 @@ module Google
785
785
  #
786
786
  # @param name [String]
787
787
  # The field will contain name of the resource requested, for example:
788
- # +projects/\\{project_id}/transferConfigs/\\{config_id}/runs/\\{run_id}+
788
+ # `projects/{project_id}/transferConfigs/{config_id}/runs/{run_id}`
789
789
  # @param options [Google::Gax::CallOptions]
790
790
  # Overrides the default settings for this call, e.g, timeout,
791
791
  # retries, etc.
@@ -817,7 +817,7 @@ module Google
817
817
  # @param parent [String]
818
818
  # Name of transfer configuration for which transfer runs should be retrieved.
819
819
  # Format of transfer configuration resource name is:
820
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+.
820
+ # `projects/{project_id}/transferConfigs/{config_id}`.
821
821
  # @param states [Array<Google::Cloud::Bigquery::Datatransfer::V1::TransferState>]
822
822
  # When specified, only transfer runs with requested states are returned.
823
823
  # @param page_size [Integer]
@@ -880,7 +880,7 @@ module Google
880
880
  #
881
881
  # @param parent [String]
882
882
  # Transfer run name in the form:
883
- # +projects/\\{project_id}/transferConfigs/\\{config_Id}/runs/\\{run_id}+.
883
+ # `projects/{project_id}/transferConfigs/{config_Id}/runs/{run_id}`.
884
884
  # @param page_size [Integer]
885
885
  # The maximum number of resources contained in the underlying API
886
886
  # response. If page streaming is performed per-resource, this
@@ -945,7 +945,7 @@ module Google
945
945
  #
946
946
  # @param name [String]
947
947
  # The data source in the form:
948
- # +projects/\\{project_id}/dataSources/\\{data_source_id}+
948
+ # `projects/{project_id}/dataSources/{data_source_id}`
949
949
  # @param options [Google::Gax::CallOptions]
950
950
  # Overrides the default settings for this call, e.g, timeout,
951
951
  # retries, etc.
@@ -25,8 +25,8 @@ module Google
25
25
  # | [DataTransferServiceClient][] | The Google BigQuery Data Transfer Service API enables BigQuery users to configure the transfer of their data from other Google Products into BigQuery. |
26
26
  # | [Data Types][] | Data types for Google::Cloud::Bigquery::DataTransfer::V1 |
27
27
  #
28
- # [DataTransferServiceClient]: https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-bigquery-data_transfer/latest/google/cloud/bigquery/datatransfer/v1/datatransferserviceclient
29
- # [Data Types]: https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-bigquery-data_transfer/latest/google/cloud/bigquery/datatransfer/v1/datatypes
28
+ # [DataTransferServiceClient]: https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud-bigquery-data_transfer/latest/google/cloud/bigquery/datatransfer/v1/datatransferserviceclient
29
+ # [Data Types]: https://googleapis.github.io/google-cloud-ruby/#/docs/google-cloud-bigquery-data_transfer/latest/google/cloud/bigquery/datatransfer/v1/datatypes
30
30
  #
31
31
  module V1
32
32
  # Represents a data source parameter with validation rules, so that
@@ -119,7 +119,7 @@ module Google
119
119
  # @!attribute [rw] scopes
120
120
  # @return [Array<String>]
121
121
  # Api auth scopes for which refresh token needs to be obtained. Only valid
122
- # when +client_id+ is specified. Ignored otherwise. These are scopes needed
122
+ # when `client_id` is specified. Ignored otherwise. These are scopes needed
123
123
  # by a data source to prepare data and ingest them into BigQuery,
124
124
  # e.g., https://www.googleapis.com/auth/bigquery
125
125
  # @!attribute [rw] transfer_type
@@ -137,14 +137,14 @@ module Google
137
137
  # @return [String]
138
138
  # Default data transfer schedule.
139
139
  # Examples of valid schedules include:
140
- # +1st,3rd monday of month 15:30+,
141
- # +every wed,fri of jan,jun 13:15+, and
142
- # +first sunday of quarter 00:00+.
140
+ # `1st,3rd monday of month 15:30`,
141
+ # `every wed,fri of jan,jun 13:15`, and
142
+ # `first sunday of quarter 00:00`.
143
143
  # @!attribute [rw] supports_custom_schedule
144
144
  # @return [true, false]
145
145
  # Specifies whether the data source supports a user defined schedule, or
146
146
  # operates on the default schedule.
147
- # When set to +true+, user can override default schedule.
147
+ # When set to `true`, user can override default schedule.
148
148
  # @!attribute [rw] parameters
149
149
  # @return [Array<Google::Cloud::Bigquery::Datatransfer::V1::DataSourceParameter>]
150
150
  # Data source parameters.
@@ -163,7 +163,7 @@ module Google
163
163
  # @!attribute [rw] default_data_refresh_window_days
164
164
  # @return [Integer]
165
165
  # Default data refresh window on days.
166
- # Only meaningful when +data_refresh_type+ = +SLIDING_WINDOW+.
166
+ # Only meaningful when `data_refresh_type` = `SLIDING_WINDOW`.
167
167
  # @!attribute [rw] manual_runs_disabled
168
168
  # @return [true, false]
169
169
  # Disables backfilling and manual run scheduling
@@ -207,21 +207,21 @@ module Google
207
207
  # @!attribute [rw] name
208
208
  # @return [String]
209
209
  # The field will contain name of the resource requested, for example:
210
- # +projects/\\{project_id}/dataSources/\\{data_source_id}+
210
+ # `projects/{project_id}/dataSources/{data_source_id}`
211
211
  class GetDataSourceRequest; end
212
212
 
213
213
  # Request to list supported data sources and their data transfer settings.
214
214
  # @!attribute [rw] parent
215
215
  # @return [String]
216
216
  # The BigQuery project id for which data sources should be returned.
217
- # Must be in the form: +projects/\\{project_id}+
217
+ # Must be in the form: `projects/{project_id}`
218
218
  # @!attribute [rw] page_token
219
219
  # @return [String]
220
220
  # Pagination token, which can be used to request a specific page
221
- # of +ListDataSourcesRequest+ list results. For multiple-page
222
- # results, +ListDataSourcesResponse+ outputs
223
- # a +next_page+ token, which can be used as the
224
- # +page_token+ value to request the next page of list results.
221
+ # of `ListDataSourcesRequest` list results. For multiple-page
222
+ # results, `ListDataSourcesResponse` outputs
223
+ # a `next_page` token, which can be used as the
224
+ # `page_token` value to request the next page of list results.
225
225
  # @!attribute [rw] page_size
226
226
  # @return [Integer]
227
227
  # Page size. The default page size is the maximum value of 1000 results.
@@ -235,7 +235,7 @@ module Google
235
235
  # @return [String]
236
236
  # Output only. The next-pagination token. For multiple-page list results,
237
237
  # this token can be used as the
238
- # +ListDataSourcesRequest.page_token+
238
+ # `ListDataSourcesRequest.page_token`
239
239
  # to request the next page of list results.
240
240
  class ListDataSourcesResponse; end
241
241
 
@@ -258,7 +258,7 @@ module Google
258
258
  # @return [String]
259
259
  # Optional OAuth2 authorization code to use with this transfer configuration.
260
260
  # This is required if new credentials are needed, as indicated by
261
- # +CheckValidCreds+.
261
+ # `CheckValidCreds`.
262
262
  # In order to obtain authorization_code, please make a
263
263
  # request to
264
264
  # https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?client_id=<datatransferapiclientid>&scope=<data_source_scopes>&redirect_uri=<redirect_uri>
@@ -306,7 +306,7 @@ module Google
306
306
  # @!attribute [rw] name
307
307
  # @return [String]
308
308
  # The field will contain name of the resource requested, for example:
309
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+
309
+ # `projects/{project_id}/transferConfigs/{config_id}`
310
310
  class GetTransferConfigRequest; end
311
311
 
312
312
  # A request to delete data transfer information. All associated transfer runs
@@ -314,38 +314,38 @@ module Google
314
314
  # @!attribute [rw] name
315
315
  # @return [String]
316
316
  # The field will contain name of the resource requested, for example:
317
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+
317
+ # `projects/{project_id}/transferConfigs/{config_id}`
318
318
  class DeleteTransferConfigRequest; end
319
319
 
320
320
  # A request to get data transfer run information.
321
321
  # @!attribute [rw] name
322
322
  # @return [String]
323
323
  # The field will contain name of the resource requested, for example:
324
- # +projects/\\{project_id}/transferConfigs/\\{config_id}/runs/\\{run_id}+
324
+ # `projects/{project_id}/transferConfigs/{config_id}/runs/{run_id}`
325
325
  class GetTransferRunRequest; end
326
326
 
327
327
  # A request to delete data transfer run information.
328
328
  # @!attribute [rw] name
329
329
  # @return [String]
330
330
  # The field will contain name of the resource requested, for example:
331
- # +projects/\\{project_id}/transferConfigs/\\{config_id}/runs/\\{run_id}+
331
+ # `projects/{project_id}/transferConfigs/{config_id}/runs/{run_id}`
332
332
  class DeleteTransferRunRequest; end
333
333
 
334
334
  # A request to list data transfers configured for a BigQuery project.
335
335
  # @!attribute [rw] parent
336
336
  # @return [String]
337
337
  # The BigQuery project id for which data sources
338
- # should be returned: +projects/\\{project_id}+.
338
+ # should be returned: `projects/{project_id}`.
339
339
  # @!attribute [rw] data_source_ids
340
340
  # @return [Array<String>]
341
341
  # When specified, only configurations of requested data sources are returned.
342
342
  # @!attribute [rw] page_token
343
343
  # @return [String]
344
344
  # Pagination token, which can be used to request a specific page
345
- # of +ListTransfersRequest+ list results. For multiple-page
346
- # results, +ListTransfersResponse+ outputs
347
- # a +next_page+ token, which can be used as the
348
- # +page_token+ value to request the next page of list results.
345
+ # of `ListTransfersRequest` list results. For multiple-page
346
+ # results, `ListTransfersResponse` outputs
347
+ # a `next_page` token, which can be used as the
348
+ # `page_token` value to request the next page of list results.
349
349
  # @!attribute [rw] page_size
350
350
  # @return [Integer]
351
351
  # Page size. The default page size is the maximum value of 1000 results.
@@ -359,7 +359,7 @@ module Google
359
359
  # @return [String]
360
360
  # Output only. The next-pagination token. For multiple-page list results,
361
361
  # this token can be used as the
362
- # +ListTransferConfigsRequest.page_token+
362
+ # `ListTransferConfigsRequest.page_token`
363
363
  # to request the next page of list results.
364
364
  class ListTransferConfigsResponse; end
365
365
 
@@ -370,17 +370,17 @@ module Google
370
370
  # @return [String]
371
371
  # Name of transfer configuration for which transfer runs should be retrieved.
372
372
  # Format of transfer configuration resource name is:
373
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+.
373
+ # `projects/{project_id}/transferConfigs/{config_id}`.
374
374
  # @!attribute [rw] states
375
375
  # @return [Array<Google::Cloud::Bigquery::Datatransfer::V1::TransferState>]
376
376
  # When specified, only transfer runs with requested states are returned.
377
377
  # @!attribute [rw] page_token
378
378
  # @return [String]
379
379
  # Pagination token, which can be used to request a specific page
380
- # of +ListTransferRunsRequest+ list results. For multiple-page
381
- # results, +ListTransferRunsResponse+ outputs
382
- # a +next_page+ token, which can be used as the
383
- # +page_token+ value to request the next page of list results.
380
+ # of `ListTransferRunsRequest` list results. For multiple-page
381
+ # results, `ListTransferRunsResponse` outputs
382
+ # a `next_page` token, which can be used as the
383
+ # `page_token` value to request the next page of list results.
384
384
  # @!attribute [rw] page_size
385
385
  # @return [Integer]
386
386
  # Page size. The default page size is the maximum value of 1000 results.
@@ -406,7 +406,7 @@ module Google
406
406
  # @return [String]
407
407
  # Output only. The next-pagination token. For multiple-page list results,
408
408
  # this token can be used as the
409
- # +ListTransferRunsRequest.page_token+
409
+ # `ListTransferRunsRequest.page_token`
410
410
  # to request the next page of list results.
411
411
  class ListTransferRunsResponse; end
412
412
 
@@ -414,14 +414,14 @@ module Google
414
414
  # @!attribute [rw] parent
415
415
  # @return [String]
416
416
  # Transfer run name in the form:
417
- # +projects/\\{project_id}/transferConfigs/\\{config_Id}/runs/\\{run_id}+.
417
+ # `projects/{project_id}/transferConfigs/{config_Id}/runs/{run_id}`.
418
418
  # @!attribute [rw] page_token
419
419
  # @return [String]
420
420
  # Pagination token, which can be used to request a specific page
421
- # of +ListTransferLogsRequest+ list results. For multiple-page
422
- # results, +ListTransferLogsResponse+ outputs
423
- # a +next_page+ token, which can be used as the
424
- # +page_token+ value to request the next page of list results.
421
+ # of `ListTransferLogsRequest` list results. For multiple-page
422
+ # results, `ListTransferLogsResponse` outputs
423
+ # a `next_page` token, which can be used as the
424
+ # `page_token` value to request the next page of list results.
425
425
  # @!attribute [rw] page_size
426
426
  # @return [Integer]
427
427
  # Page size. The default page size is the maximum value of 1000 results.
@@ -439,7 +439,7 @@ module Google
439
439
  # @return [String]
440
440
  # Output only. The next-pagination token. For multiple-page list results,
441
441
  # this token can be used as the
442
- # +GetTransferRunLogRequest.page_token+
442
+ # `GetTransferRunLogRequest.page_token`
443
443
  # to request the next page of list results.
444
444
  class ListTransferLogsResponse; end
445
445
 
@@ -452,28 +452,28 @@ module Google
452
452
  # @!attribute [rw] name
453
453
  # @return [String]
454
454
  # The data source in the form:
455
- # +projects/\\{project_id}/dataSources/\\{data_source_id}+
455
+ # `projects/{project_id}/dataSources/{data_source_id}`
456
456
  class CheckValidCredsRequest; end
457
457
 
458
458
  # A response indicating whether the credentials exist and are valid.
459
459
  # @!attribute [rw] has_valid_creds
460
460
  # @return [true, false]
461
- # If set to +true+, the credentials exist and are valid.
461
+ # If set to `true`, the credentials exist and are valid.
462
462
  class CheckValidCredsResponse; end
463
463
 
464
464
  # A request to schedule transfer runs for a time range.
465
465
  # @!attribute [rw] parent
466
466
  # @return [String]
467
467
  # Transfer configuration name in the form:
468
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+.
468
+ # `projects/{project_id}/transferConfigs/{config_id}`.
469
469
  # @!attribute [rw] start_time
470
470
  # @return [Google::Protobuf::Timestamp]
471
471
  # Start time of the range of transfer runs. For example,
472
- # +"2017-05-25T00:00:00+00:00"+.
472
+ # `"2017-05-25T00:00:00+00:00"`.
473
473
  # @!attribute [rw] end_time
474
474
  # @return [Google::Protobuf::Timestamp]
475
475
  # End time of the range of transfer runs. For example,
476
- # +"2017-05-30T00:00:00+00:00"+.
476
+ # `"2017-05-30T00:00:00+00:00"`.
477
477
  class ScheduleTransferRunsRequest; end
478
478
 
479
479
  # A response to schedule transfer runs for a time range.
@@ -20,16 +20,16 @@ module Google
20
20
  module V1
21
21
  # Represents a data transfer configuration. A transfer configuration
22
22
  # contains all metadata needed to perform a data transfer. For example,
23
- # +destination_dataset_id+ specifies where data should be stored.
23
+ # `destination_dataset_id` specifies where data should be stored.
24
24
  # When a new transfer configuration is created, the specified
25
- # +destination_dataset_id+ is created when needed and shared with the
25
+ # `destination_dataset_id` is created when needed and shared with the
26
26
  # appropriate data source service account.
27
27
  # @!attribute [rw] name
28
28
  # @return [String]
29
29
  # The resource name of the transfer config.
30
30
  # Transfer config names have the form
31
- # +projects/\\{project_id}/transferConfigs/\\{config_id}+.
32
- # Where +config_id+ is usually a uuid, even though it is not
31
+ # `projects/{project_id}/transferConfigs/{config_id}`.
32
+ # Where `config_id` is usually a uuid, even though it is not
33
33
  # guaranteed or required. The name is ignored when creating a transfer
34
34
  # config.
35
35
  # @!attribute [rw] destination_dataset_id
@@ -52,16 +52,16 @@ module Google
52
52
  # used.
53
53
  # The specified times are in UTC.
54
54
  # Examples of valid format:
55
- # +1st,3rd monday of month 15:30+,
56
- # +every wed,fri of jan,jun 13:15+, and
57
- # +first sunday of quarter 00:00+.
55
+ # `1st,3rd monday of month 15:30`,
56
+ # `every wed,fri of jan,jun 13:15`, and
57
+ # `first sunday of quarter 00:00`.
58
58
  # See more explanation about the format here:
59
59
  # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
60
60
  # NOTE: the granularity should be at least 8 hours, or less frequent.
61
61
  # @!attribute [rw] data_refresh_window_days
62
62
  # @return [Integer]
63
63
  # The number of days to look back to automatically refresh the data.
64
- # For example, if +data_refresh_window_days = 10+, then every day
64
+ # For example, if `data_refresh_window_days = 10`, then every day
65
65
  # BigQuery reingests data for [today-10, today-1], rather than ingesting data
66
66
  # for just [today-1].
67
67
  # Only valid if the data source supports the feature. Set the value to 0
@@ -96,7 +96,7 @@ module Google
96
96
  # @return [String]
97
97
  # The resource name of the transfer run.
98
98
  # Transfer run names have the form
99
- # +projects/\\{project_id}/locations/\\{location}/transferConfigs/\\{config_id}/runs/\\{run_id}+.
99
+ # `projects/{project_id}/locations/{location}/transferConfigs/{config_id}/runs/{run_id}`.
100
100
  # The name is ignored when creating a transfer run.
101
101
  # @!attribute [rw] schedule_time
102
102
  # @return [Google::Protobuf::Timestamp]
@@ -144,7 +144,7 @@ module Google
144
144
  # created as part of a regular schedule. For batch transfer runs that are
145
145
  # scheduled manually, this is empty.
146
146
  # NOTE: the system might choose to delay the schedule depending on the
147
- # current load, so +schedule_time+ doesn't always matches this.
147
+ # current load, so `schedule_time` doesn't always matches this.
148
148
  class TransferRun; end
149
149
 
150
150
  # Represents a user facing message for a particular data transfer run.
@@ -15,7 +15,7 @@
15
15
 
16
16
  module Google
17
17
  module Protobuf
18
- # +Any+ contains an arbitrary serialized protocol buffer message along with a
18
+ # `Any` contains an arbitrary serialized protocol buffer message along with a
19
19
  # URL that describes the type of the serialized message.
20
20
  #
21
21
  # Protobuf library provides support to pack/unpack Any values in the form
@@ -69,9 +69,9 @@ module Google
69
69
  #
70
70
  # = JSON
71
71
  #
72
- # The JSON representation of an +Any+ value uses the regular
72
+ # The JSON representation of an `Any` value uses the regular
73
73
  # representation of the deserialized, embedded message, with an
74
- # additional field +@type+ which contains the type URL. Example:
74
+ # additional field `@type` which contains the type URL. Example:
75
75
  #
76
76
  # package google.profile;
77
77
  # message Person {
@@ -87,7 +87,7 @@ module Google
87
87
  #
88
88
  # If the embedded message type is well-known and has a custom JSON
89
89
  # representation, that representation will be embedded adding a field
90
- # +value+ which holds the custom JSON in addition to the +@type+
90
+ # `value` which holds the custom JSON in addition to the `@type`
91
91
  # field. Example (for message {Google::Protobuf::Duration}):
92
92
  #
93
93
  # {
@@ -99,15 +99,15 @@ module Google
99
99
  # A URL/resource name that uniquely identifies the type of the serialized
100
100
  # protocol buffer message. The last segment of the URL's path must represent
101
101
  # the fully qualified name of the type (as in
102
- # +path/google.protobuf.Duration+). The name should be in a canonical form
102
+ # `path/google.protobuf.Duration`). The name should be in a canonical form
103
103
  # (e.g., leading "." is not accepted).
104
104
  #
105
105
  # In practice, teams usually precompile into the binary all types that they
106
106
  # expect it to use in the context of Any. However, for URLs which use the
107
- # scheme +http+, +https+, or no scheme, one can optionally set up a type
107
+ # scheme `http`, `https`, or no scheme, one can optionally set up a type
108
108
  # server that maps type URLs to message definitions as follows:
109
109
  #
110
- # * If no scheme is provided, +https+ is assumed.
110
+ # * If no scheme is provided, `https` is assumed.
111
111
  # * An HTTP GET on the URL must yield a {Google::Protobuf::Type}
112
112
  # value in binary format, or produce an error.
113
113
  # * Applications are allowed to cache lookup results based on the
@@ -120,7 +120,7 @@ module Google
120
120
  # protobuf release, and it is not used for type URLs beginning with
121
121
  # type.googleapis.com.
122
122
  #
123
- # Schemes other than +http+, +https+ (or the empty scheme) might be
123
+ # Schemes other than `http`, `https` (or the empty scheme) might be
124
124
  # used with implementation specific semantics.
125
125
  # @!attribute [rw] value
126
126
  # @return [String]
@@ -82,9 +82,9 @@ module Google
82
82
  # @return [Integer]
83
83
  # Signed fractions of a second at nanosecond resolution of the span
84
84
  # of time. Durations less than one second are represented with a 0
85
- # +seconds+ field and a positive or negative +nanos+ field. For durations
86
- # of one second or more, a non-zero value for the +nanos+ field must be
87
- # of the same sign as the +seconds+ field. Must be from -999,999,999
85
+ # `seconds` field and a positive or negative `nanos` field. For durations
86
+ # of one second or more, a non-zero value for the `nanos` field must be
87
+ # of the same sign as the `seconds` field. Must be from -999,999,999
88
88
  # to +999,999,999 inclusive.
89
89
  class Duration; end
90
90
  end
@@ -23,7 +23,7 @@ module Google
23
23
  # rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty);
24
24
  # }
25
25
  #
26
- # The JSON representation for +Empty+ is empty JSON object +{}+.
26
+ # The JSON representation for `Empty` is empty JSON object `{}`.
27
27
  class Empty; end
28
28
  end
29
29
  end
@@ -15,14 +15,14 @@
15
15
 
16
16
  module Google
17
17
  module Protobuf
18
- # +FieldMask+ represents a set of symbolic field paths, for example:
18
+ # `FieldMask` represents a set of symbolic field paths, for example:
19
19
  #
20
20
  # paths: "f.a"
21
21
  # paths: "f.b.d"
22
22
  #
23
- # Here +f+ represents a field in some root message, +a+ and +b+
24
- # fields in the message found in +f+, and +d+ a field found in the
25
- # message in +f.b+.
23
+ # Here `f` represents a field in some root message, `a` and `b`
24
+ # fields in the message found in `f`, and `d` a field found in the
25
+ # message in `f.b`.
26
26
  #
27
27
  # Field masks are used to specify a subset of fields that should be
28
28
  # returned by a get operation or modified by an update operation.
@@ -85,7 +85,7 @@ module Google
85
85
  #
86
86
  # If a repeated field is specified for an update operation, the existing
87
87
  # repeated values in the target resource will be overwritten by the new values.
88
- # Note that a repeated field is only allowed in the last position of a +paths+
88
+ # Note that a repeated field is only allowed in the last position of a `paths`
89
89
  # string.
90
90
  #
91
91
  # If a sub-message is specified in the last position of the field mask for an
@@ -177,7 +177,7 @@ module Google
177
177
  # string address = 2;
178
178
  # }
179
179
  #
180
- # In proto a field mask for +Profile+ may look as such:
180
+ # In proto a field mask for `Profile` may look as such:
181
181
  #
182
182
  # mask {
183
183
  # paths: "user.display_name"
@@ -221,7 +221,7 @@ module Google
221
221
  #
222
222
  # The implementation of any API method which has a FieldMask type field in the
223
223
  # request should verify the included field paths, and return an
224
- # +INVALID_ARGUMENT+ error if any path is duplicated or unmappable.
224
+ # `INVALID_ARGUMENT` error if any path is duplicated or unmappable.
225
225
  # @!attribute [rw] paths
226
226
  # @return [Array<String>]
227
227
  # The set of field mask paths.
@@ -15,25 +15,25 @@
15
15
 
16
16
  module Google
17
17
  module Protobuf
18
- # +Struct+ represents a structured data value, consisting of fields
19
- # which map to dynamically typed values. In some languages, +Struct+
18
+ # `Struct` represents a structured data value, consisting of fields
19
+ # which map to dynamically typed values. In some languages, `Struct`
20
20
  # might be supported by a native representation. For example, in
21
21
  # scripting languages like JS a struct is represented as an
22
22
  # object. The details of that representation are described together
23
23
  # with the proto support for the language.
24
24
  #
25
- # The JSON representation for +Struct+ is JSON object.
25
+ # The JSON representation for `Struct` is JSON object.
26
26
  # @!attribute [rw] fields
27
27
  # @return [Hash{String => Google::Protobuf::Value}]
28
28
  # Unordered map of dynamically typed values.
29
29
  class Struct; end
30
30
 
31
- # +Value+ represents a dynamically typed value which can be either
31
+ # `Value` represents a dynamically typed value which can be either
32
32
  # null, a number, a string, a boolean, a recursive struct value, or a
33
33
  # list of values. A producer of value is expected to set one of that
34
34
  # variants, absence of any variant indicates an error.
35
35
  #
36
- # The JSON representation for +Value+ is JSON value.
36
+ # The JSON representation for `Value` is JSON value.
37
37
  # @!attribute [rw] null_value
38
38
  # @return [Google::Protobuf::NullValue]
39
39
  # Represents a null value.
@@ -51,21 +51,21 @@ module Google
51
51
  # Represents a structured value.
52
52
  # @!attribute [rw] list_value
53
53
  # @return [Google::Protobuf::ListValue]
54
- # Represents a repeated +Value+.
54
+ # Represents a repeated `Value`.
55
55
  class Value; end
56
56
 
57
- # +ListValue+ is a wrapper around a repeated field of values.
57
+ # `ListValue` is a wrapper around a repeated field of values.
58
58
  #
59
- # The JSON representation for +ListValue+ is JSON array.
59
+ # The JSON representation for `ListValue` is JSON array.
60
60
  # @!attribute [rw] values
61
61
  # @return [Array<Google::Protobuf::Value>]
62
62
  # Repeated field of dynamically typed values.
63
63
  class ListValue; end
64
64
 
65
- # +NullValue+ is a singleton enumeration to represent the null value for the
66
- # +Value+ type union.
65
+ # `NullValue` is a singleton enumeration to represent the null value for the
66
+ # `Value` type union.
67
67
  #
68
- # The JSON representation for +NullValue+ is JSON +null+.
68
+ # The JSON representation for `NullValue` is JSON `null`.
69
69
  module NullValue
70
70
  # Null value.
71
71
  NULL_VALUE = 0
@@ -29,13 +29,13 @@ module Google
29
29
  #
30
30
  # = Examples
31
31
  #
32
- # Example 1: Compute Timestamp from POSIX +time()+.
32
+ # Example 1: Compute Timestamp from POSIX `time()`.
33
33
  #
34
34
  # Timestamp timestamp;
35
35
  # timestamp.set_seconds(time(NULL));
36
36
  # timestamp.set_nanos(0);
37
37
  #
38
- # Example 2: Compute Timestamp from POSIX +gettimeofday()+.
38
+ # Example 2: Compute Timestamp from POSIX `gettimeofday()`.
39
39
  #
40
40
  # struct timeval tv;
41
41
  # gettimeofday(&tv, NULL);
@@ -44,7 +44,7 @@ module Google
44
44
  # timestamp.set_seconds(tv.tv_sec);
45
45
  # timestamp.set_nanos(tv.tv_usec * 1000);
46
46
  #
47
- # Example 3: Compute Timestamp from Win32 +GetSystemTimeAsFileTime()+.
47
+ # Example 3: Compute Timestamp from Win32 `GetSystemTimeAsFileTime()`.
48
48
  #
49
49
  # FILETIME ft;
50
50
  # GetSystemTimeAsFileTime(&ft);
@@ -56,7 +56,7 @@ module Google
56
56
  # timestamp.set_seconds((INT64) ((ticks / 10000000) - 11644473600LL));
57
57
  # timestamp.set_nanos((INT32) ((ticks % 10000000) * 100));
58
58
  #
59
- # Example 4: Compute Timestamp from Java +System.currentTimeMillis()+.
59
+ # Example 4: Compute Timestamp from Java `System.currentTimeMillis()`.
60
60
  #
61
61
  # long millis = System.currentTimeMillis();
62
62
  #
@@ -87,10 +87,10 @@ module Google
87
87
  #
88
88
  # In JavaScript, one can convert a Date object to this format using the
89
89
  # standard [toISOString()](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/toISOString]
90
- # method. In Python, a standard +datetime.datetime+ object can be converted
91
- # to this format using [+strftime+](https://docs.python.org/2/library/time.html#time.strftime)
90
+ # method. In Python, a standard `datetime.datetime` object can be converted
91
+ # to this format using [`strftime`](https://docs.python.org/2/library/time.html#time.strftime)
92
92
  # with the time format spec '%Y-%m-%dT%H:%M:%S.%fZ'. Likewise, in Java, one
93
- # can use the Joda Time's [+ISODateTimeFormat.dateTime()+](
93
+ # can use the Joda Time's [`ISODateTimeFormat.dateTime()`](
94
94
  # http://www.joda.org/joda-time/apidocs/org/joda/time/format/ISODateTimeFormat.html#dateTime--
95
95
  # ) to obtain a formatter capable of generating timestamps in this format.
96
96
  # @!attribute [rw] seconds
@@ -15,7 +15,7 @@
15
15
 
16
16
  module Google
17
17
  module Rpc
18
- # The +Status+ type defines a logical error model that is suitable for different
18
+ # The `Status` type defines a logical error model that is suitable for different
19
19
  # programming environments, including REST APIs and RPC APIs. It is used by
20
20
  # [gRPC](https://github.com/grpc). The error model is designed to be:
21
21
  #
@@ -24,7 +24,7 @@ module Google
24
24
  #
25
25
  # = Overview
26
26
  #
27
- # The +Status+ message contains three pieces of data: error code, error message,
27
+ # The `Status` message contains three pieces of data: error code, error message,
28
28
  # and error details. The error code should be an enum value of
29
29
  # {Google::Rpc::Code}, but it may accept additional error codes if needed. The
30
30
  # error message should be a developer-facing English message that helps
@@ -32,40 +32,40 @@ module Google
32
32
  # error message is needed, put the localized message in the error details or
33
33
  # localize it in the client. The optional error details may contain arbitrary
34
34
  # information about the error. There is a predefined set of error detail types
35
- # in the package +google.rpc+ that can be used for common error conditions.
35
+ # in the package `google.rpc` that can be used for common error conditions.
36
36
  #
37
37
  # = Language mapping
38
38
  #
39
- # The +Status+ message is the logical representation of the error model, but it
40
- # is not necessarily the actual wire format. When the +Status+ message is
39
+ # The `Status` message is the logical representation of the error model, but it
40
+ # is not necessarily the actual wire format. When the `Status` message is
41
41
  # exposed in different client libraries and different wire protocols, it can be
42
42
  # mapped differently. For example, it will likely be mapped to some exceptions
43
43
  # in Java, but more likely mapped to some error codes in C.
44
44
  #
45
45
  # = Other uses
46
46
  #
47
- # The error model and the +Status+ message can be used in a variety of
47
+ # The error model and the `Status` message can be used in a variety of
48
48
  # environments, either with or without APIs, to provide a
49
49
  # consistent developer experience across different environments.
50
50
  #
51
51
  # Example uses of this error model include:
52
52
  #
53
53
  # * Partial errors. If a service needs to return partial errors to the client,
54
- # it may embed the +Status+ in the normal response to indicate the partial
54
+ # it may embed the `Status` in the normal response to indicate the partial
55
55
  # errors.
56
56
  #
57
57
  # * Workflow errors. A typical workflow has multiple steps. Each step may
58
- # have a +Status+ message for error reporting.
58
+ # have a `Status` message for error reporting.
59
59
  #
60
60
  # * Batch operations. If a client uses batch request and batch response, the
61
- # +Status+ message should be used directly inside batch response, one for
61
+ # `Status` message should be used directly inside batch response, one for
62
62
  # each error sub-response.
63
63
  #
64
64
  # * Asynchronous operations. If an API call embeds asynchronous operation
65
65
  # results in its response, the status of those operations should be
66
- # represented directly using the +Status+ message.
66
+ # represented directly using the `Status` message.
67
67
  #
68
- # * Logging. If some API errors are stored in logs, the message +Status+ could
68
+ # * Logging. If some API errors are stored in logs, the message `Status` could
69
69
  # be used directly after any stripping needed for security/privacy reasons.
70
70
  # @!attribute [rw] code
71
71
  # @return [Integer]
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: google-cloud-bigquery-data_transfer
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.2
4
+ version: 0.2.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Google LLC
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-09-10 00:00:00.000000000 Z
11
+ date: 2018-09-20 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: google-gax
@@ -124,12 +124,11 @@ files:
124
124
  - lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/timestamp.rb
125
125
  - lib/google/cloud/bigquery/data_transfer/v1/doc/google/protobuf/wrappers.rb
126
126
  - lib/google/cloud/bigquery/data_transfer/v1/doc/google/rpc/status.rb
127
- - lib/google/cloud/bigquery/data_transfer/v1/doc/overview.rb
128
127
  - lib/google/cloud/bigquery/data_transfer/v1/transfer_pb.rb
129
128
  - lib/google/cloud/bigquery/datatransfer/v1/datatransfer_pb.rb
130
129
  - lib/google/cloud/bigquery/datatransfer/v1/datatransfer_services_pb.rb
131
130
  - lib/google/cloud/bigquery/datatransfer/v1/transfer_pb.rb
132
- homepage: https://github.com/GoogleCloudPlatform/google-cloud-ruby/tree/master/google-cloud-bigquery-data_transfer
131
+ homepage: https://github.com/googleapis/google-cloud-ruby/tree/master/google-cloud-bigquery-data_transfer
133
132
  licenses:
134
133
  - Apache-2.0
135
134
  metadata: {}
@@ -1,105 +0,0 @@
1
- # Copyright 2018 Google LLC
2
- #
3
- # Licensed under the Apache License, Version 2.0 (the "License");
4
- # you may not use this file except in compliance with the License.
5
- # You may obtain a copy of the License at
6
- #
7
- # https://www.apache.org/licenses/LICENSE-2.0
8
- #
9
- # Unless required by applicable law or agreed to in writing, software
10
- # distributed under the License is distributed on an "AS IS" BASIS,
11
- # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
- # See the License for the specific language governing permissions and
13
- # limitations under the License.
14
-
15
-
16
- module Google
17
- module Cloud
18
- module Bigquery
19
- # rubocop:disable LineLength
20
-
21
- ##
22
- # # Ruby Client for BigQuery Data Transfer API ([Beta](https://github.com/GoogleCloudPlatform/google-cloud-ruby#versioning))
23
- #
24
- # [BigQuery Data Transfer API][Product Documentation]:
25
- # Transfers data from partner SaaS applications to Google BigQuery on a
26
- # scheduled, managed basis.
27
- # - [Product Documentation][]
28
- #
29
- # ## Quick Start
30
- # In order to use this library, you first need to go through the following
31
- # steps:
32
- #
33
- # 1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
34
- # 2. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
35
- # 3. [Enable the BigQuery Data Transfer API.](https://console.cloud.google.com/apis/library/bigquerydatatransfer.googleapis.com)
36
- # 4. [Setup Authentication.](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud/master/guides/authentication)
37
- #
38
- # ### Installation
39
- # ```
40
- # $ gem install google-cloud-bigquery-data_transfer
41
- # ```
42
- #
43
- # ### Preview
44
- # #### DataTransferServiceClient
45
- # ```rb
46
- # require "google/cloud/bigquery/data_transfer"
47
- #
48
- # data_transfer_service_client = Google::Cloud::Bigquery::DataTransfer.new
49
- # formatted_parent = Google::Cloud::Bigquery::DataTransfer::V1::DataTransferServiceClient.project_path(project_id)
50
- #
51
- # # Iterate over all results.
52
- # data_transfer_service_client.list_data_sources(formatted_parent).each do |element|
53
- # # Process element.
54
- # end
55
- #
56
- # # Or iterate over results one page at a time.
57
- # data_transfer_service_client.list_data_sources(formatted_parent).each_page do |page|
58
- # # Process each page at a time.
59
- # page.each do |element|
60
- # # Process element.
61
- # end
62
- # end
63
- # ```
64
- #
65
- # ### Next Steps
66
- # - Read the [BigQuery Data Transfer API Product documentation][Product Documentation]
67
- # to learn more about the product and see How-to Guides.
68
- # - View this [repository's main README](https://github.com/GoogleCloudPlatform/google-cloud-ruby/blob/master/README.md)
69
- # to see the full list of Cloud APIs that we cover.
70
- #
71
- # [Product Documentation]: https://cloud.google.com/bigquery/transfer/
72
- #
73
- # ## Enabling Logging
74
- #
75
- # To enable logging for this library, set the logger for the underlying [gRPC](https://github.com/grpc/grpc/tree/master/src/ruby) library.
76
- # The logger that you set may be a Ruby stdlib [`Logger`](https://ruby-doc.org/stdlib-2.5.0/libdoc/logger/rdoc/Logger.html) as shown below,
77
- # or a [`Google::Cloud::Logging::Logger`](https://googlecloudplatform.github.io/google-cloud-ruby/#/docs/google-cloud-logging/latest/google/cloud/logging/logger)
78
- # that will write logs to [Stackdriver Logging](https://cloud.google.com/logging/). See [grpc/logconfig.rb](https://github.com/grpc/grpc/blob/master/src/ruby/lib/grpc/logconfig.rb)
79
- # and the gRPC [spec_helper.rb](https://github.com/grpc/grpc/blob/master/src/ruby/spec/spec_helper.rb) for additional information.
80
- #
81
- # Configuring a Ruby stdlib logger:
82
- #
83
- # ```ruby
84
- # require "logger"
85
- #
86
- # module MyLogger
87
- # LOGGER = Logger.new $stderr, level: Logger::WARN
88
- # def logger
89
- # LOGGER
90
- # end
91
- # end
92
- #
93
- # # Define a gRPC module-level logger method before grpc/logconfig.rb loads.
94
- # module GRPC
95
- # extend MyLogger
96
- # end
97
- # ```
98
- #
99
- module DataTransfer
100
- module V1
101
- end
102
- end
103
- end
104
- end
105
- end