aws-sdk-gluedatabrew 1.2.0 → 1.7.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +43 -0
- data/LICENSE.txt +202 -0
- data/VERSION +1 -0
- data/lib/aws-sdk-gluedatabrew.rb +2 -2
- data/lib/aws-sdk-gluedatabrew/client.rb +270 -22
- data/lib/aws-sdk-gluedatabrew/client_api.rb +117 -1
- data/lib/aws-sdk-gluedatabrew/errors.rb +1 -1
- data/lib/aws-sdk-gluedatabrew/resource.rb +1 -1
- data/lib/aws-sdk-gluedatabrew/types.rb +715 -66
- metadata +8 -5
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: b1f0370109214a987494fc4c1269c4afa58a4bebbfdf916669d7fabb291b2e29
|
4
|
+
data.tar.gz: b0c6dca740e6820ff6f75de9c4fe60a1d12ac0a09a301c61d25f94f7644b73b5
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: da2c2891a69d23b22c886a1b98cd25f5e200e075303b4f617e6c191c77a17275c01b5ff1fd56638474deb04b81aea78be7a256df1a8a664c8c1597dfa2138451
|
7
|
+
data.tar.gz: d9e19e5a3858584b3a3afe0cfb5f3ac75ab27e68234189c9dd613c1b5f9a10884d7934f25b19869bed82e20f81904bc8e943052cc857c42edf32140a11d05b98
|
data/CHANGELOG.md
ADDED
@@ -0,0 +1,43 @@
|
|
1
|
+
Unreleased Changes
|
2
|
+
------------------
|
3
|
+
|
4
|
+
1.7.0 (2021-03-30)
|
5
|
+
------------------
|
6
|
+
|
7
|
+
* Feature - This SDK release adds two new dataset features: 1) support for specifying a database connection as a dataset input 2) support for dynamic datasets that accept configurable parameters in S3 path.
|
8
|
+
|
9
|
+
1.6.0 (2021-03-10)
|
10
|
+
------------------
|
11
|
+
|
12
|
+
* Feature - Code Generated Changes, see `./build_tools` or `aws-sdk-core`'s CHANGELOG.md for details.
|
13
|
+
|
14
|
+
1.5.0 (2021-02-25)
|
15
|
+
------------------
|
16
|
+
|
17
|
+
* Feature - This SDK release adds two new dataset features: 1) support for specifying the file format for a dataset, and 2) support for specifying whether the first row of a CSV or Excel file contains a header.
|
18
|
+
|
19
|
+
1.4.0 (2021-02-11)
|
20
|
+
------------------
|
21
|
+
|
22
|
+
* Feature - This release adds support for profile job sampling, which determines the number of rows on which the profile job will be executed.
|
23
|
+
|
24
|
+
1.3.0 (2021-02-03)
|
25
|
+
------------------
|
26
|
+
|
27
|
+
* Feature - This release adds the DescribeJobRun API to allow customers retrieve details of a given job run
|
28
|
+
|
29
|
+
1.2.0 (2021-02-02)
|
30
|
+
------------------
|
31
|
+
|
32
|
+
* Feature - Code Generated Changes, see `./build_tools` or `aws-sdk-core`'s CHANGELOG.md for details.
|
33
|
+
|
34
|
+
1.1.0 (2021-01-28)
|
35
|
+
------------------
|
36
|
+
|
37
|
+
* Feature - This SDK release adds support for specifying a custom delimiter for input CSV datasets and for CSV job outputs.
|
38
|
+
|
39
|
+
1.0.0 (2020-11-11)
|
40
|
+
------------------
|
41
|
+
|
42
|
+
* Feature - Initial release of `aws-sdk-gluedatabrew`.
|
43
|
+
|
data/LICENSE.txt
ADDED
@@ -0,0 +1,202 @@
|
|
1
|
+
|
2
|
+
Apache License
|
3
|
+
Version 2.0, January 2004
|
4
|
+
http://www.apache.org/licenses/
|
5
|
+
|
6
|
+
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
7
|
+
|
8
|
+
1. Definitions.
|
9
|
+
|
10
|
+
"License" shall mean the terms and conditions for use, reproduction,
|
11
|
+
and distribution as defined by Sections 1 through 9 of this document.
|
12
|
+
|
13
|
+
"Licensor" shall mean the copyright owner or entity authorized by
|
14
|
+
the copyright owner that is granting the License.
|
15
|
+
|
16
|
+
"Legal Entity" shall mean the union of the acting entity and all
|
17
|
+
other entities that control, are controlled by, or are under common
|
18
|
+
control with that entity. For the purposes of this definition,
|
19
|
+
"control" means (i) the power, direct or indirect, to cause the
|
20
|
+
direction or management of such entity, whether by contract or
|
21
|
+
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
22
|
+
outstanding shares, or (iii) beneficial ownership of such entity.
|
23
|
+
|
24
|
+
"You" (or "Your") shall mean an individual or Legal Entity
|
25
|
+
exercising permissions granted by this License.
|
26
|
+
|
27
|
+
"Source" form shall mean the preferred form for making modifications,
|
28
|
+
including but not limited to software source code, documentation
|
29
|
+
source, and configuration files.
|
30
|
+
|
31
|
+
"Object" form shall mean any form resulting from mechanical
|
32
|
+
transformation or translation of a Source form, including but
|
33
|
+
not limited to compiled object code, generated documentation,
|
34
|
+
and conversions to other media types.
|
35
|
+
|
36
|
+
"Work" shall mean the work of authorship, whether in Source or
|
37
|
+
Object form, made available under the License, as indicated by a
|
38
|
+
copyright notice that is included in or attached to the work
|
39
|
+
(an example is provided in the Appendix below).
|
40
|
+
|
41
|
+
"Derivative Works" shall mean any work, whether in Source or Object
|
42
|
+
form, that is based on (or derived from) the Work and for which the
|
43
|
+
editorial revisions, annotations, elaborations, or other modifications
|
44
|
+
represent, as a whole, an original work of authorship. For the purposes
|
45
|
+
of this License, Derivative Works shall not include works that remain
|
46
|
+
separable from, or merely link (or bind by name) to the interfaces of,
|
47
|
+
the Work and Derivative Works thereof.
|
48
|
+
|
49
|
+
"Contribution" shall mean any work of authorship, including
|
50
|
+
the original version of the Work and any modifications or additions
|
51
|
+
to that Work or Derivative Works thereof, that is intentionally
|
52
|
+
submitted to Licensor for inclusion in the Work by the copyright owner
|
53
|
+
or by an individual or Legal Entity authorized to submit on behalf of
|
54
|
+
the copyright owner. For the purposes of this definition, "submitted"
|
55
|
+
means any form of electronic, verbal, or written communication sent
|
56
|
+
to the Licensor or its representatives, including but not limited to
|
57
|
+
communication on electronic mailing lists, source code control systems,
|
58
|
+
and issue tracking systems that are managed by, or on behalf of, the
|
59
|
+
Licensor for the purpose of discussing and improving the Work, but
|
60
|
+
excluding communication that is conspicuously marked or otherwise
|
61
|
+
designated in writing by the copyright owner as "Not a Contribution."
|
62
|
+
|
63
|
+
"Contributor" shall mean Licensor and any individual or Legal Entity
|
64
|
+
on behalf of whom a Contribution has been received by Licensor and
|
65
|
+
subsequently incorporated within the Work.
|
66
|
+
|
67
|
+
2. Grant of Copyright License. Subject to the terms and conditions of
|
68
|
+
this License, each Contributor hereby grants to You a perpetual,
|
69
|
+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
70
|
+
copyright license to reproduce, prepare Derivative Works of,
|
71
|
+
publicly display, publicly perform, sublicense, and distribute the
|
72
|
+
Work and such Derivative Works in Source or Object form.
|
73
|
+
|
74
|
+
3. Grant of Patent License. Subject to the terms and conditions of
|
75
|
+
this License, each Contributor hereby grants to You a perpetual,
|
76
|
+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
77
|
+
(except as stated in this section) patent license to make, have made,
|
78
|
+
use, offer to sell, sell, import, and otherwise transfer the Work,
|
79
|
+
where such license applies only to those patent claims licensable
|
80
|
+
by such Contributor that are necessarily infringed by their
|
81
|
+
Contribution(s) alone or by combination of their Contribution(s)
|
82
|
+
with the Work to which such Contribution(s) was submitted. If You
|
83
|
+
institute patent litigation against any entity (including a
|
84
|
+
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
85
|
+
or a Contribution incorporated within the Work constitutes direct
|
86
|
+
or contributory patent infringement, then any patent licenses
|
87
|
+
granted to You under this License for that Work shall terminate
|
88
|
+
as of the date such litigation is filed.
|
89
|
+
|
90
|
+
4. Redistribution. You may reproduce and distribute copies of the
|
91
|
+
Work or Derivative Works thereof in any medium, with or without
|
92
|
+
modifications, and in Source or Object form, provided that You
|
93
|
+
meet the following conditions:
|
94
|
+
|
95
|
+
(a) You must give any other recipients of the Work or
|
96
|
+
Derivative Works a copy of this License; and
|
97
|
+
|
98
|
+
(b) You must cause any modified files to carry prominent notices
|
99
|
+
stating that You changed the files; and
|
100
|
+
|
101
|
+
(c) You must retain, in the Source form of any Derivative Works
|
102
|
+
that You distribute, all copyright, patent, trademark, and
|
103
|
+
attribution notices from the Source form of the Work,
|
104
|
+
excluding those notices that do not pertain to any part of
|
105
|
+
the Derivative Works; and
|
106
|
+
|
107
|
+
(d) If the Work includes a "NOTICE" text file as part of its
|
108
|
+
distribution, then any Derivative Works that You distribute must
|
109
|
+
include a readable copy of the attribution notices contained
|
110
|
+
within such NOTICE file, excluding those notices that do not
|
111
|
+
pertain to any part of the Derivative Works, in at least one
|
112
|
+
of the following places: within a NOTICE text file distributed
|
113
|
+
as part of the Derivative Works; within the Source form or
|
114
|
+
documentation, if provided along with the Derivative Works; or,
|
115
|
+
within a display generated by the Derivative Works, if and
|
116
|
+
wherever such third-party notices normally appear. The contents
|
117
|
+
of the NOTICE file are for informational purposes only and
|
118
|
+
do not modify the License. You may add Your own attribution
|
119
|
+
notices within Derivative Works that You distribute, alongside
|
120
|
+
or as an addendum to the NOTICE text from the Work, provided
|
121
|
+
that such additional attribution notices cannot be construed
|
122
|
+
as modifying the License.
|
123
|
+
|
124
|
+
You may add Your own copyright statement to Your modifications and
|
125
|
+
may provide additional or different license terms and conditions
|
126
|
+
for use, reproduction, or distribution of Your modifications, or
|
127
|
+
for any such Derivative Works as a whole, provided Your use,
|
128
|
+
reproduction, and distribution of the Work otherwise complies with
|
129
|
+
the conditions stated in this License.
|
130
|
+
|
131
|
+
5. Submission of Contributions. Unless You explicitly state otherwise,
|
132
|
+
any Contribution intentionally submitted for inclusion in the Work
|
133
|
+
by You to the Licensor shall be under the terms and conditions of
|
134
|
+
this License, without any additional terms or conditions.
|
135
|
+
Notwithstanding the above, nothing herein shall supersede or modify
|
136
|
+
the terms of any separate license agreement you may have executed
|
137
|
+
with Licensor regarding such Contributions.
|
138
|
+
|
139
|
+
6. Trademarks. This License does not grant permission to use the trade
|
140
|
+
names, trademarks, service marks, or product names of the Licensor,
|
141
|
+
except as required for reasonable and customary use in describing the
|
142
|
+
origin of the Work and reproducing the content of the NOTICE file.
|
143
|
+
|
144
|
+
7. Disclaimer of Warranty. Unless required by applicable law or
|
145
|
+
agreed to in writing, Licensor provides the Work (and each
|
146
|
+
Contributor provides its Contributions) on an "AS IS" BASIS,
|
147
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
148
|
+
implied, including, without limitation, any warranties or conditions
|
149
|
+
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
150
|
+
PARTICULAR PURPOSE. You are solely responsible for determining the
|
151
|
+
appropriateness of using or redistributing the Work and assume any
|
152
|
+
risks associated with Your exercise of permissions under this License.
|
153
|
+
|
154
|
+
8. Limitation of Liability. In no event and under no legal theory,
|
155
|
+
whether in tort (including negligence), contract, or otherwise,
|
156
|
+
unless required by applicable law (such as deliberate and grossly
|
157
|
+
negligent acts) or agreed to in writing, shall any Contributor be
|
158
|
+
liable to You for damages, including any direct, indirect, special,
|
159
|
+
incidental, or consequential damages of any character arising as a
|
160
|
+
result of this License or out of the use or inability to use the
|
161
|
+
Work (including but not limited to damages for loss of goodwill,
|
162
|
+
work stoppage, computer failure or malfunction, or any and all
|
163
|
+
other commercial damages or losses), even if such Contributor
|
164
|
+
has been advised of the possibility of such damages.
|
165
|
+
|
166
|
+
9. Accepting Warranty or Additional Liability. While redistributing
|
167
|
+
the Work or Derivative Works thereof, You may choose to offer,
|
168
|
+
and charge a fee for, acceptance of support, warranty, indemnity,
|
169
|
+
or other liability obligations and/or rights consistent with this
|
170
|
+
License. However, in accepting such obligations, You may act only
|
171
|
+
on Your own behalf and on Your sole responsibility, not on behalf
|
172
|
+
of any other Contributor, and only if You agree to indemnify,
|
173
|
+
defend, and hold each Contributor harmless for any liability
|
174
|
+
incurred by, or claims asserted against, such Contributor by reason
|
175
|
+
of your accepting any such warranty or additional liability.
|
176
|
+
|
177
|
+
END OF TERMS AND CONDITIONS
|
178
|
+
|
179
|
+
APPENDIX: How to apply the Apache License to your work.
|
180
|
+
|
181
|
+
To apply the Apache License to your work, attach the following
|
182
|
+
boilerplate notice, with the fields enclosed by brackets "[]"
|
183
|
+
replaced with your own identifying information. (Don't include
|
184
|
+
the brackets!) The text should be enclosed in the appropriate
|
185
|
+
comment syntax for the file format. We also recommend that a
|
186
|
+
file or class name and description of purpose be included on the
|
187
|
+
same "printed page" as the copyright notice for easier
|
188
|
+
identification within third-party archives.
|
189
|
+
|
190
|
+
Copyright [yyyy] [name of copyright owner]
|
191
|
+
|
192
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
193
|
+
you may not use this file except in compliance with the License.
|
194
|
+
You may obtain a copy of the License at
|
195
|
+
|
196
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
197
|
+
|
198
|
+
Unless required by applicable law or agreed to in writing, software
|
199
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
200
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
201
|
+
See the License for the specific language governing permissions and
|
202
|
+
limitations under the License.
|
data/VERSION
ADDED
@@ -0,0 +1 @@
|
|
1
|
+
1.7.0
|
data/lib/aws-sdk-gluedatabrew.rb
CHANGED
@@ -3,7 +3,7 @@
|
|
3
3
|
# WARNING ABOUT GENERATED CODE
|
4
4
|
#
|
5
5
|
# This file is generated. See the contributing guide for more information:
|
6
|
-
# https://github.com/aws/aws-sdk-ruby/blob/
|
6
|
+
# https://github.com/aws/aws-sdk-ruby/blob/version-3/CONTRIBUTING.md
|
7
7
|
#
|
8
8
|
# WARNING ABOUT GENERATED CODE
|
9
9
|
|
@@ -48,6 +48,6 @@ require_relative 'aws-sdk-gluedatabrew/customizations'
|
|
48
48
|
# @!group service
|
49
49
|
module Aws::GlueDataBrew
|
50
50
|
|
51
|
-
GEM_VERSION = '1.
|
51
|
+
GEM_VERSION = '1.7.0'
|
52
52
|
|
53
53
|
end
|
@@ -3,7 +3,7 @@
|
|
3
3
|
# WARNING ABOUT GENERATED CODE
|
4
4
|
#
|
5
5
|
# This file is generated. See the contributing guide for more information:
|
6
|
-
# https://github.com/aws/aws-sdk-ruby/blob/
|
6
|
+
# https://github.com/aws/aws-sdk-ruby/blob/version-3/CONTRIBUTING.md
|
7
7
|
#
|
8
8
|
# WARNING ABOUT GENERATED CODE
|
9
9
|
|
@@ -335,11 +335,11 @@ module Aws::GlueDataBrew
|
|
335
335
|
#
|
336
336
|
# * There is an invalid version identifier in the list of versions.
|
337
337
|
#
|
338
|
-
# * The
|
338
|
+
# * The version list is empty.
|
339
339
|
#
|
340
340
|
# * The version list size exceeds 50.
|
341
341
|
#
|
342
|
-
# * The
|
342
|
+
# * The version list contains duplicate entries.
|
343
343
|
#
|
344
344
|
# The request will complete successfully, but with partial failures, if:
|
345
345
|
#
|
@@ -399,12 +399,21 @@ module Aws::GlueDataBrew
|
|
399
399
|
# The name of the dataset to be created. Valid characters are
|
400
400
|
# alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.
|
401
401
|
#
|
402
|
+
# @option params [String] :format
|
403
|
+
# The file format of a dataset that is created from an S3 file or
|
404
|
+
# folder.
|
405
|
+
#
|
402
406
|
# @option params [Types::FormatOptions] :format_options
|
403
|
-
#
|
407
|
+
# Represents a set of options that define the structure of either
|
408
|
+
# comma-separated value (CSV), Excel, or JSON input.
|
404
409
|
#
|
405
410
|
# @option params [required, Types::Input] :input
|
406
|
-
#
|
407
|
-
# Catalog or Amazon S3.
|
411
|
+
# Represents information on how DataBrew can find data, in either the
|
412
|
+
# AWS Glue Data Catalog or Amazon S3.
|
413
|
+
#
|
414
|
+
# @option params [Types::PathOptions] :path_options
|
415
|
+
# A set of options that defines how DataBrew interprets an S3 path of
|
416
|
+
# the dataset.
|
408
417
|
#
|
409
418
|
# @option params [Hash<String,String>] :tags
|
410
419
|
# Metadata tags to apply to this dataset.
|
@@ -417,6 +426,7 @@ module Aws::GlueDataBrew
|
|
417
426
|
#
|
418
427
|
# resp = client.create_dataset({
|
419
428
|
# name: "DatasetName", # required
|
429
|
+
# format: "CSV", # accepts CSV, JSON, PARQUET, EXCEL
|
420
430
|
# format_options: {
|
421
431
|
# json: {
|
422
432
|
# multi_line: false,
|
@@ -424,9 +434,11 @@ module Aws::GlueDataBrew
|
|
424
434
|
# excel: {
|
425
435
|
# sheet_names: ["SheetName"],
|
426
436
|
# sheet_indexes: [1],
|
437
|
+
# header_row: false,
|
427
438
|
# },
|
428
439
|
# csv: {
|
429
440
|
# delimiter: "Delimiter",
|
441
|
+
# header_row: false,
|
430
442
|
# },
|
431
443
|
# },
|
432
444
|
# input: { # required
|
@@ -443,6 +455,45 @@ module Aws::GlueDataBrew
|
|
443
455
|
# key: "Key",
|
444
456
|
# },
|
445
457
|
# },
|
458
|
+
# database_input_definition: {
|
459
|
+
# glue_connection_name: "GlueConnectionName", # required
|
460
|
+
# database_table_name: "DatabaseTableName", # required
|
461
|
+
# temp_directory: {
|
462
|
+
# bucket: "Bucket", # required
|
463
|
+
# key: "Key",
|
464
|
+
# },
|
465
|
+
# },
|
466
|
+
# },
|
467
|
+
# path_options: {
|
468
|
+
# last_modified_date_condition: {
|
469
|
+
# expression: "Expression", # required
|
470
|
+
# values_map: { # required
|
471
|
+
# "ValueReference" => "ConditionValue",
|
472
|
+
# },
|
473
|
+
# },
|
474
|
+
# files_limit: {
|
475
|
+
# max_files: 1, # required
|
476
|
+
# ordered_by: "LAST_MODIFIED_DATE", # accepts LAST_MODIFIED_DATE
|
477
|
+
# order: "DESCENDING", # accepts DESCENDING, ASCENDING
|
478
|
+
# },
|
479
|
+
# parameters: {
|
480
|
+
# "PathParameterName" => {
|
481
|
+
# name: "PathParameterName", # required
|
482
|
+
# type: "Datetime", # required, accepts Datetime, Number, String
|
483
|
+
# datetime_options: {
|
484
|
+
# format: "DatetimeFormat", # required
|
485
|
+
# timezone_offset: "TimezoneOffset",
|
486
|
+
# locale_code: "LocaleCode",
|
487
|
+
# },
|
488
|
+
# create_column: false,
|
489
|
+
# filter: {
|
490
|
+
# expression: "Expression", # required
|
491
|
+
# values_map: { # required
|
492
|
+
# "ValueReference" => "ConditionValue",
|
493
|
+
# },
|
494
|
+
# },
|
495
|
+
# },
|
496
|
+
# },
|
446
497
|
# },
|
447
498
|
# tags: {
|
448
499
|
# "TagKey" => "TagValue",
|
@@ -474,8 +525,8 @@ module Aws::GlueDataBrew
|
|
474
525
|
# @option params [String] :encryption_mode
|
475
526
|
# The encryption mode for the job, which can be one of the following:
|
476
527
|
#
|
477
|
-
# * `SSE-KMS` -
|
478
|
-
#
|
528
|
+
# * `SSE-KMS` - `SSE-KMS` - Server-side encryption with AWS KMS-managed
|
529
|
+
# keys.
|
479
530
|
#
|
480
531
|
# * `SSE-S3` - Server-side encryption with keys managed by Amazon S3.
|
481
532
|
#
|
@@ -495,8 +546,8 @@ module Aws::GlueDataBrew
|
|
495
546
|
# The maximum number of times to retry the job after a job run fails.
|
496
547
|
#
|
497
548
|
# @option params [required, Types::S3Location] :output_location
|
498
|
-
#
|
499
|
-
# read input data, or write output from a job.
|
549
|
+
# Represents an Amazon S3 location (bucket name and object key) where
|
550
|
+
# DataBrew can read input data, or write output from a job.
|
500
551
|
#
|
501
552
|
# @option params [required, String] :role_arn
|
502
553
|
# The Amazon Resource Name (ARN) of the AWS Identity and Access
|
@@ -509,6 +560,12 @@ module Aws::GlueDataBrew
|
|
509
560
|
# The job's timeout in minutes. A job that attempts to run longer than
|
510
561
|
# this timeout period ends with a status of `TIMEOUT`.
|
511
562
|
#
|
563
|
+
# @option params [Types::JobSample] :job_sample
|
564
|
+
# Sample configuration for profile jobs only. Determines the number of
|
565
|
+
# rows on which the profile job will be executed. If a JobSample value
|
566
|
+
# is not provided, the default value will be used. The default value is
|
567
|
+
# CUSTOM\_ROWS for the mode parameter and 20000 for the size parameter.
|
568
|
+
#
|
512
569
|
# @return [Types::CreateProfileJobResponse] Returns a {Seahorse::Client::Response response} object which responds to the following methods:
|
513
570
|
#
|
514
571
|
# * {Types::CreateProfileJobResponse#name #name} => String
|
@@ -532,6 +589,10 @@ module Aws::GlueDataBrew
|
|
532
589
|
# "TagKey" => "TagValue",
|
533
590
|
# },
|
534
591
|
# timeout: 1,
|
592
|
+
# job_sample: {
|
593
|
+
# mode: "FULL_DATASET", # accepts FULL_DATASET, CUSTOM_ROWS
|
594
|
+
# size: 1,
|
595
|
+
# },
|
535
596
|
# })
|
536
597
|
#
|
537
598
|
# @example Response structure
|
@@ -677,7 +738,7 @@ module Aws::GlueDataBrew
|
|
677
738
|
# @option params [String] :encryption_mode
|
678
739
|
# The encryption mode for the job, which can be one of the following:
|
679
740
|
#
|
680
|
-
# * `SSE-KMS` - Server-side encryption with AWS KMS
|
741
|
+
# * `SSE-KMS` - Server-side encryption with keys managed by AWS KMS.
|
681
742
|
#
|
682
743
|
# * `SSE-S3` - Server-side encryption with keys managed by Amazon S3.
|
683
744
|
#
|
@@ -981,11 +1042,13 @@ module Aws::GlueDataBrew
|
|
981
1042
|
# * {Types::DescribeDatasetResponse#created_by #created_by} => String
|
982
1043
|
# * {Types::DescribeDatasetResponse#create_date #create_date} => Time
|
983
1044
|
# * {Types::DescribeDatasetResponse#name #name} => String
|
1045
|
+
# * {Types::DescribeDatasetResponse#format #format} => String
|
984
1046
|
# * {Types::DescribeDatasetResponse#format_options #format_options} => Types::FormatOptions
|
985
1047
|
# * {Types::DescribeDatasetResponse#input #input} => Types::Input
|
986
1048
|
# * {Types::DescribeDatasetResponse#last_modified_date #last_modified_date} => Time
|
987
1049
|
# * {Types::DescribeDatasetResponse#last_modified_by #last_modified_by} => String
|
988
1050
|
# * {Types::DescribeDatasetResponse#source #source} => String
|
1051
|
+
# * {Types::DescribeDatasetResponse#path_options #path_options} => Types::PathOptions
|
989
1052
|
# * {Types::DescribeDatasetResponse#tags #tags} => Hash<String,String>
|
990
1053
|
# * {Types::DescribeDatasetResponse#resource_arn #resource_arn} => String
|
991
1054
|
#
|
@@ -1000,12 +1063,15 @@ module Aws::GlueDataBrew
|
|
1000
1063
|
# resp.created_by #=> String
|
1001
1064
|
# resp.create_date #=> Time
|
1002
1065
|
# resp.name #=> String
|
1066
|
+
# resp.format #=> String, one of "CSV", "JSON", "PARQUET", "EXCEL"
|
1003
1067
|
# resp.format_options.json.multi_line #=> Boolean
|
1004
1068
|
# resp.format_options.excel.sheet_names #=> Array
|
1005
1069
|
# resp.format_options.excel.sheet_names[0] #=> String
|
1006
1070
|
# resp.format_options.excel.sheet_indexes #=> Array
|
1007
1071
|
# resp.format_options.excel.sheet_indexes[0] #=> Integer
|
1072
|
+
# resp.format_options.excel.header_row #=> Boolean
|
1008
1073
|
# resp.format_options.csv.delimiter #=> String
|
1074
|
+
# resp.format_options.csv.header_row #=> Boolean
|
1009
1075
|
# resp.input.s3_input_definition.bucket #=> String
|
1010
1076
|
# resp.input.s3_input_definition.key #=> String
|
1011
1077
|
# resp.input.data_catalog_input_definition.catalog_id #=> String
|
@@ -1013,9 +1079,29 @@ module Aws::GlueDataBrew
|
|
1013
1079
|
# resp.input.data_catalog_input_definition.table_name #=> String
|
1014
1080
|
# resp.input.data_catalog_input_definition.temp_directory.bucket #=> String
|
1015
1081
|
# resp.input.data_catalog_input_definition.temp_directory.key #=> String
|
1082
|
+
# resp.input.database_input_definition.glue_connection_name #=> String
|
1083
|
+
# resp.input.database_input_definition.database_table_name #=> String
|
1084
|
+
# resp.input.database_input_definition.temp_directory.bucket #=> String
|
1085
|
+
# resp.input.database_input_definition.temp_directory.key #=> String
|
1016
1086
|
# resp.last_modified_date #=> Time
|
1017
1087
|
# resp.last_modified_by #=> String
|
1018
|
-
# resp.source #=> String, one of "S3", "DATA-CATALOG"
|
1088
|
+
# resp.source #=> String, one of "S3", "DATA-CATALOG", "DATABASE"
|
1089
|
+
# resp.path_options.last_modified_date_condition.expression #=> String
|
1090
|
+
# resp.path_options.last_modified_date_condition.values_map #=> Hash
|
1091
|
+
# resp.path_options.last_modified_date_condition.values_map["ValueReference"] #=> String
|
1092
|
+
# resp.path_options.files_limit.max_files #=> Integer
|
1093
|
+
# resp.path_options.files_limit.ordered_by #=> String, one of "LAST_MODIFIED_DATE"
|
1094
|
+
# resp.path_options.files_limit.order #=> String, one of "DESCENDING", "ASCENDING"
|
1095
|
+
# resp.path_options.parameters #=> Hash
|
1096
|
+
# resp.path_options.parameters["PathParameterName"].name #=> String
|
1097
|
+
# resp.path_options.parameters["PathParameterName"].type #=> String, one of "Datetime", "Number", "String"
|
1098
|
+
# resp.path_options.parameters["PathParameterName"].datetime_options.format #=> String
|
1099
|
+
# resp.path_options.parameters["PathParameterName"].datetime_options.timezone_offset #=> String
|
1100
|
+
# resp.path_options.parameters["PathParameterName"].datetime_options.locale_code #=> String
|
1101
|
+
# resp.path_options.parameters["PathParameterName"].create_column #=> Boolean
|
1102
|
+
# resp.path_options.parameters["PathParameterName"].filter.expression #=> String
|
1103
|
+
# resp.path_options.parameters["PathParameterName"].filter.values_map #=> Hash
|
1104
|
+
# resp.path_options.parameters["PathParameterName"].filter.values_map["ValueReference"] #=> String
|
1019
1105
|
# resp.tags #=> Hash
|
1020
1106
|
# resp.tags["TagKey"] #=> String
|
1021
1107
|
# resp.resource_arn #=> String
|
@@ -1055,6 +1141,7 @@ module Aws::GlueDataBrew
|
|
1055
1141
|
# * {Types::DescribeJobResponse#role_arn #role_arn} => String
|
1056
1142
|
# * {Types::DescribeJobResponse#tags #tags} => Hash<String,String>
|
1057
1143
|
# * {Types::DescribeJobResponse#timeout #timeout} => Integer
|
1144
|
+
# * {Types::DescribeJobResponse#job_sample #job_sample} => Types::JobSample
|
1058
1145
|
#
|
1059
1146
|
# @example Request syntax with placeholder values
|
1060
1147
|
#
|
@@ -1093,6 +1180,8 @@ module Aws::GlueDataBrew
|
|
1093
1180
|
# resp.tags #=> Hash
|
1094
1181
|
# resp.tags["TagKey"] #=> String
|
1095
1182
|
# resp.timeout #=> Integer
|
1183
|
+
# resp.job_sample.mode #=> String, one of "FULL_DATASET", "CUSTOM_ROWS"
|
1184
|
+
# resp.job_sample.size #=> Integer
|
1096
1185
|
#
|
1097
1186
|
# @see http://docs.aws.amazon.com/goto/WebAPI/databrew-2017-07-25/DescribeJob AWS API Documentation
|
1098
1187
|
#
|
@@ -1103,6 +1192,76 @@ module Aws::GlueDataBrew
|
|
1103
1192
|
req.send_request(options)
|
1104
1193
|
end
|
1105
1194
|
|
1195
|
+
# Represents one run of a DataBrew job.
|
1196
|
+
#
|
1197
|
+
# @option params [required, String] :name
|
1198
|
+
# The name of the job being processed during this run.
|
1199
|
+
#
|
1200
|
+
# @option params [required, String] :run_id
|
1201
|
+
# The unique identifier of the job run.
|
1202
|
+
#
|
1203
|
+
# @return [Types::DescribeJobRunResponse] Returns a {Seahorse::Client::Response response} object which responds to the following methods:
|
1204
|
+
#
|
1205
|
+
# * {Types::DescribeJobRunResponse#attempt #attempt} => Integer
|
1206
|
+
# * {Types::DescribeJobRunResponse#completed_on #completed_on} => Time
|
1207
|
+
# * {Types::DescribeJobRunResponse#dataset_name #dataset_name} => String
|
1208
|
+
# * {Types::DescribeJobRunResponse#error_message #error_message} => String
|
1209
|
+
# * {Types::DescribeJobRunResponse#execution_time #execution_time} => Integer
|
1210
|
+
# * {Types::DescribeJobRunResponse#job_name #job_name} => String
|
1211
|
+
# * {Types::DescribeJobRunResponse#run_id #run_id} => String
|
1212
|
+
# * {Types::DescribeJobRunResponse#state #state} => String
|
1213
|
+
# * {Types::DescribeJobRunResponse#log_subscription #log_subscription} => String
|
1214
|
+
# * {Types::DescribeJobRunResponse#log_group_name #log_group_name} => String
|
1215
|
+
# * {Types::DescribeJobRunResponse#outputs #outputs} => Array<Types::Output>
|
1216
|
+
# * {Types::DescribeJobRunResponse#recipe_reference #recipe_reference} => Types::RecipeReference
|
1217
|
+
# * {Types::DescribeJobRunResponse#started_by #started_by} => String
|
1218
|
+
# * {Types::DescribeJobRunResponse#started_on #started_on} => Time
|
1219
|
+
# * {Types::DescribeJobRunResponse#job_sample #job_sample} => Types::JobSample
|
1220
|
+
#
|
1221
|
+
# @example Request syntax with placeholder values
|
1222
|
+
#
|
1223
|
+
# resp = client.describe_job_run({
|
1224
|
+
# name: "JobName", # required
|
1225
|
+
# run_id: "JobRunId", # required
|
1226
|
+
# })
|
1227
|
+
#
|
1228
|
+
# @example Response structure
|
1229
|
+
#
|
1230
|
+
# resp.attempt #=> Integer
|
1231
|
+
# resp.completed_on #=> Time
|
1232
|
+
# resp.dataset_name #=> String
|
1233
|
+
# resp.error_message #=> String
|
1234
|
+
# resp.execution_time #=> Integer
|
1235
|
+
# resp.job_name #=> String
|
1236
|
+
# resp.run_id #=> String
|
1237
|
+
# resp.state #=> String, one of "STARTING", "RUNNING", "STOPPING", "STOPPED", "SUCCEEDED", "FAILED", "TIMEOUT"
|
1238
|
+
# resp.log_subscription #=> String, one of "ENABLE", "DISABLE"
|
1239
|
+
# resp.log_group_name #=> String
|
1240
|
+
# resp.outputs #=> Array
|
1241
|
+
# resp.outputs[0].compression_format #=> String, one of "GZIP", "LZ4", "SNAPPY", "BZIP2", "DEFLATE", "LZO", "BROTLI", "ZSTD", "ZLIB"
|
1242
|
+
# resp.outputs[0].format #=> String, one of "CSV", "JSON", "PARQUET", "GLUEPARQUET", "AVRO", "ORC", "XML"
|
1243
|
+
# resp.outputs[0].partition_columns #=> Array
|
1244
|
+
# resp.outputs[0].partition_columns[0] #=> String
|
1245
|
+
# resp.outputs[0].location.bucket #=> String
|
1246
|
+
# resp.outputs[0].location.key #=> String
|
1247
|
+
# resp.outputs[0].overwrite #=> Boolean
|
1248
|
+
# resp.outputs[0].format_options.csv.delimiter #=> String
|
1249
|
+
# resp.recipe_reference.name #=> String
|
1250
|
+
# resp.recipe_reference.recipe_version #=> String
|
1251
|
+
# resp.started_by #=> String
|
1252
|
+
# resp.started_on #=> Time
|
1253
|
+
# resp.job_sample.mode #=> String, one of "FULL_DATASET", "CUSTOM_ROWS"
|
1254
|
+
# resp.job_sample.size #=> Integer
|
1255
|
+
#
|
1256
|
+
# @see http://docs.aws.amazon.com/goto/WebAPI/databrew-2017-07-25/DescribeJobRun AWS API Documentation
|
1257
|
+
#
|
1258
|
+
# @overload describe_job_run(params = {})
|
1259
|
+
# @param [Hash] params ({})
|
1260
|
+
def describe_job_run(params = {}, options = {})
|
1261
|
+
req = build_request(:describe_job_run, params)
|
1262
|
+
req.send_request(options)
|
1263
|
+
end
|
1264
|
+
|
1106
1265
|
# Returns the definition of a specific DataBrew project.
|
1107
1266
|
#
|
1108
1267
|
# @option params [required, String] :name
|
@@ -1301,12 +1460,15 @@ module Aws::GlueDataBrew
|
|
1301
1460
|
# resp.datasets[0].created_by #=> String
|
1302
1461
|
# resp.datasets[0].create_date #=> Time
|
1303
1462
|
# resp.datasets[0].name #=> String
|
1463
|
+
# resp.datasets[0].format #=> String, one of "CSV", "JSON", "PARQUET", "EXCEL"
|
1304
1464
|
# resp.datasets[0].format_options.json.multi_line #=> Boolean
|
1305
1465
|
# resp.datasets[0].format_options.excel.sheet_names #=> Array
|
1306
1466
|
# resp.datasets[0].format_options.excel.sheet_names[0] #=> String
|
1307
1467
|
# resp.datasets[0].format_options.excel.sheet_indexes #=> Array
|
1308
1468
|
# resp.datasets[0].format_options.excel.sheet_indexes[0] #=> Integer
|
1469
|
+
# resp.datasets[0].format_options.excel.header_row #=> Boolean
|
1309
1470
|
# resp.datasets[0].format_options.csv.delimiter #=> String
|
1471
|
+
# resp.datasets[0].format_options.csv.header_row #=> Boolean
|
1310
1472
|
# resp.datasets[0].input.s3_input_definition.bucket #=> String
|
1311
1473
|
# resp.datasets[0].input.s3_input_definition.key #=> String
|
1312
1474
|
# resp.datasets[0].input.data_catalog_input_definition.catalog_id #=> String
|
@@ -1314,9 +1476,29 @@ module Aws::GlueDataBrew
|
|
1314
1476
|
# resp.datasets[0].input.data_catalog_input_definition.table_name #=> String
|
1315
1477
|
# resp.datasets[0].input.data_catalog_input_definition.temp_directory.bucket #=> String
|
1316
1478
|
# resp.datasets[0].input.data_catalog_input_definition.temp_directory.key #=> String
|
1479
|
+
# resp.datasets[0].input.database_input_definition.glue_connection_name #=> String
|
1480
|
+
# resp.datasets[0].input.database_input_definition.database_table_name #=> String
|
1481
|
+
# resp.datasets[0].input.database_input_definition.temp_directory.bucket #=> String
|
1482
|
+
# resp.datasets[0].input.database_input_definition.temp_directory.key #=> String
|
1317
1483
|
# resp.datasets[0].last_modified_date #=> Time
|
1318
1484
|
# resp.datasets[0].last_modified_by #=> String
|
1319
|
-
# resp.datasets[0].source #=> String, one of "S3", "DATA-CATALOG"
|
1485
|
+
# resp.datasets[0].source #=> String, one of "S3", "DATA-CATALOG", "DATABASE"
|
1486
|
+
# resp.datasets[0].path_options.last_modified_date_condition.expression #=> String
|
1487
|
+
# resp.datasets[0].path_options.last_modified_date_condition.values_map #=> Hash
|
1488
|
+
# resp.datasets[0].path_options.last_modified_date_condition.values_map["ValueReference"] #=> String
|
1489
|
+
# resp.datasets[0].path_options.files_limit.max_files #=> Integer
|
1490
|
+
# resp.datasets[0].path_options.files_limit.ordered_by #=> String, one of "LAST_MODIFIED_DATE"
|
1491
|
+
# resp.datasets[0].path_options.files_limit.order #=> String, one of "DESCENDING", "ASCENDING"
|
1492
|
+
# resp.datasets[0].path_options.parameters #=> Hash
|
1493
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].name #=> String
|
1494
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].type #=> String, one of "Datetime", "Number", "String"
|
1495
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].datetime_options.format #=> String
|
1496
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].datetime_options.timezone_offset #=> String
|
1497
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].datetime_options.locale_code #=> String
|
1498
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].create_column #=> Boolean
|
1499
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].filter.expression #=> String
|
1500
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].filter.values_map #=> Hash
|
1501
|
+
# resp.datasets[0].path_options.parameters["PathParameterName"].filter.values_map["ValueReference"] #=> String
|
1320
1502
|
# resp.datasets[0].tags #=> Hash
|
1321
1503
|
# resp.datasets[0].tags["TagKey"] #=> String
|
1322
1504
|
# resp.datasets[0].resource_arn #=> String
|
@@ -1384,6 +1566,8 @@ module Aws::GlueDataBrew
|
|
1384
1566
|
# resp.job_runs[0].recipe_reference.recipe_version #=> String
|
1385
1567
|
# resp.job_runs[0].started_by #=> String
|
1386
1568
|
# resp.job_runs[0].started_on #=> Time
|
1569
|
+
# resp.job_runs[0].job_sample.mode #=> String, one of "FULL_DATASET", "CUSTOM_ROWS"
|
1570
|
+
# resp.job_runs[0].job_sample.size #=> Integer
|
1387
1571
|
# resp.next_token #=> String
|
1388
1572
|
#
|
1389
1573
|
# @see http://docs.aws.amazon.com/goto/WebAPI/databrew-2017-07-25/ListJobRuns AWS API Documentation
|
@@ -1463,6 +1647,8 @@ module Aws::GlueDataBrew
|
|
1463
1647
|
# resp.jobs[0].timeout #=> Integer
|
1464
1648
|
# resp.jobs[0].tags #=> Hash
|
1465
1649
|
# resp.jobs[0].tags["TagKey"] #=> String
|
1650
|
+
# resp.jobs[0].job_sample.mode #=> String, one of "FULL_DATASET", "CUSTOM_ROWS"
|
1651
|
+
# resp.jobs[0].job_sample.size #=> Integer
|
1466
1652
|
# resp.next_token #=> String
|
1467
1653
|
#
|
1468
1654
|
# @see http://docs.aws.amazon.com/goto/WebAPI/databrew-2017-07-25/ListJobs AWS API Documentation
|
@@ -1795,7 +1981,7 @@ module Aws::GlueDataBrew
|
|
1795
1981
|
# and ready for work. The action will be performed on this session.
|
1796
1982
|
#
|
1797
1983
|
# @option params [Types::ViewFrame] :view_frame
|
1798
|
-
# Represents the data being
|
1984
|
+
# Represents the data being transformed during an action.
|
1799
1985
|
#
|
1800
1986
|
# @return [Types::SendProjectSessionActionResponse] Returns a {Seahorse::Client::Response response} object which responds to the following methods:
|
1801
1987
|
#
|
@@ -2006,12 +2192,21 @@ module Aws::GlueDataBrew
|
|
2006
2192
|
# @option params [required, String] :name
|
2007
2193
|
# The name of the dataset to be updated.
|
2008
2194
|
#
|
2195
|
+
# @option params [String] :format
|
2196
|
+
# The file format of a dataset that is created from an S3 file or
|
2197
|
+
# folder.
|
2198
|
+
#
|
2009
2199
|
# @option params [Types::FormatOptions] :format_options
|
2010
|
-
#
|
2200
|
+
# Represents a set of options that define the structure of either
|
2201
|
+
# comma-separated value (CSV), Excel, or JSON input.
|
2011
2202
|
#
|
2012
2203
|
# @option params [required, Types::Input] :input
|
2013
|
-
#
|
2014
|
-
# Catalog or Amazon S3.
|
2204
|
+
# Represents information on how DataBrew can find data, in either the
|
2205
|
+
# AWS Glue Data Catalog or Amazon S3.
|
2206
|
+
#
|
2207
|
+
# @option params [Types::PathOptions] :path_options
|
2208
|
+
# A set of options that defines how DataBrew interprets an S3 path of
|
2209
|
+
# the dataset.
|
2015
2210
|
#
|
2016
2211
|
# @return [Types::UpdateDatasetResponse] Returns a {Seahorse::Client::Response response} object which responds to the following methods:
|
2017
2212
|
#
|
@@ -2021,6 +2216,7 @@ module Aws::GlueDataBrew
|
|
2021
2216
|
#
|
2022
2217
|
# resp = client.update_dataset({
|
2023
2218
|
# name: "DatasetName", # required
|
2219
|
+
# format: "CSV", # accepts CSV, JSON, PARQUET, EXCEL
|
2024
2220
|
# format_options: {
|
2025
2221
|
# json: {
|
2026
2222
|
# multi_line: false,
|
@@ -2028,9 +2224,11 @@ module Aws::GlueDataBrew
|
|
2028
2224
|
# excel: {
|
2029
2225
|
# sheet_names: ["SheetName"],
|
2030
2226
|
# sheet_indexes: [1],
|
2227
|
+
# header_row: false,
|
2031
2228
|
# },
|
2032
2229
|
# csv: {
|
2033
2230
|
# delimiter: "Delimiter",
|
2231
|
+
# header_row: false,
|
2034
2232
|
# },
|
2035
2233
|
# },
|
2036
2234
|
# input: { # required
|
@@ -2047,6 +2245,45 @@ module Aws::GlueDataBrew
|
|
2047
2245
|
# key: "Key",
|
2048
2246
|
# },
|
2049
2247
|
# },
|
2248
|
+
# database_input_definition: {
|
2249
|
+
# glue_connection_name: "GlueConnectionName", # required
|
2250
|
+
# database_table_name: "DatabaseTableName", # required
|
2251
|
+
# temp_directory: {
|
2252
|
+
# bucket: "Bucket", # required
|
2253
|
+
# key: "Key",
|
2254
|
+
# },
|
2255
|
+
# },
|
2256
|
+
# },
|
2257
|
+
# path_options: {
|
2258
|
+
# last_modified_date_condition: {
|
2259
|
+
# expression: "Expression", # required
|
2260
|
+
# values_map: { # required
|
2261
|
+
# "ValueReference" => "ConditionValue",
|
2262
|
+
# },
|
2263
|
+
# },
|
2264
|
+
# files_limit: {
|
2265
|
+
# max_files: 1, # required
|
2266
|
+
# ordered_by: "LAST_MODIFIED_DATE", # accepts LAST_MODIFIED_DATE
|
2267
|
+
# order: "DESCENDING", # accepts DESCENDING, ASCENDING
|
2268
|
+
# },
|
2269
|
+
# parameters: {
|
2270
|
+
# "PathParameterName" => {
|
2271
|
+
# name: "PathParameterName", # required
|
2272
|
+
# type: "Datetime", # required, accepts Datetime, Number, String
|
2273
|
+
# datetime_options: {
|
2274
|
+
# format: "DatetimeFormat", # required
|
2275
|
+
# timezone_offset: "TimezoneOffset",
|
2276
|
+
# locale_code: "LocaleCode",
|
2277
|
+
# },
|
2278
|
+
# create_column: false,
|
2279
|
+
# filter: {
|
2280
|
+
# expression: "Expression", # required
|
2281
|
+
# values_map: { # required
|
2282
|
+
# "ValueReference" => "ConditionValue",
|
2283
|
+
# },
|
2284
|
+
# },
|
2285
|
+
# },
|
2286
|
+
# },
|
2050
2287
|
# },
|
2051
2288
|
# })
|
2052
2289
|
#
|
@@ -2072,7 +2309,7 @@ module Aws::GlueDataBrew
|
|
2072
2309
|
# @option params [String] :encryption_mode
|
2073
2310
|
# The encryption mode for the job, which can be one of the following:
|
2074
2311
|
#
|
2075
|
-
# * `SSE-KMS` - Server-side encryption with AWS KMS
|
2312
|
+
# * `SSE-KMS` - Server-side encryption with keys managed by AWS KMS.
|
2076
2313
|
#
|
2077
2314
|
# * `SSE-S3` - Server-side encryption with keys managed by Amazon S3.
|
2078
2315
|
#
|
@@ -2091,8 +2328,8 @@ module Aws::GlueDataBrew
|
|
2091
2328
|
# The maximum number of times to retry the job after a job run fails.
|
2092
2329
|
#
|
2093
2330
|
# @option params [required, Types::S3Location] :output_location
|
2094
|
-
#
|
2095
|
-
# read input data, or write output from a job.
|
2331
|
+
# Represents an Amazon S3 location (bucket name and object key) where
|
2332
|
+
# DataBrew can read input data, or write output from a job.
|
2096
2333
|
#
|
2097
2334
|
# @option params [required, String] :role_arn
|
2098
2335
|
# The Amazon Resource Name (ARN) of the AWS Identity and Access
|
@@ -2102,6 +2339,13 @@ module Aws::GlueDataBrew
|
|
2102
2339
|
# The job's timeout in minutes. A job that attempts to run longer than
|
2103
2340
|
# this timeout period ends with a status of `TIMEOUT`.
|
2104
2341
|
#
|
2342
|
+
# @option params [Types::JobSample] :job_sample
|
2343
|
+
# Sample configuration for Profile Jobs only. Determines the number of
|
2344
|
+
# rows on which the Profile job will be executed. If a JobSample value
|
2345
|
+
# is not provided for profile jobs, the default value will be used. The
|
2346
|
+
# default value is CUSTOM\_ROWS for the mode parameter and 20000 for the
|
2347
|
+
# size parameter.
|
2348
|
+
#
|
2105
2349
|
# @return [Types::UpdateProfileJobResponse] Returns a {Seahorse::Client::Response response} object which responds to the following methods:
|
2106
2350
|
#
|
2107
2351
|
# * {Types::UpdateProfileJobResponse#name #name} => String
|
@@ -2121,6 +2365,10 @@ module Aws::GlueDataBrew
|
|
2121
2365
|
# },
|
2122
2366
|
# role_arn: "Arn", # required
|
2123
2367
|
# timeout: 1,
|
2368
|
+
# job_sample: {
|
2369
|
+
# mode: "FULL_DATASET", # accepts FULL_DATASET, CUSTOM_ROWS
|
2370
|
+
# size: 1,
|
2371
|
+
# },
|
2124
2372
|
# })
|
2125
2373
|
#
|
2126
2374
|
# @example Response structure
|
@@ -2242,7 +2490,7 @@ module Aws::GlueDataBrew
|
|
2242
2490
|
# @option params [String] :encryption_mode
|
2243
2491
|
# The encryption mode for the job, which can be one of the following:
|
2244
2492
|
#
|
2245
|
-
# * `SSE-KMS` - Server-side encryption with AWS KMS
|
2493
|
+
# * `SSE-KMS` - Server-side encryption with keys managed by AWS KMS.
|
2246
2494
|
#
|
2247
2495
|
# * `SSE-S3` - Server-side encryption with keys managed by Amazon S3.
|
2248
2496
|
#
|
@@ -2373,7 +2621,7 @@ module Aws::GlueDataBrew
|
|
2373
2621
|
params: params,
|
2374
2622
|
config: config)
|
2375
2623
|
context[:gem_name] = 'aws-sdk-gluedatabrew'
|
2376
|
-
context[:gem_version] = '1.
|
2624
|
+
context[:gem_version] = '1.7.0'
|
2377
2625
|
Seahorse::Client::Request.new(handlers, context)
|
2378
2626
|
end
|
2379
2627
|
|