sagemaker-core 1.0.29__py3-none-any.whl → 1.0.30__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of sagemaker-core might be problematic. Click here for more details.

@@ -12,7 +12,7 @@
12
12
  # language governing permissions and limitations under the License.
13
13
  import datetime
14
14
 
15
- from pydantic import BaseModel, ConfigDict
15
+ from pydantic import BaseModel, ConfigDict, Field
16
16
  from typing import List, Dict, Optional, Any, Union
17
17
  from sagemaker_core.main.utils import Unassigned
18
18
 
@@ -1013,7 +1013,7 @@ class StoppingCondition(Base):
1013
1013
  ----------------------
1014
1014
  max_runtime_in_seconds: The maximum length of time, in seconds, that a training or compilation job can run before it is stopped. For compilation jobs, if the job does not complete during this time, a TimeOut error is generated. We recommend starting with 900 seconds and increasing as necessary based on your model. For all other jobs, if the job does not complete during this time, SageMaker ends the job. When RetryStrategy is specified in the job request, MaxRuntimeInSeconds specifies the maximum time for all of the attempts in total, not each individual attempt. The default value is 1 day. The maximum value is 28 days. The maximum time that a TrainingJob can run in total, including any time spent publishing metrics or archiving and uploading models after it has been stopped, is 30 days.
1015
1015
  max_wait_time_in_seconds: The maximum length of time, in seconds, that a managed Spot training job has to complete. It is the amount of time spent waiting for Spot capacity plus the amount of time the job can run. It must be equal to or greater than MaxRuntimeInSeconds. If the job does not complete during this time, SageMaker ends the job. When RetryStrategy is specified in the job request, MaxWaitTimeInSeconds specifies the maximum time for all of the attempts in total, not each individual attempt.
1016
- max_pending_time_in_seconds: The maximum length of time, in seconds, that a training or compilation job can be pending before it is stopped.
1016
+ max_pending_time_in_seconds: The maximum length of time, in seconds, that a training or compilation job can be pending before it is stopped. When working with training jobs that use capacity from training plans, not all Pending job states count against the MaxPendingTimeInSeconds limit. The following scenarios do not increment the MaxPendingTimeInSeconds counter: The plan is in a Scheduled state: Jobs queued (in Pending status) before a plan's start date (waiting for scheduled start time) Between capacity reservations: Jobs temporarily back to Pending status between two capacity reservation periods MaxPendingTimeInSeconds only increments when jobs are actively waiting for capacity in an Active plan.
1017
1017
  """
1018
1018
 
1019
1019
  max_runtime_in_seconds: Optional[int] = Unassigned()
@@ -2543,12 +2543,12 @@ class MonitoringDatasetFormat(Base):
2543
2543
  Attributes
2544
2544
  ----------------------
2545
2545
  csv: The CSV dataset used in the monitoring job.
2546
- json: The JSON dataset used in the monitoring job
2546
+ json_format: The JSON dataset used in the monitoring job
2547
2547
  parquet: The Parquet dataset used in the monitoring job
2548
2548
  """
2549
2549
 
2550
2550
  csv: Optional[MonitoringCsvDatasetFormat] = Unassigned()
2551
- json: Optional[MonitoringJsonDatasetFormat] = Unassigned()
2551
+ json_format: Optional[MonitoringJsonDatasetFormat] = Field(default=Unassigned(), alias="json")
2552
2552
  parquet: Optional[MonitoringParquetDatasetFormat] = Unassigned()
2553
2553
 
2554
2554
 
@@ -4877,7 +4877,7 @@ class ProductionVariant(Base):
4877
4877
  enable_ssm_access: You can use this parameter to turn on native Amazon Web Services Systems Manager (SSM) access for a production variant behind an endpoint. By default, SSM access is disabled for all production variants behind an endpoint. You can turn on or turn off SSM access for a production variant behind an existing endpoint by creating a new endpoint configuration and calling UpdateEndpoint.
4878
4878
  managed_instance_scaling: Settings that control the range in the number of instances that the endpoint provisions as it scales up or down to accommodate traffic.
4879
4879
  routing_config: Settings that control how the endpoint routes incoming traffic to the instances that the endpoint hosts.
4880
- inference_ami_version: Specifies an option from a collection of preconfigured Amazon Machine Image (AMI) images. Each image is configured by Amazon Web Services with a set of software and driver versions. Amazon Web Services optimizes these configurations for different machine learning workloads. By selecting an AMI version, you can ensure that your inference environment is compatible with specific software requirements, such as CUDA driver versions, Linux kernel versions, or Amazon Web Services Neuron driver versions. The AMI version names, and their configurations, are the following: al2-ami-sagemaker-inference-gpu-2 Accelerator: GPU NVIDIA driver version: 535 CUDA version: 12.2 al2-ami-sagemaker-inference-gpu-2-1 Accelerator: GPU NVIDIA driver version: 535 CUDA version: 12.2 NVIDIA Container Toolkit with disabled CUDA-compat mounting al2-ami-sagemaker-inference-gpu-3-1 Accelerator: GPU NVIDIA driver version: 550 CUDA version: 12.4 NVIDIA Container Toolkit with disabled CUDA-compat mounting
4880
+ inference_ami_version: Specifies an option from a collection of preconfigured Amazon Machine Image (AMI) images. Each image is configured by Amazon Web Services with a set of software and driver versions. Amazon Web Services optimizes these configurations for different machine learning workloads. By selecting an AMI version, you can ensure that your inference environment is compatible with specific software requirements, such as CUDA driver versions, Linux kernel versions, or Amazon Web Services Neuron driver versions. The AMI version names, and their configurations, are the following: al2-ami-sagemaker-inference-gpu-2 Accelerator: GPU NVIDIA driver version: 535 CUDA version: 12.2 al2-ami-sagemaker-inference-gpu-2-1 Accelerator: GPU NVIDIA driver version: 535 CUDA version: 12.2 NVIDIA Container Toolkit with disabled CUDA-compat mounting al2-ami-sagemaker-inference-gpu-3-1 Accelerator: GPU NVIDIA driver version: 550 CUDA version: 12.4 NVIDIA Container Toolkit with disabled CUDA-compat mounting al2-ami-sagemaker-inference-neuron-2 Accelerator: Inferentia2 and Trainium Neuron driver version: 2.19
4881
4881
  """
4882
4882
 
4883
4883
  variant_name: str
@@ -10680,6 +10680,7 @@ class ModelPackageSummary(Base):
10680
10680
  creation_time: A timestamp that shows when the model package was created.
10681
10681
  model_package_status: The overall status of the model package.
10682
10682
  model_approval_status: The approval status of the model. This can be one of the following values. APPROVED - The model is approved REJECTED - The model is rejected. PENDING_MANUAL_APPROVAL - The model is waiting for manual approval.
10683
+ model_life_cycle
10683
10684
  """
10684
10685
 
10685
10686
  model_package_arn: str
@@ -10690,6 +10691,7 @@ class ModelPackageSummary(Base):
10690
10691
  model_package_version: Optional[int] = Unassigned()
10691
10692
  model_package_description: Optional[str] = Unassigned()
10692
10693
  model_approval_status: Optional[str] = Unassigned()
10694
+ model_life_cycle: Optional[ModelLifeCycle] = Unassigned()
10693
10695
 
10694
10696
 
10695
10697
  class ModelSummary(Base):
@@ -182,8 +182,11 @@ def get_textual_rich_logger(name: str, log_level: str = "INFO") -> logging.Logge
182
182
  """
183
183
  enable_textual_rich_console_and_traceback()
184
184
  handler = get_rich_handler()
185
- logging.basicConfig(level=getattr(logging, log_level), handlers=[handler])
186
185
  logger = logging.getLogger(name)
186
+ for handler in logger.handlers:
187
+ logger.removeHandler(handler)
188
+ logger.addHandler(handler)
189
+ logger.setLevel(getattr(logging, log_level))
187
190
 
188
191
  return logger
189
192
 
@@ -97,3 +97,5 @@ SHAPES_CODEGEN_FILE_NAME = "shapes.py"
97
97
  CONFIG_SCHEMA_FILE_NAME = "config_schema.py"
98
98
 
99
99
  API_COVERAGE_JSON_FILE_PATH = os.getcwd() + "/src/sagemaker_core/tools/api_coverage.json"
100
+
101
+ SHAPES_WITH_JSON_FIELD_ALIAS = ["MonitoringDatasetFormat"] # Shapes with field name with "json"
@@ -177,7 +177,7 @@ class ResourcesCodeGen:
177
177
  "import datetime",
178
178
  "import time",
179
179
  "import functools",
180
- "from pydantic import validate_call",
180
+ "from pydantic import validate_call, ConfigDict, BaseModel",
181
181
  "from typing import Dict, List, Literal, Optional, Union, Any\n"
182
182
  "from boto3.session import Session",
183
183
  "from rich.console import Group",
@@ -192,8 +192,8 @@ class ResourcesCodeGen:
192
192
  "snake_to_pascal, pascal_to_snake, is_not_primitive, is_not_str_dict, is_primitive_list, serialize",
193
193
  "from sagemaker_core.main.intelligent_defaults_helper import load_default_configs_for_resource_name, get_config_value",
194
194
  "from sagemaker_core.main.logs import MultiLogStreamHandler",
195
- "from sagemaker_core.main.shapes import *",
196
195
  "from sagemaker_core.main.exceptions import *",
196
+ "import sagemaker_core.main.shapes as shapes",
197
197
  ]
198
198
 
199
199
  formated_imports = "\n".join(imports)
@@ -1388,9 +1388,12 @@ class ResourcesCodeGen:
1388
1388
  return_type_conversion = "cls"
1389
1389
  return_string = f"Returns:\n" f" {method.resource_name}\n"
1390
1390
  else:
1391
- return_type = f"Optional[{method.return_type}]"
1392
1391
  return_type_conversion = method.return_type
1393
- return_string = f"Returns:\n" f" {method.return_type}\n"
1392
+ if method.resource_name != method.return_type and method.return_type in self.shapes:
1393
+ return_type_conversion = f"shapes.{method.return_type}"
1394
+ return_type = f"Optional[{return_type_conversion}]"
1395
+
1396
+ return_string = f"Returns:\n" f" {return_type_conversion}\n"
1394
1397
  operation_output_shape = operation_metadata["output"]["shape"]
1395
1398
  deserialize_response = DESERIALIZE_RESPONSE_TEMPLATE.format(
1396
1399
  operation_output_shape=operation_output_shape,
@@ -1471,10 +1474,14 @@ class ResourcesCodeGen:
1471
1474
  method_args += add_indent("session: Optional[Session] = None,\n", 4)
1472
1475
  method_args += add_indent("region: Optional[str] = None,", 4)
1473
1476
 
1477
+ iterator_return_type = method.return_type
1478
+ if method.resource_name != method.return_type and method.return_type in self.shapes:
1479
+ iterator_return_type = f"shapes.{method.return_type}"
1480
+
1474
1481
  if method.return_type == method.resource_name:
1475
- return_type = f'ResourceIterator["{method.resource_name}"]'
1482
+ method_return_type = f'ResourceIterator["{method.resource_name}"]'
1476
1483
  else:
1477
- return_type = f"ResourceIterator[{method.return_type}]"
1484
+ method_return_type = f"ResourceIterator[{iterator_return_type}]"
1478
1485
  return_string = f"Returns:\n" f" Iterator for listed {method.return_type}.\n"
1479
1486
 
1480
1487
  get_list_operation_output_shape = operation_metadata["output"]["shape"]
@@ -1497,7 +1504,7 @@ class ResourcesCodeGen:
1497
1504
  f"list_method='{list_method}'",
1498
1505
  f"summaries_key='{summaries_key}'",
1499
1506
  f"summary_name='{summary_name}'",
1500
- f"resource_cls={method.return_type}",
1507
+ f"resource_cls={iterator_return_type}",
1501
1508
  "list_method_kwargs=operation_input_args",
1502
1509
  ]
1503
1510
 
@@ -1527,7 +1534,7 @@ class ResourcesCodeGen:
1527
1534
  decorator=decorator,
1528
1535
  method_name=method.method_name,
1529
1536
  method_args=method_args,
1530
- return_type=return_type,
1537
+ return_type=method_return_type,
1531
1538
  serialize_operation_input=serialize_operation_input,
1532
1539
  initialize_client=initialize_client,
1533
1540
  call_operation_api="",
@@ -1901,6 +1908,8 @@ class ResourcesCodeGen:
1901
1908
  new_val = value.split("=")[0].strip()
1902
1909
  if new_val.startswith("Optional"):
1903
1910
  new_val = new_val.replace("Optional[", "")[:-1]
1911
+ if new_val.startswith("shapes."):
1912
+ new_val = new_val.replace("shapes.", "")
1904
1913
  cleaned_class_attributes[key] = new_val
1905
1914
  return cleaned_class_attributes
1906
1915
 
@@ -22,6 +22,7 @@ from sagemaker_core.tools.constants import (
22
22
  LICENCES_STRING,
23
23
  GENERATED_CLASSES_LOCATION,
24
24
  SHAPES_CODEGEN_FILE_NAME,
25
+ SHAPES_WITH_JSON_FIELD_ALIAS,
25
26
  )
26
27
  from sagemaker_core.tools.shapes_extractor import ShapesExtractor
27
28
  from sagemaker_core.main.utils import (
@@ -148,7 +149,7 @@ class ShapesCodeGen:
148
149
  """
149
150
  class_name = shape
150
151
  init_data = self.shapes_extractor.generate_data_shape_string_body(
151
- shape, self.resources_plan
152
+ shape, self.resources_plan, add_shapes_prefix=False
152
153
  )
153
154
  try:
154
155
  data_class_members = add_indent(init_data, 4)
@@ -180,7 +181,13 @@ class ShapesCodeGen:
180
181
 
181
182
  if "members" in shape_dict:
182
183
  for member, member_attributes in shape_dict["members"].items():
183
- docstring += f"\n{convert_to_snake_case(member)}"
184
+ # Add alias if field name is json, to address the Bug: https://github.com/aws/sagemaker-python-sdk/issues/4944
185
+ if shape in SHAPES_WITH_JSON_FIELD_ALIAS and member == "Json":
186
+ updated_member = "JsonFormat"
187
+ docstring += f"\n{convert_to_snake_case(updated_member)}"
188
+ else:
189
+ docstring += f"\n{convert_to_snake_case(member)}"
190
+
184
191
  if "documentation" in member_attributes:
185
192
  docstring += f": {member_attributes['documentation']}"
186
193
 
@@ -204,7 +211,7 @@ class ShapesCodeGen:
204
211
  """
205
212
  imports = "import datetime\n"
206
213
  imports += "\n"
207
- imports += "from pydantic import BaseModel, ConfigDict\n"
214
+ imports += "from pydantic import BaseModel, ConfigDict, Field\n"
208
215
  imports += "from typing import List, Dict, Optional, Any, Union\n"
209
216
  imports += "from sagemaker_core.main.utils import Unassigned"
210
217
  imports += "\n"
@@ -16,7 +16,11 @@ import pprint
16
16
  from functools import lru_cache
17
17
  from typing import Optional, Any
18
18
 
19
- from sagemaker_core.tools.constants import BASIC_JSON_TYPES_TO_PYTHON_TYPES, SHAPE_DAG_FILE_PATH
19
+ from sagemaker_core.tools.constants import (
20
+ BASIC_JSON_TYPES_TO_PYTHON_TYPES,
21
+ SHAPE_DAG_FILE_PATH,
22
+ SHAPES_WITH_JSON_FIELD_ALIAS,
23
+ )
20
24
  from sagemaker_core.main.utils import (
21
25
  reformat_file_with_black,
22
26
  convert_to_snake_case,
@@ -99,6 +103,11 @@ class ShapesExtractor:
99
103
  _dag[shape] = {"type": "structure", "members": []}
100
104
  for member, member_attrs in shape_data["members"].items():
101
105
  shape_node_member = {"name": member, "shape": member_attrs["shape"]}
106
+ # Add alias if field name is json, to address the Bug: https://github.com/aws/sagemaker-python-sdk/issues/4944
107
+ if shape in SHAPES_WITH_JSON_FIELD_ALIAS and member == "Json":
108
+ shape_node_member["name"] = "JsonFormat"
109
+ shape_node_member["alias"] = "json"
110
+
102
111
  member_shape_dict = _all_shapes[member_attrs["shape"]]
103
112
  shape_node_member["type"] = member_shape_dict["type"]
104
113
  _dag[shape]["members"].append(shape_node_member)
@@ -117,19 +126,21 @@ class ShapesExtractor:
117
126
  _dag[shape]["value_type"] = _all_shapes[_map_value_shape]["type"]
118
127
  return _dag
119
128
 
120
- def _evaluate_list_type(self, member_shape):
129
+ def _evaluate_list_type(self, member_shape, add_shapes_prefix=True):
121
130
  list_shape_name = member_shape["member"]["shape"]
122
131
  list_shape_member = self.combined_shapes[list_shape_name]
123
132
  list_shape_type = list_shape_member["type"]
124
133
  if list_shape_type == "list":
125
- member_type = f"List[{self._evaluate_list_type(list_shape_member)}]"
134
+ member_type = f"List[{self._evaluate_list_type(list_shape_member, add_shapes_prefix)}]"
126
135
  elif list_shape_type == "map":
127
- member_type = f"List[{self._evaluate_map_type(list_shape_member)}]"
136
+ member_type = f"List[{self._evaluate_map_type(list_shape_member, add_shapes_prefix)}]"
128
137
  elif list_shape_type == "structure":
129
138
  # handling an edge case of nested structure
130
139
  if list_shape_name == "SearchExpression":
131
140
  member_type = f"List['{list_shape_name}']"
132
141
  else:
142
+ if add_shapes_prefix:
143
+ list_shape_name = f"shapes.{list_shape_name}"
133
144
  member_type = f"List[{list_shape_name}]"
134
145
  elif list_shape_type in BASIC_JSON_TYPES_TO_PYTHON_TYPES.keys():
135
146
  member_type = f"List[{BASIC_JSON_TYPES_TO_PYTHON_TYPES[list_shape_type]}]"
@@ -139,7 +150,7 @@ class ShapesExtractor:
139
150
  )
140
151
  return member_type
141
152
 
142
- def _evaluate_map_type(self, member_shape):
153
+ def _evaluate_map_type(self, member_shape, add_shapes_prefix=True):
143
154
  map_key_shape_name = member_shape["key"]["shape"]
144
155
  map_value_shape_name = member_shape["value"]["shape"]
145
156
  map_key_shape = self.combined_shapes[map_key_shape_name]
@@ -152,6 +163,8 @@ class ShapesExtractor:
152
163
  "Unhandled map shape key type encountered, needs extra logic to handle this"
153
164
  )
154
165
  if map_value_shape_type == "structure":
166
+ if add_shapes_prefix:
167
+ map_value_shape_name = f"shapes.{map_value_shape_name}"
155
168
  member_type = (
156
169
  f"Dict[{BASIC_JSON_TYPES_TO_PYTHON_TYPES[map_key_shape_type]}, "
157
170
  f"{map_value_shape_name}]"
@@ -159,12 +172,12 @@ class ShapesExtractor:
159
172
  elif map_value_shape_type == "list":
160
173
  member_type = (
161
174
  f"Dict[{BASIC_JSON_TYPES_TO_PYTHON_TYPES[map_key_shape_type]}, "
162
- f"{self._evaluate_list_type(map_value_shape)}]"
175
+ f"{self._evaluate_list_type(map_value_shape, add_shapes_prefix)}]"
163
176
  )
164
177
  elif map_value_shape_type == "map":
165
178
  member_type = (
166
179
  f"Dict[{BASIC_JSON_TYPES_TO_PYTHON_TYPES[map_key_shape_type]}, "
167
- f"{self._evaluate_map_type(map_value_shape)}]"
180
+ f"{self._evaluate_map_type(map_value_shape, add_shapes_prefix)}]"
168
181
  )
169
182
  else:
170
183
  member_type = (
@@ -174,9 +187,13 @@ class ShapesExtractor:
174
187
  return member_type
175
188
 
176
189
  def generate_data_shape_members_and_string_body(
177
- self, shape, resource_plan: Optional[Any] = None, required_override=()
190
+ self,
191
+ shape,
192
+ resource_plan: Optional[Any] = None,
193
+ required_override=(),
194
+ add_shapes_prefix=True,
178
195
  ):
179
- shape_members = self.generate_shape_members(shape, required_override)
196
+ shape_members = self.generate_shape_members(shape, required_override, add_shapes_prefix)
180
197
  resource_names = None
181
198
  if resource_plan is not None:
182
199
  resource_names = [row["resource_name"] for _, row in resource_plan.iterrows()]
@@ -199,18 +216,22 @@ class ShapesExtractor:
199
216
  init_data_body += f"{attr}: {value}\n"
200
217
  return shape_members, init_data_body
201
218
 
202
- def generate_data_shape_string_body(self, shape, resource_plan, required_override=()):
219
+ def generate_data_shape_string_body(
220
+ self, shape, resource_plan, required_override=(), add_shapes_prefix=True
221
+ ):
203
222
  return self.generate_data_shape_members_and_string_body(
204
- shape, resource_plan, required_override
223
+ shape, resource_plan, required_override, add_shapes_prefix
205
224
  )[1]
206
225
 
207
- def generate_data_shape_members(self, shape, resource_plan, required_override=()):
226
+ def generate_data_shape_members(
227
+ self, shape, resource_plan, required_override=(), add_shapes_prefix=True
228
+ ):
208
229
  return self.generate_data_shape_members_and_string_body(
209
- shape, resource_plan, required_override
230
+ shape, resource_plan, required_override, add_shapes_prefix
210
231
  )[0]
211
232
 
212
233
  @lru_cache
213
- def generate_shape_members(self, shape, required_override=()):
234
+ def generate_shape_members(self, shape, required_override=(), add_shapes_prefix=True):
214
235
  shape_dict = self.combined_shapes[shape]
215
236
  members = shape_dict["members"]
216
237
  required_args = list(required_override) or shape_dict.get("required", [])
@@ -218,29 +239,48 @@ class ShapesExtractor:
218
239
  # bring the required members in front
219
240
  ordered_members = {key: members[key] for key in required_args if key in members}
220
241
  ordered_members.update(members)
242
+ field_aliases = {}
243
+
221
244
  for member_name, member_attrs in ordered_members.items():
222
245
  member_shape_name = member_attrs["shape"]
223
246
  if self.combined_shapes[member_shape_name]:
224
247
  member_shape = self.combined_shapes[member_shape_name]
225
248
  member_shape_type = member_shape["type"]
226
249
  if member_shape_type == "structure":
227
- member_type = member_shape_name
250
+ if add_shapes_prefix:
251
+ member_shape_name = f"shapes.{member_shape_name}"
252
+ member_type = f"{member_shape_name}"
228
253
  elif member_shape_type == "list":
229
- member_type = self._evaluate_list_type(member_shape)
254
+ member_type = self._evaluate_list_type(member_shape, add_shapes_prefix)
230
255
  elif member_shape_type == "map":
231
- member_type = self._evaluate_map_type(member_shape)
256
+ member_type = self._evaluate_map_type(member_shape, add_shapes_prefix)
232
257
  else:
233
258
  # Shape is a simple type like string
234
259
  member_type = BASIC_JSON_TYPES_TO_PYTHON_TYPES[member_shape_type]
235
260
  else:
236
261
  raise Exception("The Shape definition mush exist. The Json Data might be corrupt")
237
- member_name_snake_case = convert_to_snake_case(member_name)
238
- if member_name in required_args:
239
- init_data_body[f"{member_name_snake_case}"] = f"{member_type}"
240
- else:
241
- init_data_body[f"{member_name_snake_case}"] = (
242
- f"Optional[{member_type}] = Unassigned()"
262
+
263
+ is_required = member_name in required_args
264
+ # Add alias if field name is json, to address the Bug: https://github.com/aws/sagemaker-python-sdk/issues/4944
265
+ if shape in SHAPES_WITH_JSON_FIELD_ALIAS and member_name == "Json":
266
+ updated_member_name_snake_case = "json_format"
267
+ field_aliases[updated_member_name_snake_case] = "json"
268
+ init_data_body[f"{updated_member_name_snake_case}"] = (
269
+ (
270
+ f"{member_type} = Field(alias='{field_aliases[updated_member_name_snake_case]}')"
271
+ )
272
+ if is_required
273
+ else f"Optional[{member_type}] = Field(default=Unassigned(), alias='json')"
243
274
  )
275
+ else:
276
+ member_name_snake_case = convert_to_snake_case(member_name)
277
+ if is_required:
278
+ init_data_body[f"{member_name_snake_case}"] = f"{member_type}"
279
+ else:
280
+ init_data_body[f"{member_name_snake_case}"] = (
281
+ f"Optional[{member_type}] = Unassigned()"
282
+ )
283
+
244
284
  return init_data_body
245
285
 
246
286
  @lru_cache
@@ -75,7 +75,7 @@ def create(
75
75
  ) -> Optional["{resource_name}"]:
76
76
  {docstring}
77
77
  logger.info("Creating {resource_lower} resource.")
78
- client =Base.get_sagemaker_client(session=session, region_name=region, service_name='{service_name}')
78
+ client = Base.get_sagemaker_client(session=session, region_name=region, service_name='{service_name}')
79
79
 
80
80
  operation_input_args = {{
81
81
  {operation_input_args}
@@ -623,7 +623,7 @@ class Base(BaseModel):
623
623
  configurable_attribute, resource_defaults, global_defaults
624
624
  ):
625
625
  resource_name = snake_to_pascal(configurable_attribute)
626
- class_object = globals()[resource_name]
626
+ class_object = getattr(shapes, resource_name, None) or globals().get(resource_name)
627
627
  kwargs[configurable_attribute] = class_object(**config_value)
628
628
  except BaseException as e:
629
629
  logger.debug("Could not load Default Configs. Continuing.", exc_info=True)
@@ -669,7 +669,9 @@ class Base(BaseModel):
669
669
  @staticmethod
670
670
  def _get_chained_attribute(item_value: Any):
671
671
  resource_name = type(item_value).__name__
672
- class_object = globals()[resource_name]
672
+ class_object = globals().get(resource_name) or getattr(shapes, resource_name, None)
673
+ if class_object is None:
674
+ return item_value
673
675
  return class_object(**Base.populate_chained_attributes(
674
676
  resource_name=resource_name,
675
677
  operation_input_args=vars(item_value)
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: sagemaker-core
3
- Version: 1.0.29
3
+ Version: 1.0.30
4
4
  Summary: An python package for sagemaker core functionalities
5
5
  Author-email: AWS <sagemaker-interests@amazon.com>
6
6
  Project-URL: Repository, https://github.com/aws/sagemaker-core.git
@@ -1,5 +1,5 @@
1
1
  sagemaker_core/__init__.py,sha256=kM7971seaG8p8wFv2T0kOrFLRBhjOpjAHo3DGCaqRHc,126
2
- sagemaker_core/_version.py,sha256=t-XEKT8_pvKkl5z_UrfcIU256c_NxaI16qGWMuNVpbM,321
2
+ sagemaker_core/_version.py,sha256=4UH5LevZBa3Kl1OiDbkaeIzXUQNSYVFrc-2su2TCco4,86
3
3
  sagemaker_core/helper/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
4
4
  sagemaker_core/helper/session_helper.py,sha256=GO1UJgpN1L9a25nYlVb-KWk4KvmFzVkLqFMqw-VaI4c,33126
5
5
  sagemaker_core/main/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
@@ -7,29 +7,29 @@ sagemaker_core/main/config_schema.py,sha256=lBwIm5CT_dSXeW5i6cgRzbiLDjs0qtH1FrM7
7
7
  sagemaker_core/main/exceptions.py,sha256=87DUlrmHxaWoiYNlpNY9ixxFMPRk_dIGPsA2e_xdVwQ,5602
8
8
  sagemaker_core/main/intelligent_defaults_helper.py,sha256=5SDM6UavZtp-k5LhqRL7GRIDgzFB5UsC_p7YuiSPK9A,8334
9
9
  sagemaker_core/main/logs.py,sha256=yfEH7uP91nbE1lefymOlBr81ziBzsDSIOF2Qyd54FJE,6241
10
- sagemaker_core/main/resources.py,sha256=ezCu5Hqw9OYu8-rLrC6jJPZYEaUAghJ5rxcsb1XkXRw,1413476
11
- sagemaker_core/main/shapes.py,sha256=tpq9dp4X1aKDOGNQwt3pslrMMAetQfxJv-oLY-ovYvc,731767
10
+ sagemaker_core/main/resources.py,sha256=xWTgxtjjkxJJfCPbBetDkIp6gFAoOB49dgRYciKab0E,1418508
11
+ sagemaker_core/main/shapes.py,sha256=HmG353jkAGcppGVxknZio_OMktb_uubY2sMpXivmlO8,732587
12
12
  sagemaker_core/main/user_agent.py,sha256=BPYDAfDd70ObP-VAjl7aDHALHyGknkpRP21ktVr_LDw,2744
13
- sagemaker_core/main/utils.py,sha256=LCFDM6oxf6_e1i-_Dgtkm3ehl7YfoEpJ2kTTFTL6iOU,18471
13
+ sagemaker_core/main/utils.py,sha256=qTGJDcZwrAQSsdyg8A78x4PKU4Wu1rY6Cn3OIbIspaA,18546
14
14
  sagemaker_core/main/code_injection/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
15
15
  sagemaker_core/main/code_injection/base.py,sha256=11_Jif0nOzfbLGlXaacKf-wcizzfS64U0OSZGoVffFU,1733
16
16
  sagemaker_core/main/code_injection/codec.py,sha256=nA51E9iNWHyKou1G23rKSRL4WitdkFRbMuFkyrGHzKU,8428
17
17
  sagemaker_core/main/code_injection/constants.py,sha256=2ICExGge8vAWx7lSTW0JGh-bH1korkvpOpDu5M63eI4,980
18
- sagemaker_core/main/code_injection/shape_dag.py,sha256=YE-5PMUXhjByNNt0ymf6jvYogRBiPpmc2as2-lAb3Xs,699939
18
+ sagemaker_core/main/code_injection/shape_dag.py,sha256=HufpRdLVicfx8fQVKlJqp-pkJ_v3rjmUQtcZhIktWxQ,700129
19
19
  sagemaker_core/resources/__init__.py,sha256=EAYTFMN-nPjnPjjBbhIUeaL67FLKNPd7qbcbl9VIrws,31
20
20
  sagemaker_core/shapes/__init__.py,sha256=RnbIu9eTxKt-DNsOFJabrWIgrrtS9_SdAozP9JBl_ic,28
21
21
  sagemaker_core/tools/__init__.py,sha256=xX79JImxCVzrWMnjgntLCve2G5I-R4pRar5s20kT9Rs,56
22
22
  sagemaker_core/tools/codegen.py,sha256=mKWVi2pWnPxyIoWUEPYjEc9Gw7D9bCOrHqa00yzIZ1o,2005
23
- sagemaker_core/tools/constants.py,sha256=a2WjUDK7gzxgilZs99vp30qh4kQ-y6JKhrwwqVAA12o,3385
23
+ sagemaker_core/tools/constants.py,sha256=XEwsUJ4w952mpnk-K0TS7R2uJhZyVPjcR47nrzgVXtg,3483
24
24
  sagemaker_core/tools/data_extractor.py,sha256=pNfmTA0NUA96IgfLrla7a36Qjc1NljbwgZYaOhouKqQ,2113
25
25
  sagemaker_core/tools/method.py,sha256=Ud2YeH2SPkj7xtIxBuUdRfQwCmovMUzGGkcIvzhpQeQ,805
26
- sagemaker_core/tools/resources_codegen.py,sha256=bnufJgmVUcIGyMifZKbtwaO_bG_0bN2_A7CYq629pNc,84550
26
+ sagemaker_core/tools/resources_codegen.py,sha256=wz9HU_crbHeZUp5ylMqgfWbHCCgVLfD31hFBGBSuW_A,85100
27
27
  sagemaker_core/tools/resources_extractor.py,sha256=hN61ehZbPnhFW-2FIVDi7NsEz4rLvGr-WoglHQGfrug,14523
28
- sagemaker_core/tools/shapes_codegen.py,sha256=_ve959bwH8usZ6dPlpXxi2on9t0hLpcmhRWnaWHCWMQ,11745
29
- sagemaker_core/tools/shapes_extractor.py,sha256=GFy55JHGW0V8cwN5SL5gGLUxihLGswueyh5iNCn1sYk,12310
30
- sagemaker_core/tools/templates.py,sha256=yX2RQKeClgYwKS5Qu_mDpnWJIBCuj0yELrdm95aiTpk,23262
31
- sagemaker_core-1.0.29.dist-info/licenses/LICENSE,sha256=xx0jnfkXJvxRnG63LTGOxlggYnIysveWIZ6H3PNdCrQ,11357
32
- sagemaker_core-1.0.29.dist-info/METADATA,sha256=7qm0ZhVK2oLeFUK5pAu1iaiMARQnCS3AkN3G2JCvhbU,4885
33
- sagemaker_core-1.0.29.dist-info/WHEEL,sha256=CmyFI0kx5cdEMTLiONQRbGQwjIoR1aIYB7eCAQ4KPJ0,91
34
- sagemaker_core-1.0.29.dist-info/top_level.txt,sha256=R3GAZZ1zC5JxqdE_0x2Lu_WYi2Xfke7VsiP3L5zngfA,15
35
- sagemaker_core-1.0.29.dist-info/RECORD,,
28
+ sagemaker_core/tools/shapes_codegen.py,sha256=4lsePZpjk7M6RpJs5yar_m4z5MzwGHFrvCkdS_-R12c,12172
29
+ sagemaker_core/tools/shapes_extractor.py,sha256=vxVKjXD3lmjrkoKiexjUnOt8ITbFxQSeiDtx7P6Qtkw,14226
30
+ sagemaker_core/tools/templates.py,sha256=vIgRWConRGAQ-Mri6LwfkArqWHlL3KXcvbbYa-t_MV4,23414
31
+ sagemaker_core-1.0.30.dist-info/licenses/LICENSE,sha256=xx0jnfkXJvxRnG63LTGOxlggYnIysveWIZ6H3PNdCrQ,11357
32
+ sagemaker_core-1.0.30.dist-info/METADATA,sha256=_QIjOSMM3F06uAMrzR0ClBuj58lMeCryhrpnRsd4C2w,4885
33
+ sagemaker_core-1.0.30.dist-info/WHEEL,sha256=ooBFpIzZCPdw3uqIQsOo4qqbA4ZRPxHnOH7peeONza0,91
34
+ sagemaker_core-1.0.30.dist-info/top_level.txt,sha256=R3GAZZ1zC5JxqdE_0x2Lu_WYi2Xfke7VsiP3L5zngfA,15
35
+ sagemaker_core-1.0.30.dist-info/RECORD,,
@@ -1,5 +1,5 @@
1
1
  Wheel-Version: 1.0
2
- Generator: setuptools (78.1.0)
2
+ Generator: setuptools (80.0.1)
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
5
5