| Package | Description |
|---|---|
| com.amazonaws.services.sagemaker.model |
| Modifier and Type | Method and Description |
|---|---|
InferenceComponentComputeResourceRequirements |
InferenceComponentComputeResourceRequirements.clone() |
InferenceComponentComputeResourceRequirements |
InferenceComponentSpecificationSummary.getComputeResourceRequirements()
The compute resources allocated to run the model assigned to the inference component.
|
InferenceComponentComputeResourceRequirements |
InferenceComponentSpecification.getComputeResourceRequirements()
The compute resources allocated to run the model assigned to the inference component.
|
InferenceComponentComputeResourceRequirements |
InferenceComponentComputeResourceRequirements.withMaxMemoryRequiredInMb(Integer maxMemoryRequiredInMb)
The maximum MB of memory to allocate to run a model that you assign to an inference component.
|
InferenceComponentComputeResourceRequirements |
InferenceComponentComputeResourceRequirements.withMinMemoryRequiredInMb(Integer minMemoryRequiredInMb)
The minimum MB of memory to allocate to run a model that you assign to an inference component.
|
InferenceComponentComputeResourceRequirements |
InferenceComponentComputeResourceRequirements.withNumberOfAcceleratorDevicesRequired(Float numberOfAcceleratorDevicesRequired)
The number of accelerators to allocate to run a model that you assign to an inference component.
|
InferenceComponentComputeResourceRequirements |
InferenceComponentComputeResourceRequirements.withNumberOfCpuCoresRequired(Float numberOfCpuCoresRequired)
The number of CPU cores to allocate to run a model that you assign to an inference component.
|
| Modifier and Type | Method and Description |
|---|---|
void |
InferenceComponentSpecificationSummary.setComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
|
void |
InferenceComponentSpecification.setComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
|
InferenceComponentSpecificationSummary |
InferenceComponentSpecificationSummary.withComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
|
InferenceComponentSpecification |
InferenceComponentSpecification.withComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
|
Copyright © 2024. All rights reserved.