| Package | Description |
|---|---|
| com.amazonaws.services.sagemaker.model |
| Modifier and Type | Method and Description |
|---|---|
ContainerDefinition |
ContainerDefinition.addEnvironmentEntry(String key,
String value)
Add a single Environment entry
|
ContainerDefinition |
ContainerDefinition.clearEnvironmentEntries()
Removes all the entries added into Environment.
|
ContainerDefinition |
ContainerDefinition.clone() |
ContainerDefinition |
CreateModelRequest.getPrimaryContainer()
The location of the primary docker image containing inference code, associated artifacts, and custom environment
map that the inference code uses when the model is deployed for predictions.
|
ContainerDefinition |
DescribeModelResult.getPrimaryContainer()
The location of the primary inference code, associated artifacts, and custom environment map that the inference
code uses when it is deployed in production.
|
ContainerDefinition |
ContainerDefinition.withContainerHostname(String containerHostname)
This parameter is ignored for models that contain only a
PrimaryContainer. |
ContainerDefinition |
ContainerDefinition.withEnvironment(Map<String,String> environment)
The environment variables to set in the Docker container.
|
ContainerDefinition |
ContainerDefinition.withImage(String image)
The path where inference code is stored.
|
ContainerDefinition |
ContainerDefinition.withImageConfig(ImageConfig imageConfig)
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon
Virtual Private Cloud (VPC).
|
ContainerDefinition |
ContainerDefinition.withInferenceSpecificationName(String inferenceSpecificationName)
The inference specification name in the model package version.
|
ContainerDefinition |
ContainerDefinition.withMode(ContainerMode mode)
Whether the container hosts a single model or multiple models.
|
ContainerDefinition |
ContainerDefinition.withMode(String mode)
Whether the container hosts a single model or multiple models.
|
ContainerDefinition |
ContainerDefinition.withModelDataUrl(String modelDataUrl)
The S3 path where the model artifacts, which result from model training, are stored.
|
ContainerDefinition |
ContainerDefinition.withModelPackageName(String modelPackageName)
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
|
ContainerDefinition |
ContainerDefinition.withMultiModelConfig(MultiModelConfig multiModelConfig)
Specifies additional configuration for multi-model endpoints.
|
| Modifier and Type | Method and Description |
|---|---|
List<ContainerDefinition> |
CreateModelRequest.getContainers()
Specifies the containers in the inference pipeline.
|
List<ContainerDefinition> |
DescribeModelResult.getContainers()
The containers in the inference pipeline.
|
| Modifier and Type | Method and Description |
|---|---|
void |
CreateModelRequest.setPrimaryContainer(ContainerDefinition primaryContainer)
The location of the primary docker image containing inference code, associated artifacts, and custom environment
map that the inference code uses when the model is deployed for predictions.
|
void |
DescribeModelResult.setPrimaryContainer(ContainerDefinition primaryContainer)
The location of the primary inference code, associated artifacts, and custom environment map that the inference
code uses when it is deployed in production.
|
CreateModelRequest |
CreateModelRequest.withContainers(ContainerDefinition... containers)
Specifies the containers in the inference pipeline.
|
DescribeModelResult |
DescribeModelResult.withContainers(ContainerDefinition... containers)
The containers in the inference pipeline.
|
CreateModelRequest |
CreateModelRequest.withPrimaryContainer(ContainerDefinition primaryContainer)
The location of the primary docker image containing inference code, associated artifacts, and custom environment
map that the inference code uses when the model is deployed for predictions.
|
DescribeModelResult |
DescribeModelResult.withPrimaryContainer(ContainerDefinition primaryContainer)
The location of the primary inference code, associated artifacts, and custom environment map that the inference
code uses when it is deployed in production.
|
| Modifier and Type | Method and Description |
|---|---|
void |
CreateModelRequest.setContainers(Collection<ContainerDefinition> containers)
Specifies the containers in the inference pipeline.
|
void |
DescribeModelResult.setContainers(Collection<ContainerDefinition> containers)
The containers in the inference pipeline.
|
CreateModelRequest |
CreateModelRequest.withContainers(Collection<ContainerDefinition> containers)
Specifies the containers in the inference pipeline.
|
DescribeModelResult |
DescribeModelResult.withContainers(Collection<ContainerDefinition> containers)
The containers in the inference pipeline.
|
Copyright © 2022. All rights reserved.