| Package | Description |
|---|---|
| com.amazonaws.services.sagemaker.model |
| Modifier and Type | Method and Description |
|---|---|
TransformInput |
TransformInput.clone() |
TransformInput |
CreateTransformJobRequest.getTransformInput()
Describes the input source and the way the transform job consumes it.
|
TransformInput |
DescribeTransformJobResult.getTransformInput()
Describes the dataset to be transformed and the Amazon S3 location where it is stored.
|
TransformInput |
TransformInput.withCompressionType(CompressionType compressionType)
Compressing data helps save on storage space.
|
TransformInput |
TransformInput.withCompressionType(String compressionType)
Compressing data helps save on storage space.
|
TransformInput |
TransformInput.withContentType(String contentType)
The multipurpose internet mail extension (MIME) type of the data.
|
TransformInput |
TransformInput.withDataSource(TransformDataSource dataSource)
Describes the location of the channel data, meaning the S3 location of the input data that the model can consume.
|
TransformInput |
TransformInput.withSplitType(SplitType splitType)
The method to use to split the transform job's data into smaller batches.
|
TransformInput |
TransformInput.withSplitType(String splitType)
The method to use to split the transform job's data into smaller batches.
|
| Modifier and Type | Method and Description |
|---|---|
void |
CreateTransformJobRequest.setTransformInput(TransformInput transformInput)
Describes the input source and the way the transform job consumes it.
|
void |
DescribeTransformJobResult.setTransformInput(TransformInput transformInput)
Describes the dataset to be transformed and the Amazon S3 location where it is stored.
|
CreateTransformJobRequest |
CreateTransformJobRequest.withTransformInput(TransformInput transformInput)
Describes the input source and the way the transform job consumes it.
|
DescribeTransformJobResult |
DescribeTransformJobResult.withTransformInput(TransformInput transformInput)
Describes the dataset to be transformed and the Amazon S3 location where it is stored.
|
Copyright © 2018. All rights reserved.