static CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.builder() |
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.clientRequestToken(String clientRequestToken) |
A unique, case-sensitive identifier to ensure that the API request completes no more than one time.
|
default CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.inputDataConfig(Consumer<ModelInvocationJobInputDataConfig.Builder> inputDataConfig) |
Details about the location of the input to the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.inputDataConfig(ModelInvocationJobInputDataConfig inputDataConfig) |
Details about the location of the input to the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.jobName(String jobName) |
A name to give the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.modelId(String modelId) |
The unique identifier of the foundation model to use for the batch inference job.
|
default CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.outputDataConfig(Consumer<ModelInvocationJobOutputDataConfig.Builder> outputDataConfig) |
Details about the location of the output of the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.outputDataConfig(ModelInvocationJobOutputDataConfig outputDataConfig) |
Details about the location of the output of the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.overrideConfiguration(Consumer<AwsRequestOverrideConfiguration.Builder> builderConsumer) |
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.overrideConfiguration(AwsRequestOverrideConfiguration overrideConfiguration) |
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.roleArn(String roleArn) |
The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.tags(Collection<Tag> tags) |
Any tags to associate with the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.tags(Consumer<Tag.Builder>... tags) |
Any tags to associate with the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.tags(Tag... tags) |
Any tags to associate with the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.timeoutDurationInHours(Integer timeoutDurationInHours) |
The number of hours after which to force the batch inference job to time out.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.toBuilder() |
|
default CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.vpcConfig(Consumer<VpcConfig.Builder> vpcConfig) |
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job.
|
CreateModelInvocationJobRequest.Builder |
CreateModelInvocationJobRequest.Builder.vpcConfig(VpcConfig vpcConfig) |
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job.
|