static GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.builder() |
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.clientRequestToken(String clientRequestToken) |
A unique, case-sensitive identifier to ensure that the API request completes no more than one time.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.endTime(Instant endTime) |
The time at which the batch inference job ended.
|
default GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.inputDataConfig(Consumer<ModelInvocationJobInputDataConfig.Builder> inputDataConfig) |
Details about the location of the input to the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.inputDataConfig(ModelInvocationJobInputDataConfig inputDataConfig) |
Details about the location of the input to the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.jobArn(String jobArn) |
The Amazon Resource Name (ARN) of the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.jobExpirationTime(Instant jobExpirationTime) |
The time at which the batch inference job times or timed out.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.jobName(String jobName) |
The name of the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.lastModifiedTime(Instant lastModifiedTime) |
The time at which the batch inference job was last modified.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.message(String message) |
If the batch inference job failed, this field contains a message describing why the job failed.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.modelId(String modelId) |
The unique identifier of the foundation model used for model inference.
|
default GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.outputDataConfig(Consumer<ModelInvocationJobOutputDataConfig.Builder> outputDataConfig) |
Details about the location of the output of the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.outputDataConfig(ModelInvocationJobOutputDataConfig outputDataConfig) |
Details about the location of the output of the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.roleArn(String roleArn) |
The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.status(String status) |
The status of the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.status(ModelInvocationJobStatus status) |
The status of the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.submitTime(Instant submitTime) |
The time at which the batch inference job was submitted.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.timeoutDurationInHours(Integer timeoutDurationInHours) |
The number of hours after which batch inference job was set to time out.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.toBuilder() |
|
default GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.vpcConfig(Consumer<VpcConfig.Builder> vpcConfig) |
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job.
|
GetModelInvocationJobResponse.Builder |
GetModelInvocationJobResponse.Builder.vpcConfig(VpcConfig vpcConfig) |
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job.
|