Interface ModelInvocationJobSummary.Builder
-
- All Superinterfaces:
Buildable,CopyableBuilder<ModelInvocationJobSummary.Builder,ModelInvocationJobSummary>,SdkBuilder<ModelInvocationJobSummary.Builder,ModelInvocationJobSummary>,SdkPojo
- Enclosing class:
- ModelInvocationJobSummary
@Mutable @NotThreadSafe public static interface ModelInvocationJobSummary.Builder extends SdkPojo, CopyableBuilder<ModelInvocationJobSummary.Builder,ModelInvocationJobSummary>
-
-
Method Summary
All Methods Instance Methods Abstract Methods Default Methods Modifier and Type Method Description ModelInvocationJobSummary.BuilderclientRequestToken(String clientRequestToken)A unique, case-sensitive identifier to ensure that the API request completes no more than one time.ModelInvocationJobSummary.BuilderendTime(Instant endTime)The time at which the batch inference job ended.default ModelInvocationJobSummary.BuilderinputDataConfig(Consumer<ModelInvocationJobInputDataConfig.Builder> inputDataConfig)Details about the location of the input to the batch inference job.ModelInvocationJobSummary.BuilderinputDataConfig(ModelInvocationJobInputDataConfig inputDataConfig)Details about the location of the input to the batch inference job.ModelInvocationJobSummary.BuilderjobArn(String jobArn)The Amazon Resource Name (ARN) of the batch inference job.ModelInvocationJobSummary.BuilderjobExpirationTime(Instant jobExpirationTime)The time at which the batch inference job times or timed out.ModelInvocationJobSummary.BuilderjobName(String jobName)The name of the batch inference job.ModelInvocationJobSummary.BuilderlastModifiedTime(Instant lastModifiedTime)The time at which the batch inference job was last modified.ModelInvocationJobSummary.Buildermessage(String message)If the batch inference job failed, this field contains a message describing why the job failed.ModelInvocationJobSummary.BuildermodelId(String modelId)The unique identifier of the foundation model used for model inference.default ModelInvocationJobSummary.BuilderoutputDataConfig(Consumer<ModelInvocationJobOutputDataConfig.Builder> outputDataConfig)Details about the location of the output of the batch inference job.ModelInvocationJobSummary.BuilderoutputDataConfig(ModelInvocationJobOutputDataConfig outputDataConfig)Details about the location of the output of the batch inference job.ModelInvocationJobSummary.BuilderroleArn(String roleArn)The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference.ModelInvocationJobSummary.Builderstatus(String status)The status of the batch inference job.ModelInvocationJobSummary.Builderstatus(ModelInvocationJobStatus status)The status of the batch inference job.ModelInvocationJobSummary.BuildersubmitTime(Instant submitTime)The time at which the batch inference job was submitted.ModelInvocationJobSummary.BuildertimeoutDurationInHours(Integer timeoutDurationInHours)The number of hours after which the batch inference job was set to time out.default ModelInvocationJobSummary.BuildervpcConfig(Consumer<VpcConfig.Builder> vpcConfig)The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job.ModelInvocationJobSummary.BuildervpcConfig(VpcConfig vpcConfig)The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job.-
Methods inherited from interface software.amazon.awssdk.utils.builder.CopyableBuilder
copy
-
Methods inherited from interface software.amazon.awssdk.utils.builder.SdkBuilder
applyMutation, build
-
Methods inherited from interface software.amazon.awssdk.core.SdkPojo
equalsBySdkFields, sdkFieldNameToField, sdkFields
-
-
-
-
Method Detail
-
jobArn
ModelInvocationJobSummary.Builder jobArn(String jobArn)
The Amazon Resource Name (ARN) of the batch inference job.
- Parameters:
jobArn- The Amazon Resource Name (ARN) of the batch inference job.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
jobName
ModelInvocationJobSummary.Builder jobName(String jobName)
The name of the batch inference job.
- Parameters:
jobName- The name of the batch inference job.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
modelId
ModelInvocationJobSummary.Builder modelId(String modelId)
The unique identifier of the foundation model used for model inference.
- Parameters:
modelId- The unique identifier of the foundation model used for model inference.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
clientRequestToken
ModelInvocationJobSummary.Builder clientRequestToken(String clientRequestToken)
A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
- Parameters:
clientRequestToken- A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
roleArn
ModelInvocationJobSummary.Builder roleArn(String roleArn)
The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.
- Parameters:
roleArn- The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
status
ModelInvocationJobSummary.Builder status(String status)
The status of the batch inference job.
The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
- Parameters:
status- The status of the batch inference job.The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
-
- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
ModelInvocationJobStatus,ModelInvocationJobStatus
-
-
status
ModelInvocationJobSummary.Builder status(ModelInvocationJobStatus status)
The status of the batch inference job.
The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
- Parameters:
status- The status of the batch inference job.The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
-
- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
ModelInvocationJobStatus,ModelInvocationJobStatus
-
-
message
ModelInvocationJobSummary.Builder message(String message)
If the batch inference job failed, this field contains a message describing why the job failed.
- Parameters:
message- If the batch inference job failed, this field contains a message describing why the job failed.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
submitTime
ModelInvocationJobSummary.Builder submitTime(Instant submitTime)
The time at which the batch inference job was submitted.
- Parameters:
submitTime- The time at which the batch inference job was submitted.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
lastModifiedTime
ModelInvocationJobSummary.Builder lastModifiedTime(Instant lastModifiedTime)
The time at which the batch inference job was last modified.
- Parameters:
lastModifiedTime- The time at which the batch inference job was last modified.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
endTime
ModelInvocationJobSummary.Builder endTime(Instant endTime)
The time at which the batch inference job ended.
- Parameters:
endTime- The time at which the batch inference job ended.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
inputDataConfig
ModelInvocationJobSummary.Builder inputDataConfig(ModelInvocationJobInputDataConfig inputDataConfig)
Details about the location of the input to the batch inference job.
- Parameters:
inputDataConfig- Details about the location of the input to the batch inference job.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
inputDataConfig
default ModelInvocationJobSummary.Builder inputDataConfig(Consumer<ModelInvocationJobInputDataConfig.Builder> inputDataConfig)
Details about the location of the input to the batch inference job.
This is a convenience method that creates an instance of theModelInvocationJobInputDataConfig.Builderavoiding the need to create one manually viaModelInvocationJobInputDataConfig.builder().When the
Consumercompletes,SdkBuilder.build()is called immediately and its result is passed toinputDataConfig(ModelInvocationJobInputDataConfig).- Parameters:
inputDataConfig- a consumer that will call methods onModelInvocationJobInputDataConfig.Builder- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
inputDataConfig(ModelInvocationJobInputDataConfig)
-
outputDataConfig
ModelInvocationJobSummary.Builder outputDataConfig(ModelInvocationJobOutputDataConfig outputDataConfig)
Details about the location of the output of the batch inference job.
- Parameters:
outputDataConfig- Details about the location of the output of the batch inference job.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
outputDataConfig
default ModelInvocationJobSummary.Builder outputDataConfig(Consumer<ModelInvocationJobOutputDataConfig.Builder> outputDataConfig)
Details about the location of the output of the batch inference job.
This is a convenience method that creates an instance of theModelInvocationJobOutputDataConfig.Builderavoiding the need to create one manually viaModelInvocationJobOutputDataConfig.builder().When the
Consumercompletes,SdkBuilder.build()is called immediately and its result is passed tooutputDataConfig(ModelInvocationJobOutputDataConfig).- Parameters:
outputDataConfig- a consumer that will call methods onModelInvocationJobOutputDataConfig.Builder- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
outputDataConfig(ModelInvocationJobOutputDataConfig)
-
vpcConfig
ModelInvocationJobSummary.Builder vpcConfig(VpcConfig vpcConfig)
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC.
- Parameters:
vpcConfig- The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
vpcConfig
default ModelInvocationJobSummary.Builder vpcConfig(Consumer<VpcConfig.Builder> vpcConfig)
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC.
This is a convenience method that creates an instance of theVpcConfig.Builderavoiding the need to create one manually viaVpcConfig.builder().When the
Consumercompletes,SdkBuilder.build()is called immediately and its result is passed tovpcConfig(VpcConfig).- Parameters:
vpcConfig- a consumer that will call methods onVpcConfig.Builder- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
vpcConfig(VpcConfig)
-
timeoutDurationInHours
ModelInvocationJobSummary.Builder timeoutDurationInHours(Integer timeoutDurationInHours)
The number of hours after which the batch inference job was set to time out.
- Parameters:
timeoutDurationInHours- The number of hours after which the batch inference job was set to time out.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
jobExpirationTime
ModelInvocationJobSummary.Builder jobExpirationTime(Instant jobExpirationTime)
The time at which the batch inference job times or timed out.
- Parameters:
jobExpirationTime- The time at which the batch inference job times or timed out.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
-