Interface ModelInvocationJobSummary.Builder

    • Method Detail

      • jobArn

        ModelInvocationJobSummary.Builder jobArn​(String jobArn)

        The Amazon Resource Name (ARN) of the batch inference job.

        Parameters:
        jobArn - The Amazon Resource Name (ARN) of the batch inference job.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • jobName

        ModelInvocationJobSummary.Builder jobName​(String jobName)

        The name of the batch inference job.

        Parameters:
        jobName - The name of the batch inference job.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • modelId

        ModelInvocationJobSummary.Builder modelId​(String modelId)

        The unique identifier of the foundation model used for model inference.

        Parameters:
        modelId - The unique identifier of the foundation model used for model inference.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • clientRequestToken

        ModelInvocationJobSummary.Builder clientRequestToken​(String clientRequestToken)

        A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.

        Parameters:
        clientRequestToken - A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • roleArn

        ModelInvocationJobSummary.Builder roleArn​(String roleArn)

        The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.

        Parameters:
        roleArn - The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • status

        ModelInvocationJobSummary.Builder status​(String status)

        The status of the batch inference job.

        The following statuses are possible:

        • Submitted – This job has been submitted to a queue for validation.

        • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

          • Your IAM service role has access to the Amazon S3 buckets containing your files.

          • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

          • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

        • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

        • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

        • InProgress – This job has begun. You can start viewing the results in the output S3 location.

        • Completed – This job has successfully completed. View the output files in the output S3 location.

        • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

        • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

        • Stopped – This job was stopped by a user.

        • Stopping – This job is being stopped by a user.

        Parameters:
        status - The status of the batch inference job.

        The following statuses are possible:

        • Submitted – This job has been submitted to a queue for validation.

        • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

          • Your IAM service role has access to the Amazon S3 buckets containing your files.

          • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

          • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

        • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

        • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

        • InProgress – This job has begun. You can start viewing the results in the output S3 location.

        • Completed – This job has successfully completed. View the output files in the output S3 location.

        • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

        • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

        • Stopped – This job was stopped by a user.

        • Stopping – This job is being stopped by a user.

        Returns:
        Returns a reference to this object so that method calls can be chained together.
        See Also:
        ModelInvocationJobStatus, ModelInvocationJobStatus
      • status

        ModelInvocationJobSummary.Builder status​(ModelInvocationJobStatus status)

        The status of the batch inference job.

        The following statuses are possible:

        • Submitted – This job has been submitted to a queue for validation.

        • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

          • Your IAM service role has access to the Amazon S3 buckets containing your files.

          • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

          • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

        • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

        • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

        • InProgress – This job has begun. You can start viewing the results in the output S3 location.

        • Completed – This job has successfully completed. View the output files in the output S3 location.

        • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

        • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

        • Stopped – This job was stopped by a user.

        • Stopping – This job is being stopped by a user.

        Parameters:
        status - The status of the batch inference job.

        The following statuses are possible:

        • Submitted – This job has been submitted to a queue for validation.

        • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

          • Your IAM service role has access to the Amazon S3 buckets containing your files.

          • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

          • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

        • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

        • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

        • InProgress – This job has begun. You can start viewing the results in the output S3 location.

        • Completed – This job has successfully completed. View the output files in the output S3 location.

        • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

        • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

        • Stopped – This job was stopped by a user.

        • Stopping – This job is being stopped by a user.

        Returns:
        Returns a reference to this object so that method calls can be chained together.
        See Also:
        ModelInvocationJobStatus, ModelInvocationJobStatus
      • message

        ModelInvocationJobSummary.Builder message​(String message)

        If the batch inference job failed, this field contains a message describing why the job failed.

        Parameters:
        message - If the batch inference job failed, this field contains a message describing why the job failed.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • submitTime

        ModelInvocationJobSummary.Builder submitTime​(Instant submitTime)

        The time at which the batch inference job was submitted.

        Parameters:
        submitTime - The time at which the batch inference job was submitted.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • lastModifiedTime

        ModelInvocationJobSummary.Builder lastModifiedTime​(Instant lastModifiedTime)

        The time at which the batch inference job was last modified.

        Parameters:
        lastModifiedTime - The time at which the batch inference job was last modified.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • endTime

        ModelInvocationJobSummary.Builder endTime​(Instant endTime)

        The time at which the batch inference job ended.

        Parameters:
        endTime - The time at which the batch inference job ended.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • inputDataConfig

        ModelInvocationJobSummary.Builder inputDataConfig​(ModelInvocationJobInputDataConfig inputDataConfig)

        Details about the location of the input to the batch inference job.

        Parameters:
        inputDataConfig - Details about the location of the input to the batch inference job.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • outputDataConfig

        ModelInvocationJobSummary.Builder outputDataConfig​(ModelInvocationJobOutputDataConfig outputDataConfig)

        Details about the location of the output of the batch inference job.

        Parameters:
        outputDataConfig - Details about the location of the output of the batch inference job.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • timeoutDurationInHours

        ModelInvocationJobSummary.Builder timeoutDurationInHours​(Integer timeoutDurationInHours)

        The number of hours after which the batch inference job was set to time out.

        Parameters:
        timeoutDurationInHours - The number of hours after which the batch inference job was set to time out.
        Returns:
        Returns a reference to this object so that method calls can be chained together.
      • jobExpirationTime

        ModelInvocationJobSummary.Builder jobExpirationTime​(Instant jobExpirationTime)

        The time at which the batch inference job times or timed out.

        Parameters:
        jobExpirationTime - The time at which the batch inference job times or timed out.
        Returns:
        Returns a reference to this object so that method calls can be chained together.