Interface PromptModelInferenceConfiguration.Builder
-
- All Superinterfaces:
Buildable,CopyableBuilder<PromptModelInferenceConfiguration.Builder,PromptModelInferenceConfiguration>,SdkBuilder<PromptModelInferenceConfiguration.Builder,PromptModelInferenceConfiguration>,SdkPojo
- Enclosing class:
- PromptModelInferenceConfiguration
public static interface PromptModelInferenceConfiguration.Builder extends SdkPojo, CopyableBuilder<PromptModelInferenceConfiguration.Builder,PromptModelInferenceConfiguration>
-
-
Method Summary
All Methods Instance Methods Abstract Methods Modifier and Type Method Description PromptModelInferenceConfiguration.BuildermaxTokens(Integer maxTokens)The maximum number of tokens to return in the response.PromptModelInferenceConfiguration.BuilderstopSequences(String... stopSequences)A list of strings that define sequences after which the model will stop generating.PromptModelInferenceConfiguration.BuilderstopSequences(Collection<String> stopSequences)A list of strings that define sequences after which the model will stop generating.PromptModelInferenceConfiguration.Buildertemperature(Float temperature)Controls the randomness of the response.PromptModelInferenceConfiguration.BuildertopP(Float topP)The percentage of most-likely candidates that the model considers for the next token.-
Methods inherited from interface software.amazon.awssdk.utils.builder.CopyableBuilder
copy
-
Methods inherited from interface software.amazon.awssdk.utils.builder.SdkBuilder
applyMutation, build
-
Methods inherited from interface software.amazon.awssdk.core.SdkPojo
equalsBySdkFields, sdkFields
-
-
-
-
Method Detail
-
maxTokens
PromptModelInferenceConfiguration.Builder maxTokens(Integer maxTokens)
The maximum number of tokens to return in the response.
- Parameters:
maxTokens- The maximum number of tokens to return in the response.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
stopSequences
PromptModelInferenceConfiguration.Builder stopSequences(Collection<String> stopSequences)
A list of strings that define sequences after which the model will stop generating.
- Parameters:
stopSequences- A list of strings that define sequences after which the model will stop generating.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
stopSequences
PromptModelInferenceConfiguration.Builder stopSequences(String... stopSequences)
A list of strings that define sequences after which the model will stop generating.
- Parameters:
stopSequences- A list of strings that define sequences after which the model will stop generating.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
temperature
PromptModelInferenceConfiguration.Builder temperature(Float temperature)
Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.
- Parameters:
temperature- Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
topP
PromptModelInferenceConfiguration.Builder topP(Float topP)
The percentage of most-likely candidates that the model considers for the next token.
- Parameters:
topP- The percentage of most-likely candidates that the model considers for the next token.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
-