Uses of Interface
software.amazon.awssdk.services.bedrock.model.TextInferenceConfig.Builder
-
Packages that use TextInferenceConfig.Builder Package Description software.amazon.awssdk.services.bedrock.model -
-
Uses of TextInferenceConfig.Builder in software.amazon.awssdk.services.bedrock.model
Methods in software.amazon.awssdk.services.bedrock.model that return TextInferenceConfig.Builder Modifier and Type Method Description static TextInferenceConfig.BuilderTextInferenceConfig. builder()TextInferenceConfig.BuilderTextInferenceConfig.Builder. maxTokens(Integer maxTokens)The maximum number of tokens to generate in the output text.TextInferenceConfig.BuilderTextInferenceConfig.Builder. stopSequences(String... stopSequences)A list of sequences of characters that, if generated, will cause the model to stop generating further tokens.TextInferenceConfig.BuilderTextInferenceConfig.Builder. stopSequences(Collection<String> stopSequences)A list of sequences of characters that, if generated, will cause the model to stop generating further tokens.TextInferenceConfig.BuilderTextInferenceConfig.Builder. temperature(Float temperature)Controls the random-ness of text generated by the language model, influencing how much the model sticks to the most predictable next words versus exploring more surprising options.TextInferenceConfig.BuilderTextInferenceConfig. toBuilder()TextInferenceConfig.BuilderTextInferenceConfig.Builder. topP(Float topP)A probability distribution threshold which controls what the model considers for the set of possible next tokens.Methods in software.amazon.awssdk.services.bedrock.model that return types with arguments of type TextInferenceConfig.Builder Modifier and Type Method Description static Class<? extends TextInferenceConfig.Builder>TextInferenceConfig. serializableBuilderClass()Method parameters in software.amazon.awssdk.services.bedrock.model with type arguments of type TextInferenceConfig.Builder Modifier and Type Method Description default KbInferenceConfig.BuilderKbInferenceConfig.Builder. textInferenceConfig(Consumer<TextInferenceConfig.Builder> textInferenceConfig)Contains configuration details for text generation using a language model via theRetrieveAndGeneratefunction.
-