| Package | Description |
|---|---|
| com.volcengine.model.maas.api |
| Modifier and Type | Method and Description |
|---|---|
Api.Parameters.Builder |
Api.Parameters.Builder.addAllStop(Iterable<String> values)
repeated string stop = 13; |
Api.Parameters.Builder |
Api.Parameters.Builder.addRepeatedField(com.google.protobuf.Descriptors.FieldDescriptor field,
Object value) |
Api.Parameters.Builder |
Api.Parameters.Builder.addStop(String value)
repeated string stop = 13; |
Api.Parameters.Builder |
Api.Parameters.Builder.addStopBytes(com.google.protobuf.ByteString value)
repeated string stop = 13; |
Api.Parameters.Builder |
Api.Parameters.Builder.clear() |
Api.Parameters.Builder |
Api.Parameters.Builder.clearDoSample()
Whether or not to use sampling, use greedy decoding otherwise.
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearField(com.google.protobuf.Descriptors.FieldDescriptor field) |
Api.Parameters.Builder |
Api.Parameters.Builder.clearFrequencyPenalty()
Number between -2.0 and 2.0.
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearLogprobs()
optional int64 logprobs = 12; |
Api.Parameters.Builder |
Api.Parameters.Builder.clearMaxNewTokens()
The maximum number of tokens to generate, ignoring the number of tokens in
the prompt
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearMaxPromptTokens()
the maximum number of prompt tokens, if prompt tokens length over this
limit, it will be truncated as prompt[-max_prompt_tokens:]
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearMaxTokens()
The maximum number of tokens to generate in the char completion.
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearMinNewTokens()
the minimum number of tokens to generate
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearOneof(com.google.protobuf.Descriptors.OneofDescriptor oneof) |
Api.Parameters.Builder |
Api.Parameters.Builder.clearPresencePenalty()
Number between -2.0 and 2.0.
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearRepetitionPenalty()
The parameter for repetition penalty, from [1.0, 2.0]
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearStop()
repeated string stop = 13; |
Api.Parameters.Builder |
Api.Parameters.Builder.clearTemperature()
Exponential scaling output probability distribution
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearTopK()
The number of highest probability vocabulary tokens to keep for
top-k-filtering.
|
Api.Parameters.Builder |
Api.Parameters.Builder.clearTopP()
An alternative to sampling with temperature, called nucleus sampling,
where the model considers the results of the tokens with top_p probability
mass
|
Api.Parameters.Builder |
Api.Parameters.Builder.clone() |
Api.Parameters.Builder |
Api.ChatReq.Builder.getParametersBuilder()
API specific parameters
|
Api.Parameters.Builder |
Api.Parameters.Builder.mergeFrom(Api.Parameters other) |
Api.Parameters.Builder |
Api.Parameters.Builder.mergeFrom(com.google.protobuf.CodedInputStream input,
com.google.protobuf.ExtensionRegistryLite extensionRegistry) |
Api.Parameters.Builder |
Api.Parameters.Builder.mergeFrom(com.google.protobuf.Message other) |
Api.Parameters.Builder |
Api.Parameters.Builder.mergeUnknownFields(com.google.protobuf.UnknownFieldSet unknownFields) |
static Api.Parameters.Builder |
Api.Parameters.newBuilder() |
static Api.Parameters.Builder |
Api.Parameters.newBuilder(Api.Parameters prototype) |
Api.Parameters.Builder |
Api.Parameters.newBuilderForType() |
protected Api.Parameters.Builder |
Api.Parameters.newBuilderForType(com.google.protobuf.GeneratedMessageV3.BuilderParent parent) |
Api.Parameters.Builder |
Api.Parameters.Builder.setDoSample(boolean value)
Whether or not to use sampling, use greedy decoding otherwise.
|
Api.Parameters.Builder |
Api.Parameters.Builder.setField(com.google.protobuf.Descriptors.FieldDescriptor field,
Object value) |
Api.Parameters.Builder |
Api.Parameters.Builder.setFrequencyPenalty(float value)
Number between -2.0 and 2.0.
|
Api.Parameters.Builder |
Api.Parameters.Builder.setLogprobs(long value)
optional int64 logprobs = 12; |
Api.Parameters.Builder |
Api.Parameters.Builder.setMaxNewTokens(long value)
The maximum number of tokens to generate, ignoring the number of tokens in
the prompt
|
Api.Parameters.Builder |
Api.Parameters.Builder.setMaxPromptTokens(long value)
the maximum number of prompt tokens, if prompt tokens length over this
limit, it will be truncated as prompt[-max_prompt_tokens:]
|
Api.Parameters.Builder |
Api.Parameters.Builder.setMaxTokens(long value)
The maximum number of tokens to generate in the char completion.
|
Api.Parameters.Builder |
Api.Parameters.Builder.setMinNewTokens(long value)
the minimum number of tokens to generate
|
Api.Parameters.Builder |
Api.Parameters.Builder.setPresencePenalty(float value)
Number between -2.0 and 2.0.
|
Api.Parameters.Builder |
Api.Parameters.Builder.setRepeatedField(com.google.protobuf.Descriptors.FieldDescriptor field,
int index,
Object value) |
Api.Parameters.Builder |
Api.Parameters.Builder.setRepetitionPenalty(float value)
The parameter for repetition penalty, from [1.0, 2.0]
|
Api.Parameters.Builder |
Api.Parameters.Builder.setStop(int index,
String value)
repeated string stop = 13; |
Api.Parameters.Builder |
Api.Parameters.Builder.setTemperature(float value)
Exponential scaling output probability distribution
|
Api.Parameters.Builder |
Api.Parameters.Builder.setTopK(long value)
The number of highest probability vocabulary tokens to keep for
top-k-filtering.
|
Api.Parameters.Builder |
Api.Parameters.Builder.setTopP(float value)
An alternative to sampling with temperature, called nucleus sampling,
where the model considers the results of the tokens with top_p probability
mass
|
Api.Parameters.Builder |
Api.Parameters.Builder.setUnknownFields(com.google.protobuf.UnknownFieldSet unknownFields) |
Api.Parameters.Builder |
Api.Parameters.toBuilder() |
| Modifier and Type | Method and Description |
|---|---|
Api.ChatReq.Builder |
Api.ChatReq.Builder.setParameters(Api.Parameters.Builder builderForValue)
API specific parameters
|
Copyright © 2025. All rights reserved.