Class PutOpenaiRequest.Builder
java.lang.Object
co.elastic.clients.util.ObjectBuilderBase
co.elastic.clients.util.WithJsonObjectBuilderBase<BuilderT>
co.elastic.clients.elasticsearch._types.RequestBase.AbstractBuilder<PutOpenaiRequest.Builder>
co.elastic.clients.elasticsearch.inference.PutOpenaiRequest.Builder
- All Implemented Interfaces:
WithJson<PutOpenaiRequest.Builder>,ObjectBuilder<PutOpenaiRequest>
- Enclosing class:
- PutOpenaiRequest
public static class PutOpenaiRequest.Builder
extends RequestBase.AbstractBuilder<PutOpenaiRequest.Builder>
implements ObjectBuilder<PutOpenaiRequest>
Builder for
PutOpenaiRequest.-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()Builds aPutOpenaiRequest.final PutOpenaiRequest.BuilderThe chunking configuration object.final PutOpenaiRequest.BuilderchunkingSettings(Function<InferenceChunkingSettings.Builder, ObjectBuilder<InferenceChunkingSettings>> fn) The chunking configuration object.final PutOpenaiRequest.BuilderopenaiInferenceId(String value) Required - The unique identifier of the inference endpoint.protected PutOpenaiRequest.Builderself()final PutOpenaiRequest.Builderservice(OpenAIServiceType value) Required - The type of service supported for the specified task type.final PutOpenaiRequest.BuilderRequired - Settings used to install the inference model.final PutOpenaiRequest.BuilderRequired - Settings used to install the inference model.final PutOpenaiRequest.BuildertaskSettings(OpenAITaskSettings value) Settings to configure the inference task.final PutOpenaiRequest.BuilderSettings to configure the inference task.final PutOpenaiRequest.BuildertaskType(OpenAITaskType value) Required - The type of the inference task that the model will perform.Methods inherited from class co.elastic.clients.util.WithJsonObjectBuilderBase
withJsonMethods inherited from class co.elastic.clients.util.ObjectBuilderBase
_checkSingleUse, _listAdd, _listAddAll, _mapPut, _mapPutAll
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
chunkingSettings
The chunking configuration object.API name:
chunking_settings -
chunkingSettings
public final PutOpenaiRequest.Builder chunkingSettings(Function<InferenceChunkingSettings.Builder, ObjectBuilder<InferenceChunkingSettings>> fn) The chunking configuration object.API name:
chunking_settings -
openaiInferenceId
Required - The unique identifier of the inference endpoint.API name:
openai_inference_id -
service
Required - The type of service supported for the specified task type. In this case,openai.API name:
service -
serviceSettings
Required - Settings used to install the inference model. These settings are specific to theopenaiservice.API name:
service_settings -
serviceSettings
public final PutOpenaiRequest.Builder serviceSettings(Function<OpenAIServiceSettings.Builder, ObjectBuilder<OpenAIServiceSettings>> fn) Required - Settings used to install the inference model. These settings are specific to theopenaiservice.API name:
service_settings -
taskSettings
Settings to configure the inference task. These settings are specific to the task type you specified.API name:
task_settings -
taskSettings
public final PutOpenaiRequest.Builder taskSettings(Function<OpenAITaskSettings.Builder, ObjectBuilder<OpenAITaskSettings>> fn) Settings to configure the inference task. These settings are specific to the task type you specified.API name:
task_settings -
taskType
Required - The type of the inference task that the model will perform. NOTE: Thechat_completiontask type only supports streaming and only through the _stream API.API name:
task_type -
self
- Specified by:
selfin classRequestBase.AbstractBuilder<PutOpenaiRequest.Builder>
-
build
Builds aPutOpenaiRequest.- Specified by:
buildin interfaceObjectBuilder<PutOpenaiRequest>- Throws:
NullPointerException- if some of the required fields are null.
-