public class AzureOpenAiStreamingChatModel extends Object implements dev.langchain4j.model.chat.StreamingChatLanguageModel, dev.langchain4j.model.chat.TokenCountEstimator
StreamingResponseHandler.
Mandatory parameters for initialization are: baseUrl, apiVersion and apiKey.
There are two primary authentication methods to access Azure OpenAI:
1. API Key Authentication: For this type of authentication, HTTP requests must include the API Key in the "api-key" HTTP header.
2. Azure Active Directory Authentication: For this type of authentication, HTTP requests must include the authentication/access token in the "Authorization" HTTP header.
Please note, that currently, only API Key authentication is supported by this class, second authentication option will be supported later.
| Modifier and Type | Class and Description |
|---|---|
static class |
AzureOpenAiStreamingChatModel.Builder |
| Constructor and Description |
|---|
AzureOpenAiStreamingChatModel(String baseUrl,
String apiVersion,
String apiKey,
dev.langchain4j.model.Tokenizer tokenizer,
Double temperature,
Double topP,
Integer maxTokens,
Double presencePenalty,
Double frequencyPenalty,
Duration timeout,
Proxy proxy,
Boolean logRequests,
Boolean logResponses) |
| Modifier and Type | Method and Description |
|---|---|
static AzureOpenAiStreamingChatModel.Builder |
builder() |
int |
estimateTokenCount(List<dev.langchain4j.data.message.ChatMessage> messages) |
void |
generate(List<dev.langchain4j.data.message.ChatMessage> messages,
List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications,
dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) |
void |
generate(List<dev.langchain4j.data.message.ChatMessage> messages,
dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) |
void |
generate(List<dev.langchain4j.data.message.ChatMessage> messages,
dev.langchain4j.agent.tool.ToolSpecification toolSpecification,
dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) |
public AzureOpenAiStreamingChatModel(String baseUrl, String apiVersion, String apiKey, dev.langchain4j.model.Tokenizer tokenizer, Double temperature, Double topP, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Duration timeout, Proxy proxy, Boolean logRequests, Boolean logResponses)
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
generate in interface dev.langchain4j.model.chat.StreamingChatLanguageModelpublic void generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
generate in interface dev.langchain4j.model.chat.StreamingChatLanguageModelpublic void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
generate in interface dev.langchain4j.model.chat.StreamingChatLanguageModelpublic int estimateTokenCount(List<dev.langchain4j.data.message.ChatMessage> messages)
estimateTokenCount in interface dev.langchain4j.model.chat.TokenCountEstimatorpublic static AzureOpenAiStreamingChatModel.Builder builder()
Copyright © 2023. All rights reserved.