public class AzureOpenAiChatModel extends Object implements dev.langchain4j.model.chat.ChatLanguageModel, dev.langchain4j.model.chat.TokenCountEstimator
Mandatory parameters for initialization are: baseUrl, apiVersion and apiKey.
There are two primary authentication methods to access Azure OpenAI:
1. API Key Authentication: For this type of authentication, HTTP requests must include the API Key in the "api-key" HTTP header.
2. Azure Active Directory Authentication: For this type of authentication, HTTP requests must include the authentication/access token in the "Authorization" HTTP header.
Please note, that currently, only API Key authentication is supported by this class, second authentication option will be supported later.
| Modifier and Type | Class and Description |
|---|---|
static class |
AzureOpenAiChatModel.Builder |
| Constructor and Description |
|---|
AzureOpenAiChatModel(String baseUrl,
String apiVersion,
String apiKey,
dev.langchain4j.model.Tokenizer tokenizer,
Double temperature,
Double topP,
Integer maxTokens,
Double presencePenalty,
Double frequencyPenalty,
Duration timeout,
Integer maxRetries,
Proxy proxy,
Boolean logRequests,
Boolean logResponses) |
| Modifier and Type | Method and Description |
|---|---|
static AzureOpenAiChatModel.Builder |
builder() |
int |
estimateTokenCount(List<dev.langchain4j.data.message.ChatMessage> messages) |
dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> |
generate(List<dev.langchain4j.data.message.ChatMessage> messages) |
dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> |
generate(List<dev.langchain4j.data.message.ChatMessage> messages,
List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications) |
dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> |
generate(List<dev.langchain4j.data.message.ChatMessage> messages,
dev.langchain4j.agent.tool.ToolSpecification toolSpecification) |
public AzureOpenAiChatModel(String baseUrl, String apiVersion, String apiKey, dev.langchain4j.model.Tokenizer tokenizer, Double temperature, Double topP, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, Proxy proxy, Boolean logRequests, Boolean logResponses)
public dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages)
generate in interface dev.langchain4j.model.chat.ChatLanguageModelpublic dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications)
generate in interface dev.langchain4j.model.chat.ChatLanguageModelpublic dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification)
generate in interface dev.langchain4j.model.chat.ChatLanguageModelpublic int estimateTokenCount(List<dev.langchain4j.data.message.ChatMessage> messages)
estimateTokenCount in interface dev.langchain4j.model.chat.TokenCountEstimatorpublic static AzureOpenAiChatModel.Builder builder()
Copyright © 2023. All rights reserved.