Skip to content

AzureOpenAI Provider

Below is how you can instantiate Azure OpenAI as a provider.

All feedback functions listed in the base LLMProvider class can be run with the AzureOpenAI Provider.

Warning

Azure OpenAI does not support the OpenAI moderation endpoint.

trulens_eval.feedback.provider.openai.AzureOpenAI

Bases: OpenAI

Out of the box feedback functions calling AzureOpenAI APIs. Has the same functionality as OpenAI out of the box feedback functions, excluding the moderation endpoint which is not supported by Azure. Please export the following env variables. These can be retrieved from https://oai.azure.com/ .

  • AZURE_OPENAI_ENDPOINT
  • AZURE_OPENAI_API_KEY
  • OPENAI_API_VERSION

Deployment name below is also found on the oai azure page.

Example
from trulens_eval.feedback.provider.openai import AzureOpenAI
openai_provider = AzureOpenAI(deployment_name="...")

openai_provider.relevance(
    prompt="Where is Germany?",
    response="Poland is in Europe."
) # low relevance
PARAMETER DESCRIPTION
deployment_name

The name of the deployment.

TYPE: str