Skip to content

LiteLLM Provider

Below is how you can instantiate LiteLLM as a provider. LiteLLM supports 100+ models from OpenAI, Cohere, Anthropic, HuggingFace, Meta and more. You can find more information about models available here.

All feedback functions listed in the base LLMProvider class can be run with LiteLLM.

trulens_eval.feedback.provider.litellm.LiteLLM

Bases: LLMProvider

Out of the box feedback functions calling LiteLLM API.

Create an LiteLLM Provider with out of the box feedback functions.

Example

from trulens_eval.feedback.provider.litellm import LiteLLM
litellm_provider = LiteLLM()

Attributes

model_engine instance-attribute

model_engine: str

The LiteLLM completion model. Defaults to gpt-3.5-turbo.

completion_args class-attribute instance-attribute

completion_args: Dict[str, str] = Field(default_factory=dict)

Additional arguments to pass to the litellm.completion as needed for chosen api.