Skip to content

๐Ÿฆœ๏ธ๐Ÿ”— LangChain Provider

Below is how you can instantiate a LangChain LLM as a provider.

All feedback functions listed in the base LLMProvider class can be run with the LangChain Provider.

Note

LangChain provider cannot be used in deferred mode due to inconsistent serialization capabilities of LangChain apps.

trulens_eval.feedback.provider.langchain.Langchain

Bases: LLMProvider

Out of the box feedback functions using LangChain LLMs and ChatModels

Create a LangChain Provider with out of the box feedback functions.

Example

from trulens_eval.feedback.provider.langchain import Langchain
from langchain_community.llms import OpenAI

gpt3_llm = OpenAI(model="gpt-3.5-turbo-instruct")
langchain_provider = Langchain(chain = gpt3_llm)
PARAMETER DESCRIPTION
chain

LangChain LLM.

TYPE: Union[BaseLLM, BaseChatModel]