langchain.chains.history_aware_retriever
.create_history_aware_retriever¶
- langchain.chains.history_aware_retriever.create_history_aware_retriever(llm: Runnable[Union[PromptValue, str, Sequence[Union[BaseMessage, Tuple[str, str], str, Dict[str, Any]]]], Union[BaseMessage, str]], retriever: Runnable[str, List[Document]], prompt: BasePromptTemplate) Runnable[Any, List[Document]] [source]¶
Create a chain that takes conversation history and returns documents.
If there is no chat_history, then the input is just passed directly to the retriever. If there is chat_history, then the prompt and LLM will be used to generate a search query. That search query is then passed to the retriever.
- Parameters
llm (Runnable[Union[PromptValue, str, Sequence[Union[BaseMessage, Tuple[str, str], str, Dict[str, Any]]]], Union[BaseMessage, str]]) – Language model to use for generating a search term given chat history
retriever (Runnable[str, List[Document]]) – RetrieverLike object that takes a string as input and outputs a list of Documents.
prompt (BasePromptTemplate) – The prompt used to generate the search query for the retriever.
- Returns
An LCEL Runnable. The runnable input must take in input, and if there is chat history should take it in the form of chat_history. The Runnable output is a list of Documents
- Return type
Example
# pip install -U langchain langchain-community from langchain_community.chat_models import ChatOpenAI from langchain.chains import create_history_aware_retriever from langchain import hub rephrase_prompt = hub.pull("langchain-ai/chat-langchain-rephrase") llm = ChatOpenAI() retriever = ... chat_retriever_chain = create_history_aware_retriever( llm, retriever, rephrase_prompt ) chain.invoke({"input": "...", "chat_history": })