langchain.chains.openai_functions.qa_with_structure.create_qa_with_structure_chain¶

langchain.chains.openai_functions.qa_with_structure.create_qa_with_structure_chain(llm: BaseLanguageModel, schema: Union[dict, Type[BaseModel]], output_parser: str = 'base', prompt: Optional[Union[PromptTemplate, ChatPromptTemplate]] = None, verbose: bool = False) LLMChain[source]¶
Create a question answering chain that returns an answer with sources

based on schema.

Parameters
  • llm (BaseLanguageModel) – Language model to use for the chain.

  • schema (Union[dict, Type[BaseModel]]) – Pydantic schema to use for the output.

  • output_parser (str) – Output parser to use. Should be one of pydantic or base. Default to base.

  • prompt (Optional[Union[PromptTemplate, ChatPromptTemplate]]) – Optional prompt to use for the chain.

  • verbose (bool) –

Return type

LLMChain

Returns:

Examples using create_qa_with_structure_chain¶