langchain.agents.openai_functions_agent.base.create_openai_functions_agent¶

langchain.agents.openai_functions_agent.base.create_openai_functions_agent(llm: BaseLanguageModel, tools: Sequence[BaseTool], prompt: ChatPromptTemplate) Runnable[source]¶

Create an agent that uses OpenAI function calling.

Examples

Creating an agent with no memory

from langchain_community.chat_models import ChatOpenAI
from langchain.agents import AgentExecutor, create_openai_functions_agent
from langchain import hub

prompt = hub.pull("hwchase17/openai-functions-agent")
model = ChatOpenAI()
tools = ...

agent = create_openai_functions_agent(model, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools)

agent_executor.invoke({"input": "hi"})

# Using with chat history
from langchain_core.messages import AIMessage, HumanMessage
agent_executor.invoke(
    {
        "input": "what's my name?",
        "chat_history": [
            HumanMessage(content="hi! my name is bob"),
            AIMessage(content="Hello Bob! How can I assist you today?"),
        ],
    }
)
Parameters
  • llm – LLM to use as the agent. Should work with OpenAI function calling, so either be an OpenAI model that supports that or a wrapper of a different model that adds in equivalent support.

  • tools – Tools this agent has access to.

  • prompt – The prompt to use, must have an input key of agent_scratchpad.

Returns

A runnable sequence representing an agent. It takes as input all the same input variables as the prompt passed in does. It returns as output either an AgentAction or AgentFinish.