langchain.agents.openai_functions_multi_agent.base
.OpenAIMultiFunctionsAgent¶
- class langchain.agents.openai_functions_multi_agent.base.OpenAIMultiFunctionsAgent[source]¶
Bases:
BaseMultiActionAgent
[Deprecated] An Agent driven by OpenAIs function powered API.
- Parameters
llm – This should be an instance of ChatOpenAI, specifically a model that supports using functions.
tools – The tools this agent has access to.
prompt – The prompt for this agent, should support agent_scratchpad as one of the variables. For an easy way to construct this prompt, use `OpenAIMultiFunctionsAgent.create_prompt(…)`[Deprecated] An Agent driven by OpenAIs function powered API.
llm – This should be an instance of ChatOpenAI, specifically a model that supports using functions.
tools – The tools this agent has access to.
prompt – The prompt for this agent, should support agent_scratchpad as one of the variables. For an easy way to construct this prompt, use OpenAIMultiFunctionsAgent.create_prompt(…)
Notes
Deprecated since version 0.1.0: Use create_openai_tools_agent instead.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
- param llm: BaseLanguageModel [Required]¶
- param prompt: BasePromptTemplate [Required]¶
- async aplan(intermediate_steps: List[Tuple[AgentAction, str]], callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) Union[List[AgentAction], AgentFinish] [source]¶
Given input, decided what to do.
- Parameters
intermediate_steps – Steps the LLM has taken to date, along with observations
**kwargs – User inputs.
- Returns
Action specifying what tool to use.
- classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model ¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
- copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model ¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep – set to True to make a deep copy of the model
- Returns
new model instance
- classmethod create_prompt(system_message: Optional[SystemMessage] = SystemMessage(content='You are a helpful AI assistant.'), extra_prompt_messages: Optional[List[BaseMessagePromptTemplate]] = None) BasePromptTemplate [source]¶
Create prompt for this agent.
- Parameters
system_message – Message to use as the system message that will be the first in the prompt.
extra_prompt_messages – Prompt messages that will be placed between the system message and the new human input.
- Returns
A prompt template to pass into this agent.
- dict(**kwargs: Any) Dict ¶
Return dictionary representation of agent.
- classmethod from_llm_and_tools(llm: BaseLanguageModel, tools: Sequence[BaseTool], callback_manager: Optional[BaseCallbackManager] = None, extra_prompt_messages: Optional[List[BaseMessagePromptTemplate]] = None, system_message: Optional[SystemMessage] = SystemMessage(content='You are a helpful AI assistant.'), **kwargs: Any) BaseMultiActionAgent [source]¶
Construct an agent from an LLM and tools.
- classmethod from_orm(obj: Any) Model ¶
- json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
- classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod parse_obj(obj: Any) Model ¶
- classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- plan(intermediate_steps: List[Tuple[AgentAction, str]], callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) Union[List[AgentAction], AgentFinish] [source]¶
Given input, decided what to do.
- Parameters
intermediate_steps – Steps the LLM has taken to date, along with observations
**kwargs – User inputs.
- Returns
Action specifying what tool to use.
- return_stopped_response(early_stopping_method: str, intermediate_steps: List[Tuple[AgentAction, str]], **kwargs: Any) AgentFinish ¶
Return response when agent has been stopped due to max iterations.
- save(file_path: Union[Path, str]) None ¶
Save the agent.
- Parameters
file_path – Path to file to save the agent to.
Example: .. code-block:: python
# If working with agent executor agent.agent.save(file_path=”path/agent.yaml”)
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ¶
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ¶
- tool_run_logging_kwargs() Dict ¶
- classmethod update_forward_refs(**localns: Any) None ¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
- classmethod validate(value: Any) Model ¶
- property functions: List[dict]¶
- property input_keys: List[str]¶
Get input keys. Input refers to user input here.
- property return_values: List[str]¶
Return values of the agent.