langchain_community.callbacks.mlflow_callback
.MlflowCallbackHandler¶
- class langchain_community.callbacks.mlflow_callback.MlflowCallbackHandler(name: Optional[str] = 'langchainrun-%', experiment: Optional[str] = 'langchain', tags: Optional[Dict] = None, tracking_uri: Optional[str] = None, run_id: Optional[str] = None, artifacts_dir: str = '')[source]¶
Callback Handler that logs metrics and artifacts to mlflow server.
- Parameters
name (str) – Name of the run.
experiment (str) – Name of the experiment.
tags (dict) – Tags to be attached for the run.
tracking_uri (str) – MLflow tracking server uri.
run_id (Optional[str]) –
artifacts_dir (str) –
This handler will utilize the associated callback method called and formats the input of each callback function with metadata regarding the state of LLM run, and adds the response to the list of records for both the {method}_records and action. It then logs the response to mlflow server.
Initialize callback handler.
Attributes
always_verbose
Whether to call verbose callbacks even if verbose is False.
ignore_agent
Whether to ignore agent callbacks.
ignore_chain
Whether to ignore chain callbacks.
ignore_chat_model
Whether to ignore chat model callbacks.
ignore_llm
Whether to ignore LLM callbacks.
ignore_retriever
Whether to ignore retriever callbacks.
ignore_retry
Whether to ignore retry callbacks.
raise_error
run_inline
Methods
__init__
([name, experiment, tags, ...])Initialize callback handler.
flush_tracker
([langchain_asset, finish])on_agent_action
(action, **kwargs)Run on agent action.
on_agent_finish
(finish, **kwargs)Run when agent ends running.
on_chain_end
(outputs, **kwargs)Run when chain ends running.
on_chain_error
(error, **kwargs)Run when chain errors.
on_chain_start
(serialized, inputs, **kwargs)Run when chain starts running.
on_chat_model_start
(serialized, messages, *, ...)Run when a chat model starts running.
on_llm_end
(response, **kwargs)Run when LLM ends running.
on_llm_error
(error, **kwargs)Run when LLM errors.
on_llm_new_token
(token, **kwargs)Run when LLM generates a new token.
on_llm_start
(serialized, prompts, **kwargs)Run when LLM starts.
on_retriever_end
(documents, **kwargs)Run when Retriever ends running.
on_retriever_error
(error, **kwargs)Run when Retriever errors.
on_retriever_start
(serialized, query, **kwargs)Run when Retriever starts running.
on_retry
(retry_state, *, run_id[, parent_run_id])Run on a retry event.
on_text
(text, **kwargs)Run when text is received.
on_tool_end
(output, **kwargs)Run when tool ends running.
on_tool_error
(error, **kwargs)Run when tool errors.
on_tool_start
(serialized, input_str, **kwargs)Run when tool starts running.
Reset the callback metadata.
- __init__(name: Optional[str] = 'langchainrun-%', experiment: Optional[str] = 'langchain', tags: Optional[Dict] = None, tracking_uri: Optional[str] = None, run_id: Optional[str] = None, artifacts_dir: str = '') None [source]¶
Initialize callback handler.
- Parameters
name (Optional[str]) –
experiment (Optional[str]) –
tags (Optional[Dict]) –
tracking_uri (Optional[str]) –
run_id (Optional[str]) –
artifacts_dir (str) –
- Return type
None
- flush_tracker(langchain_asset: Any = None, finish: bool = False) None [source]¶
- Parameters
langchain_asset (Any) –
finish (bool) –
- Return type
None
- get_custom_callback_meta() Dict[str, Any] ¶
- Return type
Dict[str, Any]
- on_agent_action(action: AgentAction, **kwargs: Any) Any [source]¶
Run on agent action.
- Parameters
action (AgentAction) –
kwargs (Any) –
- Return type
Any
- on_agent_finish(finish: AgentFinish, **kwargs: Any) None [source]¶
Run when agent ends running.
- Parameters
finish (AgentFinish) –
kwargs (Any) –
- Return type
None
- on_chain_end(outputs: Union[Dict[str, Any], str, List[str]], **kwargs: Any) None [source]¶
Run when chain ends running.
- Parameters
outputs (Union[Dict[str, Any], str, List[str]]) –
kwargs (Any) –
- Return type
None
- on_chain_error(error: BaseException, **kwargs: Any) None [source]¶
Run when chain errors.
- Parameters
error (BaseException) –
kwargs (Any) –
- Return type
None
- on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) None [source]¶
Run when chain starts running.
- Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
kwargs (Any) –
- Return type
None
- on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) Any ¶
Run when a chat model starts running.
- ATTENTION: This method is called for chat models. If you’re implementing
a handler for a non-chat model, you should use on_llm_start instead.
- Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
- Return type
Any
- on_llm_end(response: LLMResult, **kwargs: Any) None [source]¶
Run when LLM ends running.
- Parameters
response (LLMResult) –
kwargs (Any) –
- Return type
None
- on_llm_error(error: BaseException, **kwargs: Any) None [source]¶
Run when LLM errors.
- Parameters
error (BaseException) –
kwargs (Any) –
- Return type
None
- on_llm_new_token(token: str, **kwargs: Any) None [source]¶
Run when LLM generates a new token.
- Parameters
token (str) –
kwargs (Any) –
- Return type
None
- on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) None [source]¶
Run when LLM starts.
- Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
kwargs (Any) –
- Return type
None
- on_retriever_end(documents: Sequence[Document], **kwargs: Any) Any [source]¶
Run when Retriever ends running.
- Parameters
documents (Sequence[Document]) –
kwargs (Any) –
- Return type
Any
- on_retriever_error(error: BaseException, **kwargs: Any) Any [source]¶
Run when Retriever errors.
- Parameters
error (BaseException) –
kwargs (Any) –
- Return type
Any
- on_retriever_start(serialized: Dict[str, Any], query: str, **kwargs: Any) Any [source]¶
Run when Retriever starts running.
- Parameters
serialized (Dict[str, Any]) –
query (str) –
kwargs (Any) –
- Return type
Any
- on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) Any ¶
Run on a retry event.
- Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
- Return type
Any
- on_text(text: str, **kwargs: Any) None [source]¶
Run when text is received.
- Parameters
text (str) –
kwargs (Any) –
- Return type
None
- on_tool_end(output: Any, **kwargs: Any) None [source]¶
Run when tool ends running.
- Parameters
output (Any) –
kwargs (Any) –
- Return type
None
- on_tool_error(error: BaseException, **kwargs: Any) None [source]¶
Run when tool errors.
- Parameters
error (BaseException) –
kwargs (Any) –
- Return type
None
- on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) None [source]¶
Run when tool starts running.
- Parameters
serialized (Dict[str, Any]) –
input_str (str) –
kwargs (Any) –
- Return type
None
- reset_callback_meta() None ¶
Reset the callback metadata.
- Return type
None