langchain_core.callbacks.manager.trace_as_chain_groupยถ

langchain_core.callbacks.manager.trace_as_chain_group(group_name: str, callback_manager: Optional[CallbackManager] = None, *, inputs: Optional[Dict[str, Any]] = None, project_name: Optional[str] = None, example_id: Optional[Union[str, UUID]] = None, run_id: Optional[UUID] = None, tags: Optional[List[str]] = None) Generator[CallbackManagerForChainGroup, None, None][source]ยถ

Get a callback manager for a chain group in a context manager. Useful for grouping different calls together as a single run even if they arenโ€™t composed in a single chain.

Parameters
  • group_name (str) โ€“ The name of the chain group.

  • callback_manager (CallbackManager, optional) โ€“ The callback manager to use.

  • inputs (Dict[str, Any], optional) โ€“ The inputs to the chain group.

  • project_name (str, optional) โ€“ The name of the project. Defaults to None.

  • example_id (str or UUID, optional) โ€“ The ID of the example. Defaults to None.

  • run_id (UUID, optional) โ€“ The ID of the run.

  • tags (List[str], optional) โ€“ The inheritable tags to apply to all runs. Defaults to None.

Note: must have LANGCHAIN_TRACING_V2 env var set to true to see the trace in LangSmith.

Returns

The callback manager for the chain group.

Return type

CallbackManagerForChainGroup

Example

llm_input = "Foo"
with trace_as_chain_group("group_name", inputs={"input": llm_input}) as manager:
    # Use the callback manager for the chain group
    res = llm.predict(llm_input, callbacks=manager)
    manager.on_chain_end({"output": res})