langchain_core 0.1.31

langchain_core.agents

Agent is a class that uses an LLM to choose a sequence of actions to take.

In Chains, a sequence of actions is hardcoded. In Agents, a language model is used as a reasoning engine to determine which actions to take and in which order.

Agents select and use Tools and Toolkits for actions.

Class hierarchy:

BaseSingleActionAgent --> LLMSingleActionAgent
                          OpenAIFunctionsAgent
                          XMLAgent
                          Agent --> <name>Agent  # Examples: ZeroShotAgent, ChatAgent


BaseMultiActionAgent  --> OpenAIMultiFunctionsAgent

Main helpers:

AgentType, AgentExecutor, AgentOutputParser, AgentExecutorIterator,
AgentAction, AgentFinish, AgentStep

Classes

agents.AgentAction

A full description of an action for an ActionAgent to execute.

agents.AgentActionMessageLog

Override init to support instantiation by position for backward compat.

agents.AgentFinish

The final return value of an ActionAgent.

agents.AgentStep

The result of running an AgentAction.

langchain_core.beta

Some beta features that are not yet ready for production.

Classes

beta.runnables.context.Context()

Context for a runnable.

beta.runnables.context.ContextGet

[Beta] Get a context value.

beta.runnables.context.ContextSet

[Beta] Set a context value.

beta.runnables.context.PrefixContext([prefix])

Context for a runnable with a prefix.

Functions

beta.runnables.context.aconfig_with_context(...)

Asynchronously patch a runnable config with context getters and setters.

beta.runnables.context.config_with_context(...)

Patch a runnable config with context getters and setters.

langchain_core.caches

Warning

Beta Feature!

Cache provides an optional caching layer for LLMs.

Cache is useful for two reasons:

  • It can save you money by reducing the number of API calls you make to the LLM provider if you’re often requesting the same completion multiple times.

  • It can speed up your application by reducing the number of API calls you make to the LLM provider.

Cache directly competes with Memory. See documentation for Pros and Cons.

Class hierarchy:

BaseCache --> <name>Cache  # Examples: InMemoryCache, RedisCache, GPTCache

Classes

caches.BaseCache()

Base interface for cache.

langchain_core.callbacks

Callback handlers allow listening to events in LangChain.

Class hierarchy:

BaseCallbackHandler --> <name>CallbackHandler  # Example: AimCallbackHandler

Classes

callbacks.base.AsyncCallbackHandler()

Async callback handler that handles callbacks from LangChain.

callbacks.base.BaseCallbackHandler()

Base callback handler that handles callbacks from LangChain.

callbacks.base.BaseCallbackManager(handlers)

Base callback manager that handles callbacks from LangChain.

callbacks.base.CallbackManagerMixin()

Mixin for callback manager.

callbacks.base.ChainManagerMixin()

Mixin for chain callbacks.

callbacks.base.LLMManagerMixin()

Mixin for LLM callbacks.

callbacks.base.RetrieverManagerMixin()

Mixin for Retriever callbacks.

callbacks.base.RunManagerMixin()

Mixin for run manager.

callbacks.base.ToolManagerMixin()

Mixin for tool callbacks.

callbacks.manager.AsyncCallbackManager(handlers)

Async callback manager that handles callbacks from LangChain.

callbacks.manager.AsyncCallbackManagerForChainGroup(...)

Async callback manager for the chain group.

callbacks.manager.AsyncCallbackManagerForChainRun(*, ...)

Async callback manager for chain run.

callbacks.manager.AsyncCallbackManagerForLLMRun(*, ...)

Async callback manager for LLM run.

callbacks.manager.AsyncCallbackManagerForRetrieverRun(*, ...)

Async callback manager for retriever run.

callbacks.manager.AsyncCallbackManagerForToolRun(*, ...)

Async callback manager for tool run.

callbacks.manager.AsyncParentRunManager(*, ...)

Async Parent Run Manager.

callbacks.manager.AsyncRunManager(*, run_id, ...)

Async Run Manager.

callbacks.manager.BaseRunManager(*, run_id, ...)

Base class for run manager (a bound callback manager).

callbacks.manager.CallbackManager(handlers)

Callback manager that handles callbacks from LangChain.

callbacks.manager.CallbackManagerForChainGroup(...)

Callback manager for the chain group.

callbacks.manager.CallbackManagerForChainRun(*, ...)

Callback manager for chain run.

callbacks.manager.CallbackManagerForLLMRun(*, ...)

Callback manager for LLM run.

callbacks.manager.CallbackManagerForRetrieverRun(*, ...)

Callback manager for retriever run.

callbacks.manager.CallbackManagerForToolRun(*, ...)

Callback manager for tool run.

callbacks.manager.ParentRunManager(*, ...[, ...])

Sync Parent Run Manager.

callbacks.manager.RunManager(*, run_id, ...)

Sync Run Manager.

callbacks.stdout.StdOutCallbackHandler([color])

Callback Handler that prints to std out.

callbacks.streaming_stdout.StreamingStdOutCallbackHandler()

Callback handler for streaming.

Functions

callbacks.manager.ahandle_event(handlers, ...)

Generic event handler for AsyncCallbackManager.

callbacks.manager.atrace_as_chain_group(...)

Get an async callback manager for a chain group in a context manager.

callbacks.manager.handle_event(handlers, ...)

Generic event handler for CallbackManager.

callbacks.manager.shielded(func)

Makes so an awaitable method is always shielded from cancellation

callbacks.manager.trace_as_chain_group(...)

Get a callback manager for a chain group in a context manager.

langchain_core.chat_history

Chat message history stores a history of the message interactions in a chat.

Class hierarchy:

BaseChatMessageHistory --> <name>ChatMessageHistory  # Examples: FileChatMessageHistory, PostgresChatMessageHistory

Main helpers:

AIMessage, HumanMessage, BaseMessage

Classes

chat_history.BaseChatMessageHistory()

Abstract base class for storing chat message history.

langchain_core.chat_sessions

Chat Sessions are a collection of messages and function calls.

Classes

chat_sessions.ChatSession

Chat Session represents a single conversation, channel, or other group of messages.

langchain_core.document_loaders

Classes

document_loaders.base.BaseBlobParser()

Abstract interface for blob parsers.

document_loaders.base.BaseLoader()

Interface for Document Loader.

document_loaders.blob_loaders.Blob

Blob represents raw data by either reference or value.

document_loaders.blob_loaders.BlobLoader()

Abstract interface for blob loaders implementation.

langchain_core.documents

Document module is a collection of classes that handle documents and their transformations.

Classes

documents.base.Document

Class for storing a piece of text and associated metadata.

documents.compressor.BaseDocumentCompressor

Base class for document compressors.

documents.transformers.BaseDocumentTransformer()

Abstract base class for document transformation systems.

langchain_core.embeddings

Embeddings interface.

Classes

embeddings.Embeddings()

Interface for embedding models.

langchain_core.example_selectors

Example selector implements logic for selecting examples to include them in prompts. This allows us to select examples that are most relevant to the input.

Classes

example_selectors.base.BaseExampleSelector()

Interface for selecting examples to include in prompts.

example_selectors.length_based.LengthBasedExampleSelector

Select examples based on length.

example_selectors.semantic_similarity.MaxMarginalRelevanceExampleSelector

ExampleSelector that selects examples based on Max Marginal Relevance.

example_selectors.semantic_similarity.SemanticSimilarityExampleSelector

Example selector that selects examples based on SemanticSimilarity.

Functions

example_selectors.semantic_similarity.sorted_values(values)

Return a list of values in dict sorted by key.

langchain_core.exceptions

Custom exceptions for LangChain.

Classes

exceptions.LangChainException

General LangChain exception.

exceptions.OutputParserException(error[, ...])

Exception that output parsers should raise to signify a parsing error.

exceptions.TracerException

Base class for exceptions in tracers module.

langchain_core.language_models

Language Model is a type of model that can generate text or complete text prompts.

LangChain has two main classes to work with language models: - LLM classes provide access to the large language model (LLM) APIs and services. - Chat Models are a variation on language models.

Class hierarchy:

BaseLanguageModel --> BaseLLM --> LLM --> <name>  # Examples: AI21, HuggingFaceHub, OpenAI
                  --> BaseChatModel --> <name>    # Examples: ChatOpenAI, ChatGooglePalm

Main helpers:

LLMResult, PromptValue,
CallbackManagerForLLMRun, AsyncCallbackManagerForLLMRun,
CallbackManager, AsyncCallbackManager,
AIMessage, BaseMessage, HumanMessage

Classes

language_models.base.BaseLanguageModel

Abstract base class for interfacing with language models.

language_models.chat_models.BaseChatModel

Base class for Chat models.

language_models.chat_models.SimpleChatModel

A simplified implementation for a chat model to inherit from.

language_models.llms.BaseLLM

Base LLM abstract interface.

language_models.llms.LLM

Base LLM abstract class.

Functions

language_models.chat_models.agenerate_from_stream(stream)

Async generate from a stream.

language_models.chat_models.generate_from_stream(stream)

Generate from a stream.

language_models.llms.aget_prompts(params, ...)

Get prompts that are already cached.

language_models.llms.aupdate_cache(...)

Update the cache and get the LLM output.

language_models.llms.create_base_retry_decorator(...)

Create a retry decorator for a given LLM and provided list of error types.

language_models.llms.get_prompts(params, prompts)

Get prompts that are already cached.

language_models.llms.update_cache(...)

Update the cache and get the LLM output.

langchain_core.load

Load module helps with serialization and deserialization.

Classes

load.load.Reviver([secrets_map, ...])

Reviver for JSON objects.

load.serializable.BaseSerialized

Base class for serialized objects.

load.serializable.Serializable

Serializable base class.

load.serializable.SerializedConstructor

Serialized constructor.

load.serializable.SerializedNotImplemented

Serialized not implemented.

load.serializable.SerializedSecret

Serialized secret.

Functions

load.dump.default(obj)

Return a default value for a Serializable object or a SerializedNotImplemented object.

load.dump.dumpd(obj)

Return a json dict representation of an object.

load.dump.dumps(obj, *[, pretty])

Return a json string representation of an object.

load.load.load(obj, *[, secrets_map, ...])

[Beta] Revive a LangChain class from a JSON object.

load.load.loads(text, *[, secrets_map, ...])

[Beta] Revive a LangChain class from a JSON string.

load.serializable.to_json_not_implemented(obj)

Serialize a "not implemented" object.

load.serializable.try_neq_default(value, ...)

Try to determine if a value is different from the default.

langchain_core.memory

Memory maintains Chain state, incorporating context from past runs.

Class hierarchy for Memory:

BaseMemory --> <name>Memory --> <name>Memory  # Examples: BaseChatMemory -> MotorheadMemory

Classes

memory.BaseMemory

Abstract base class for memory in Chains.

langchain_core.messages

Messages are objects used in prompts and chat conversations.

Class hierarchy:

BaseMessage --> SystemMessage, AIMessage, HumanMessage, ChatMessage, FunctionMessage, ToolMessage
            --> BaseMessageChunk --> SystemMessageChunk, AIMessageChunk, HumanMessageChunk, ChatMessageChunk, FunctionMessageChunk, ToolMessageChunk

Main helpers:

ChatPromptTemplate

Classes

messages.ai.AIMessage

Message from an AI.

messages.ai.AIMessageChunk

Message chunk from an AI.

messages.base.BaseMessage

Base abstract Message class.

messages.base.BaseMessageChunk

Message chunk, which can be concatenated with other Message chunks.

messages.chat.ChatMessage

Message that can be assigned an arbitrary speaker (i.e.

messages.chat.ChatMessageChunk

Chat Message chunk.

messages.function.FunctionMessage

Message for passing the result of executing a function back to a model.

messages.function.FunctionMessageChunk

Function Message chunk.

messages.human.HumanMessage

Message from a human.

messages.human.HumanMessageChunk

Human Message chunk.

messages.system.SystemMessage

Message for priming AI behavior, usually passed in as the first of a sequence of input messages.

messages.system.SystemMessageChunk

System Message chunk.

messages.tool.ToolMessage

Message for passing the result of executing a tool back to a model.

messages.tool.ToolMessageChunk

Tool Message chunk.

Functions

messages.base.get_msg_title_repr(title, *[, ...])

Get a title representation for a message.

messages.base.merge_content(first_content, ...)

Merge two message contents.

messages.base.message_to_dict(message)

Convert a Message to a dictionary.

messages.base.messages_to_dict(messages)

Convert a sequence of Messages to a list of dictionaries.

langchain_core.output_parsers

OutputParser classes parse the output of an LLM call.

Class hierarchy:

BaseLLMOutputParser --> BaseOutputParser --> <name>OutputParser  # ListOutputParser, PydanticOutputParser

Main helpers:

Serializable, Generation, PromptValue

Classes

output_parsers.base.BaseGenerationOutputParser

Base class to parse the output of an LLM call.

output_parsers.base.BaseLLMOutputParser()

Abstract base class for parsing the outputs of a model.

output_parsers.base.BaseOutputParser

Base class to parse the output of an LLM call.

output_parsers.json.JsonOutputParser

Parse the output of an LLM call to a JSON object.

output_parsers.json.SimpleJsonOutputParser

alias of JsonOutputParser

output_parsers.list.CommaSeparatedListOutputParser

Parse the output of an LLM call to a comma-separated list.

output_parsers.list.ListOutputParser

Parse the output of an LLM call to a list.

output_parsers.list.MarkdownListOutputParser

Parse a markdown list.

output_parsers.list.NumberedListOutputParser

Parse a numbered list.

output_parsers.openai_functions.JsonKeyOutputFunctionsParser

Parse an output as the element of the Json object.

output_parsers.openai_functions.JsonOutputFunctionsParser

Parse an output as the Json object.

output_parsers.openai_functions.OutputFunctionsParser

Parse an output that is one of sets of values.

output_parsers.openai_functions.PydanticAttrOutputFunctionsParser

Parse an output as an attribute of a pydantic object.

output_parsers.openai_functions.PydanticOutputFunctionsParser

Parse an output as a pydantic object.

output_parsers.openai_tools.JsonOutputKeyToolsParser

Parse tools from OpenAI response.

output_parsers.openai_tools.JsonOutputToolsParser

Parse tools from OpenAI response.

output_parsers.openai_tools.PydanticToolsParser

Parse tools from OpenAI response.

output_parsers.pydantic.PydanticOutputParser

Parse an output using a pydantic model.

output_parsers.string.StrOutputParser

OutputParser that parses LLMResult into the top likely string.

output_parsers.transform.BaseCumulativeTransformOutputParser

Base class for an output parser that can handle streaming input.

output_parsers.transform.BaseTransformOutputParser

Base class for an output parser that can handle streaming input.

output_parsers.xml.XMLOutputParser

Parse an output using xml format.

Functions

output_parsers.json.parse_and_check_json_markdown(...)

Parse a JSON string from a Markdown string and check that it contains the expected keys.

output_parsers.json.parse_json_markdown(...)

Parse a JSON string from a Markdown string.

output_parsers.json.parse_partial_json(s, *)

Parse a JSON string that may be missing closing braces.

output_parsers.list.droplastn(iter, n)

Drop the last n elements of an iterator.

output_parsers.xml.nested_element(path, elem)

Get nested element from path.

langchain_core.outputs

Output classes are used to represent the output of a language model call and the output of a chat.

Classes

outputs.chat_generation.ChatGeneration

A single chat generation output.

outputs.chat_generation.ChatGenerationChunk

ChatGeneration chunk, which can be concatenated with other

outputs.chat_result.ChatResult

Class that contains all results for a single chat model call.

outputs.generation.Generation

A single text generation output.

outputs.generation.GenerationChunk

Generation chunk, which can be concatenated with other Generation chunks.

outputs.llm_result.LLMResult

Class that contains all results for a batched LLM call.

outputs.run_info.RunInfo

Class that contains metadata for a single execution of a Chain or model.

langchain_core.prompt_values

Prompt values for language model prompts.

Prompt values are used to represent different pieces of prompts. They can be used to represent text, images, or chat message pieces.

Classes

prompt_values.ChatPromptValue

Chat prompt value.

prompt_values.ChatPromptValueConcrete

Chat prompt value which explicitly lists out the message types it accepts.

prompt_values.ImagePromptValue

Image prompt value.

prompt_values.ImageURL

prompt_values.PromptValue

Base abstract class for inputs to any language model.

prompt_values.StringPromptValue

String prompt value.

langchain_core.prompts

Prompt is the input to the model.

Prompt is often constructed from multiple components and prompt values. Prompt classes and functions make constructing

and working with prompts easy.

Class hierarchy:

BasePromptTemplate --> PipelinePromptTemplate
                       StringPromptTemplate --> PromptTemplate
                                                FewShotPromptTemplate
                                                FewShotPromptWithTemplates
                       BaseChatPromptTemplate --> AutoGPTPrompt
                                                  ChatPromptTemplate --> AgentScratchPadChatPromptTemplate



BaseMessagePromptTemplate --> MessagesPlaceholder
                              BaseStringMessagePromptTemplate --> ChatMessagePromptTemplate
                                                                  HumanMessagePromptTemplate
                                                                  AIMessagePromptTemplate
                                                                  SystemMessagePromptTemplate

Classes

prompts.base.BasePromptTemplate

Base class for all prompt templates, returning a prompt.

prompts.chat.AIMessagePromptTemplate

AI message prompt template.

prompts.chat.BaseChatPromptTemplate

Base class for chat prompt templates.

prompts.chat.BaseMessagePromptTemplate

Base class for message prompt templates.

prompts.chat.BaseStringMessagePromptTemplate

Base class for message prompt templates that use a string prompt template.

prompts.chat.ChatMessagePromptTemplate

Chat message prompt template.

prompts.chat.ChatPromptTemplate

Prompt template for chat models.

prompts.chat.HumanMessagePromptTemplate

Human message prompt template.

prompts.chat.MessagesPlaceholder

Prompt template that assumes variable is already list of messages.

prompts.chat.SystemMessagePromptTemplate

System message prompt template.

prompts.few_shot.FewShotChatMessagePromptTemplate

Chat prompt template that supports few-shot examples.

prompts.few_shot.FewShotPromptTemplate

Prompt template that contains few shot examples.

prompts.few_shot_with_templates.FewShotPromptWithTemplates

Prompt template that contains few shot examples.

prompts.image.ImagePromptTemplate

An image prompt template for a multimodal model.

prompts.pipeline.PipelinePromptTemplate

Prompt template for composing multiple prompt templates together.

prompts.prompt.PromptTemplate

A prompt template for a language model.

prompts.string.StringPromptTemplate

String prompt that exposes the format method, returning a prompt.

Functions

prompts.base.format_document(doc, prompt)

Format a document into a string based on a prompt template.

prompts.loading.load_prompt(path)

Unified method for loading a prompt from LangChainHub or local fs.

prompts.loading.load_prompt_from_config(config)

Load prompt from Config Dict.

prompts.string.check_valid_template(...)

Check that template string is valid.

prompts.string.get_template_variables(...)

Get the variables from the template.

prompts.string.jinja2_formatter(template, ...)

Format a template using jinja2.

prompts.string.validate_jinja2(template, ...)

Validate that the input variables are valid for the template.

langchain_core.retrievers

Retriever class returns Documents given a text query.

It is more general than a vector store. A retriever does not need to be able to store documents, only to return (or retrieve) it. Vector stores can be used as the backbone of a retriever, but there are other types of retrievers as well.

Class hierarchy:

BaseRetriever --> <name>Retriever  # Examples: ArxivRetriever, MergerRetriever

Main helpers:

RetrieverInput, RetrieverOutput, RetrieverLike, RetrieverOutputLike,
Document, Serializable, Callbacks,
CallbackManagerForRetrieverRun, AsyncCallbackManagerForRetrieverRun

Classes

retrievers.BaseRetriever

Abstract base class for a Document retrieval system.

langchain_core.runnables

LangChain Runnable and the LangChain Expression Language (LCEL).

The LangChain Expression Language (LCEL) offers a declarative method to build production-grade programs that harness the power of LLMs.

Programs created using LCEL and LangChain Runnables inherently support synchronous, asynchronous, batch, and streaming operations.

Support for async allows servers hosting LCEL based programs to scale better for higher concurrent loads.

Batch operations allow for processing multiple inputs in parallel.

Streaming of intermediate outputs, as they’re being generated, allows for creating more responsive UX.

This module contains schema and implementation of LangChain Runnables primitives.

Classes

runnables.base.Runnable()

A unit of work that can be invoked, batched, streamed, transformed and composed.

runnables.base.RunnableBinding

Wrap a Runnable with additional functionality.

runnables.base.RunnableBindingBase

Runnable that delegates calls to another Runnable with a set of kwargs.

runnables.base.RunnableEach

Runnable that delegates calls to another Runnable with each element of the input sequence.

runnables.base.RunnableEachBase

Runnable that delegates calls to another Runnable with each element of the input sequence.

runnables.base.RunnableGenerator(transform)

Runnable that runs a generator function.

runnables.base.RunnableLambda(func[, afunc, ...])

RunnableLambda converts a python callable into a Runnable.

runnables.base.RunnableMap

alias of RunnableParallel

runnables.base.RunnableParallel

Runnable that runs a mapping of Runnables in parallel, and returns a mapping of their outputs.

runnables.base.RunnableSequence

Sequence of Runnables, where the output of each is the input of the next.

runnables.base.RunnableSerializable

Runnable that can be serialized to JSON.

runnables.branch.RunnableBranch

Runnable that selects which branch to run based on a condition.

runnables.config.ContextThreadPoolExecutor([...])

ThreadPoolExecutor that copies the context to the child thread.

runnables.config.EmptyDict

Empty dict type.

runnables.config.RunnableConfig

Configuration for a Runnable.

runnables.configurable.DynamicRunnable

Serializable Runnable that can be dynamically configured.

runnables.configurable.RunnableConfigurableAlternatives

Runnable that can be dynamically configured.

runnables.configurable.RunnableConfigurableFields

Runnable that can be dynamically configured.

runnables.configurable.StrEnum(value[, ...])

String enum.

runnables.fallbacks.RunnableWithFallbacks

Runnable that can fallback to other Runnables if it fails.

runnables.graph.Edge(source, target[, data])

Edge in a graph.

runnables.graph.Graph(nodes, ...)

Graph of nodes and edges.

runnables.graph.LabelsDict

runnables.graph.Node(id, data)

Node in a graph.

runnables.graph_ascii.AsciiCanvas(cols, lines)

Class for drawing in ASCII.

runnables.graph_ascii.VertexViewer(name)

Class to define vertex box boundaries that will be accounted for during graph building by grandalf.

runnables.graph_png.PngDrawer([fontname, labels])

A helper class to draw a state graph into a PNG file. Requires graphviz and pygraphviz to be installed. :param fontname: The font to use for the labels :param labels: A dictionary of label overrides. The dictionary should have the following format: { "nodes": { "node1": "CustomLabel1", "node2": "CustomLabel2", "__end__": "End Node" }, "edges": { "continue": "ContinueLabel", "end": "EndLabel" } } The keys are the original labels, and the values are the new labels. Usage: drawer = PngDrawer() drawer.draw(state_graph, 'graph.png').

runnables.history.RunnableWithMessageHistory

Runnable that manages chat message history for another Runnable.

runnables.passthrough.RunnableAssign

A runnable that assigns key-value pairs to Dict[str, Any] inputs.

runnables.passthrough.RunnablePassthrough

Runnable to passthrough inputs unchanged or with additional keys.

runnables.passthrough.RunnablePick

Runnable that picks keys from Dict[str, Any] inputs.

runnables.retry.RunnableRetry

Retry a Runnable if it fails.

runnables.router.RouterInput

Router input.

runnables.router.RouterRunnable

Runnable that routes to a set of Runnables based on Input['key'].

runnables.schema.EventData

Data associated with a streaming event.

runnables.schema.StreamEvent

Streaming event.

runnables.utils.AddableDict

Dictionary that can be added to another dictionary.

runnables.utils.ConfigurableField(id[, ...])

Field that can be configured by the user.

runnables.utils.ConfigurableFieldMultiOption(id, ...)

Field that can be configured by the user with multiple default values.

runnables.utils.ConfigurableFieldSingleOption(id, ...)

Field that can be configured by the user with a default value.

runnables.utils.ConfigurableFieldSpec(id, ...)

Field that can be configured by the user.

runnables.utils.FunctionNonLocals()

Get the nonlocal variables accessed of a function.

runnables.utils.GetLambdaSource()

Get the source code of a lambda function.

runnables.utils.IsFunctionArgDict()

Check if the first argument of a function is a dict.

runnables.utils.IsLocalDict(name, keys)

Check if a name is a local dict.

runnables.utils.NonLocals()

Get nonlocal variables accessed.

runnables.utils.SupportsAdd(*args, **kwargs)

Protocol for objects that support addition.

Functions

runnables.base.chain()

Decorate a function to make it a Runnable.

runnables.base.coerce_to_runnable(thing)

Coerce a runnable-like object into a Runnable.

runnables.config.acall_func_with_variable_args(...)

Call function that may optionally accept a run_manager and/or config.

runnables.config.call_func_with_variable_args(...)

Call function that may optionally accept a run_manager and/or config.

runnables.config.ensure_config([config])

Ensure that a config is a dict with all keys present.

runnables.config.get_async_callback_manager_for_config(config)

Get an async callback manager for a config.

runnables.config.get_callback_manager_for_config(config)

Get a callback manager for a config.

runnables.config.get_config_list(config, length)

Get a list of configs from a single config or a list of configs.

runnables.config.get_executor_for_config(config)

Get an executor for a config.

runnables.config.merge_configs(*configs)

Merge multiple configs into one.

runnables.config.patch_config(config, *[, ...])

Patch a config with new values.

runnables.config.run_in_executor(...)

Run a function in an executor.

runnables.configurable.make_options_spec(...)

Make a ConfigurableFieldSpec for a ConfigurableFieldSingleOption or ConfigurableFieldMultiOption.

runnables.configurable.prefix_config_spec(...)

Prefix the id of a ConfigurableFieldSpec.

runnables.graph.is_uuid(value)

runnables.graph.node_data_json(node)

runnables.graph.node_data_str(node)

runnables.graph_ascii.draw_ascii(vertices, edges)

Build a DAG and draw it in ASCII.

runnables.passthrough.aidentity(x)

Async identity function

runnables.passthrough.identity(x)

Identity function

runnables.utils.aadd(addables)

Asynchronously add a sequence of addable objects together.

runnables.utils.accepts_config(callable)

Check if a callable accepts a config argument.

runnables.utils.accepts_context(callable)

Check if a callable accepts a context argument.

runnables.utils.accepts_run_manager(callable)

Check if a callable accepts a run_manager argument.

runnables.utils.add(addables)

Add a sequence of addable objects together.

runnables.utils.create_model(__model_name, ...)

runnables.utils.gated_coro(semaphore, coro)

Run a coroutine with a semaphore.

runnables.utils.gather_with_concurrency(n, ...)

Gather coroutines with a limit on the number of concurrent coroutines.

runnables.utils.get_function_first_arg_dict_keys(func)

Get the keys of the first argument of a function if it is a dict.

runnables.utils.get_function_nonlocals(func)

Get the nonlocal variables accessed by a function.

runnables.utils.get_lambda_source(func)

Get the source code of a lambda function.

runnables.utils.get_unique_config_specs(specs)

Get the unique config specs from a sequence of config specs.

runnables.utils.indent_lines_after_first(...)

Indent all lines of text after the first line.

langchain_core.stores

Store implements the key-value stores and storage helpers.

Module provides implementations of various key-value stores that conform to a simple key-value interface.

The primary goal of these storages is to support implementation of caching.

Classes

stores.BaseStore()

Abstract interface for a key-value store.

langchain_core.sys_info

sys_info prints information about the system and langchain packages for debugging purposes.

Functions

sys_info.print_sys_info(*[, additional_pkgs])

Print information about the environment for debugging purposes.

langchain_core.tools

Tools are classes that an Agent uses to interact with the world.

Each tool has a description. Agent uses the description to choose the right tool for the job.

Class hierarchy:

RunnableSerializable --> BaseTool --> <name>Tool  # Examples: AIPluginTool, BaseGraphQLTool
                                      <name>      # Examples: BraveSearch, HumanInputRun

Main helpers:

CallbackManagerForToolRun, AsyncCallbackManagerForToolRun

Classes

tools.BaseTool

Interface LangChain tools must implement.

tools.SchemaAnnotationError

Raised when 'args_schema' is missing or has an incorrect type annotation.

tools.StructuredTool

Tool that can operate on any number of inputs.

tools.Tool

Tool that takes in function or coroutine directly.

tools.ToolException

Optional exception that tool throws when execution error occurs.

Functions

tools.create_schema_from_function(...)

Create a pydantic schema from a function's signature.

tools.tool(*args[, return_direct, ...])

Make tools out of functions, can be used with or without arguments.

langchain_core.tracers

Tracers are classes for tracing runs.

Class hierarchy:

BaseCallbackHandler --> BaseTracer --> <name>Tracer  # Examples: LangChainTracer, RootListenersTracer
                                   --> <name>  # Examples: LogStreamCallbackHandler

Classes

tracers.base.BaseTracer(*[, _schema_format])

Base interface for tracers.

tracers.evaluation.EvaluatorCallbackHandler(...)

Tracer that runs a run evaluator whenever a run is persisted.

tracers.langchain.LangChainTracer([...])

Implementation of the SharedTracer that POSTS to the LangChain endpoint.

tracers.langchain_v1.LangChainTracerV1(**kwargs)

[Deprecated] Implementation of the SharedTracer that POSTS to the langchain endpoint.

tracers.log_stream.LogEntry

A single entry in the run log.

tracers.log_stream.LogStreamCallbackHandler(*)

Tracer that streams run logs to a stream.

tracers.log_stream.RunLog(*ops, state)

Run log.

tracers.log_stream.RunLogPatch(*ops)

Patch to the run log.

tracers.log_stream.RunState

State of the run.

tracers.root_listeners.RootListenersTracer(*, ...)

Tracer that calls listeners on run start, end, and error.

tracers.run_collector.RunCollectorCallbackHandler([...])

Tracer that collects all nested runs in a list.

tracers.schemas.BaseRun

[Deprecated] Base class for Run.

tracers.schemas.ChainRun

[Deprecated] Class for ChainRun.

tracers.schemas.LLMRun

[Deprecated] Class for LLMRun.

tracers.schemas.Run

Run schema for the V2 API in the Tracer.

tracers.schemas.ToolRun

[Deprecated] Class for ToolRun.

tracers.schemas.TracerSession

[Deprecated] TracerSessionV1 schema for the V2 API.

tracers.schemas.TracerSessionBase

[Deprecated] Base class for TracerSession.

tracers.schemas.TracerSessionV1

[Deprecated] TracerSessionV1 schema.

tracers.schemas.TracerSessionV1Base

[Deprecated] Base class for TracerSessionV1.

tracers.schemas.TracerSessionV1Create

[Deprecated] Create class for TracerSessionV1.

tracers.stdout.ConsoleCallbackHandler(**kwargs)

Tracer that prints to the console.

tracers.stdout.FunctionCallbackHandler(...)

Tracer that calls a function with a single str parameter.

Functions

tracers.context.collect_runs()

Collect all run traces in context.

tracers.context.register_configure_hook(...)

Register a configure hook.

tracers.context.tracing_enabled([session_name])

[Deprecated] Get the Deprecated LangChainTracer in a context manager.

tracers.context.tracing_v2_enabled([...])

Instruct LangChain to log all runs in context to LangSmith.

tracers.evaluation.wait_for_all_evaluators()

Wait for all tracers to finish.

tracers.langchain.get_client()

Get the client.

tracers.langchain.log_error_once(method, ...)

Log an error once.

tracers.langchain.wait_for_all_tracers()

Wait for all tracers to finish.

tracers.langchain_v1.get_headers()

Get the headers for the LangChain API.

tracers.schemas.RunTypeEnum()

[Deprecated] RunTypeEnum.

tracers.stdout.elapsed(run)

Get the elapsed time of a run.

tracers.stdout.try_json_stringify(obj, fallback)

Try to stringify an object to JSON.

langchain_core.utils

Utility functions for LangChain.

These functions do not depend on any other LangChain module.

Classes

utils.aiter.NoLock()

Dummy lock that provides the proper interface but no protection

utils.aiter.Tee(iterable[, n, lock])

Create n separate asynchronous iterators over iterable

utils.aiter.atee

alias of Tee

utils.formatting.StrictFormatter()

Formatter that checks for extra keys.

utils.function_calling.FunctionDescription

Representation of a callable function to send to an LLM.

utils.function_calling.ToolDescription

Representation of a callable function to the OpenAI API.

utils.iter.NoLock()

Dummy lock that provides the proper interface but no protection

utils.iter.Tee(iterable[, n, lock])

Create n separate asynchronous iterators over iterable

utils.iter.safetee

alias of Tee

Functions

utils.aiter.py_anext(iterator[, default])

Pure-Python implementation of anext() for testing purposes.

utils.aiter.tee_peer(iterator, buffer, ...)

An individual iterator of a tee()

utils.env.env_var_is_set(env_var)

Check if an environment variable is set.

utils.env.get_from_dict_or_env(data, key, ...)

Get a value from a dictionary or an environment variable.

utils.env.get_from_env(key, env_key[, default])

Get a value from a dictionary or an environment variable.

utils.function_calling.convert_pydantic_to_openai_function(...)

[Deprecated] Converts a Pydantic model to a function description for the OpenAI API.

utils.function_calling.convert_pydantic_to_openai_tool(...)

[Deprecated] Converts a Pydantic model to a function description for the OpenAI API.

utils.function_calling.convert_python_function_to_openai_function(...)

[Deprecated] Convert a Python function to an OpenAI function-calling API compatible dict.

utils.function_calling.convert_to_openai_function(...)

Convert a raw function/class to an OpenAI function.

utils.function_calling.convert_to_openai_tool(tool)

Convert a raw function/class to an OpenAI tool.

utils.function_calling.format_tool_to_openai_function(tool)

[Deprecated] Format tool into the OpenAI function API.

utils.function_calling.format_tool_to_openai_tool(tool)

[Deprecated] Format tool into the OpenAI function API.

utils.html.extract_sub_links(raw_html, url, *)

Extract all links from a raw html string and convert into absolute paths.

utils.html.find_all_links(raw_html, *[, pattern])

Extract all links from a raw html string.

utils.image.encode_image(image_path)

Get base64 string from image URI.

utils.image.image_to_data_url(image_path)

utils.input.get_bolded_text(text)

Get bolded text.

utils.input.get_color_mapping(items[, ...])

Get mapping for items to a support color.

utils.input.get_colored_text(text, color)

Get colored text.

utils.input.print_text(text[, color, end, file])

Print text with highlighting and no end characters.

utils.interactive_env.is_interactive_env()

Determine if running within IPython or Jupyter.

utils.iter.batch_iterate(size, iterable)

Utility batching function.

utils.iter.tee_peer(iterator, buffer, peers, ...)

An individual iterator of a tee()

utils.json_schema.dereference_refs(schema_obj, *)

Try to substitute $refs in JSON Schema.

utils.loading.try_load_from_hub(path, ...)

[Deprecated] Load configuration from hub.

utils.pydantic.get_pydantic_major_version()

Get the major version of Pydantic.

utils.strings.comma_list(items)

Convert a list to a comma-separated string.

utils.strings.stringify_dict(data)

Stringify a dictionary.

utils.strings.stringify_value(val)

Stringify a value.

utils.utils.build_extra_kwargs(extra_kwargs, ...)

Build extra kwargs from values and extra_kwargs.

utils.utils.check_package_version(package[, ...])

Check the version of a package.

utils.utils.convert_to_secret_str(value)

Convert a string to a SecretStr if needed.

utils.utils.get_pydantic_field_names(...)

Get field names, including aliases, for a pydantic class.

utils.utils.guard_import(module_name, *[, ...])

Dynamically imports a module and raises a helpful exception if the module is not installed.

utils.utils.mock_now(dt_value)

Context manager for mocking out datetime.now() in unit tests.

utils.utils.raise_for_status_with_text(response)

Raise an error with the response text.

utils.utils.xor_args(*arg_groups)

Validate specified keyword args are mutually exclusive.

langchain_core.vectorstores

Vector store stores embedded data and performs vector search.

One of the most common ways to store and search over unstructured data is to embed it and store the resulting embedding vectors, and then query the store and retrieve the data that are ‘most similar’ to the embedded query.

Class hierarchy:

VectorStore --> <name>  # Examples: Annoy, FAISS, Milvus

BaseRetriever --> VectorStoreRetriever --> <name>Retriever  # Example: VespaRetriever

Main helpers:

Embeddings, Document

Classes

vectorstores.VectorStore()

Interface for vector store.

vectorstores.VectorStoreRetriever

Base Retriever class for VectorStore.