langchain.memory.zep_memory
.ZepMemory¶
- class langchain.memory.zep_memory.ZepMemory[source]¶
Bases:
ConversationBufferMemory
Persist your chain history to the Zep MemoryStore.
The number of messages returned by Zep and when the Zep server summarizes chat histories is configurable. See the Zep documentation for more details.
Documentation: https://docs.getzep.com
Example
- memory = ZepMemory(
session_id=session_id, # Identifies your user or a user’s session url=ZEP_API_URL, # Your Zep server’s URL api_key=<your_api_key>, # Optional memory_key=”history”, # Ensure this matches the key used in
# chain’s prompt template
- return_messages=True, # Does your prompt template expect a string
# or a list of Messages?
)
- chain = LLMChain(memory=memory,…) # Configure your chain to use the ZepMemory
instance
Note
To persist metadata alongside your chat history, your will need to create a
custom Chain class that overrides the prep_outputs method to include the metadata in the call to self.memory.save_context.
Zep - Fast, scalable building blocks for LLM Apps¶
Zep is an open source platform for productionizing LLM apps. Go from a prototype built in LangChain or LlamaIndex, or a custom app, to production in minutes without rewriting code.
For server installation instructions and more, see: https://docs.getzep.com/deployment/quickstart/
For more information on the zep-python package, see: https://github.com/getzep/zep-python
Initialize ZepMemory.
- param session_id
Identifies your user or a user’s session
- type session_id
str
- param url
Your Zep server’s URL. Defaults to “http://localhost:8000”.
- type url
str, optional
- param api_key
Your Zep API key. Defaults to None.
- type api_key
Optional[str], optional
- param output_key
The key to use for the output message. Defaults to None.
- type output_key
Optional[str], optional
- param input_key
The key to use for the input message. Defaults to None.
- type input_key
Optional[str], optional
- param return_messages
Does your prompt template expect a string or a list of Messages? Defaults to False i.e. return a string.
- type return_messages
bool, optional
- param human_prefix
The prefix to use for human messages. Defaults to “Human”.
- type human_prefix
str, optional
- param ai_prefix
The prefix to use for AI messages. Defaults to “AI”.
- type ai_prefix
str, optional
- param memory_key
The key to use for the memory. Defaults to “history”. Ensure that this matches the key used in chain’s prompt template.
- type memory_key
str, optional
- param ai_prefix: str = 'AI'¶
- param chat_memory: ZepChatMessageHistory [Required]¶
- param human_prefix: str = 'Human'¶
- param input_key: Optional[str] = None¶
- param output_key: Optional[str] = None¶
- param return_messages: bool = False¶
- async abuffer() Any ¶
String buffer of memory.
- Return type
Any
- async abuffer_as_messages() List[BaseMessage] ¶
Exposes the buffer as a list of messages in case return_messages is False.
- Return type
List[BaseMessage]
- async abuffer_as_str() str ¶
Exposes the buffer as a string in case return_messages is True.
- Return type
str
- async aclear() None ¶
Clear memory contents.
- Return type
None
- async aload_memory_variables(inputs: Dict[str, Any]) Dict[str, Any] ¶
Return key-value pairs given the text input to the chain.
- Parameters
inputs (Dict[str, Any]) –
- Return type
Dict[str, Any]
- async asave_context(inputs: Dict[str, Any], outputs: Dict[str, str]) None ¶
Save context from this conversation to buffer.
- Parameters
inputs (Dict[str, Any]) –
outputs (Dict[str, str]) –
- Return type
None
- clear() None ¶
Clear memory contents.
- Return type
None
- classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model ¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
- Parameters
_fields_set (Optional[SetStr]) –
values (Any) –
- Return type
Model
- copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model ¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to include in new model
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to exclude from new model, as with values this takes precedence over include
update (Optional[DictStrAny]) – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep (bool) – set to True to make a deep copy of the model
self (Model) –
- Returns
new model instance
- Return type
Model
- dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny ¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
by_alias (bool) –
skip_defaults (Optional[bool]) –
exclude_unset (bool) –
exclude_defaults (bool) –
exclude_none (bool) –
- Return type
DictStrAny
- classmethod from_orm(obj: Any) Model ¶
- Parameters
obj (Any) –
- Return type
Model
- classmethod get_lc_namespace() List[str] ¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the namespace is [“langchain”, “llms”, “openai”]
- Return type
List[str]
- classmethod is_lc_serializable() bool ¶
Is this class serializable?
- Return type
bool
- json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
by_alias (bool) –
skip_defaults (Optional[bool]) –
exclude_unset (bool) –
exclude_defaults (bool) –
exclude_none (bool) –
encoder (Optional[Callable[[Any], Any]]) –
models_as_dict (bool) –
dumps_kwargs (Any) –
- Return type
unicode
- classmethod lc_id() List[str] ¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path to the object.
- Return type
List[str]
- load_memory_variables(inputs: Dict[str, Any]) Dict[str, Any] ¶
Return history buffer.
- Parameters
inputs (Dict[str, Any]) –
- Return type
Dict[str, Any]
- classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- Parameters
path (Union[str, Path]) –
content_type (unicode) –
encoding (unicode) –
proto (Protocol) –
allow_pickle (bool) –
- Return type
Model
- classmethod parse_obj(obj: Any) Model ¶
- Parameters
obj (Any) –
- Return type
Model
- classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- Parameters
b (Union[str, bytes]) –
content_type (unicode) –
encoding (unicode) –
proto (Protocol) –
allow_pickle (bool) –
- Return type
Model
- save_context(inputs: Dict[str, Any], outputs: Dict[str, str], metadata: Optional[Dict[str, Any]] = None) None [source]¶
Save context from this conversation to buffer.
- Parameters
inputs (Dict[str, Any]) – The inputs to the chain.
outputs (Dict[str, str]) – The outputs from the chain.
metadata (Optional[Dict[str, Any]], optional) – Any metadata to save with the context. Defaults to None
- Returns
None
- Return type
None
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ¶
- Parameters
by_alias (bool) –
ref_template (unicode) –
- Return type
DictStrAny
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ¶
- Parameters
by_alias (bool) –
ref_template (unicode) –
dumps_kwargs (Any) –
- Return type
unicode
- to_json() Union[SerializedConstructor, SerializedNotImplemented] ¶
- Return type
- to_json_not_implemented() SerializedNotImplemented ¶
- Return type
- classmethod update_forward_refs(**localns: Any) None ¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
- Parameters
localns (Any) –
- Return type
None
- classmethod validate(value: Any) Model ¶
- Parameters
value (Any) –
- Return type
Model
- property buffer: Any¶
String buffer of memory.
- property buffer_as_messages: List[BaseMessage]¶
Exposes the buffer as a list of messages in case return_messages is False.
- property buffer_as_str: str¶
Exposes the buffer as a string in case return_messages is True.
- property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
- property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
- For example,
{“openai_api_key”: “OPENAI_API_KEY”}