langchain_community.cache.SQLAlchemyCache¶
- class langchain_community.cache.SQLAlchemyCache(engine: ~sqlalchemy.engine.base.Engine, cache_schema: ~typing.Type[~langchain_community.cache.FullLLMCache] = <class 'langchain_community.cache.FullLLMCache'>)[source]¶
Cache that uses SQAlchemy as a backend.
Initialize by creating all tables.
Methods
__init__(engine[, cache_schema])Initialize by creating all tables.
aclear(**kwargs)Clear cache that can take additional keyword arguments.
alookup(prompt, llm_string)Look up based on prompt and llm_string.
aupdate(prompt, llm_string, return_val)Update cache based on prompt and llm_string.
clear(**kwargs)Clear cache.
lookup(prompt, llm_string)Look up based on prompt and llm_string.
update(prompt, llm_string, return_val)Update based on prompt and llm_string.
- Parameters
engine (Engine) –
cache_schema (Type[FullLLMCache]) –
- __init__(engine: ~sqlalchemy.engine.base.Engine, cache_schema: ~typing.Type[~langchain_community.cache.FullLLMCache] = <class 'langchain_community.cache.FullLLMCache'>)[source]¶
Initialize by creating all tables.
- Parameters
engine (Engine) –
cache_schema (Type[FullLLMCache]) –
- async aclear(**kwargs: Any) None¶
Clear cache that can take additional keyword arguments.
- Parameters
kwargs (Any) –
- Return type
None
- async alookup(prompt: str, llm_string: str) Optional[Sequence[Generation]]¶
Look up based on prompt and llm_string.
- Parameters
prompt (str) –
llm_string (str) –
- Return type
Optional[Sequence[Generation]]
- async aupdate(prompt: str, llm_string: str, return_val: Sequence[Generation]) None¶
Update cache based on prompt and llm_string.
- Parameters
prompt (str) –
llm_string (str) –
return_val (Sequence[Generation]) –
- Return type
None
- lookup(prompt: str, llm_string: str) Optional[Sequence[Generation]][source]¶
Look up based on prompt and llm_string.
- Parameters
prompt (str) –
llm_string (str) –
- Return type
Optional[Sequence[Generation]]
- update(prompt: str, llm_string: str, return_val: Sequence[Generation]) None[source]¶
Update based on prompt and llm_string.
- Parameters
prompt (str) –
llm_string (str) –
return_val (Sequence[Generation]) –
- Return type
None