langchain_core.caches
.BaseCacheΒΆ
- class langchain_core.caches.BaseCache[source]ΒΆ
Base interface for cache.
Methods
__init__
()aclear
(**kwargs)Clear cache that can take additional keyword arguments.
alookup
(prompt, llm_string)Look up based on prompt and llm_string.
aupdate
(prompt, llm_string, return_val)Update cache based on prompt and llm_string.
clear
(**kwargs)Clear cache that can take additional keyword arguments.
lookup
(prompt, llm_string)Look up based on prompt and llm_string.
update
(prompt, llm_string, return_val)Update cache based on prompt and llm_string.
- __init__()ΒΆ
- async aclear(**kwargs: Any) None [source]ΒΆ
Clear cache that can take additional keyword arguments.
- Parameters
kwargs (Any) β
- Return type
None
- async alookup(prompt: str, llm_string: str) Optional[Sequence[Generation]] [source]ΒΆ
Look up based on prompt and llm_string.
- Parameters
prompt (str) β
llm_string (str) β
- Return type
Optional[Sequence[Generation]]
- async aupdate(prompt: str, llm_string: str, return_val: Sequence[Generation]) None [source]ΒΆ
Update cache based on prompt and llm_string.
- Parameters
prompt (str) β
llm_string (str) β
return_val (Sequence[Generation]) β
- Return type
None
- abstract clear(**kwargs: Any) None [source]ΒΆ
Clear cache that can take additional keyword arguments.
- Parameters
kwargs (Any) β
- Return type
None
- abstract lookup(prompt: str, llm_string: str) Optional[Sequence[Generation]] [source]ΒΆ
Look up based on prompt and llm_string.
- Parameters
prompt (str) β
llm_string (str) β
- Return type
Optional[Sequence[Generation]]
- abstract update(prompt: str, llm_string: str, return_val: Sequence[Generation]) None [source]ΒΆ
Update cache based on prompt and llm_string.
- Parameters
prompt (str) β
llm_string (str) β
return_val (Sequence[Generation]) β
- Return type
None