langchain_community.cache.RedisSemanticCache¶

class langchain_community.cache.RedisSemanticCache(redis_url: str, embedding: Embeddings, score_threshold: float = 0.2)[source]¶

Cache that uses Redis as a vector-store backend.

Initialize by passing in the init GPTCache func

Parameters
  • redis_url (str) – URL to connect to Redis.

  • embedding (Embedding) – Embedding provider for semantic encoding and search.

  • score_threshold (float, 0.2) –

Example:

from langchain_community.globals import set_llm_cache

from langchain_community.cache import RedisSemanticCache
from langchain_community.embeddings import OpenAIEmbeddings

set_llm_cache(RedisSemanticCache(
    redis_url="redis://localhost:6379",
    embedding=OpenAIEmbeddings()
))

Attributes

DEFAULT_SCHEMA

Methods

__init__(redis_url, embedding[, score_threshold])

Initialize by passing in the init GPTCache func

clear(**kwargs)

Clear semantic cache for a given llm_string.

lookup(prompt, llm_string)

Look up based on prompt and llm_string.

update(prompt, llm_string, return_val)

Update cache based on prompt and llm_string.

__init__(redis_url: str, embedding: Embeddings, score_threshold: float = 0.2)[source]¶

Initialize by passing in the init GPTCache func

Parameters
  • redis_url (str) – URL to connect to Redis.

  • embedding (Embedding) – Embedding provider for semantic encoding and search.

  • score_threshold (float, 0.2) –

Example:

from langchain_community.globals import set_llm_cache

from langchain_community.cache import RedisSemanticCache
from langchain_community.embeddings import OpenAIEmbeddings

set_llm_cache(RedisSemanticCache(
    redis_url="redis://localhost:6379",
    embedding=OpenAIEmbeddings()
))
clear(**kwargs: Any) None[source]¶

Clear semantic cache for a given llm_string.

lookup(prompt: str, llm_string: str) Optional[Sequence[Generation]][source]¶

Look up based on prompt and llm_string.

update(prompt: str, llm_string: str, return_val: Sequence[Generation]) None[source]¶

Update cache based on prompt and llm_string.

Examples using RedisSemanticCache¶