langchain_community.embeddings.gradient_ai.TinyAsyncGradientEmbeddingClient¶

class langchain_community.embeddings.gradient_ai.TinyAsyncGradientEmbeddingClient(access_token: Optional[str] = None, workspace_id: Optional[str] = None, host: str = 'https://api.gradient.ai/api', aiosession: Optional[ClientSession] = None)[source]¶

A helper tool to embed Gradient. Not part of Langchain’s or Gradients stable API, direct use discouraged.

To use, set the environment variable GRADIENT_ACCESS_TOKEN with your API token and GRADIENT_WORKSPACE_ID for your gradient workspace, or alternatively provide them as keywords to the constructor of this class.

Example

mini_client = TinyAsyncGradientEmbeddingClient(
    workspace_id="12345614fc0_workspace",
    access_token="gradientai-access_token",
)
embeds = mini_client.embed(
    model="bge-large",
    text=["doc1", "doc2"]
)
# or
embeds = await mini_client.aembed(
    model="bge-large",
    text=["doc1", "doc2"]
)

Methods

__init__([access_token, workspace_id, host, ...])

aembed(model, texts)

call the embedding of model, async method

embed(model, texts)

call the embedding of model

__init__(access_token: Optional[str] = None, workspace_id: Optional[str] = None, host: str = 'https://api.gradient.ai/api', aiosession: Optional[ClientSession] = None) None[source]¶
async aembed(model: str, texts: List[str]) List[List[float]][source]¶

call the embedding of model, async method

Parameters
  • model (str) – to embedding model

  • texts (List[str]) – List of sentences to embed.

Returns

List of vectors for each sentence

Return type

List[List[float]]

embed(model: str, texts: List[str]) List[List[float]][source]¶

call the embedding of model

Parameters
  • model (str) – to embedding model

  • texts (List[str]) – List of sentences to embed.

Returns

List of vectors for each sentence

Return type

List[List[float]]