langchain_community.llms.bedrock
.LLMInputOutputAdapter¶
- class langchain_community.llms.bedrock.LLMInputOutputAdapter[source]¶
Adapter class to prepare the inputs from Langchain to a format that LLM model expects.
It also provides helper function to extract the generated text from the model response.
Attributes
provider_to_output_key_map
Methods
__init__
()aprepare_output_stream
(provider, response[, ...])prepare_input
(provider, model_kwargs[, ...])prepare_output
(provider, response)prepare_output_stream
(provider, response[, ...])- __init__()¶
- classmethod aprepare_output_stream(provider: str, response: Any, stop: Optional[List[str]] = None) AsyncIterator[GenerationChunk] [source]¶
- Parameters
provider (str) –
response (Any) –
stop (Optional[List[str]]) –
- Return type
AsyncIterator[GenerationChunk]
- classmethod prepare_input(provider: str, model_kwargs: Dict[str, Any], prompt: Optional[str] = None, system: Optional[str] = None, messages: Optional[List[Dict]] = None) Dict[str, Any] [source]¶
- Parameters
provider (str) –
model_kwargs (Dict[str, Any]) –
prompt (Optional[str]) –
system (Optional[str]) –
messages (Optional[List[Dict]]) –
- Return type
Dict[str, Any]
- classmethod prepare_output(provider: str, response: Any) dict [source]¶
- Parameters
provider (str) –
response (Any) –
- Return type
dict
- classmethod prepare_output_stream(provider: str, response: Any, stop: Optional[List[str]] = None, messages_api: bool = False) Iterator[GenerationChunk] [source]¶
- Parameters
provider (str) –
response (Any) –
stop (Optional[List[str]]) –
messages_api (bool) –
- Return type
Iterator[GenerationChunk]