langchain_community.chat_models.azureml_endpoint.LlamaChatContentFormatter¶
- class langchain_community.chat_models.azureml_endpoint.LlamaChatContentFormatter[source]¶
Content formatter for LLaMA.
Attributes
SUPPORTED_ROLESacceptsThe MIME type of the response data returned from the endpoint
content_typeThe MIME type of the input data passed to the endpoint
format_error_msgsupported_api_typesSupported APIs for the given formatter.
Methods
__init__()escape_special_characters(prompt)Escapes any special characters in prompt
format_messages_request_payload(messages, ...)Formats the request according to the chosen api
format_request_payload(prompt, model_kwargs)Formats the request body according to the input schema of the model.
format_response_payload(output[, api_type])Formats response
- __init__()¶
- static escape_special_characters(prompt: str) str¶
Escapes any special characters in prompt
- Parameters
prompt (str) –
- Return type
str
- format_messages_request_payload(messages: List[BaseMessage], model_kwargs: Dict, api_type: AzureMLEndpointApiType) bytes[source]¶
Formats the request according to the chosen api
- Parameters
messages (List[BaseMessage]) –
model_kwargs (Dict) –
api_type (AzureMLEndpointApiType) –
- Return type
bytes
- format_request_payload(prompt: str, model_kwargs: Dict, api_type: AzureMLEndpointApiType = AzureMLEndpointApiType.realtime) Any¶
Formats the request body according to the input schema of the model. Returns bytes or seekable file like object in the format specified in the content_type request header.
- Parameters
prompt (str) –
model_kwargs (Dict) –
api_type (AzureMLEndpointApiType) –
- Return type
Any
- format_response_payload(output: bytes, api_type: AzureMLEndpointApiType = AzureMLEndpointApiType.realtime) ChatGeneration[source]¶
Formats response
- Parameters
output (bytes) –
api_type (AzureMLEndpointApiType) –
- Return type