langchain_core.prompts.chat
.SystemMessagePromptTemplate¶
- class langchain_core.prompts.chat.SystemMessagePromptTemplate[source]¶
Bases:
BaseStringMessagePromptTemplate
System message prompt template. This is a message that is not sent to the user.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
- param additional_kwargs: dict [Optional]¶
Additional keyword arguments to pass to the prompt template.
- param prompt: langchain_core.prompts.string.StringPromptTemplate [Required]¶
String prompt template.
- classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model ¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
- copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model ¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep – set to True to make a deep copy of the model
- Returns
new model instance
- dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny ¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- format(**kwargs: Any) BaseMessage [source]¶
Format the prompt template.
- Parameters
**kwargs – Keyword arguments to use for formatting.
- Returns
Formatted message.
- format_messages(**kwargs: Any) List[BaseMessage] ¶
Format messages from kwargs.
- Parameters
**kwargs – Keyword arguments to use for formatting.
- Returns
List of BaseMessages.
- classmethod from_orm(obj: Any) Model ¶
- classmethod from_template(template: str, template_format: str = 'f-string', partial_variables: Optional[Dict[str, Any]] = None, **kwargs: Any) MessagePromptTemplateT ¶
Create a class from a string template.
- Parameters
template – a template.
template_format – format of the template.
partial_variables –
- A dictionary of variables that can be used to partially
fill in the template. For example, if the template is
”{variable1} {variable2}”, and partial_variables is {“variable1”: “foo”}, then the final prompt will be “foo {variable2}”.
**kwargs – keyword arguments to pass to the constructor.
- Returns
A new instance of this class.
- classmethod from_template_file(template_file: Union[str, Path], input_variables: List[str], **kwargs: Any) MessagePromptTemplateT ¶
Create a class from a template file.
- Parameters
template_file – path to a template file. String or Path.
input_variables – list of input variables.
**kwargs – keyword arguments to pass to the constructor.
- Returns
A new instance of this class.
- classmethod is_lc_serializable() bool ¶
Return whether or not the class is serializable.
- json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
- classmethod lc_id() List[str] ¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path to the object.
- classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod parse_obj(obj: Any) Model ¶
- classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ¶
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ¶
- to_json() Union[SerializedConstructor, SerializedNotImplemented] ¶
- to_json_not_implemented() SerializedNotImplemented ¶
- classmethod update_forward_refs(**localns: Any) None ¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
- classmethod validate(value: Any) Model ¶
- property input_variables: List[str]¶
Input variables for this prompt template.
- Returns
List of input variable names.
- property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
- property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
- For example,
{“openai_api_key”: “OPENAI_API_KEY”}