langchain_community.llms.bedrock.BedrockBase

class langchain_community.llms.bedrock.BedrockBase[source]

Bases: BaseModel, ABC

Base class for Bedrock models.

Create a new model by parsing and validating input data from keyword arguments.

Raises ValidationError if the input data cannot be parsed to form a valid model.

param config: Optional[Config] = None

An optional botocore.config.Config instance to pass to the client.

param credentials_profile_name: Optional[str] = None

The name of the profile in the ~/.aws/credentials or ~/.aws/config files, which has either access keys or role information specified. If not specified, the default credential profile or, if on an EC2 instance, credentials from IMDS will be used. See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

param endpoint_url: Optional[str] = None

Needed if you don’t want to default to us-east-1 endpoint

param guardrails: Optional[Mapping[str, Any]] = {'id': None, 'trace': False, 'version': None}

An optional dictionary to configure guardrails for Bedrock.

This field ‘guardrails’ consists of two keys: ‘id’ and ‘version’, which should be strings, but are initialized to None. It’s used to determine if specific guardrails are enabled and properly set.

Type:

Optional[Mapping[str, str]]: A mapping with ‘id’ and ‘version’ keys.

Example: llm = Bedrock(model_id=”<model_id>”, client=<bedrock_client>,

model_kwargs={}, guardrails={

“id”: “<guardrail_id>”, “version”: “<guardrail_version>”})

To enable tracing for guardrails, set the ‘trace’ key to True and pass a callback handler to the ‘run_manager’ parameter of the ‘generate’, ‘_call’ methods.

Example: llm = Bedrock(model_id=”<model_id>”, client=<bedrock_client>,

model_kwargs={}, guardrails={

“id”: “<guardrail_id>”, “version”: “<guardrail_version>”, “trace”: True},

callbacks=[BedrockAsyncCallbackHandler()])

[https://python.langchain.com/docs/modules/callbacks/] for more information on callback handlers.

class BedrockAsyncCallbackHandler(AsyncCallbackHandler):
async def on_llm_error(

self, error: BaseException, **kwargs: Any,

) -> Any:

reason = kwargs.get(“reason”) if reason == “GUARDRAIL_INTERVENED”:

…Logic to handle guardrail intervention…

param model_id: str [Required]

Id of the model to call, e.g., amazon.titan-text-express-v1, this is equivalent to the modelId property in the list-foundation-models api

param model_kwargs: Optional[Dict] = None

Keyword arguments to pass to the model.

param provider_stop_sequence_key_name_map: Mapping[str, str] = {'ai21': 'stop_sequences', 'amazon': 'stopSequences', 'anthropic': 'stop_sequences', 'cohere': 'stop_sequences'}
param region_name: Optional[str] = None

The aws region e.g., us-west-2. Fallsback to AWS_DEFAULT_REGION env variable or region specified in ~/.aws/config in case it is not provided here.

param streaming: bool = False

Whether to stream the results.

classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model

Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values

copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model

Duplicate a model, optionally choose which fields to include, exclude and change.

Parameters
  • include – fields to include in new model

  • exclude – fields to exclude from new model, as with values this takes precedence over include

  • update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data

  • deep – set to True to make a deep copy of the model

Returns

new model instance

dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

classmethod from_orm(obj: Any) Model
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode

Generate a JSON representation of the model, include and exclude arguments as per dict().

encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().

classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model
classmethod parse_obj(obj: Any) Model
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode
classmethod update_forward_refs(**localns: Any) None

Try to update ForwardRefs on fields based on this Model, globalns and localns.

classmethod validate(value: Any) Model