Optional
additionalAdditional inference parameters that the model supports, beyond the
base set of inference parameters that the Converse API supports in the inferenceConfig
field. For more information, see the model parameters link below.
Optional
authorizationBROWSER ONLY.
Providing this value will set an "Authorization" request header value on the GET request.
Optional
awsAn alternative to awsContainerAuthorizationTokenFile, this is the token value itself.
For browser environments, use instead authorizationToken.
Optional
awsWill be read on each credentials request to add an Authorization request header value.
Not supported in browsers.
Optional
awsIf this value is provided, it will be used as-is.
For browser environments, use instead credentialsFullUri.
Optional
awsIf this value is provided instead of the full URI, it will be appended to the default link local host of 169.254.170.2.
Not supported in browsers.
Optional
configThe path at which to locate the ini config file. Defaults to the value of
the AWS_CONFIG_FILE
environment variable (if defined) or
~/.aws/config
otherwise.
Optional
credentialsAWS Credentials. If no credentials are provided, the default credentials from
@aws-sdk/credential-provider-node
will be used.
Optional
credentialsBROWSER ONLY.
In browsers, a relative URI is not allowed, and a full URI must be provided. HTTPS is required.
This value is required for the browser environment.
Optional
ec2Only used in the IMDS credential provider.
Optional
endpointOverride the default endpoint hostname.
Optional
filepathThe path at which to locate the ini credentials file. Defaults to the
value of the AWS_SHARED_CREDENTIALS_FILE
environment variable (if
defined) or ~/.aws/credentials
otherwise.
Optional
guardrailConfiguration information for a guardrail that you want to use in the request.
Optional
ignoreConfiguration files are normally cached after the first time they are loaded. When this property is set, the provider will always reload any configuration files loaded before.
Optional
loggerFor credential resolution trace logging.
Optional
maxDefault is 3 retry attempts or 4 total attempts.
Optional
maxMax tokens.
Optional
mfaA function that returns a promise fulfilled with an MFA token code for
the provided MFA Serial code. If a profile requires an MFA code and
mfaCodeProvider
is not a valid function, the credential provider
promise will be rejected.
The serial code of the MFA device specified.
Optional
modelModel to use. For example, "anthropic.claude-3-haiku-20240307-v1:0", this is equivalent to the modelId property in the list-foundation-models api. See the below link for a full list of models.
https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html#model-ids-arns
anthropic.claude-3-haiku-20240307-v1:0
Optional
profileThe configuration profile to use.
Optional
regionThe AWS region e.g. us-west-2
.
Fallback to AWS_DEFAULT_REGION env variable or region specified in ~/.aws/config
in case it is not provided here.
Optional
roleA function that assumes a role and returns a promise fulfilled with credentials for the assumed role.
The credentials with which to assume a role.
Optional
roleA function that assumes a role with web identity and returns a promise fulfilled with credentials for the assumed role.
Optional
roleThe IAM session name used to distinguish sessions.
Optional
ssoThe ID of the AWS account to use for temporary credentials.
Optional
ssoOptional
ssoThe AWS region to use for temporary credentials.
Optional
ssoThe name of the AWS role to assume.
Optional
ssoSSO session identifier. Presence implies usage of the SSOTokenProvider.
Optional
ssoThe URL to the AWS SSO service.
Optional
streamWhether or not to include usage data, like token counts in the streamed response chunks. Passing as a call option will take precedence over the class-level setting.
Optional
streamingWhether or not to stream responses
Optional
supportsWhich types of tool_choice
values the model supports.
Inferred if not specified. Inferred as ['auto', 'any', 'tool'] if a 'claude-3' model is used, ['auto', 'any'] if a 'mistral-large' model is used, empty otherwise.
Optional
temperatureTemperature.
Optional
timeoutDefault is 1000ms. Time in milliseconds to spend waiting between retry attempts.
Optional
topPThe percentage of most-likely candidates that the model considers for the next token. For
example, if you choose a value of 0.8 for topP
, the model selects from the top 80% of the
probability distribution of tokens that could be next in the sequence.
The default value is the default value for the model that you are using.
For more information, see the inference parameters for foundation models link below.
Optional
webFile location of where the OIDC
token is stored.
Inputs for ChatBedrockConverse.