When using ChatBedrock with llama 2 models and generating texts in non-streaming mode, setting stop sequence will cause malformed input request error.
e.g. running the following code
model_id = "meta.llama2-70b-chat-v1"
model_kwargs = {
"temperature": 0,
"max_gen_len": 2048,
}
llm = BedrockChat(
model_id=model_id,
model_kwargs=model_kwargs
)
llm.invoke("show me the weather in sf", stop=["Humidity"])
will result in the following error message
ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: extraneous key [stop_sequences] is not permitted, please reformat your input and try again.
When using ChatBedrock with llama 2 models and generating texts in non-streaming mode, setting stop sequence will cause malformed input request error.
e.g. running the following code
will result in the following error message