Issue:
When attempting any prompt with Claude v2.1 using the llm-bedrock-anthropic plugin I encounter an InvokeModelWithResponseStream boto error. Claude v2 works correctly.
$ llm -m anthropic.claude-v2 "Write me a one sentence Haiku about cheese"
Here is a one sentence haiku about cheese:
Aged cheddar, sharp and crumbly, brings joy with each bite.
$ llm -m anthropic.claude-v2:1 "Write me a one sentence Haiku about cheese"
Error: An error occurred (ValidationException) when calling the InvokeModelWithResponseStream operation: The provided model identifier is invalid.
Input validation does recognize the registered model
Issue: When attempting any prompt with Claude v2.1 using the llm-bedrock-anthropic plugin I encounter an InvokeModelWithResponseStream boto error. Claude v2 works correctly.
Input validation does recognize the registered model
Python Versions: