sblakey / llm-bedrock-anthropic

Plugin for https://llm.datasette.io/en/stable/ to enable talking with Claude Instant and ClaudeV2 models on AWS Bedrock
Apache License 2.0
34 stars 11 forks source link

Claude v2.1 InvokeModelWithResponseStream Error #8

Closed jhaydter closed 8 months ago

jhaydter commented 8 months ago

Issue: When attempting any prompt with Claude v2.1 using the llm-bedrock-anthropic plugin I encounter an InvokeModelWithResponseStream boto error. Claude v2 works correctly.

$ llm -m anthropic.claude-v2 "Write me a one sentence Haiku about cheese"
Here is a one sentence haiku about cheese:

Aged cheddar, sharp and crumbly, brings joy with each bite.
$ llm -m anthropic.claude-v2:1 "Write me a one sentence Haiku about cheese"
Error: An error occurred (ValidationException) when calling the InvokeModelWithResponseStream operation: The provided model identifier is invalid.

Input validation does recognize the registered model

$ llm models default anthropic.claude-v2:1
$ llm models default
anthropic.claude-v2:1
$ llm models default anthropic.claude-v4
Error: Unknown model: anthropic.claude-v4

Python Versions:

Package               Version
-----------------------------
boto3                 1.34.55
botocore              1.34.55
llm                   0.13.1
llm-bedrock-anthropic 0.3
sblakey commented 8 months ago

Fixed with release 0.4.1