BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.61k stars 1.59k forks source link

[Bug]: Can't send image content blocks in AWS Bedrock via anthropic `/v1/messages` endpoint #5911

Open liuzhaolong765481 opened 1 month ago

liuzhaolong765481 commented 1 month ago

What happened?

once used image content blocks to v1/messages endpoint which like used in anthropic official, it will reponse the error. And the Environment is AWS Bedrock { "error": { "message": "litellm.APIConnectionError: too many values to unpack (expected 2)\nTraceback (most recent call last):\n File \"/usr/local/lib/python3.11/site-packages/litellm/main.py\", line 2415, in completion\n response = bedrock_converse_chat_completion.completion(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/usr/local/lib/python3.11/site-packages/litellm/llms/bedrock/chat/converse_handler.py\", line 305, in completion\n _data = litellm.AmazonConverseConfig()._transform_request(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/usr/local/lib/python3.11/site-packages/litellm/llms/bedrock/chat/converse_transformation.py\", line 243, in _transform_request\n bedrock_messages: List[MessageBlock] = _bedrock_converse_messages_pt(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/usr/local/lib/python3.11/site-packages/litellm/llms/prompt_templates/factory.py\", line 2391, in _bedrock_converse_messages_pt\n _part = _process_bedrock_converse_image_block( # type: ignore\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/usr/local/lib/python3.11/site-packages/litellm/llms/prompt_templates/factory.py\", line 2175, in _process_bedrock_converse_image_block\n image_metadata, img_without_base_64 = image_url.split(\",\")\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nValueError: too many values to unpack (expected 2)\n\nReceived Model Group=claude-3-sonnet-20240229\nAvailable Model Group Fallbacks=None", "type": null, "param": null, "code": "500" } }

here is the error

{ "messages": [ { "role": "user", "content": [ { "type": "image", "source": { "type": "base64", "media_type": "image/webp", "data": "UklGRgo9AABXRUJQV..." } }, { "type": "text", "text": "whats that in this images?" } ] } ], "model": "claude-3-sonnet-20240229", "stream": false, "max_tokens": 2000 } and here is my request code

Relevant log output

No response

Twitter / LinkedIn details

No response

zengbo commented 2 weeks ago

I got the same error