BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.53k stars 1.46k forks source link

[Bug]: 422 Unprocessable Entity - List index out of range #4003

Closed karlschelhammer-1159 closed 3 months ago

karlschelhammer-1159 commented 3 months ago

What happened?

I'm trying to use the popular framework AutoGen with AWS Bedrock Claude 3 Sonnet.

For the most part, the proxy works fine. Single agent chats are easy enough to get up and running.

However, when I try to set up a simple group chat, things stop working. I get the following error (see below). Apparently it's attempting to access an empty list.

Does anyone know how I might approach troubleshooting such an issue?

Here's the minimum reproducible example:

import autogen

llm_config = {
    "config_list": [{
        "model": "bedrock/anthropic.claude-3-sonnet-20240229-v1:0", 
        "api_key": "dummy-key",
        "base_url": "http://0.0.0.0:4000" 
    }]
}

user_proxy = autogen.UserProxyAgent(
    name="User_proxy",
    code_execution_config=False,
    human_input_mode="ALWAYS"
)

interviewer = autogen.AssistantAgent(
    name="Interviewer",
    system_message="Interview a user for a life insurance product.  Asks questions about health conditions, lifestyle, and family history.",
    llm_config=llm_config,
)
product_recommender = autogen.AssistantAgent(
    name="Product Recommender",
    system_message="Uses the results of the interview to recommend a life insurance product.",
    llm_config=llm_config,
)

groupchat = autogen.GroupChat(agents=[user_proxy, interviewer, product_recommender], messages=[{"role": "assistant", "content": "Let's interview!"}], max_round=22)
manager = autogen.GroupChatManager(groupchat=groupchat, llm_config=llm_config)

user_proxy.initiate_chat(
    manager, message="I'd like to buy a life insurance product."
)

Here's the error from the notebook:

User_proxy (to chat_manager):

I'd like to buy a life insurance product.

--------------------------------------------------------------------------------
Interviewer (to chat_manager):

Certainly, I'll be happy to assist you with getting some quotes for life insurance products. Before I can provide accurate quotes, I need to ask you some questions about your personal situation. This information will help determine your risk profile and the appropriate coverage amount and type. Please feel free to let me know if you have any other questions as we go through this process.

--------------------------------------------------------------------------------
Interviewer (to chat_manager):

First, can you provide the following basic information:

- Your age:
- Your gender: 
- If you are married or have dependents:
- Your state/province of residence:
- Whether you are a smoker or non-smoker:

This will allow me to get started gathering some initial quotes for you. I'll need some additional health and lifestyle information as well, but those basics will be helpful to start. Let me know if you have any other initial questions!

--------------------------------------------------------------------------------
---------------------------------------------------------------------------
UnprocessableEntityError                  Traceback (most recent call last)
Cell In[5], [line 1](vscode-notebook-cell:?execution_count=5&line=1)
----> [1](vscode-notebook-cell:?execution_count=5&line=1) user_proxy.initiate_chat(
      [2](vscode-notebook-cell:?execution_count=5&line=2)     manager, message="I'd like to buy a life insurance product."
      [3](vscode-notebook-cell:?execution_count=5&line=3) )

File ~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1007, in ConversableAgent.initiate_chat(self, recipient, clear_history, silent, cache, max_turns, summary_method, summary_args, message, **kwargs)
   [1005](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1005)     else:
   [1006](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1006)         msg2send = self.generate_init_message(message, **kwargs)
-> [1007](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1007)     self.send(msg2send, recipient, silent=silent)
   [1008](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1008) summary = self._summarize_chat(
   [1009](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1009)     summary_method,
   [1010](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1010)     summary_args,
   [1011](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1011)     recipient,
   [1012](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1012)     cache=cache,
   [1013](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1013) )
   [1014](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:1014) for agent in [self, recipient]:

File ~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:645, in ConversableAgent.send(self, message, recipient, request_reply, silent)
    [643](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:643) valid = self._append_oai_message(message, "assistant", recipient)
    [644](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:644) if valid:
--> [645](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:645)     recipient.receive(message, self, request_reply, silent)
    [646](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:646) else:
    [647](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py:647)     raise ValueError(
...
   (...)
   [1027](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/openai/_base_client.py:1027)     stream_cls=stream_cls,
   [1028](https://file+.vscode-resource.vscode-cdn.net/Users/karlschelhammer/Projects/applied-ai-repo/mas-interview-bot/~/anaconda3/envs/goai/lib/python3.12/site-packages/openai/_base_client.py:1028) )

UnprocessableEntityError: Error code: 422 - {'error': {'message': 'BedrockException - Error processing={"id":"msg_bdrk_01W745phrgHMP77t3Xbmh9ey","type":"message","role":"assistant","model":"claude-3-sonnet-20240229","stop_sequence":null,"usage":{"input_tokens":217,"output_tokens":3},"content":[],"stop_reason":"end_turn"}, Received error=list index out of range', 'type': None, 'param': None, 'code': 422}}
Output is truncated. View as a [scrollable element](command:cellOutput.enableScrolling?1fa17b27-dda1-4661-ab8c-91156d1dcad3) or open in a [text editor](command:workbench.action.openLargeOutput?1fa17b27-dda1-4661-ab8c-91156d1dcad3). Adjust cell output [settings](command:workbench.action.openSettings?%5B%22%40tag%3AnotebookOutputLayout%22%5D)...

Here's the error from the proxy server: litellm.exceptions.BadRequestError: BedrockException - Error processing={"id":"msg_bdrk_01W745phrgHMP77t3Xbmh9ey","type":"message","role":"assistant","model":"claude-3-sonnet-20240229","stop_sequence":null,"usage":{"input_tokens":217,"output_tokens":3},"content":[],"stop_reason":"end_turn"}, Received error=list index out of range

Relevant log output

litellm.exceptions.BadRequestError: BedrockException - Error processing={"id":"msg_bdrk_01W745phrgHMP77t3Xbmh9ey","type":"message","role":"assistant","model":"claude-3-sonnet-20240229","stop_sequence":null,"usage":{"input_tokens":217,"output_tokens":3},"content":[],"stop_reason":"end_turn"}, Received error=list index out of range

Twitter / LinkedIn details

No response

krrishdholakia commented 3 months ago

hey @karlschelhammer-1159 this should be fixed now with the move to bedrock converse api. closing, please bump if not