BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.32k stars 1.43k forks source link

[Bug]: When using LiteLLM Proxy with tool calling, Autogen and AWS Bedrock Claude, Bedrock errors when content fields are empty #4820

Open seam-ctooley opened 2 months ago

seam-ctooley commented 2 months ago

What happened?

Setup:

Autogen Agent:

CLAUDE_CONFIG = {
    "config_list": [
        {
            "model": "anthropic.claude-3-5-sonnet-20240620-v1:0",  # Loaded with LiteLLM command
            "api_key": "NotRequired",  # Not needed
            "base_url": "http://localhost:4000/",  # Your LiteLLM URL
        }
    ],
    "cache_seed": None,
}

LiteLLM Proxy Config

model_list:
  - model_name: anthropic.claude-3-5-sonnet-20240620-v1:0
    litellm_params:
      model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
      aws_region_name: us-east-1

litellm_settings:
  drop_params: True

Minimal Reproducible Autogen setup:

import autogen
from typing import Literal, Annotated

chatbot = autogen.AssistantAgent(
    name="chatbot",
    system_message="For currency exchange tasks, only use the functions you have been provided with. Reply TERMINATE when the task is done.",
    llm_config={
        "config_list": [
            {
                "model": "anthropic.claude-3-5-sonnet-20240620-v1:0",  # Loaded with LiteLLM command
                "api_key": "NotRequired",  # Not needed
                "base_url": "http://localhost:4000/",  # Your LiteLLM URL
            }
        ],
        "cache_seed": None,
    },
)

# create a UserProxyAgent instance named "user_proxy"
user_proxy = autogen.UserProxyAgent(
    name="user_proxy",
    is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
    human_input_mode="NEVER",
    max_consecutive_auto_reply=10,
)

CurrencySymbol = Literal["USD", "EUR"]

def exchange_rate(base_currency: CurrencySymbol, quote_currency: CurrencySymbol) -> float:
    if base_currency == quote_currency:
        return 1.0
    elif base_currency == "USD" and quote_currency == "EUR":
        return 1 / 1.1
    elif base_currency == "EUR" and quote_currency == "USD":
        return 1.1
    else:
        raise ValueError(f"Unknown currencies {base_currency}, {quote_currency}")

@user_proxy.register_for_execution()
@chatbot.register_for_llm(description="Currency exchange calculator.")
def currency_calculator(
    base_amount: Annotated[float, "Amount of currency in base_currency"],
    base_currency: Annotated[CurrencySymbol, "Base currency"] = "USD",
    quote_currency: Annotated[CurrencySymbol, "Quote currency"] = "EUR",
) -> str:
    quote_amount = exchange_rate(base_currency, quote_currency) * base_amount
    return f"{quote_amount} {quote_currency}"

res = user_proxy.initiate_chat(
    chatbot, message="How much is 123.45 USD in EUR?"
)

Relevant log output

09:25:12 - LiteLLM Proxy:DEBUG: proxy_server.py:3004 - An error occurred: litellm.BadRequestError: BedrockException - {"message":"The text field in the ContentBlock object at messages.2.content.0 is blank. Add text to the text field, and try again."} LiteLLM Retried: 1 times, LiteLLM Max Retries: 2 None

Twitter / LinkedIn details

No response

krrishdholakia commented 2 months ago

Hey @seam-ctooley what version of autogen is this? i can't seem to run your script

Screenshot 2024-07-22 at 2 33 04 PM
krrishdholakia commented 2 months ago

also can you run your proxy with --detailed_debug? it should print the raw request being made, which should help with repro

seam-ctooley commented 1 month ago

I'm on the latest Autogen version: here is a repo that reproduces the issue I'm seeing. https://github.com/seam-ctooley/litellm-bedrock-bug-repro

I've got a detailed debug log, but it seems to contain AWS creds. I'll share it tomorrow once my session expires. If we could share it over Discord as well, that would be greatly appreciated. I am "christiant_47581" on the LiteLLM server

seam-ctooley commented 1 month ago

stderr.txt Here is the full log file @krrishdholakia

dbpprt commented 1 month ago

Same issue here with latest LiteLLM running locally, Autogen and Claude 3 Haiku.

haandol commented 1 month ago

same here

astroalek commented 1 month ago

Hey @seam-ctooley what version of autogen is this? i can't seem to run your script Screenshot 2024-07-22 at 2 33 04 PM

This usually occurs when you install "autogen" instead of "pyautogen"

seam-ctooley commented 1 month ago

I've been able to get around the issues mentioned here by using Autogen directly with a custom client https://gist.github.com/seam-ctooley/d22f8319f313bc160388ae5949cc20b8

So I imagine the issue lies with the translation layer to Bedrock, specific format requirements with tool calling that aren't being met.