BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.08k stars 1.66k forks source link

[Bug]: Starting with v1.52.0 logging with Langfuse leads to error: "team_id missing from team" #6787

Open CalabiYau14 opened 4 days ago

CalabiYau14 commented 4 days ago

What happened?

First of all thanks for the great work. Unfortunately I encounter a small bug starting with v1.52.0

I need to log to Langfuse and for this I use the following config:

litellm_settings: default_team_settings:

This then leads to the following error (see also added Stacktrace): Exception: team_id missing from team

Do you have any ideas why this happens? Thanks in advance and cheers

Relevant log output

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 3295, in chat_completion
    data = await add_litellm_data_to_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/litellm_pre_call_utils.py", line 512, in add_litellm_data_to_request
    team_config = await proxy_config.load_team_config(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 1472, in load_team_config
    raise Exception(f"team_id missing from team: {team}")
Exception: team_id missing from team: {'success_callback': ['langfuse'], 'failure_callback': ['langfuse'], 'langfuse_public_key': 'pk-lf-008c75dc-f7a6-4e8b-8f3a-3fa8631cedf6', 'langfuse_secret': 'sk-lf-64ca0c4b-8a4f-4c81-896f-7e59864d057c'}
07:38:24 - LiteLLM Proxy:ERROR: _common.py:120 - Giving up chat_completion(...) after 1 tries (litellm.proxy._types.ProxyException)
INFO:     10.13.0.29:36316 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
07:38:24 - LiteLLM Proxy:ERROR: proxy_server.py:3482 - litellm.proxy.proxy_server.chat_completion(): Exception occured - team_id missing from team: {'success_callback': ['langfuse'], 'failure_callback': ['langfuse'], 'langfuse_public_key': 'pk-lf-008c75dc-f7a6-4e8b-8f3a-3fa8631cedf6', 'langfuse_secret': 'sk-lf-64ca0c4b-8a4f-4c81-896f-7e59864d057c'}
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 3295, in chat_completion
    data = await add_litellm_data_to_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/litellm_pre_call_utils.py", line 512, in add_litellm_data_to_request
    team_config = await proxy_config.load_team_config(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 1472, in load_team_config
    raise Exception(f"team_id missing from team: {team}")
Exception: team_id missing from team

Twitter / LinkedIn details

No response

nomarek commented 22 hours ago

I've also encountered this issue. It seems that the team_id is being removed from the dictionary, which is apparent in the error message reported by the CalabiYau14.

07:38:24 - LiteLLM Proxy:ERROR: proxy_server.py:3482 - litellm.proxy.proxy_server.chat_completion(): Exception occured - team_id missing from team: {'success_callback': ['langfuse'], 'failure_callback': ['langfuse'], 'langfuse_public_key': '***', 'langfuse_secret': '***'}

nomarek commented 21 hours ago

I think the bug might be in ProxyConfig.load_team_config. This assignment:

team_config = team

Should be a dictionary copy instead:

team_config = team.copy()

This is necessary because add_litellm_data_to_request pops team_id from team_config here:

team_id = team_config.pop("team_id", None)

As a result, team_id is missing in subsequent calls to load_team_config.