BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.19k stars 1.54k forks source link

[Bug]: Braintrust integration doesn't work outside of default project #6412

Open huyouare opened 4 hours ago

huyouare commented 4 hours ago

What happened?

The project configuration in Braintrust logging doesn't work.

When specifying a project_id within metadata, this doesn't actually set the project_id for Braintrust as expected. Instead, the project ID in Braintrust is still the default one (litellm).

Looking at the implementation, it seems to be looking for project_id in kwargs, but this isn't allowed IIUC:

litellm.exceptions.BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"project_id: Extra inputs are not permitted"}}

Relevant log output

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Traceback (most recent call last):
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/main.py", line 1587, in completion
    response = anthropic_chat_completions.completion(
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/llms/anthropic/chat.py", line 1139, in completion
    response = client.post(
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/llms/custom_httpx/http_handler.py", line 371, in post
    raise e
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/llms/custom_httpx/http_handler.py", line 357, in post
    response.raise_for_status()
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/httpx/_models.py", line 763, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/jessehu/.pyenv/versions/3.10.10/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/Users/jessehu/.pyenv/versions/3.10.10/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/Users/jessehu/code/agent/runner/run_benchmark.py", line 122, in <module>
    run_benchmark(existing_run)
  File "/Users/jessehu/code/agent/runner/run_benchmark.py", line 110, in run_benchmark
    result = future.result()
  File "/Users/jessehu/.pyenv/versions/3.10.10/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "/Users/jessehu/.pyenv/versions/3.10.10/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/Users/jessehu/.pyenv/versions/3.10.10/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/braintrust/logger.py", line 1311, in wrapper_sync
    ret = f(*f_args, **f_kwargs)
  File "/Users/jessehu/code/agent/runner/run_benchmark.py", line 58, in process_example
    result = agent.run(instance_id, problem_statement, base_commit)
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/braintrust/logger.py", line 1311, in wrapper_sync
    ret = f(*f_args, **f_kwargs)
  File "/Users/jessehu/code/agent/core/swebench_agent.py", line 75, in run
    tool_name, arguments, result, thought_content = self.react_chain.run(
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/braintrust/logger.py", line 1311, in wrapper_sync
    ret = f(*f_args, **f_kwargs)
  File "/Users/jessehu/code/agent/chains/react_critique.py", line 45, in run
    response = completion(model="claude-3-5-sonnet-20241022",
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/utils.py", line 1086, in wrapper
    raise e
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/utils.py", line 974, in wrapper
    result = original_function(*args, **kwargs)
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/main.py", line 2848, in completion
    raise exception_type(
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/utils.py", line 8199, in exception_type
    raise e
  File "/Users/jessehu/code/agent/.venv/lib/python3.10/site-packages/litellm/utils.py", line 6646, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"project_id: Extra inputs are not permitted"}}

Twitter / LinkedIn details

No response

krrishdholakia commented 2 hours ago

can you share a minimal script to repro this error? @huyouare