BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.89k stars 1.64k forks source link

[Bug]: using pydantic model for structured output returning error for Anthropic models. #6766

Open dannylee1020 opened 2 hours ago

dannylee1020 commented 2 hours ago

What happened?

Description

litellm.completion returning error when pydantic model is used for structured output in response_format for Anthropic models. This was tested with both Anthropic and OpenAI models and it works only on OpenAI models without an error. It also works for previous versions < 1.52.8, so I suspect something has changed in the release.

Relevant log output

Traceback (most recent call last):
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 567, in completion
    response = client.post(
               ^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 389, in post
    raise e
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 375, in post
    response.raise_for_status()
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/httpx/_models.py", line 763, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/main.py", line 1770, in completion
    response = anthropic_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 582, in completion
    raise AnthropicError(
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"invalid_request_error","message":"tools.0.input_schema: JSON schema is invalid - please consult https://json-schema.org or our documentation at https://docs.anthropic.com/en/docs/tool-use"}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/dannylee1020/repos/peony/tests/test.py", line 74, in <module>
    res = litellm.completion(
          ^^^^^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/utils.py", line 960, in wrapper
    raise e
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/utils.py", line 849, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/main.py", line 3034, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2125, in exception_type
    raise e
  File "/Users/dannylee1020/repos/peony/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 469, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"tools.0.input_schema: JSON schema is invalid - please consult https://json-schema.org or our documentation at https://docs.anthropic.com/en/docs/tool-use"}}

Twitter / LinkedIn details

No response

ishaan-jaff commented 2 hours ago

hi @dannylee1020 - can you share the request you're making with litellm?

We recently moved to use Anthropic Tool use for JSON responses - this might be related