BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.08k stars 1.66k forks source link

[Bug]: messages.content array broken with ollama_chat #6808

Open linznin opened 3 days ago

linznin commented 3 days ago

What happened?

request failed array content example request body:

{
    "messages": [
        {
            "content": [
                {
                    "type": "text",
                    "text": "hi"
                }
            ],
            "role": "user"
        }
    ],
    "model": "ollama_chat/taide-chat",
    "stream": false
}

response:

{
    "error": {
        "message": "litellm.APIConnectionError: Ollama_chatException - {\"error\":\"json: cannot unmarshal array into Go struct field ChatRequest.messages of type string\"}\nReceived Model Group=taide-chat\nAvailable Model Group Fallbacks=None",
        "type": null,
        "param": null,
        "code": "500"
    }
}

Relevant log output

13:30:26 - LiteLLM Proxy:ERROR: proxy_server.py:3463 - litellm.proxy.proxy_server.chat_completion(): Exception occured - litellm.APIConnectionError: Ollama_chatException - {"error":"json: cannot unmarshal array into Go struct field ChatRequest.messages of type string"}
Received Model Group=taide-chat
Available Model Group Fallbacks=None
Traceback (most recent call last):
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\main.py", line 470, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\llms\ollama_chat.py", line 609, in ollama_acompletion
    raise e  # don't use verbose_logger.exception, if exception is raised
    ^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\llms\ollama_chat.py", line 548, in ollama_acompletion
    raise OllamaError(status_code=resp.status, message=text)
litellm.llms.ollama_chat.OllamaError: {"error":"json: cannot unmarshal array into Go struct field ChatRequest.messages of type string"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\proxy\proxy_server.py", line 3352, in chat_completion
    responses = await llm_responses
                ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 818, in acompletion
    raise e
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 794, in acompletion
    response = await self.async_function_with_fallbacks(**kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 2836, in async_function_with_fallbacks
    raise original_exception
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 2672, in async_function_with_fallbacks
    response = await self.async_function_with_retries(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 2920, in async_function_with_retries
    response = await self.make_call(original_function, *args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 3017, in make_call
    response = await response
               ^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 947, in _acompletion
    raise e
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\router.py", line 915, in _acompletion
    response = await _response
               ^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\utils.py", line 1227, in wrapper_async
    raise e
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\utils.py", line 1083, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\main.py", line 492, in acompletion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 2116, in exception_type
    raise e
  File "C:\Users\10609302\AppData\Local\pypoetry\Cache\virtualenvs\litellm-custom-0ZH-kh0y-py3.11\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 2085, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: Ollama_chatException - {"error":"json: cannot unmarshal array into Go struct field ChatRequest.messages of type string"}


### Twitter / LinkedIn details

_No response_