BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.16k stars 1.13k forks source link

[Bug]: Stream requests failing for openai compatible endpoints. #4429

Closed letmefocus closed 4 days ago

letmefocus commented 4 days ago

What happened?

When using an openai compatible endpoint, the server requesting and receiving a text/event-stream tries to parse it as a JSON object.

Relevant log output

INFO:     127.0.0.1:54930 - "POST /v1/chat/completions HTTP/1.0" 200 OK
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 9381, in handle_openai_chat_completion_chunk
    if len(str_line.choices) > 0:
TypeError: object of type 'NoneType' has no len()
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 7601, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: OpenAIException - object of type 'NoneType' has no len()
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/proxy/proxy_server.py", line 3069, in async_data_generator
    async for chunk in response:
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 10447, in __anext__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 10345, in __anext__
    processed_chunk: Optional[ModelResponse] = self.chunk_creator(
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 10200, in chunk_creator
    raise exception_type(
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 8754, in exception_type
    raise original_exception
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 9972, in chunk_creator
    response_obj = self.handle_openai_chat_completion_chunk(chunk)
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 9418, in handle_openai_chat_completion_chunk
    raise e
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 9381, in handle_openai_chat_completion_chunk
    if len(str_line.choices) > 0:
TypeError: object of type 'NoneType' has no len()

Twitter / LinkedIn details

twitter: @ellsiecodes - https://x.com/ellsiecodes

ishaan-jaff commented 4 days ago

fixed here: https://github.com/BerriAI/litellm/commit/57852bada9075b4d831d80776bd057fb2905cb30