BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.06k stars 1.39k forks source link

[Bug]: clarifai Streaming not working #4543

Open gamenerd457 opened 2 months ago

gamenerd457 commented 2 months ago

What happened?

Tried using Mistral-large from clarifai platform with stream=True. It is giving an error.

from litellm import completion

response = completion(
                model="clarifai/mistralai.completion.mistral-large",
                messages=messages,
                stream=True,
                )

Chat_response = [print(msg) for msg in response]

Relevant log output

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Provider List: https://docs.litellm.ai/docs/providers

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Provider List: https://docs.litellm.ai/docs/providers

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/litellm/utils.py in chunk_creator(self, chunk)
   8831                 if response_obj["is_finished"]:
-> 8832                     self.received_finish_reason = response_obj["finish_reason"]
   8833             elif self.model == "replicate" or self.custom_llm_provider == "replicate":

KeyError: 'finish_reason'

During handling of the above exception, another exception occurred:

APIConnectionError                        Traceback (most recent call last)
8 frames
APIConnectionError: litellm.APIConnectionError: 'finish_reason'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 8832, in chunk_creator
    self.received_finish_reason = response_obj["finish_reason"]
KeyError: 'finish_reason'

During handling of the above exception, another exception occurred:

APIConnectionError                        Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/litellm/utils.py in exception_type(model, original_exception, custom_llm_provider, completion_kwargs, extra_kwargs)
   7606             exception_mapping_worked = True
   7607             if hasattr(original_exception, "request"):
-> 7608                 raise APIConnectionError(
   7609                     message="{}\n{}".format(
   7610                         str(original_exception), traceback.format_exc()

APIConnectionError: litellm.APIConnectionError: litellm.APIConnectionError: 'finish_reason'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 8832, in chunk_creator
    self.received_finish_reason = response_obj["finish_reason"]
KeyError: 'finish_reason'

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 8832, in chunk_creator
    self.received_finish_reason = response_obj["finish_reason"]
KeyError: 'finish_reason'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 9538, in __next__
    response: Optional[ModelResponse] = self.chunk_creator(chunk=chunk)
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 9468, in chunk_creator
    raise exception_type(
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 7644, in exception_type
    raise e
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 7617, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: 'finish_reason'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 8832, in chunk_creator
    self.received_finish_reason = response_obj["finish_reason"]
KeyError: 'finish_reason'

Twitter / LinkedIn details

No response

nitinbhojwani commented 1 month ago

@gamenerd457 @ishaan-jaff This will most likely be fixed as part of https://github.com/BerriAI/litellm/pull/4170

@mogith-pn Please confirm.