BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.66k stars 1.47k forks source link

[Bug]: UnboundLocalError: cannot access local variable 'exception_provider' #4414

Closed paul-gauthier closed 3 months ago

paul-gauthier commented 3 months ago

What happened?

With litellm==1.40.26 I am seeing an UnboundLocalError coming out of litellm.completion().

File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7400, in exception_type
message=f"APIError: {exception_provider} - {message}",
                     ^^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'exception_provider' where it is not associated with a value

There are multiple errors stacked up here. But I think it starts from an AssertionError:

File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 5203, in convert_to_model_response_object
assert response_object["choices"] is not None and isinstance(

Relevant log output

Traceback (most recent call last):
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 5203, in convert_to_model_response_object
assert response_object["choices"] is not None and isinstance(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError
>
During handling of the above exception, another exception occurred:
>
Traceback (most recent call last):
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 825, in completion
raise e
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 792, in completion
return convert_to_model_response_object(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 5328, in convert_to_model_response_object
raise Exception(
Exception: Invalid response object Traceback (most recent call last):
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 5203, in convert_to_model_response_object
assert response_object["choices"] is not None and isinstance(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError
>
>
received_args={'response_object': {'id': None, 'choices': None, 'created': None, 'model': None, 'object': None, 'service_tier': None, 'system_fingerprint': None, 'usage': None, 'error': {'message': '{"type":"error","error":{"type":"invalid_request_error","message":"Output blocked by content filtering policy"}}', 'code': 400}}, 'model_response_object': ModelResponse(id='chatcmpl-b88ce43a-7bfc-437c-b8cc-e90d59372cfb', choices=[Choices(finish_reason='stop', index=0, message=Message(content='default', role='assistant'))], created=1719376241, model='None/anthropic/claude-3.5-sonnet', object='chat.completion', system_fingerprint=None, usage=Usage()), 'response_type': 'completion', 'stream': False, 'start_time': None, 'end_time': None, 'hidden_params': None}
>
During handling of the above exception, another exception occurred:
>
Traceback (most recent call last):
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/main.py", line 1794, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 833, in completion
raise OpenAIError(status_code=500, message=traceback.format_exc())
litellm.llms.openai.OpenAIError: Traceback (most recent call last):
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 5203, in convert_to_model_response_object
assert response_object["choices"] is not None and isinstance(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError
>
During handling of the above exception, another exception occurred:
>
Traceback (most recent call last):
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 825, in completion
raise e
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 792, in completion
return convert_to_model_response_object(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 5328, in convert_to_model_response_object
raise Exception(
Exception: Invalid response object Traceback (most recent call last):
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 5203, in conv
ert_to_model_response_object
assert response_object["choices"] is not None and isinstance(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError
>
>
received_args={'response_object': {'id': None, 'choices': None, 'created': None, 'model': None, 'object': None, 'service_tier': None, 'system_fingerprint': None, 'usage': None, 'error': {'message': '{"type":"error","error":{"type":"invalid_request_error","message":"Output blocked by content filtering policy"}}', 'code': 400}}, 'model_response_object': ModelResponse(id='chatcmpl-b88ce43a-7bfc-437c-b8cc-e90d59372cfb', choices=[Choices(finish_reason='stop', index=0, message=Message(content='default', role='assistant'))], created=1719376241, model='None/anthropic/claude-3.5-sonnet', object='chat.completion', system_fingerprint=None, usage=Usage()), 'response_type': 'completion', 'stream': False, 'start_time': None, 'end_time': None, 'hidden_params': None}
>
>
During handling of the above exception, another exception occurred:
>
Traceback (most recent call last):
File "/home/ubuntu/Projects/aider/aider/coders/base_coder.py", line 816, in send_new_user_message
yield from self.send(messages, functions=self.functions)
File "/home/ubuntu/Projects/aider/aider/coders/base_coder.py", line 1052, in send
hash_object, completion = send_with_retries(
^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/backoff/_sync.py", line 105, in retry
ret = target(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider/aider/sendchat.py", line 71, in send_with_retries
res = litellm.completion(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 956, in wrapper
raise e
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 851, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/main.py", line 2583, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7470, in exception_type
raise e
File "/home/ubuntu/Projects/aider-swe-bench/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7400, in exception_type
message=f"APIError: {exception_provider} - {message}",
^^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'exception_provider' where it is not associated with a value

Twitter / LinkedIn details

No response

krrishdholakia commented 3 months ago

acknowledging this

krrishdholakia commented 3 months ago

received_args={'response_object': {'id': None, 'choices': None, 'created': None, 'model': None, 'object': None, 'service_tier': None, 'system_fingerprint': None, 'usage': None, 'error': {'message': '{"type":"error","error":{"type":"invalid_request_error","message":"Output blocked by content filtering policy"}}', 'code': 400}}, 'model_response_object': ModelResponse(id='chatcmpl-b88ce43a-7bfc-437c-b8cc-e90d59372cfb', choices=[Choices(finish_reason='stop', index=0, message=Message(content='default', role='assistant'))], created=1719376241, model='None/anthropic/claude-3.5-sonnet', object='chat.completion', system_fingerprint=None, usage=Usage()), 'response_type': 'completion', 'stream': False, 'start_time': None, 'end_time': None, 'hidden_params': None}

Looks like it's raised when a bad request error is received

krrishdholakia commented 3 months ago

@paul-gauthier also looked at the way openrouter returns errors - added a check to raise this correctly https://github.com/BerriAI/litellm/commit/ca04244a0ab76291a819f0f9a475f5e0706d0808