Closed kun432 closed 6 months ago
looking into this
hmm I don't see the error on my local machine
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/litellm/main.py", line 289, in acompletion
response = await loop.run_in_executor(None, func_with_context) # type: ignore
File "/usr/local/lib/python3.9/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "/usr/local/lib/python3.9/site-packages/litellm/utils.py", line 2732, in wrapper
return litellm.completion_with_retries(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/litellm/main.py", line 2091, in completion_with_retries
return retryer(original_function, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/tenacity/__init__.py", line 379, in __call__
do = self.iter(retry_state=retry_state)
File "/usr/local/lib/python3.9/site-packages/tenacity/__init__.py", line 325, in iter
raise retry_exc.reraise()
File "/usr/local/lib/python3.9/site-packages/tenacity/__init__.py", line 158, in reraise
raise self.last_attempt.result()
File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 439, in result
return self.__get_result()
File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result
raise self._exception
File "/usr/local/lib/python3.9/site-packages/tenacity/__init__.py", line 382, in __call__
result = fn(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/litellm/utils.py", line 2761, in wrapper
raise e
File "/usr/local/lib/python3.9/site-packages/litellm/utils.py", line 2662, in wrapper
result = original_function(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/litellm/main.py", line 2059, in completion
raise exception_type(
File "/usr/local/lib/python3.9/site-packages/litellm/utils.py", line 8229, in exception_type
raise e
File "/usr/local/lib/python3.9/site-packages/litellm/utils.py", line 7341, in exception_type
raise ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: BedrockException - An error occurred (ResourceNotFoundException) when calling the InvokeModel operation: Could not resolve the foundation model from the provided model identifier.
Okay I'm able to repro this on a new server
I believe this is because the completion call goes through completion_with_retries.
Is this necessary since the router has it's own retry logic? @ishaan-jaff
Fix - it should not even go through this logic when on the router/proxy: https://github.com/BerriAI/litellm/pull/2620
@kun432 we'd love to get on a call and learn how we can improve litellm proxy for you. Any chance you're free sometime this week? Sharing my cal for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Here's my linkedin if you want to message https://www.linkedin.com/in/reffajnaahsi/
thank you for fix! and sorry for late reply.
this happened only once after server started and seems not a big deal in my end, but I found and just reported. Tried current version, confirmed this error handled correctly like:
{"error":{"message":"BedrockException - BedrockException - An error occurred (ResourceNotFoundException) when calling the InvokeModel operation: Could not resolve the foundation model from the provided model identifier.","type":null,"param":null,"code":500}}
thanks.
What happened?
as a proxy using
docker compose
, seems the latest image does not includetenacity
and it gets an error when a request error happen.litellm_config.yaml:
docker-compose.yml:
log:
but somehow logs show retrying seems working. I can see the following logs 2 times:
and also this 1 time:
I don't know if this is a bug or not.
Relevant log output
Twitter / LinkedIn details
No response