BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.55k stars 1.46k forks source link

[Bug]: The 429 error code hides the real cause. #5764

Closed kmyczkowska-hypatos closed 1 week ago

kmyczkowska-hypatos commented 1 week ago

What happened?

Hello, Would it be possible to return the 529 error code (as originally provided by the vendor) instead of mapping it to 429, which indicates the issue is on our side?

Relevant log output

Error code: 429 - {'error': {'message': 'No deployments available for selected model, Try again in 60 seconds. Passed model=vertex-claude-3-5-sonnet. pre-call-checks=False, allowed_model_region=n/a, cooldown_list=[(\'...\', {\'Exception Received\': \'litellm.APIConnectionError: Server error \\\'529 Internal Server Error\\\' for url \\\'..../anthropic/models/claude-3-5-sonnet@

Twitter / LinkedIn details

No response

krrishdholakia commented 1 week ago

@kmyczkowska-hypatos if you bump to latest, this issue re: cooldowns is now fixed.

krrishdholakia commented 1 week ago

Please feel free to re-open / create a new issue, with minimal repro steps, if this persists