BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.08k stars 1.66k forks source link

fix: duplicate exception_type for gemini #6768

Open nobu007 opened 5 days ago

nobu007 commented 5 days ago

Type

๐Ÿ› Bug Fix ๐Ÿงน Refactoring

Changes

The exception_type for "gemini" is handled same as "palm" now. Original code of "gemini" was in both "vertex_ai" and "palm".

Recentry, gemini often throws 503, so adding switch case as vertex_ai.

This is sample output. Note: logger_fn of exception_logging() is always None. Understanding was not easy for me. โ€˜โ€˜โ€˜ {'exception': litellm.BadRequestError: GeminiException - { "error": { "code": 503, "message": "The model is overloaded. Please try again later.", "status": "UNAVAILABLE" } } โ€˜โ€˜โ€˜

vercel[bot] commented 5 days ago

The latest updates on your projects. Learn more about Vercel for Git โ†—๏ธŽ

Name Status Preview Comments Updated (UTC)
litellm โœ… Ready (Inspect) Visit Preview ๐Ÿ’ฌ Add feedback Nov 16, 2024 3:50pm