BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

(fix) don't block proxy startup if license check fails & using prometheus #6839

Closed ishaan-jaff closed 1 day ago

ishaan-jaff commented 1 day ago
  1. Modify license check to be non-blocking for Prometheus, preventing startup interruption
  2. Create comprehensive tests validating proxy initialization, health endpoints, and chat completion functionality

Relevant issues

Type

🐛 Bug Fix

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

Description by Korbit AI

What change is being made?

Modify the code to prevent blocking the proxy startup if the license check fails while using Prometheus and add comprehensive tests for basic proxy startup functionality, including testing health endpoints and chat completions.

Why are these changes being made?

These changes improve the robustness of the proxy startup by allowing it to continue operating with non-premium features despite license check failures, thereby enhancing user experience. The additional tests are crucial to ensure that fundamental proxy functionalities operate correctly under various scenarios, mitigating the risk of undetected issues during deployment.

Is this description stale? Ask me to generate a new description by commenting /korbit-generate-pr-description

vercel[bot] commented 1 day ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Nov 21, 2024 1:30am