BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

add custom health probes in helm chart #6851

Open mohittalele opened 17 hours ago

mohittalele commented 17 hours ago

Title

add custom health probes in helm chart

helm template and deploy works

Description by Korbit AI

What change is being made?

Add custom health probes to the Helm chart for improved configurability of liveness, readiness, and startup probes.

Why are these changes being made?

These changes allow for greater flexibility and customization in deploying applications using the litellm Helm chart by enabling users to specify their own health probe configurations through the values.yaml file. This approach addresses the need for different health checks in various deployment environments and simplifies the maintenance and adaptation of probe configurations.

Is this description stale? Ask me to generate a new description by commenting /korbit-generate-pr-description

vercel[bot] commented 17 hours ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Nov 21, 2024 3:52pm
mohittalele commented 17 hours ago

@krrishdholakia can you please have a look at this pr ? I need a custom health probes in order to use helm chart. Thanks