BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.05k stars 1.39k forks source link

[Feature]: LiteLLM doesn't support Redis sentinel !!! #4674

Closed cheng92hao closed 2 weeks ago

cheng92hao commented 1 month ago

What happened?

A bug happened! LiteLLM doesn't support Redis sentinel !!!

Relevant log output

No response

Twitter / LinkedIn details

No response

acuciureanu commented 1 month ago

This can't be a bug if it doesn't support Redis Sentinel. Isn't this more like a feature request?

Also, this is a dup of #4381

krrishdholakia commented 2 weeks ago

Thanks @acuciureanu closing as Duplicate of #4381