Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
12.56k
stars
1.46k
forks
source link
[Feature]: support wildcard fallbacks #5971
Open
krrishdholakia opened 1 day ago
The Feature
we need to add a separate model entry for each model on Azure. It would be great if we could instead do:
Motivation, pitch
user request
Twitter / LinkedIn details
No response