BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.56k stars 1.46k forks source link

[Feature]: support wildcard fallbacks #5971

Open krrishdholakia opened 1 day ago

krrishdholakia commented 1 day ago

The Feature

we need to add a separate model entry for each model on Azure. It would be great if we could instead do:

model_list:
  - model_name: azure/*
    litellm_params:
      model: azure/*
      api_base: https://my-openai.openai.azure.com/
      api_version: 2024-05-01-preview
      api_key: os.environ/AZURE_OPENAI_API_KEY
litellm_settings:
  fallbacks:
    - gpt-4o:
      - azure/gpt-4o

Motivation, pitch

user request

Twitter / LinkedIn details

No response

krrishdholakia commented 1 day ago

This is a no-op as it's already supported. I'll add an example for this to our docs.