BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.95k stars 1.64k forks source link

python sdk don't have option to get list of models from a provider #5894

Open soorisoft opened 1 month ago

soorisoft commented 1 month ago

Referring couple of API's to list of models, below API's returns list of supported ML models: https://platform.openai.com/docs/api-reference/models/list https://docs.mistral.ai/api/#tag/models/operation/list_models_v1_models_get

Seems LiteLLM got this function - https://github.com/BerriAI/litellm/blob/main/litellm/router.py#L4691 to returns configured models list within litellm. Is there a way to get supported models list directly from providers like Azure/open ai/mistral etc.? if not, is this makes sense to get it implemented in litellm python SDK.

krrishdholakia commented 1 month ago

we don't have a helper for that yet, but it would definitely make sense.

Would welcome a PR here @soorisoft