Seems LiteLLM got this function - https://github.com/BerriAI/litellm/blob/main/litellm/router.py#L4691 to returns configured models list within litellm. Is there a way to get supported models list directly from providers like Azure/open ai/mistral etc.? if not, is this makes sense to get it implemented in litellm python SDK.
Referring couple of API's to list of models, below API's returns list of supported ML models: https://platform.openai.com/docs/api-reference/models/list https://docs.mistral.ai/api/#tag/models/operation/list_models_v1_models_get
Seems LiteLLM got this function - https://github.com/BerriAI/litellm/blob/main/litellm/router.py#L4691 to returns configured models list within litellm. Is there a way to get supported models list directly from providers like Azure/open ai/mistral etc.? if not, is this makes sense to get it implemented in litellm python SDK.