BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.05k stars 1.12k forks source link

[Bug]: models_by_provider not updated; utils.get_valid_models() returns incomplete list #4240

Closed liffiton closed 1 week ago

liffiton commented 1 week ago

What happened?

utils.get_valid_models() returns an incomplete list, leaving off all gemini/* models, for example. This is because models_by_provider hasn't been updated to include all providers.

It looks like you might be moving toward having a single source of truth, with all model info being parsed out of model_prices_and_context_window.json. That would be a great way to fix this issue and avoid similar issues in the future on top of probably simplifying a lot of the code.

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 1 week ago

Hey @liffiton how're you using get_valid_models() today? Would be helpful to understand the context

liffiton commented 1 week ago

@krrishdholakia, I have a command line utility where the user enters a model as a command line argument. I was going to use get_valid_models() to validate the model string before proceeding. When that didn't work, I saw valid_model() (which, btw, doesn't appear to be used anywhere and maybe could be removed) and check_valid_key(), and then I just implemented something like those myself. In my case, it's probably better to check that the model can be used (not that it just exists) anyway.