BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.31k stars 1.55k forks source link

[Bug]: register_model trigger some ugly debug prints #5982

Open antoine-lizee opened 3 weeks ago

antoine-lizee commented 3 weeks ago

What happened?

This is a formal bug, that's quite annoying because it's flooding our logs, but it's not breaking anything.

What's happening is that when you call litellm.register_model() it tries to see if there is an existing model with the given key by calling get_model_info() which itself calls get_llm_provider() (twice!?), which has this super weird lines:

            if litellm.suppress_debug_info == False:
                print()  # noqa
                print(  # noqa
                    "\033[1;31mProvider List: https://docs.litellm.ai/docs/providers\033[0m"  # noqa
                )  # noqa
                print()  # noqa

....before building an error.

Problems I see:

In this case, I would vote for just removing the print() statements given that the error message is quite clear and has a reference to the url already. If you catch the exception, nothing happens as expected (including when we register models) if you don't catch the exception, the error will be shown, and it's much more helpful than this print statement.

Relevant log output

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Twitter / LinkedIn details

No response

krrishdholakia commented 3 weeks ago

call litellm.register_model()

Can you share a sample script for repro? @antoine-lizee

antoine-lizee commented 3 weeks ago

Not on my computer but anything like this would print it:

litellm.register_model({"some_new_model_with_unknown_or_no_provider": {...}})

I can find a better example tomorrow.