BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.08k stars 1.66k forks source link

[Bug]: Cannot disable logging properly #6813

Closed Seluj78 closed 1 day ago

Seluj78 commented 2 days ago

What happened?

I want to disable the entire logging from litellm or at least set it to logging.WARNING for the python SDK.

I've tried everything, and there's no documentation about it.

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 1 day ago

i believe you can just do litellm._logging.disable_debugging()

https://github.com/BerriAI/litellm/blob/ddfe687b13e9f31db2fb2322887804e3d01dd467/litellm/_logging.py#L88