BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.05k stars 1.12k forks source link

[Bug]: Crash if lago is defined as callback with last main branch #4224

Closed superpoussin22 closed 1 week ago

superpoussin22 commented 1 week ago

What happened?

just test the last main branch and have a crash of litellm when lago is used as callback

Relevant log output

File "/usr/local/lib/python3.11/site-packages/litellm/proxy/utils.py", line 247, in _init_litellm_callbacks
"NameError: name 'LagoLogger' is not defined",

Twitter / LinkedIn details

No response

ishaan-jaff commented 1 week ago

cc @krrishdholakia might be related to recent changed in utils.py

krrishdholakia commented 1 week ago

Fixed here - https://github.com/BerriAI/litellm/commit/2623bb260f351d459e0d3a8abf149a335585a98e

should be live in v1.40.16+.

@superpoussin22 can we setup a direct support channel? Would love to learn how you're using litellm today

superpoussin22 commented 1 week ago

@krrishdholakia no more error, will see if lago receive the infos but I'm confident. No prob to exchange on litellm

krrishdholakia commented 1 week ago

@superpoussin22 how do you use litellm today?

(want to learn so we can improve for a specific use-case)