BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.2k stars 1.42k forks source link

[Bug]: NameError: name 'GenericAPILogger' is not defined when apply Custom Callback APIs(generic) with proxy server #3290

Open hiep-dinh opened 4 months ago

hiep-dinh commented 4 months ago

What happened?

When I apply Custom Callback APIs [Async] . I see that it is not working. I also see that the enterprise dir is not included in the libs.

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 4 months ago

@hiep-dinh have we already spoken? We manually onboard enterprise users - https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

hiep-dinh commented 4 months ago

Thank you very much @krrishdholakia. This is my mistake with information that I raised. Version: 1.35.26

  1. In proxy/utils.py, we import GenericAPILogger, but i don't see this file in the library enterprise.enterprise_callbacks.generic_api_callback from .proxy.enterprise.enterprise_callbacks.generic_api_callback import ( GenericAPILogger, ) 01

  2. Some debug information is not correct 02

  3. This is my configuration for proxy server with generic callback 03 04

Notes: The Custom Callback APIs [Async] is working without issue