BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.93k stars 1.64k forks source link

[Feature]: improved diagnostic logging for invalid requests (litellm-proxy) #5836

Open lee-b opened 1 month ago

lee-b commented 1 month ago

The Feature

When connecting to litellm-proxy with msty, I get:

"litellm-proxy-1 | WARNING: Invalid HTTP request received."

Which is not very informative. The config works on the litellm-proxy side, from other clients (like open-webui), but I've really no idea what's going wrong with msty from this log entry. I'm running with --debug and --detailed_debug, and LITELLM_LOG set to TRACE, but that's all I get in the logs regardless.

Motivation, pitch

Usability

Twitter / LinkedIn details

No response

krrishdholakia commented 1 month ago

can you share a sample query which throws this error?

krrishdholakia commented 1 month ago

WARNING: Invalid HTTP request received

this might be happening before the request even reaches litellm

https://stackoverflow.com/questions/70726187/invalid-http-request-received-with-fast-api