Open Manouchehri opened 5 months ago
hey @Manouchehri can you explain how you would use this information?
The first use case I wanted to use it for, was for https://github.com/orgs/langfuse/discussions/1070.
With the switch to httpx, hopefully this feature won't be as complicated to add, right?
Related to https://github.com/BerriAI/litellm/issues/1724.
The Feature
Similar to
proxy_server_request
, it'd be awesome if we had aproxy_server_response
object for logging. e.g. I useproxy_server_request
already for Langfuse logging, and it'd be great to be able to have a simple/universalproxy_server_response
.https://github.com/BerriAI/litellm/blob/ce13d2fa7c0957f53ed37576719aa52e3a54e1ec/litellm/integrations/langfuse.py#L323-L339
If the request used LiteLLM's S3 cache for the request, then
proxy_server_response
should contain info about the S3 object. If it's from a non HTTP source, thenproxy_server_response
should probably be null.Motivation, pitch
We already talked about this in several other tickets (that were implementation specific, like Azure OpenAI), but I think a meta-ticket for this in general is more suited.
Twitter / LinkedIn details
https://www.linkedin.com/in/davidmanouchehri/