BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

[Feature]: Forward user parameter to LLM providers #6854

Open cyberjunk opened 16 hours ago

cyberjunk commented 16 hours ago

The Feature

It seems LiteLLM SDK and LiteLLM Proxy both drop the user parameter that can be used to identify the end user when forwarding requests to LLM providers.

They both seem to use it exclusively for their internal usage tracking. But some LLM providers support this parameter as well (see OpenAI)!

I would suggest to fowarded it to LLM providers that support it. One can still explicitly drop it using additional_drop_params...

Motivation, pitch

We would like to see end user stats also in dashboards of LLM providers

Twitter / LinkedIn details

No response

krrishdholakia commented 16 hours ago

hey @cyberjunk where do you see it being dropped?

if the provider supports it, we send it across - currently this is OpenAI/OpenAI-Compatible Endpoints/Anthropic

cyberjunk commented 16 hours ago

Thanks for the fast response! Then I guess this happens for us because we are using LiteLLM SDK together with LiteLLM Proxy. Yeah, I know this is not a recommended setup after all. But is there any way to prevent LiteLLM SDK from dropping the parameter in this scenario? Here is a minimal Python code snippet:

import litellm

litellm.api_key = 'sk-1234'
litellm.api_base = 'https://llm.our-proxy.com'

messages = [
  {
    "role": "user",
    "content": "please write a small poem"
  }
]

response = litellm.completion(
  model="anthropic/claude-3-5-sonnet-20241022",
  messages=messages,
  temperature=0.0,
  top_p=1.0,
  stream=False,
  user="SOME_USER"
)

print(response)

PS: We started with LiteLLM SDK and are currently migrating to the Proxy. We want to drop the SDK at some point but code migration is a lot easier if we could use the combination of both at least temporarily..

krrishdholakia commented 4 hours ago

Please use this route - https://docs.litellm.ai/docs/providers/litellm_proxy

Let me know if this solves your problem