BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.98k stars 1.52k forks source link

[Feature]: forward headers to API Hosts configured in `model_list` #6217

Open dogacancolak-kensho opened 4 days ago

dogacancolak-kensho commented 4 days ago

The Feature

it seems that the header passed into the Python SDK completions method as extra_headers are not passed onto the API host.

Apparently Pass through Endpoints support forwarding headers, but we'd like the model_list to support as well.

Motivation, pitch

We have our self-hosted load balancers that we use as the API Host in config.yaml, which then load balances between our Azure subscriptions.

Our load balancers need a header from the requesting application in order to create metrics.

Twitter / LinkedIn details

No response

krrishdholakia commented 3 days ago

hi @dogacancolak-kensho extra_headers should be getting sent along.

Can you share sample code to repro the error?