BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.17k stars 1.68k forks source link

[Bug]: How to pass some args when requesting to litellm using sdk or server #6753

Open Chinglish123 opened 1 week ago

Chinglish123 commented 1 week ago

What happened?

When i used sdk or server to send some request with litellm, i must append some args at the end of url, like https://xxx.com/openai/deployments/gpt-4-0125-preview/chat/completions?api-version=2024-02-15-preview. Accroding to the document, the api_base must be https://xxx.com/openai/deployments/gpt-4-0125-preview and /chat/completions will be appended automaticlly.BUT what if i must pass the api-version at the end of url?

Relevant log output

No response

Twitter / LinkedIn details

No response