Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
14.6k
stars
1.71k
forks
source link
[Feature]: Support native vllm server #1114
Open
krrishdholakia opened 11 months ago
The Feature
The normal vllm server, supports these inputs. By supporting this, we can handle prompt formatting on the proxy instead of the individual vllm server
Motivation, pitch
User had to do more setup work once their vllm - litellm setup was done, to support the individual prompt template
Twitter / LinkedIn details
No response