Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
acknowledging this issue - i don't feel comfortable with abstracting too much of this away from the user but i think you might be right. Let me think about it tonight. Will update ticket.
The current interface feels complicated
Desired interface:
Here is what my existing code looks like: