Closed georgeseifada closed 5 months ago
Already supported via openai compatible endpoints support - https://docs.litellm.ai/docs/providers/openai_compatible
Anything additional you require here?
I'm not sure how that would help with an API that's different?
For example, let's say we have a new LLM Embedding API that has:
amplify
(I just made it up for the example){"text": "Hi, my name is Bob.", "metadata": "source": "audio", "language": "english"}
How can I use the new LLM Embedding API with LiteLLM, with all the standard functionality of routing, fallbacks, etc?
The Feature
Let's say OpenAI comes out with GPT-5 and it has a totally different API. Or Amazon releases a model called Bezos-AI. Users should be able to easily add support for that model's API, to seamlessly integrate it with routing, fallbacks, etc without waiting for it to be officially supported in the library.
Motivation, pitch
The library should be as extensible as possible for users to be able to rely on it long-term.
Twitter / LinkedIn details
No response