BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.37k stars 1.16k forks source link

[Feature]: Custom Model Endpoint Support #1480

Closed georgeseifada closed 5 months ago

georgeseifada commented 5 months ago

The Feature

Let's say OpenAI comes out with GPT-5 and it has a totally different API. Or Amazon releases a model called Bezos-AI. Users should be able to easily add support for that model's API, to seamlessly integrate it with routing, fallbacks, etc without waiting for it to be officially supported in the library.

Motivation, pitch

The library should be as extensible as possible for users to be able to rely on it long-term.

Twitter / LinkedIn details

No response

krrishdholakia commented 5 months ago

Already supported via openai compatible endpoints support - https://docs.litellm.ai/docs/providers/openai_compatible

Anything additional you require here?

georgeseifada commented 5 months ago

I'm not sure how that would help with an API that's different?

For example, let's say we have a new LLM Embedding API that has:

How can I use the new LLM Embedding API with LiteLLM, with all the standard functionality of routing, fallbacks, etc?

krrishdholakia commented 5 months ago