BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.38k stars 1.69k forks source link

[Feature]: Custom API Server enhancements for embedding support #6754

Open jasonkuoWT opened 2 weeks ago

jasonkuoWT commented 2 weeks ago

The Feature

We are currently developing a custom LLM provider and found that there isn’t a way to implement embeddings. We attempted to use the embedding method within BaseLLM, but it did not work as expected.

Could support for this feature be added?

Motivation, pitch

Currently LiteLLM does not support an implementation method for embeddings, and we would appreciate the addition of this functionality

Twitter / LinkedIn details

No response