BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.36k stars 1.57k forks source link

[Feature]: models/text-embedding-004 on google ai studio #5385

Closed traderpedroso closed 2 months ago

traderpedroso commented 2 months ago

The Feature

add embedding support for google AI studio provider on litellm proxy

Motivation, pitch

Dear LiteLLM Team, We are excited about the existing support for Gemini within LiteLLM and recognize its significant contributions to AI development. However, we believe that the addition of Gemini embedding functionality within Google AI Studio would unlock a whole new level of power for developers working in this environment.

Twitter / LinkedIn details

No response

krrishdholakia commented 2 months ago

Looks like we cover this for vertex ai but not google ai studio - https://docs.litellm.ai/docs/embedding/supported_embedding#vertex-ai-embedding-models

traderpedroso commented 2 months ago

Looks like we cover this for vertex ai but not google ai studio - https://docs.litellm.ai/docs/embedding/supported_embedding#vertex-ai-embedding-models

Thank you for your prompt response, it was exactly for Vertex AI that it was already available and working fantastically. It was only this detail that I noticed for Google AI Studio.