Closed ironore15 closed 5 months ago
Hey @ironore15 the 'base_url' param is for azure openai
You don't need to set base_url for palm/gemini/vertex ai.
We use the google libraries which handle the url construction for this. Is there something i'm missing here?
Here's how to make the call - https://docs.litellm.ai/docs/providers/vertex#usage-with-litellm-proxy-server
We might want base_url
support for Vertex AI, since Cloudflare AI Gateway is one use-case for it.
https://developers.cloudflare.com/ai-gateway/providers/vertex/
It don't work withiopenrouter too.
Use case: Use base_url to integrate with helicone auto proxy.
See #3732 for base_url
tracking Vertex AI support with Cloudflare AI Gateway.
setting base_url
is now supported for gemini + vertex_ai (for vertex_ai_beta/
calls).
@ironore15 can we do a 10min call? Would love to learn how you're using litellm, so we can improve
https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
What happened?
Using gemini-pro as a litellm proxy is currently impossible.
Some
llm_providers
such aspalm
,gemini
,vertex_ai
is ignoring thebase_url
argument. https://github.com/BerriAI/litellm/blob/ef4c85522c001c930f02e2ec2c32dea9a7816b74/litellm/main.py#L1629Relevant log output
Twitter / LinkedIn details
No response