BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.16k stars 1.13k forks source link

fix: Include vertex_ai_beta in vertex_ai param mapping/Do not use google auth project_id #4461

Open t968914 opened 2 days ago

t968914 commented 2 days ago

Title

Vertex_ai_beta chat completion does not work with provided project_id in model info because it is always overwritten by project_idreturned from google.auth.default(). However, google.auth.default() returns None in cases where there is no service account. This will also not work when litellm is hosted in a seperate google project from vertex ai models. The project_id passed in as a param should take precedent to support this.

Relevant issues

Type

🐛 Bug Fix

Changes

vercel[bot] commented 2 days ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 28, 2024 5:39pm
krrishdholakia commented 2 days ago

hey @t968914 should be fixed already with this - https://github.com/BerriAI/litellm/commit/6b14cf765708376490c5d88d3e54edc173c343b6

Waiting on a new release (v1.40.31). Should be live by EOD.

Feel free to reopen, if that doesn't fix it.

krrishdholakia commented 2 days ago

Realized you had 2 changes here. Since we've addressed the default project issue, can you isolate the PR to

Include vertex_ai_beta when checking for vertex_ai provider type to map params

lgtm for merge