BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.98k stars 1.52k forks source link

[Feature]: Support other authentication method for Vertex AI in LiteLLM Proxy #4892

Closed xingyaoww closed 1 month ago

xingyaoww commented 2 months ago

The Feature

Support other authentication methods for Vertex AI in LiteLLM Proxy, potentially using the recommended workload identityfederation.

Motivation, pitch

In some org, generating the service account secret is disabled.

2f527965d11cae8d8c10cadcd19e5c1c

However, litellm proxy asks for a credential from a service account:

038cad913a7e925b3d8f2e4163b32d90

Twitter / LinkedIn details

No response

Manouchehri commented 2 months ago

This should be coming soon with OIDC.

krrishdholakia commented 2 months ago

hey @xingyaoww i believe this should already work - as long as the environment has the credentials, the vertex ai package should be able to handle this.

Do you have GOOGLE_APPLICATION_CREDENTIALS set as an env var in your environment?

Code for handling credentials - https://github.com/BerriAI/litellm/blob/2f773d9cb6388c6e1dcd7a742101ecd17506181b/litellm/llms/vertex_httpx.py#L765

Manouchehri commented 2 months ago

@krrishdholakia It looks like @xingyaoww is asking about Workload Identity Federation, since https://cloud.google.com/iam/docs/workload-identity-federation#providers was linked. I don't think any of the existing code in vertex_httpx.py handles that?

krrishdholakia commented 2 months ago

you're right - reassigned ticket @Manouchehri, thanks for your work on this!

elvis-cai commented 1 month ago

looking forward to have this workload identity working in GCP instead of static json key as well 🙏

ishaan-jaff commented 1 month ago

Done here https://github.com/BerriAI/litellm/pull/5354 @xingyaoww @elvis-cai @Manouchehri. Can you help me verify it works on your side too ?

Support for the provider = vertex_ai_beta

I was able to create a workload identity for a OIDC provider

Config used:

model_list:
  - model_name: gemini-1.0-pro-vision-001
    litellm_params:
      model: vertex_ai_beta/gemini-1.0-pro-vision-001
      vertex_project: "adroit-crow-413218"
      vertex_location: "us-central1"
      vertex_credentials: "test3.json"

litellm_settings:
  drop_params: True
elvis-cai commented 1 month ago

thanks @ishaan-jaff ,can confirm working using vertex_ai_beta without adding vertex_credentials

      - model_name: gemini-1.5-pro
        litellm_params:
          model: vertex_ai_beta/gemini-1.5-pro
          vertex_project: "ai-xxxx-03fc153e"

btw, will it support vertex claude module as well?

krrishdholakia commented 1 month ago

yup, this works with claude, since they leverage the same auth function @elvis-cai

elvis-cai commented 1 month ago

thanks, trying with the following config, probably beta not supported yet
model: vertex_ai_beta/claude-3-5-sonnet@20240620

Error occurred while generating model response. Please try again. Error: Error: litellm.BadRequestError: VertexAIException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "Project `635554402465` is not allowed to use Publisher Model `projects/ai-xxxx-03fc153e/locations/us-central1/publishers/google/models/claude-3-5-sonnet@20240620`",\n "status": "FAILED_PRECONDITION"\n }\n}\n'
krrishdholakia commented 1 month ago

Hi @elvis-cai just do vertex_ai/claude-3-5-sonnet@20240620

krrishdholakia commented 1 month ago

We've also completed our vertex_ai_beta -> vertex_ai migration, so you can just use vertex_ai/, without needing to put _beta/ in the provider name

elvis-cai commented 1 month ago

tested with latest image v1.44.6-stable works perfectly, one thing to note is claude-3-5-sonnet is only supported in us-east5 (Ohio) europe-west1 (Belgium)

      - model_name: claude-3-5-sonnet
        litellm_params:
          model: vertex_ai/claude-3-5-sonnet@20240620
          vertex_project: "ai-xxx-xxxx"
          vertex_location: "us-east5"
ishaan-jaff commented 1 month ago

awesome - great to hear @elvis-cai

andreimerfu commented 3 weeks ago

Is this working only for OIDC or it should work also for the AWS provider? I have a LiteLLM instance deployed into AWS AppRunner container and I'm trying to access VertexAI by using Workload Identity Federation.

If I'm using this example, the connectivity is established but not when I'm trying to use LiteLLM.

krrishdholakia commented 3 weeks ago

@andreimerfu how are you using that code with litellm? it looks like a python script to me

andreimerfu commented 2 weeks ago

I've managed to make it work by using a LiteLLM custom hook that retrieve the temporary credentials using boto3 (like in the above example) and override the AWS environment variables with these. In this way, the customer_credentials values from the GCP json auth are ignored. Otherwise, LiteLLM will try to use the EC2 metadata endpoint to get the AWS temporary credentials but this one isn't available for AppRunner.