BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.22k stars 1.14k forks source link

[Feature]: GCP Bucket support #4300

Open mdziedzic-zoe opened 1 week ago

mdziedzic-zoe commented 1 week ago

The Feature

Support for GCP Buckets alongside S3

Motivation, pitch

We'd be interested in having a dedicated support for google buckets on GCP, with GCP Native IAM implemented.

The ask here is to add an option to gcp buckets instead of AWS and use dedicated gcp sdk to manage the bucket operations.

Twitter / LinkedIn details

No response

Manouchehri commented 1 week ago

This is actually supported by the S3 API, if you're okay with using HMAC!

https://dzlab.github.io/gcp/2022/02/26/gs-with-s3-sdk/

Currently using GCS in production with LiteLLM. =)

mdziedzic-zoe commented 1 week ago

Thanks for the callout! This is the approach we've also adopted. However, we're not that happy about HMAC and would like to see an option to use Native GCP IAM.