GoogleCloudDataproc / hadoop-connectors

Libraries and tools for interoperability between Hadoop-related open-source software and Google Cloud Platform.
Apache License 2.0
280 stars 237 forks source link

is there any way to set per bucket credentials #623

Open ruiyang2015 opened 3 years ago

ruiyang2015 commented 3 years ago

our spark is a long running spark session, but we need to access different bucket during the life span of the spark session, and our permission setup require different credentials for different bucket, but it seems that gcs hadoop connector can only set one set of credentials for all, for s3/azure, they both allow fine-grain access control on bucket/container level, can we do the same for gcs bucket?

gegose commented 1 year ago

Does exist any plan to implement this?

josecsotomorales commented 1 year ago

I'm facing this same challenge with the GCS connector, I also created an issue: https://github.com/GoogleCloudDataproc/hadoop-connectors/issues/1009