Open josecsotomorales opened 1 year ago
@medb @singhravidutt do you know if this is even possible with the current implementation?
Looking for a solution to allow different credentials for reading from a GCS bucket in a project and writing to another GCS bucket in another project
Hey folks, I have a Spark Application that reads from a source bucket and writes into a target bucket. I'm experiencing some issues when setting the keyfile for the second operation, as a Hadoop configuration, in theory, the keyfile should get overridden, but it's not the case, the application always uses the first keyfile, I tried to unset, and clear hadoop configs and everything but for whatever reason the connector always uses the first credentials file. Here is a code snippet of what I'm trying to accomplish:
For Hadoop AWS and Hadoop Azure connectors, there's multiple ways to set credentials per bucket, I would like to have the same in the GCS connector, for example: