Open ksauzz opened 1 month ago
Hi @ksauzz, you can supply your own custom AWS credential supplier to the library that handles your use case. See here.
I think It doesn't work for spark-bigquery-connector because the connector doesn't the config item to change the supplier. I hope core auth library would have this functionality without any patches by users. Otherwise, GCP users have to make a patch to each google libraries involving google-auth-library-java. Thank you.
InternalAwsSecurityCredentialsSupplier only support environment variables or EC2 metadata server to get AWS credential.
In my usecase, I can't use workload identity federation from AWS Glue (spark) to load data to BigQuery table using spark-bigquery-connector. This spark environment has no EC2 metadata endpoint, and spark driver process' environment variables cannot be updated from a job.
Environment details
AWS Glue 4.0 (spark) + pyspark
Steps to reproduce
External references such as API reference guides
Any additional information below
I think AWS SDKs including aws-sdk-java provide comprehensive ways to get credential from various AWS environments, so it would be nice to use DefaultCredentialsProvider or something instead of custom implementation in this library. But I guess google team wouldn't like to use such other vendor library...
DefaultCredentialsProvider's docs