Closed julienrf closed 3 months ago
After a second thought, I wonder if we should also support reading credentials from environment variables (see also https://github.com/awslabs/emr-dynamodb-connector/issues/185).
ideal way is to read from everywhere with some order of preference (and stick to spark ways https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties or similar with pre-defined hierarchy and inheritance)
@julienrf just fix the commit message - add a reference to https://github.com/scylladb/scylla-migrator/issues/122 (but don't close it)
I'd like to merge this to fix the credentials to the ones provided by config.yaml
Then in later PRs we can tackle env or assumed role approaches for DynamoDB access
Currently, the users of the migrator have to provide their AWS credentials through the
config.yaml
file. According to the description of #122, in some cases it is desirable to also read the AWS credentials from the user profile (~/.aws/credentials
).This PR addresses this need by using a capability of the Hadoop connector to set a custom AWS credentials provider. We set it to
com.amazonaws.auth.profile.ProfileCredentialsProvider
, which is the standard profile credentials provider from the AWS SDK.Ultimately, if necessary we could make this configurable and allow our users to supply which credentials provider to use. This would give them full control on that.
According to https://github.com/scylladb/scylla-migrator/issues/122#issuecomment-2028190781, this PR fixes #122.