amazon-archives / amazon-kinesis-connectors

Apache License 2.0
325 stars 191 forks source link

Unable to load AWS credentials from any provider in the chain #42

Open andrewrguy opened 9 years ago

andrewrguy commented 9 years ago

I am trying to run the RedshiftBasic sample and I have included aws.accessKeyId and aws.secretKey properties at the end of the RedshiftBasicSample.properties file per the instructions. When I run 'ant run', the following trace comes out:

run:
    [javac] /Users/andrewguy/www/amazon-kinesis-connectors/src/main/samples/redshiftbasic/build.xml:48: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 28 source files to /Users/andrewguy/www/amazon-kinesis-connectors/src/main/build
    [javac] warning: Supported source version 'RELEASE_6' from annotation processor 'com.amazonaws.eclipse.simpleworkflow.asynchrony.annotationprocessor.AsynchronyDeciderAnnotationProcessor' less than -source '1.8'
    [javac] 1 warning
     [java] Exception in thread "main" com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
     [java]     at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:117)
     [java]     at com.amazonaws.services.kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:2486)
     [java]     at com.amazonaws.services.kinesis.AmazonKinesisClient.describeStream(AmazonKinesisClient.java:861)
     [java]     at samples.utils.KinesisUtils.streamExists(Unknown Source)
     [java]     at samples.utils.KinesisUtils.createAndWaitForStreamToBecomeAvailable(Unknown Source)
     [java]     at samples.utils.KinesisUtils.createInputStream(Unknown Source)
     [java]     at samples.KinesisConnectorExecutor.setupAWSResources(Unknown Source)
     [java]     at samples.KinesisConnectorExecutor.<init>(Unknown Source)
     [java]     at samples.redshiftbasic.RedshiftBasicExecutor.<init>(Unknown Source)
     [java]     at samples.redshiftbasic.RedshiftBasicExecutor.main(Unknown Source)
     [java] Java Result: 1

BUILD SUCCESSFUL
Total time: 3 seconds

What am I missing?

Thanks.

andrewrguy commented 9 years ago

Update: this approach worked:

$ export AWS_ACCESS_KEY_ID=...
$ export AWS_SECRET_ACCESS_KEY=...

However, when I have a production ready system, this approach will not work for me. Any reason, why the properties file is not taking my access ids?

jcastrov commented 8 years ago

I solved this adding those variables as root user.

$ sudo su
$ mkdir /root/.aws
$ nano credentials

# After you create the file, exit as root user
$ exit
Aparee20 commented 7 years ago

Status :FAILED Reason : Log File : s3://aws-logs-388546034523-ap-southeast-2/elasticmapreduce/j-3MMYU5AZJ1UNE/steps/s-2B4ONS94M7B00/stderr.gz Details : Exception in thread "main" com.amazon.ws.emr.hadoop.fs.shaded.com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain JAR location : command-runner.jar Main class : None Arguments : spark-submit --class com.example.project.SimpleApp --deploy-mode cluster --master yarn s3://XXXXX/spark-log-job_2.10-1.0.jar Action on failure: Continue

Hi , I am facing this issue when i submit spark scala job . looking for some help on this .

Thanks in Advance .

sudhir05 commented 6 years ago

I am still facing this issue, i am running docker container and trying to connect to DynamoDB, when i curl on one of the end point i get "message":"com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain i am logging in through saml2aws and credentials are already present in .aws folder. Please advise

landon9720 commented 5 years ago

I'm having a similar issue running kcl-bootstrap (https://github.com/awslabs/amazon-kinesis-client-nodejs). It seems like it's unable to read credentials in ~/.aws/credentials and fails with SdkClientException: Unable to load AWS credentials from any provider in the chain, although my credentials setup is working elsewhere. Same as mentioned above, if I directly set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, it does work. It's like the environment isn't being passed through somewhere.