awslabs / kinesis-kafka-connector

kinesis-kafka-connector is connector based on Kafka Connect to publish messages to Amazon Kinesis streams or Amazon Kinesis Firehose.
Apache License 2.0
151 stars 90 forks source link

Unable to load AWS credentials from any provider in the chain #28

Open SrinivasaDK opened 5 years ago

SrinivasaDK commented 5 years ago

Hi All,

I am getting below error, please advise.

We are trying to transfer data from Kafka to Kinesis, have exported my aws secret and access key, [opc@kafkanode CDH-6.1.0-1.cdh6.1.0.p0.770702]$ env |grep -i aws AWS_SECRET_ACCESS_KEY=** AWS_ACCESS_KEY_ID=**

our configuration details are:

[opc@kafkanode config]$ cat worker.properties bootstrap.servers=kafkanode:9092 key.converter=org.apache.kafka.connect.storage.StringConverter value.converter=org.apache.kafka.connect.storage.StringConverter

internal.value.converter=org.apache.kafka.connect.storage.StringConverter

internal.key.converter=org.apache.kafka.connect.storage.StringConverter

internal.value.converter=org.apache.kafka.connect.json.JsonConverter internal.key.converter=org.apache.kafka.connect.json.JsonConverter key.converter.schemas.enable=true value.converter.schemas.enable=true internal.key.converter.schemas.enable=true internal.value.converter.schemas.enable=true offset.storage.file.filename=offset.log schemas.enable=false

Rest API

rest.port=8096

passing custom jar location

plugin.path=/home/opc/kinesis-kafka-connector-master/target/

rest.host.name=

[opc@kafkanode config]$

[opc@kafkanode ~]$ aws firehose list-delivery-streams { "DeliveryStreamNames": [ "kafka-s3-stream" ], "HasMoreDeliveryStreams": false } [opc@kafkanode ~]$

My connector properties: [opc@kafkanode config]$ cat kinesis-firehose-kafka-connector.properties name=kafka_kinesis_sink_connector connector.class=com.amazon.kinesis.kafka.FirehoseSinkConnector tasks.max=1 topics=OGGTest region=eu-central-1 batch=true batchSize=500 batchSizeInBytes=1024 deliveryStream=kafka-s3-stream [opc@kafkanode-instance-20190103-1627 config]$

The error it is showing:-

[2019-01-26 09:20:10,785] INFO Kafka version : 2.0.0-cdh6.1.0 (org.apache.kafka.common.utils.AppInfoParser:109) [2019-01-26 09:20:10,785] INFO Kafka commitId : unknown (org.apache.kafka.common.utils.AppInfoParser:110) [2019-01-26 09:20:10,788] INFO Created connector kafka_kinesis_sink_connector (org.apache.kafka.connect.cli.ConnectStandalone:104) [2019-01-26 09:20:11,670] ERROR WorkerSinkTask{id=kafka_kinesis_sink_connector-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177) com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:131) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.doInvoke(AmazonKinesisFirehoseClient.java:826) at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.invoke(AmazonKinesisFirehoseClient.java:802) at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.describeDeliveryStream(AmazonKinesisFirehoseClient.java:451) at com.amazon.kinesis.kafka.FirehoseSinkTask.validateDeliveryStream(FirehoseSinkTask.java:95) at com.amazon.kinesis.kafka.FirehoseSinkTask.start(FirehoseSinkTask.java:77) at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:301) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:190) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [2019-01-26 09:20:11,671] ERROR WorkerSinkTask{id=kafka_kinesis_sink_connector-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)

SrinivasaDK commented 5 years ago

I am using Cloudera version CDH-6.1.0-1.cdh6.1.0.p0.770702 and it ships with Kafka 2.1.2 (0.10.0.1+kafka2.1.2+6)

FAYiEKcbD0XFqF2QK2E4viAHg8rMm2VbjYKdjTg commented 3 years ago

I get the same thing with localstack- it also asks for the role arn and session name, but even with those it fails.