Closed arjun180 closed 3 years ago
I believe is a problem with permissions on the user role you're using https://aws.amazon.com/it/premiumsupport/knowledge-center/s3-403-forbidden-error/
Thanks @oscerd . I thought the same, but my user has admin access and I can use AWS access key and secret key to upload files to the bucket using the AWS CLI. And I use the same access key and secret key in the Kafka connector yaml file. So, I was wondering if I was missing out on something.
Without access or information about the environment or role, it's really impossible to say something.
Hi @oscerd - Thanks. So, I was following the thread : https://github.com/apache/camel-kafka-connector/issues/282 here. My access key and secret key were not working (possibly because I use temp credentials which involves the use of secret, access key as well as the session token)
I was trying to use the camel.component.aws2-s3.useIAMCredentials: true
option instead. We made sure the default EC2 node group IAM has full s3 access. The YAML file for the connector looks like this
apiVersion: kafka.strimzi.io/v1beta2
kind: KafkaConnector
metadata:
name: s3-sink-connector
labels:
strimzi.io/cluster: my-connect-cluster
spec:
class: org.apache.camel.kafkaconnector.aws2s3.CamelAws2s3SinkConnector
tasksMax: 1
config:
key.converter: org.apache.kafka.connect.storage.StringConverter
value.converter: org.apache.kafka.connect.storage.StringConverter
topics: kafka-connect-topic
camel.sink.path.bucketNameOrArn: my-connect-bucket
camel.sink.endpoint.keyName: ${date:now:yyyyMMdd-HHmmssSSS}-${exchangeId}
camel.component.aws2-s3.useIAMCredentials : true
camel.component.aws2-s3.region: <region>
On running the connector, I end up getting this error :
Caused by: org.apache.camel.PropertyBindingException: Error binding property (camel.component.aws2-s3.useIAMCredentials=true) with name: useIAMCredentials on bean: org.apache.camel.component.aws2.s3.AWS2S3Component@6c7f1a7 with value: true
Any pointers on where we could be going wrong?
To add some more information, my Kafka connect YAML looks like this :
apiVersion: kafka.strimzi.io/v1beta2
kind: KafkaConnect
metadata:
namespace : my-cluster
name: my-connect-cluster
annotations:
strimzi.io/use-connector-resources: "true"
spec:
replicas: 1
bootstrapServers: kafka-bootstrap.my-cluster.com:9093
tls:
trustedCertificates:
- secretName: my-secret
certificate: server.crt
authentication:
type: tls
certificateAndKey:
secretName: kafka-tls-client-credentials
certificate: user.crt
key: user.key
config:
group.id: my-connect-cluster
offset.storage.topic: my-connect-cluster-offsets
config.storage.topic: mya-connect-cluster-configs
status.storage.topic: my-connect-cluster-status
key.converter: org.apache.kafka.connect.json.JsonConverter
value.converter: org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable: true
value.converter.schemas.enable: true
config.storage.replication.factor: 1
offset.storage.replication.factor: 1
status.storage.replication.factor: 1
build:
output:
type: docker
image: image:latest
pushSecret: <secret>
plugins:
- name: camel-timer
artifacts:
- type: tgz
url: https://repo.maven.apache.org/maven2/org/apache/camel/kafkaconnector/camel-timer-kafka-connector/0.8.0/camel-timer-kafka-connector-0.8.0-package.tar.gz
- name: awss2-kafka-connect
artifacts:
- type: tgz
url: https://repo.maven.apache.org/maven2/org/apache/camel/kafkaconnector/camel-aws2-s3-kafka-connector/0.8.0/camel-aws2-s3-kafka-connector-0.8.0-package.tar.gz
This got resolved. Thank you for the help.
For completeness the correct property name was useDefaultCredentialsProvider
Hello,
I am trying to get the Camel AWSS2-S3-Kafka connector sink to work with my Kafka cluster. I have a functioning Kafka cluster as well as a Kafka connect cluster up and running. My aim is to move messages from my Kafka topic to the s3 bucket I have created.
My KafkaConnect cluster is fine and has the following pulgins :
My KafkaConnector looks like the following :
After running the connector above I get the following error :
I imagine the 403 error is because of my s3 bucket not allowing the connector access. Are there some configuration changes I need to be making to the s3 bucket. I did try connecting to my s3 bucket through the AWS CLI and that worked fine. Currently the bucket is set to "Bucket and objects not public"
I'd appreciate any help! Thanks