Closed dz902 closed 3 years ago
Config, which I don't think is much relevant to the problem
name=s3-test
rest.port=8084
connector.class=io.confluent.connect.s3.S3SinkConnector
storage.class=io.confluent.connect.s3.storage.S3Storage
s3.region=us-west-2
s3.bucket.name=xxxxx
topics.dir=xxxxx
topics.regex=xxxxx\..+
format.class=io.confluent.connect.s3.format.parquet.ParquetFormat
partitioner.class=io.confluent.connect.storage.partitioner.TimeBasedPartitioner
partition.duration.ms=600000
path.format='year'=YYYY/'month'=MM/'day'=dd
locale=en-US
timezone=Africa/Abidjan
flush.size=1
I read from Debezium.
Arvo is working, only Parquet. This happens for 10.0.0 and 10.0.1, not sure why it seems no one has noticed.
Problem solved. Add corresponding guava
then enjoy.
Version confluentinc-kafka-connect-s3-10.0.1.
Here is the error:
But all jars are there untouched:
plugin.path
is correctly set, I could see the correct partition writer and file name:Why is this happending? I think Guava is bundled with Hadoop jars and the jars are there. Please advise! Thanks.