Open slvrtrn opened 3 years ago
I am noticing similar issue as above as well. Were you able to figure out the issue @slvrtrn ?
@aakarshg unfortunately, no. We decided not to use this connector.
+1
Adding a MR with a simple retry. I stopped seeing serialization problems after it: #220
@FreCap Looks like CI is failing.
What's interesting is that I only notice the issue when setting tasks.max
to a number higher than 1. However, the OP's configuration the parameter was set to 1. So, I'm not really sure where the concurrent updates would have been happening from
@aakarshg unfortunately, no. We decided not to use this connector.
Thanks for letting me know. If you dont mind, can you share what was the alternative solution that you ended up using?
@aakarshg looking at the CI, the tests are failing only due to a lack of the BigQuery KeyFile: https://jenkins.public.confluent.io/job/kafka-connect-bigquery/job/PR-220/1/testReport/junit/com.wepay.kafka.connect.bigquery.integration/BigQueryErrorResponsesIT/testWriteToTableWithoutSchema/
https://jenkins.public.confluent.io/job/kafka-connect-bigquery/job/PR-220/1/
Hello, I have the following error happening randomly to my BigQuery connector
I am using the following configuration
Is there anything that I can do with the configuration to prevent that error? The connector version is 2.1.4.