Closed San13 closed 4 years ago
hi @San13
thanks for your issue report which in fact doesn't seem to be related to the sink connector itself. you can see from the stacktrace you posted that you don't even reach any code from the sink connector packages.
The issue is related to the avro de/serialization and the schema registry itself which happens before the connector even gets to see the sink records. Maybe what you face is similiar to an issue described here https://github.com/confluentinc/schema-registry/issues/825
Good luck for issue resolution! Feel free to comment further with advise and help how you eventually fixed it. It might help others that may run into the same issue.
I am trying to save KStream Kafka topic to mongodb sink connector. Below is my configuration
I have this avro value -- (data is from avro console consumer)
{"endPoint":"test1","createdDate":1584617333519,"companyName":"test","productName":"Brainy Bin","path":"/3303/0/5700","timestamp":1584617233036,"AvroResponsePayload":{"code":"CONTENT","kind":"observe","body":{"io.teamone.leshan.avro.response.AvroReadResponseBody":{"content":{"io.teamone.leshan.avro.resource.AvroResource":{"id":5700,"path":"/3303/0/5700","kind":"SINGLE_RESOURCE","type":"FLOAT","value":{"double":-315.2}}}}}}}
Below is my error log