Closed excentrik closed 7 years ago
Hi, @excentrik, it is a know limitation.
mongo-sink is another side of mongo-source, it will only consume the records which serialized from a struct. For example, mongo source will produce records like:
{"schema":{"type":"struct","fields":[{"type":"int32","optional":true,"field":"ts"},{"type":"int32","optional":true,"field":"inc"},{"type":"string","optional":true,"field":"id"},{"type":"string","optional":true,"field":"database"},{"type":"string","optional":true,"field":"op"},{"type":"string","optional":true,"field":"object"}],"optional":false,"name":"mongo_21_schema_kafka_t"},"payload":{"ts":1488440425,"inc":11150,"id":"58b7cc69304bff419f0992a2","database":"kafka_t","op":"i","object":"{ \"_id\" : { \"$oid\" : \"58b7cc69304bff419f0992a2\" }, \"key\" : 0.6001637088183821, \"ts\" : { \"$timestamp\" : { \"t\" : 1488440425, \"i\" : 0 } } }"}}
Which contains exactly a schema, an id and a payload fields.
Hi @sailxjx,
Thanks for the reply. Hmm, for our use case, it would be essential to use the JsonConverter. But we'll see if we manage to go around the limitation. Thanks again.
@excentrik you're welcome, and PR is welcome.
Hi
Using the kafka MongoDB connector as sink for a kafka topic, I'm getting the following error:
I tried several different payloads, from the simple:
to the one seen in the error message with a schema specified, but to no avail. Always the same error happens. I was wondering if this is a know limitation or if you have ideas on how to fix it.
The connector is created with:
ENVIRONMENT variables for the kafka-connect container: