Kafka JDBC connector producing Avro encoded messages from a table with 4 columns and publishing to a topic "accounts"
Have a consumer pull those messages from "accounts" topic and do some custom business logic
This is more of an Avro question but tightly wound to Schema Registry, JDBC Connector and Avro Convertes from Confluent. The consumer is java code with schema.registry property set and value.deserializer set to KafkaAvroDeserializer.
My table has 4 columns id (int) , email (text), balance (decimal), is_active (boolean).
Is KafkaAvroDeserializer is the right deserializer to use
What kind of avro record is must be expected/consumed ? IndexedRecord or GenericRecord
Tried both of them and boolean doesn't work whereas decimal column gets converted to Avro Logical Type.
Basically how do i get Kafka Connect to produce and a normal Kafka Consumer to consume avro data ?
Appreciate any help. Any reference example would do lot good.
have you manage this? I'm facing the same problem. a jdbc connect producing avro message and a java kafka client consuming it. however I'm not able to decode the avro message..
I have this wiered scenario
This is more of an Avro question but tightly wound to Schema Registry, JDBC Connector and Avro Convertes from Confluent. The consumer is java code with schema.registry property set and value.deserializer set to KafkaAvroDeserializer.
My table has 4 columns id (int) , email (text), balance (decimal), is_active (boolean).
Basically how do i get Kafka Connect to produce and a normal Kafka Consumer to consume avro data ? Appreciate any help. Any reference example would do lot good.