Open uqix opened 2 years ago
Can you try with dev version please, there is a fix about primitive avro type
Sorry for ambiguity, the message key type is kafka builtin long(not avro long), and the value type is avro.
Ok but it's always an primitive, please try with dev branch please
Will do later, thanks
It seems that the kafka-long key is still parsed as avro-long type:
Maybe the UI could let us select key/value type?
Long doesn't exist for Kafka key since key are always store as bytes. If AKHQ display this message, it means that the first bytes of your key message is the special format for schema registry message with an id (in your case 0).
Maybe you can try with kafkacat or kafka-console-consumer but I'm pretty sure that the same error will be displayed on these tool meaning an incorrect message store on Kafka
Long keys are consumed well by Conduktor:
Can you try with other tools I mentioned ? conduktor is closed source and I can't figure how it's handle. Or can you provide a way to reproduce the issues ? I don't know how to produce this kind of message and I need to exact message on my side to fix that.
Maybe you can try with kafkacat or kafka-console-consumer but I'm pretty sure that the same error will be displayed on these tool meaning an incorrect message store on Kafka
I made some tests with Kafkacat. The keys are unreadable (e.g. ��s) in case no deserializers are provided. But you can select the deserializers for the key you need (e.g. -s key=q
[signed 64-bit integer]) to consume key/value and it will work.
Seems that you need to provide us an unit test with a message produce in the wrong format to have some fix here. I just can't figure what you are doing to produce this kind of message
@tchiotludo FYI, here's our spring-boot kafka properties:
spring.kafka:
properties:
schema.registry.url: http://schema-registry.kafka
specific.avro.reader: true
consumer:
keyDeserializer: org.apache.kafka.common.serialization.LongDeserializer
valueDeserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
producer:
keySerializer: org.apache.kafka.common.serialization.LongSerializer
valueSerializer: io.confluent.kafka.serializers.KafkaAvroSerializer
I've made a try https://github.com/tchiotludo/akhq/commit/c46c5de7958c94672366f486a9756b9067da4e2d. This don't work, since the double Deserializer is catch first (same byte length) and display the a false double value.
I have no clue how to know the serde for the topic for this standard type automatically, we need to add some configuration per topic in order to find the good serde (like conduktor is doing manually).
@tchiotludo Should this issue be solved in 0.24.0
?
@haroldlbrown no it's not fixed for now, PR are welcome