tchiotludo / akhq

Kafka GUI for Apache Kafka to manage topics, topics data, consumers group, schema registry, connect and more...
https://akhq.io/
Apache License 2.0
3.36k stars 652 forks source link

kafka-long key(with avro value) is mis-parsed as avro type #842

Open uqix opened 2 years ago

uqix commented 2 years ago

image

tchiotludo commented 2 years ago

Can you try with dev version please, there is a fix about primitive avro type

uqix commented 2 years ago

Sorry for ambiguity, the message key type is kafka builtin long(not avro long), and the value type is avro.

tchiotludo commented 2 years ago

Ok but it's always an primitive, please try with dev branch please

uqix commented 2 years ago

Will do later, thanks

uqix commented 2 years ago

It seems that the kafka-long key is still parsed as avro-long type:

image

uqix commented 2 years ago

Maybe the UI could let us select key/value type?

tchiotludo commented 2 years ago

Long doesn't exist for Kafka key since key are always store as bytes. If AKHQ display this message, it means that the first bytes of your key message is the special format for schema registry message with an id (in your case 0).

Maybe you can try with kafkacat or kafka-console-consumer but I'm pretty sure that the same error will be displayed on these tool meaning an incorrect message store on Kafka

uqix commented 2 years ago

Long keys are consumed well by Conduktor:

image

tchiotludo commented 2 years ago

Can you try with other tools I mentioned ? conduktor is closed source and I can't figure how it's handle. Or can you provide a way to reproduce the issues ? I don't know how to produce this kind of message and I need to exact message on my side to fix that.

tstuber commented 2 years ago

Maybe you can try with kafkacat or kafka-console-consumer but I'm pretty sure that the same error will be displayed on these tool meaning an incorrect message store on Kafka

I made some tests with Kafkacat. The keys are unreadable (e.g. ��s) in case no deserializers are provided. But you can select the deserializers for the key you need (e.g. -s key=q [signed 64-bit integer]) to consume key/value and it will work.

tchiotludo commented 2 years ago

Seems that you need to provide us an unit test with a message produce in the wrong format to have some fix here. I just can't figure what you are doing to produce this kind of message

uqix commented 2 years ago

@tchiotludo FYI, here's our spring-boot kafka properties:

spring.kafka:
  properties:
    schema.registry.url: http://schema-registry.kafka
    specific.avro.reader: true
  consumer:
    keyDeserializer: org.apache.kafka.common.serialization.LongDeserializer
    valueDeserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
  producer:
    keySerializer: org.apache.kafka.common.serialization.LongSerializer
    valueSerializer: io.confluent.kafka.serializers.KafkaAvroSerializer
tchiotludo commented 2 years ago

I've made a try https://github.com/tchiotludo/akhq/commit/c46c5de7958c94672366f486a9756b9067da4e2d. This don't work, since the double Deserializer is catch first (same byte length) and display the a false double value.

I have no clue how to know the serde for the topic for this standard type automatically, we need to add some configuration per topic in order to find the good serde (like conduktor is doing manually).

haroldlbrown commented 1 year ago

@tchiotludo Should this issue be solved in 0.24.0?

tchiotludo commented 1 year ago

@haroldlbrown no it's not fixed for now, PR are welcome