Open yunhappy opened 6 years ago
Is there any update on this issue as I'm also facing the same error? Any inputs is much appreciated
change deserializer, so when this is a error for data the consumer will receive like this: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
Any progress on this? In my case Kafka topic with string key and avro data has this problem. Could be fixed by the pull requests from @yunhappy .
Is any workaround found on this issue?
Why this issue is not marked as BUG?
I had something similar, it turns out in my case that it was assuming both the key
and value
of the message to be in Avro format, and was failing because the key was not.
To solve this, I believe that when creating the consumer, users should have the option to either specify format
(as is the case right now), in which case both the key and the value will have the same format, or specify keyFormat
, and valueFormat
separately
Is there any update on this issue? I have the same problem.
My message have two schemas different.
{ "key":{ "type":"JSON" "data": "test", }, "value": { "type":"PROTOBUF" "data": { "myfield": "myvalue" } } }
But the consumer don't deserialize two schemas different on key and value.
Is mandatory the message contain the same schema type Accept:application/vnd.kafka.protobuf.v2+json
Thank you so much for what's new.
Hello, I am facing same exception, Is there a workaround for unknown magic byte?
version: 4.1.1