confluentinc / kafka-rest

Confluent REST Proxy for Kafka
https://docs.confluent.io/current/kafka-rest/docs/index.html
Other
44 stars 645 forks source link

Error deserializing Avro message "Error deserializing key/value for partition " #441

Open yunhappy opened 6 years ago

yunhappy commented 6 years ago

version: 4.1.1

curl -X GET -H "Accept: application/vnd.kafka.avro.v2+json" \
http://localhost:8082/consumers/testgroup2/instances/ym_test/records
[2018-06-28 20:29:15,439] INFO 0:0:0:0:0:0:0:1 - - [28/六月/2018:20:28:04 +0800] "GET /consumers/testgroup2/instances/ym_test/records HTTP/1.1" 500 187  70513 (io.confluent.rest-utils.requests:77)
[2018-06-28 20:29:15,446] ERROR Unexpected exception in consumer read task id=io.confluent.kafkarest.v2.KafkaConsumerReadTask@5506eafe  (io.confluent.kafkarest.v2.KafkaConsumerReadTask:154)
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition tools_example0627-0 at offset 20. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
PavanGurram-DevOps commented 5 years ago

Is there any update on this issue as I'm also facing the same error? Any inputs is much appreciated

yunhappy commented 5 years ago

change deserializer, so when this is a error for data the consumer will receive like this: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1

Ritaja commented 4 years ago

Any progress on this? In my case Kafka topic with string key and avro data has this problem. Could be fixed by the pull requests from @yunhappy .

xeeaax commented 3 years ago

Is any workaround found on this issue?

xeeaax commented 3 years ago

Why this issue is not marked as BUG?

AbdulRahmanAlHamali commented 2 years ago

I had something similar, it turns out in my case that it was assuming both the key and value of the message to be in Avro format, and was failing because the key was not.

To solve this, I believe that when creating the consumer, users should have the option to either specify format (as is the case right now), in which case both the key and the value will have the same format, or specify keyFormat, and valueFormat separately

paulolimarb commented 2 years ago

Is there any update on this issue? I have the same problem.

My message have two schemas different.

{ "key":{ "type":"JSON" "data": "test", }, "value": { "type":"PROTOBUF" "data": { "myfield": "myvalue" } } }

But the consumer don't deserialize two schemas different on key and value.

Is mandatory the message contain the same schema type Accept:application/vnd.kafka.protobuf.v2+json

Thank you so much for what's new.

Eqta commented 1 month ago

Hello, I am facing same exception, Is there a workaround for unknown magic byte?