ymilky / franzy-avro

7 stars 4 forks source link

Serialization is not including the MAGIC_BYTE and schema version info. #1

Open erikjmiller opened 8 years ago

erikjmiller commented 8 years ago

I've been using the following code to test pushing an avro object to a confluent 3.0 server. When I use 'bin/kafka-avro-console-consumer...' command I get the following error.

org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1 Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!

Running the normal 'bin/kafka-console-consumer' I get the data back as a byte stream with out any errors. (some data readable some not)

I'm using the following packages with along with this one....

             [org.apache.kafka/kafka_2.10 "0.8.2.2"]  #need this to pull down message from a different older kafka server using clj-kafka
             [org.apache.kafka/kafka-clients "0.9.0.1"]
             [io.confluent/kafka-avro-serializer "1.0.1"]
             [io.confluent/kafka-schema-registry-client "1.0.1"]
             [io.confluent/kafka-schema-registry "1.0.1"]

I also tried using updated versions of..

             [org.apache.kafka/kafka-clients "0.10.0.0"]
             [io.confluent/kafka-avro-serializer "3.0.1"]
             [io.confluent/kafka-schema-registry-client "3.0.1"]
             [io.confluent/kafka-schema-registry "3.0.1"]

Is this a known issue? How should the MAGIC_BYTE get into the message?

dspiteself commented 7 years ago

@erikmiller did you ever make progress on this?

dspiteself commented 7 years ago

the confluent world uses a 5 byte binary header and is incompatible with franzy. described here http://docs.confluent.io/current/schema-registry/docs/serializer-formatter.html#wire-format

https://github.com/confluentinc/schema-registry/tree/master/avro-serializer