confluentinc / kafka-rest

Confluent REST Proxy for Kafka
https://docs.confluent.io/current/kafka-rest/docs/index.html
Other
32 stars 641 forks source link

Avro JSON Encoding for byte array keys #497

Open alexjg opened 5 years ago

alexjg commented 5 years ago

Could someone point me to any documentation on how to encode binary data for keys when producting avro records? I'm aware that you're meant to encode the data as a string using the unicode codepoints 0-255 but I think I'm getting stuck at an earlier step than that. Postcing to the rest proxy with the following data:

{
    value_schema_id: <some id>,
    value: <value data>,
    key_schema: "{\"name\": \"id\", \"type\": \"bytes\"}",
    key:  "<string encoded according to the avro spec>"
}

However, this gives me the following stacktrace in the rest proxy:

org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.lang.IllegalArgumentException: Unsupported Avro type. Supported types are null, Boolean, Integer, Long, Float, Double, String, byte[] and IndexedRecord
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerDe.getSchema(AbstractKafkaAvroSerDe.java:129)
    at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:76)
    at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53)
    at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65)
    at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55)
    at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:812)
    at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:797)
    at io.confluent.kafkarest.AvroRestProducer.produce(AvroRestProducer.java:132)
    at io.confluent.kafkarest.ProducerPool.produce(ProducerPool.java:171)
    at io.confluent.kafkarest.resources.TopicsResource.produce(TopicsResource.java:147)
    at io.confluent.kafkarest.resources.TopicsResource.produceAvro(TopicsResource.java:135)

If I change the type of the key_schema to string the problem goes away, but then I have strings in my keys and I would prefer to use bytes if possible.

Any advice?

Nouuh commented 5 years ago

I think you need to add an avro generetor to your project.