confluentinc / schema-registry

Confluent Schema Registry for Kafka
https://docs.confluent.io/current/schema-registry/docs/index.html
Other
2.22k stars 1.11k forks source link

bad size key serialization #1404

Closed jpsouplet-ADEO closed 4 years ago

jpsouplet-ADEO commented 4 years ago

Hello I program in C. Product dats with key in format string or json is not difficulut because the length of the key is the length of the string. But with avro object is very difficult. In C, there is one serializer, i use it for datas and key, but the difficulty is when i use the length give by the serdes serializer. For the datas, is alway good, but for the key not (99% of time)

This is some examples key schema : { "type": "record", "name": "Message", "namespace": "com.adeo.datastreaming.examples.micronaut", "fields": [ { "name": "content", "type": "string" } ] } for a good run :
content=helloworld, the serializer give a length of 16 but producer need 19 or 23 content=1234567890, the serializer give a length of 16 but producer need 23 content=A1B2C3--D4, the serializer give a length of 16 but producer need 16 or 24 content=jeanfifi, the serializer give a length of 14 but producer need 27 I don't know why !

If i provide a other size, the call failed. With debug=all, i can see this lines :

Topic adeo-dev-europe-west1-EVENT-CASSIOPEE-LM-ES-P1-C3-CHECKOUT-V1 [1]: broker is down: re-query Requesting metadata for 1/1 topics: refresh unavailable topics

For information, i don't change the configuration during the tests

tusharnt commented 4 years ago

@jpsouplet-ADEO can you please clarify your ask?

jpsouplet-ADEO commented 4 years ago

hello Thank you for you reply. This problem is now corrected. it was due to a bad network configuration. It was very strange because the produce datas are always good by with key, the result was ramdon.