Closed hongbo-miao closed 1 month ago
You're not reallty doing something wrong. The documentation might be a bit better. But the idea of the raw encoders is to 'just' do the bits directly related to schema registry. So appending the bytes with the magic byte, the registered schema reference, and the encoded message withing the schema. It does not do any validation in the provided bytes. I don't think encoding the JSON bytes is the proper approach, it's been a while I worked on the protobuf integration.
Thank you @gklijs for the guide! 😃
I fixed my code at https://github.com/hongbo-miao/hongbomiao.com/pull/20005/files
Now it is using the library prost (protobuf) instead of serde (json) to serialize data. And prost will throw error if the data is not matching the local proto file.
My AKHQ now can see Protobuf data correctly with corresponding schema in Confluent Schema Registry:
I fixed mainly based on these two Python Protobuf examples. Hopefully this time the code is correct ☺️
Describe the bug
EasyProtoRawEncoder
'sencode
currently does not skip writing data that not part of protobuf schema.To Reproduce
Create a schema (Note inside it has
temperature1
totemperature5
)(Note besides
temperature1
totemperature5
, it has one extratemperature6
)Expected behavior
I expect
EasyProtoRawEncoder
'sencode
will only writetemperature1
totemperature5
and skip writingtemperature6
value to Kafka as it is not part of schema. However, it got written to Kafka as well.Please correct me if I did wrong, thanks! 😃