pegasystems / dataset-integrations

Kafka custom serializer/deserializer implementations
Apache License 2.0
5 stars 6 forks source link

Question about key #2

Open ilyasahmed24 opened 4 years ago

ilyasahmed24 commented 4 years ago

Hi I had q question regarding publishing of key. I see in the code that delegateValueSerializer.configure(additionalConfiguration, false); delegateValueDeserializer.configure(additionalConfiguration, false); only the value serde is there no key serde. I have a requirement where i have to publish the key and value both in avro format. For now when i publish messages only value is sent. How can this be done?

commandini commented 4 years ago

@ilyasahmed24 AvroSchemaRegistrySerde class can be refactored to support key serde with schema. However, that would also require to extend the schema registry configuration page in order to allow uploading of schemas for keys, i.e. to have key and value both in Avro format, 2 schemas should be provided. This is not possible with the current component.

ilyasahmed24 commented 4 years ago

@commandini Can you help me out with the extension of that. May be we can pass two schema registry serde to AvroSchemaRegistrySerde and then use it . Can you tell me how to do it ? Let's say i want to use same schema for both key and value, how can this be done? As i do not see the code for producer in the serde.

andreiadamian commented 4 years ago

@ilyasahmed24 Unfortunately, we do not support custom key serialization at the moment. The only way to achieve this is to set the right serialized value to a property before saving record to a Kafka data set.

ilyasahmed24 commented 4 years ago

@andreiadamian Let's say I have serialized the key in the same serde and set it to a clipboard page in a property, now how do i pass this to data set?