Closed Shikha4599 closed 2 years ago
Hi @Shikha4599
You configured your topology with default value serde: new AssetDataDto()
In your pipeline you change from AssetDataDto
to AlertsDataDto
(in stream: str2)
So I guess you need to explicitily define your serde in the To
method (because it's not defined so KStreams will try to use the default one, which is AssetDataDto
and not AlertsDataDto
Try using:
str2.To("test-acc4-output", new StringSerDes(), new AlertsDataDto())
Hi @Shikha4599 ,
@duke-bartholomew say right !
Btw, it's highly recommended to split in two different classed DTO and DTO's Serdes.
Best regards,
Hi @Shikha4599 ,
@duke-bartholomew say right !
Btw, it's highly recommended to split in two different classed DTO and DTO's Serdes.
Best regards,
Hi @LGouellec
Where exactly split is required?
Hi @Shikha4599 , @duke-bartholomew say right ! Btw, it's highly recommended to split in two different classed DTO and DTO's Serdes. Best regards,
Hi @LGouellec
Where exactly split is required?
Hi @Shikha4599 I think what @LGouellec means is to split your data and the serde logic in different classes. The Serialization/Deserialization logic is not bound to an instance of your data. It's just functionality that can be used to operate on your data objects.
KafkaStreams will just instanciate an instance of the Serde class to use for deserializing/serializing objects of that specific type. It does not care about the data embedded in that object. In your case, you will also initialize whatever resources needed by the base Serdes class whenever you create one of your data objects, which is not something you really want.
So, long story short:
public class AssetDataDto {
public string AssetIdentifier { get; set; }
public string AssetPropertyIdentifier { get; set; }
public double Value { get; set; }
public DateTime TimeStamp { get; set; }
}
and
public class AssetDataDtoSerde : AbstractSerDes<AssetDataDto> {
public override AssetDataDto Deserialize(byte[] data, SerializationContext context) {
var bytesAsString = Encoding.UTF8.GetString(data);
return JsonConvert.DeserializeObject<AssetDataDto>(bytesAsString);
}
public override byte[] Serialize(AssetDataDto data, SerializationContext context) {
var a = JsonConvert.SerializeObject(data);
return Encoding.UTF8.GetBytes(a);
}
}
Also keep in mind that Kafka serdes should also be able to cope with 'null' values (in some cases). Deleting a message from a log-compacted topic for instance is done by publishing a 'null' value on a specific key, so the serializer used for that should be able to handle 'null' values. But that's also a bit up to how the serdes are used in your business logic and topology ...
Hi @LGouellec, @duke-bartholomew
Given solution and suggestions worked for me. Thankyou for your time and help.
Best Regards
@Shikha4599 ,
Btw, if you upgrade to 1.4.0-RC3
, you have a Streamiz.Kafka.Net.SerDes.JsonSerDes
include in the library which use Newtonsoft and behave about tombstone and so on ...
Description
I want to customize ValueSerDes before sending Kafka stream to another output topic. But I am receiving error and not able to do so.
This is my AssetDataDto class:
This is my AlertsDataDto class:
This is the error:
How to reproduce
Checklist
Please provide the following information: