Closed rav13 closed 1 year ago
AppName.StreamsApp.3-store-aggregation-changelog's topic config :
cleanup.policy=compact,delete,
segment.bytes=1073741824,
retention.ms=172800000,
message.timestamp.type=CreateTime,
retention.bytes=-1
Hi @rav13 ,
Thanks for your issue. I will check your issue as soon as possible.
First thing, changelog topic with a cleanup.policy
to compact,delete
is a little bit dangerous. For your information, Streamiz auto create theirs changelog topics with a cleanup.policy
to compact
to keep at least the last value for each given key.
I imagine you create this changelog topic manually with your specific cleanup.policy
.
Merged into the 1.4. 1.4.0 release soon. Stay tuned
Description
Hi @LGouellec,
We are testing heavily streamiz library on production and I think that we stumbled across a potential issue with .net implementation. In some cases our app has issues with reading data by stream and it is pushing them to the queue. These messages in the queue are not being processed further. Also it seems that we have a lag in the consumer group and no further messages are processed from the moment of adding the first message to the queue.
We are using streamiz version 1.3.2 and conflunet.kafka in version 1.9.3.
Log:
Stream configuration:
Changelog topic config:
How to reproduce
Don't know how to reproduce