We're running a rabbitMQ consumer which is responsible for creating timeline entries through spreads in our application.
As this consumer is a long running PHP process, we've notice a potential bug in the implementation of the flush() method. The persistedDatas array is never being cleared after a flush operation, therefore if you run the flush operation multiple times inside a rabbitMQ consumer, you'll end up with duplicate entries - at least that's what happening in our application.
We're running a rabbitMQ consumer which is responsible for creating timeline entries through spreads in our application.
As this consumer is a long running PHP process, we've notice a potential bug in the implementation of the
flush()
method. ThepersistedDatas
array is never being cleared after a flush operation, therefore if you run the flush operation multiple times inside a rabbitMQ consumer, you'll end up with duplicate entries - at least that's what happening in our application.