Closed cilerler closed 1 year ago
This is expected behavior with Redis and a limitation of Redis Stream itself.
Dapr cannot know when data should be cleared. Dapr cannot know if other consumer groups are reading this data or in need of this data. Therefore, Redis users are advised to run the Redis XTRIM
command themselves if desired.
Please use a different PubSub solution if this should present a problem. I would personally only recommend using Redis for local prototyping.
Thanks for your response, @berndverst. It's interesting, but if we have to interact with REDIS directly, it seems to contradict the primary purpose of DAPR. So, I'm puzzled why DAPR doesn't have an explicit way of signaling this? Technically, this would seem to be the desired behavior for anyone looking into PUB/SUB solutions, wouldn't you agree
@cilerler my honest opinion: don't use Redis.
This is not a strong suit of Redis. Neither as state store nor as PubSub component.
Dapr relies on the actual server / service behavior for a lot of functionality. So I recommend using something like Azure Service Bus, or RabbitMQ or Kafka.
With Redis Dapr doesn't know what users want and we don't want to control / prescribe that because we cannot know whether other external to Dapr systems must read the same data
With Dapr you still need to understand a bit about the limitations and behavior of the technology you choose.
Redis is primarily a Component we have for local development experience. However in production users do not tend to rely on Redis.
By the way, the way "XTRIM" works you have to specify the number of messages to delete.
But how would Dapr even be able to track this? If the sidecar crashes and restart any state is lot - so we wouldn't know which messages were sent by Dapr and how many, but the data would still be in Redis. It would also be impossible to know what data to safely delete.
I get what you're saying. Usually, I use RabbitMQ, and sometimes Azure Service Bus if the provider is Azure.
I think the issue is with DAPR using Redis Streams instead of Redis Pub/Sub. Redis Pub/Sub deletes messages right after delivery, while Redis Streams function like a notification center. I initially thought DAPR used the Pub/Sub model, so realizing it uses Redis Streams instead. Even it's working as intended, I think there's a need for another option like real Redis Pub/Sub. You can find more info here.
Redis' Pub/Sub exhibits at-most-once message delivery semantics. As the name suggests, it means that a message will be delivered once if at all. Once the message is sent by the Redis server, there's no chance of it being sent again. If the subscriber is unable to handle the message (for example, due to an error or a network disconnect) the message is forever lost.
I get what you're saying. Usually, I use RabbitMQ, and sometimes Azure Service Bus if the provider is Azure.
I think the issue is with DAPR using Redis Streams instead of Redis Pub/Sub. Redis Pub/Sub deletes messages right after delivery, while Redis Streams function like a notification center. I initially thought DAPR used the Pub/Sub model, so realizing it uses Redis Streams instead. Even it's working as intended, I think there's a need for another option like real Redis Pub/Sub. You can find more info here.
Redis' Pub/Sub exhibits at-most-once message delivery semantics. As the name suggests, it means that a message will be delivered once if at all. Once the message is sent by the Redis server, there's no chance of it being sent again. If the subscriber is unable to handle the message (for example, due to an error or a network disconnect) the message is forever lost.
Dapr cannot use Redis Pub/Sub because messages are not persisted and thus cannot comply with Dapr's at least once delivery guarantee
Thanks for stating that @yaron2. I just created a ticket for the Redis team here to address the concern.
@cilerler I'll close this issue for now. If Redis Pubsub ever changes to persist data, feel free to open a new issue to add support for Redis PubSub (instead of Redis Streams). Until then however, there is nothing actionable for us. Thanks!
The code below works just fine with RabbitMQ and Azure ServiceBus Queue and Topic. However, when I use it with Redis, it doesn't clear the acknowledged records (
HTTP 200
). Any ideas on how to fix it?