Open xgkk opened 1 year ago
Hi there. If you want to store events in redis instead of a SQL database, you could write your own StorageBase
implementation and then set settings.EVENTSTREAM_STORAGE_CLASS
. However, the problem with multiple processes isn't storage, it's distribution. The approach used by django-eventstream when there are multiple processes is to rely on Pushpin for distribution. See https://github.com/fanout/django-eventstream#multiple-instances-and-scaling
fine! thanks!
Hi ! I'm bumping this. So in multi processes situation, I want to use redis as Storage backend i would lets Pushpin rely on Redis instead of Django right ?
Like the documentation of Pushpin says with a Kafka exemple here: https://github.com/fanout/kafka-sse-example
In django-evenstream, storage is separate from distribution, and only storage is pluggable. You can subclass StorageBase
to store messages in Redis, but distribution will continue to go directly from Django to Pushpin. This will work with multiple processes.
The Pushpin Kafka example is built differently, using a background process that reads from Kafka. It may be possible to hack django-eventstream to work similarly, by changing send_event
to not send to Pushpin and then making a background process that listens to Redis for changes and reuses django-eventstream utility functions for publishing to Pushpin. But this would be tricky and I'm not sure why you'd want to do this.
One nice thing about django-eventstream compared to the Kafka example is that it doesn't have any background processes, so it can be run statelessly.
I want to use redis as the event storage backend. What should I do? Because send_event is used in the multi-process scenario, the client cannot receive the message consistently