Open bitliner opened 6 years ago
One thing that I've noted is that in Redis log I have the following error:
events=rw cmd=subscribe scheduled to be closed ASAP for overcoming of output buffer limits.
related to https://redis.io/topics/clients#output-buffers-limits
I think Redis is receiving too much data and too fast (as set in the configuration), and Redis drops the connection with the redis client (kue).
Same thing happens when:
subscribe q:events
==> after some seconds the connection drops.
Ways to solve it: scale the client, increase the pubsub buffer limit, e.g. 512 MB config set client-output-buffer-limit "pubsub 536870912 536870912 60"
each worker, while processing a task, sends a progress
The client that receives the progress does not receive all the events.
Somewhere the progress events are lost.
Any recommendation to better investigate?