niwinz / django-sse

HTML5 Server-Sent Events integration for Django
BSD 3-Clause "New" or "Revised" License
234 stars 36 forks source link

cleanup connections #9

Closed tbarbugli closed 10 years ago

tbarbugli commented 11 years ago

Hi, I really find this project interesting and I am experimenting a bit around with redis pubsub. I understand that using pub sub the webserver will keep an open connection with the browser and will use one of the connections from the redis pool to subscribe to the desired channel. What is bothering me is that the webserver will not notice when a browser drops a connection, for that reason that specific redis connection will be kept open forever and at some point the pool will run out of connections and start to complain (or the connection count will leak forever) Do you have any idea about some way to properly address this problem ?

niwinz commented 11 years ago

Hi!

I'm glad you're interested in the project.

In respect for your question. Actually on a pubsub class is removed by a garbage collector, it disconnects the connection and releases a connection to the pool. Theoretically, there is no problem.

Are you find some unusual behavior?

tbarbugli commented 11 years ago

Hi, You are right, connection will get closed when GC kicks because there are not extra references in your code that can lead to a connection leak. My concern is about something else, correct me if am wrong but as far as i know when a client closes its connection the web server will not notice it until a write is performed because redis pubsub does not (for obvious reasons) have a timeout. This way the connection pool can get full at some point or the connection count really high.

niwinz commented 11 years ago

I think you're right in what you say. I'll do some research and see if the behavior need to look for a solution or behavior is appropriate.

Thank you very much! I mention the conclusions, in a couple of days at most.

tbarbugli commented 11 years ago

I really liked your approach with class based views :) I made sort of copycat project for longpolling and implemented a timeout approach to avoid this problem: https://github.com/tbarbugli/django_longpolling

niwinz commented 11 years ago

Sorry it took me to respond, I have a lot of work ... I've been doing some research, and sure enough, the connection is not returned to the connection pool.

Your approach is good, but I think it only works well for long polling. More research is needed, now do not know the best way to handle this ...

Once again thanks for making me see this problem.

xeor commented 10 years ago

Any news here? I have also runned into this proble.

Too many open files errors, and 500+ of "addr=127.0.0.1:54999 fd=226 name= age=127922 idle=127922 flags=N db=0 sub=1 psub=0 multi=-1 qbuf=0 qbuf-free=0 obl=0 oll=0 omem=0 events=r cmd=subscribe" listed with "redis-cli client list".. Any more information I can provide to make it easier to solve this problem?

tbarbugli commented 10 years ago

you have few options (last two need should be combined) fix the library to closed unused connections or to only use 1 redis connection per webserver process increase kernel's file descriptor (and redis max connections setting) for redis server restart your webserver every x hours (depending on how quickly you run out of connections)

2013/12/7 xeor notifications@github.com

Any news here? I have also runned into this proble.

Too many open files errors, and 500+ of "addr=127.0.0.1:54999 fd=226 name= age=127922 idle=127922 flags=N db=0 sub=1 psub=0 multi=-1 qbuf=0 qbuf-free=0 obl=0 oll=0 omem=0 events=r cmd=subscribe" listed with "redis-cli client list".. Any more information I can provide to make it easier to solve this problem?

— Reply to this email directly or view it on GitHubhttps://github.com/niwibe/django-sse/issues/9#issuecomment-30052980 .

xeor commented 10 years ago

Thanks for the quick reply..

About using 1 redis connection, I see https://github.com/niwibe/django-sse/blob/master/django_sse/redisqueue.py#L55 , but that information is static for each connection.. Where can I change the logic, so I can example enforce max 10 connections per ip? I am using uniqe channels per user, so having max 10 connections is my goal.

Inceasing the file descriptor count is something I will do, but not in a development face where I want to detect leaks like this :)

I guess I can restart the web-server holding the sse connections once in a while.. But that really feels dirty..

Is there a way to kill off connections that havent gotten any data in 1 minute or so? My sse server is sending out pings every 40 sec, and the client will try to reconnect if there is no data for 45 sec.

xeor commented 10 years ago

To whoever else wants to do this. I went for something like

redis = redisqueue._connect()
kill_status = [ redis.client_kill(c['addr']) for c in redis.client_list() if int(c['idle']) > 360 and c['cmd'] == 'subscribe']
print('Killed %s redis pubsub connections' % str(len(kill_status)))

in my already existing rabbitmq schedueler. Which also takes care of pinging all connected sse connections every 40 seconds. Them theirself makes sure with javascript code to reconnect if they dont have any data for 45 seconds.

niwinz commented 10 years ago

This project is unmaintained. Sorry for inconvenience.