3rd-Eden / memcached

A fully featured Memcached client build on top of Node.js. Build with scaling in mind so it will support Memcached clusters and consistent hashing.
MIT License
1.31k stars 277 forks source link

Going slow under load #145

Open agarciaGIT opened 11 years ago

agarciaGIT commented 11 years ago

Using memcached in prod and with heavy load it is taking very long to respond.

We are using amazon elastic cache and the cluster is not having issues. We originally left the default pool and got pool is full errors. We now have is set to 1000 also tried100 both stop the error but it goes very, very slow.

Any ideas?

AG

ianshward commented 11 years ago

@agarciaGIT can you tell me what you used for the server locations in the memcached client, and all the options you used for the memcached client?

ianshward commented 11 years ago

@agarciaGIT any news on this?

jhnlsn commented 11 years ago

@agarciaGIT I've run into this problem as well.

The problem we found was the way that the memcached plugin works in general.

if you have a pool size of 10 and want to make 100 parallel requests, whats going to happen is that the first 10 requests will get a socket. the 11th request will get a pool full error. Based on you minTimeout setting, the 11th request will then try again after N number of seconds up to the number or retries you have set (default is 5) this essentially backs up the requests and causes things to slow way down.

I would have been nice for memcache to work in a queue fashion where the 11th request would be queued up and send right after a connection in the pool was ready instead of waiting to retry. I ended up building out this type of functionality myself and it works pretty well. Essentially implementing my own queuing system based on a max pool size of 200 and centralizing requests to memcache.