spdy-http2 / node-spdy

SPDY server on Node.js
2.81k stars 196 forks source link

Connection dont closes or keep-alive too long #338

Open geekbrother opened 6 years ago

geekbrother commented 6 years ago

Hi everybody! I've switched my api server from https module to spdy, just replace it in the code, from https.createServer... to spdy.createServer...

And it works. But after few weeks we started to see max connections issue in OS. After inspecting, we saw that connections to our spdy+express API server not closing after request send reply and still open after few minutes after reply. According to netstat, conenction in ESTABLISHED state, not keep-alive...

We have switched back to https module and problem was solved. Does spdy implementation don't close client connections after response (maybe for reusing it later) or this is just a bug... But anyway it creates massive amount of opened connections.

wsjwa commented 5 years ago

I have a similar Problem. Spdy leaves a Request from time to time in the state: "pending"

creemama commented 5 years ago

I may have encountered the same or a similar problem as well.

On a SPDY+Express production server, idle, established HTTPS connections creeped up over time. At around 4000 of these connections, the server became unresponsive.

I monitored the number of these HTTPS connections using lsof. (I changed the IP addresses in the output below.)

$ lsof -i -P -n
COMMAND   PID USER   FD   TYPE  DEVICE SIZE/OFF NODE NAME
node\x20/  86 node   19u  IPv6  867166      0t0  TCP *:8080 (LISTEN)
node\x20/  86 node   21u  IPv6  867167      0t0  TCP *:8443 (LISTEN)
node\x20/  86 node   22u  IPv6 1381822      0t0  TCP 173.19.0.6:8443->10.11.12.13:12600 (ESTABLISHED)
node\x20/  86 node   23u  IPv6  895497      0t0  TCP 173.19.0.6:8443->14.15.16.17:30078 (ESTABLISHED)
node\x20/  86 node   24u  IPv6 3692433      0t0  TCP 173.19.0.6:8443->18.19.20.21:64266 (ESTABLISHED)
node\x20/  86 node   25u  IPv6  925328      0t0  TCP 173.19.0.6:8443->21.22.23.24:11360 (ESTABLISHED)
...
node\x20/  86 node 4953u  IPv6 3874470      0t0  TCP 173.19.0.6:8443->33.34.35.36:8271 (ESTABLISHED)
node\x20/  86 node 4957u  IPv6 3874474      0t0  TCP 173.19.0.6:8443->37.38.39.40:51377 (ESTABLISHED)
node\x20/  86 node 4980u  IPv6 3897705      0t0  TCP 173.19.0.6:8443->41.42.43.44:35749 (ESTABLISHED)
node\x20/  86 node 4999u  IPv6 3897724      0t0  TCP 173.19.0.6:8443->45.46.47.48:56338 (ESTABLISHED)

$ lsof -i -P -n | grep :8443 | wc -l
4126

I knew that there were not 4126 active connections to the website and concluded that the server was not timing out these idle connections.

After some research, the following code snippet seems to have fixed the problem:

// This listener attempts to fix an issue with SPDY where idle connections do
// not close. Too many idle connections to our server (>4000) cause our server
// to be sluggish or outright nonfunctional. See
// https://github.com/spdy-http2/node-spdy/issues/338 and
// https://github.com/nodejs/node/issues/4560.

httpsServer.on("connection", function(socket) {
  // Set the socket's idle timeout in milliseconds. 2 minutes is the default
  // for Node's HTTPS server. We are currently using SPDY:
  // https://nodejs.org/api/https.html#https_server_settimeout_msecs_callback
  socket.setTimeout(120000); // 1000 * 60 * 2

  // Wait for the timeout event. The socket will emit it when the idle timeout
  // elapses.
  socket.on("timeout", function() {
    // Call destroy again.
    socket.destroy();
  });
});

Perhaps node-spdy could incorporate this code snippet somewhere.