spdy-http2 / node-spdy

SPDY server on Node.js
2.81k stars 196 forks source link

Performance Hit When Implementing spdy #235

Closed CelticParser closed 8 years ago

CelticParser commented 8 years ago

I ran some performance tests using loadtest and had to take note by the drop in reqs/sec implementing node-spdy (-70%). It may be how I am using the module and would like some input.

(Results below)

Using: loadtest http://104.xxx.xxx.xx/ -t 40 -c 200 -k

Test serves a static index.html with a simple 40k jpg:

<html>
  <body>
    <img src="cat.jpg" />
  </body>
</html>

server.js uses buffet to grab the static content and put it into a buffer:

'use strict';
//------------------------------------------------------------------------------
// Load required modules
var buffet  = require('buffet')({root: '../build'})
  , fs      = require('fs')
  , spdy    = require('spdy')
  , confg   = require("../.config.json5") // Read config.json5 data
  , mainjs  = fs.readFileSync('../build/js/main.js'); // Sets the serving dir

// -----------------------------------------------------------------------------
// Lets serve it up!
spdy.createServer(confg.options, function(req, res) {

  // Add Content Security Rules to the header
  for(var i = 0; i < confg.security.length; i++) {
    res.setHeader(confg.security[i].name, confg.security[i].value);
  }

  // Push JavaScript assets to the client but!,
  // Check to see if client wanted the root page
  if (req.url == '/') {

    // Verify if client can be pushed to
    if (res.push) {

      for(var r = 0; r < confg.resource.length; r++) {
        var stream = res.push(confg.resource[r].file, {
          req: {'accept': '*/*'},
          res: {'content-type': confg.resource[r].type},
        });
      }

      stream.end(mainjs, function() {
        // TODO(CelticParser): Remove before release
        console.log("stream pushed");
      });
    }
  }

  req.addListener('end', function () {

    // TODO(CelticParser): Is setNoDelay() needed?
    // req.socket.setNoDelay();
    // @see https://www.npmjs.com/package/buffet
    buffet(req, res, function (err, result) {

      // There was an error serving the file? Throw it!
      if (err) {
        console.error("Error serving " + req.url + " - " + err.message);
        // Respond to the client
        res.writeHead(err.status, err.headers);
      }
    });
  }).resume();
}).listen(confg.options.port, confg.options.host, function() {
  console.log("serving at https://" + confg.options.host + ":" + confg.options.port);
});

The configuration is pulled via .confg.json5:

var fs  = require('fs');

module.exports = { 

  security: [
    { name:  'Cache-Control',
      value: 'public, max-age=30672000'
    }, // ...And so on...
 ],

  options: { // SPDY-specific options
      // Read the SSL keys
      key:  fs.readFileSync('../.key.pem')
    , cert: fs.readFileSync('../.cert.pem')
    , host: '104.236.182.19'
    , port: '443'
    , spdy: { // SPDY!
          protocols: [ 'h2','spdy/3.1', 'spdy/3', 'spdy/2','http/1.1', 'http/1.0' ]
        , plain: false
        , maxStreams: 200
        , connection: {
              windowSize: 1024 * 1024
            , autoSpdy31: false
          }
      }
  }
};

Load test using required('html').createServer(function(req, res) {...

reqs/sec = ~3.2k res/avg = 95ms errors = 0

Using spdy.createServer(confg.options, function(req, res) {... as is above:

reqs/sec = ~400 res/avg = 1600ms errors = Tons

I have also tested spdy omitting the setHeader function and that gives about a 1% boost with each test run. I have tried various ways of writing the server script (including hard-coding configs), but I consistently get the same results. I ran each test 5x each for an average. I also "de-hardened" the tests to find that errors start at around 10 req/sec.

I would expect a slight drop in performance using spdy because of the TSL hand-shakes etc, but not a 70% drop. When I initially implemented the module I saw a decent performance increase (esp. w/push) in the developer console. A few days later I thought to load test it with a "real-world" scenario (20k index w/25 resources = ~600k downloaded_) and saw a 64req/sec rate! So I started looking into where the hit was. Again, I am not sure if my implementation is the cause(??). Serving static resources with Node & HTTP/1.1 is proving to be faster than using NGINX (and NGINX doesn't support "push" yet) and I would very much like to use node-spdy over NGINX in our product if we can get this figured out! Any input or insight will be greatly appreciated.

indutny commented 8 years ago

@CelticParser Have you tried benchmarking nginx+http2 using the same node.js module?

It looks like this module is measuring TLS handshake time. This is even more evident considering that loadtest does not seem to support nor http2, nor spdy (at least I don't see immediate dependencies, or any note in loadtest's readme), so you was just measuring a fallback server which is absolutely equal to default require('https').createServer(...).

I would suggest you to replace spdy in your test code with https, and make another run to see that it will give the same results without spdy.

As an answer to your question, it doesn't seem to me that loadtest is suitable for benchmarks here. I would suggest you to use h2load instead to benchmark HTTP2.

CelticParser commented 8 years ago

Thnx! Showing 7k req/sec. Now I know better.