meltwater / served

A C++11 RESTful web server library
MIT License
710 stars 174 forks source link

Add caching #57

Closed jonasgloning closed 5 years ago

jonasgloning commented 5 years ago

motivation

Serving static files with Served is painfully slow and has a terrible memory footprint. served::response::to_buffer() seems to copy a lot.

proposed user facing changes

Add a single methode

void response::set_response(const std::shared_ptr<const std::string> &res)

possible usage

std::function<void(served::response &res, const served::request &)>
serve_file(const std::string &file_name, const std::string &file_extension)
{
    // load file in memory
    const std::string filename{file_name + "." + file_extension};
    std::ifstream    ifs{file_name + "." + file_extension,
                      std::ios::in | std::ios::binary};
    const std::string file{(std::istreambuf_iterator<char>(ifs)),
                           std::istreambuf_iterator<char>()};

    // generate first response and cache it
    served::response my_res{};
    my_res.set_header("content-type", content_type(file_extension));
    my_res.set_body(file);
    const auto response_buffer{std::make_shared<const std::string>(my_res.to_buffer())};

    return
        [response_buffer](served::response &res, const served::request & /*unused*/) {
            res.set_response(response_buffer);
        };
}
mux.handle("/large_image.jpg").get(serve_file("large_image", "jpg"));

benchmarks

Serving a 13MB image over 100 connections 1000 times. Measured via ApacheBench 2.3.

Concurrency Level:      100
Time taken for tests:   2.357 seconds
Complete requests:      1000
Failed requests:        0
Keep-Alive requests:    0
Total transferred:      14488596000 bytes
HTML transferred:       14488498000 bytes
Requests per second:    424.24 [#/sec] (mean)
Time per request:       235.718 [ms] (mean)
Time per request:       2.357 [ms] (mean, across all concurrent requests)
Transfer rate:          6002519.76 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   1.0      0       5
Processing:   178  233  12.7    235     280
Waiting:        0    6  10.0      3      64
Total:        178  234  12.6    235     280

Percentage of the requests served within a certain time (ms)
  50%    235
  66%    236
  75%    237
  80%    237
  90%    243
  95%    243
  98%    266
  99%    273
 100%    280 (longest request)

About 6 to 7 Gbit/s on a single mobile core. That's faster than nginx in both response time and troughput on my machine, although cpu usage remains higher.

Jeffail commented 5 years ago

Interesting, looks good. Thanks @jonasgloning!