tarantool / nginx_upstream_module

Tarantool NginX upstream module (REST, JSON API, websockets, load balancing)
Other
174 stars 18 forks source link

nginx auto batching feature for higher client & tarantool performance #67

Open simonhf opened 7 years ago

simonhf commented 7 years ago

Experiments have shown [1] that tarantool deals much better with transactions with batches of operations rather than lots of individual operations... and that's not even with the network layer playing a factor.

I have very many PHP processes which want to each make many individual operations... which isn't the best for tarantool performance.

How about a feature in the nginx_upstream_module which does the following for certain types of HTTP requests which only write to tarantool and don't have to return any data:

  1. Reads HTTP request from client.
  2. Queues up tarantool upstream request for forwarding upstream to tarantool.
  3. Immediately replies via HTTP saying "thank you, request received".
  4. Later when a certain buffer size is reached or an elapsed time threshold is reached, a batch of queued requests is forward upstream to tarantool and processed more efficiently as a transaction?

Note: This feature is only for users who don't care about certain writes to tarantool happening in absolutely real time...

[1] https://gist.github.com/simonhf/e7c2f40d36f1a4bdedfffa40c575b63b

dedok commented 7 years ago

Yep this is possible. Also If I implement this, you'll see better performance, better latency and less CPU usage (at nginx side). I've moved the issue to the next milestone.

Thanks for an idea!

jobs-git commented 2 years ago

any progress?

Totktonada commented 2 years ago

We have no planned works regarding the upstream module in a near future. You can open a pull request or contact with our commercial support.