oven-sh / bun

Incredibly fast JavaScript runtime, bundler, test runner, and package manager – all in one
https://bun.sh
Other
73.03k stars 2.66k forks source link

`uWebSockets.js` is faster by about ~3000 req/s than `Bun.serve` #8643

Open huseeiin opened 7 months ago

huseeiin commented 7 months ago

Node:

import { App } from "uWebSockets.js";

App()
  .any("*", (res) => {
    res.end("Hello World");
  })
  .listen(3000, () => {});
node node.js

oha -n 10000 http://localhost:3000

Requests/sec: 41321.3633

Bun:

Bun.serve({
  fetch() {
    return new Response("Hello World");
  },
});
bun bun.js

oha -n 10000 http://localhost:3000

Requests/sec: 37174.1187
Redskull-127 commented 7 months ago

I mean, aren't you comparing Web Sockets to a http server, my friend?

huseeiin commented 7 months ago

I mean, aren't you comparing Web Sockets to a http server, my friend?

No. Both are http.

ghost commented 7 months ago

I'm sure they'll figure this one out since Bun's web server itself uses µWebSockets, but it's unexpected nonetheless.

Pomax commented 7 months ago

Small note: different CPUs will show different numbers, so rather than being faster by a specific number of requests per second, there's a "Node can services 10% more requests per time unit"/"Bun serves 11% fewer requests per time unit" performance difference.

ghost commented 7 months ago

Out of curiosity, I've spent a bit of time with this just to see what I get in my old laptop.

First of all environment:

Didn't touch Bun's script but changed the Node's a bit only to make it easier to run right away:

require("uWebSockets.js")
    .App()
    .any("*", (res) => res.end("Hello World"))
    .listen(3000, _ => {});

Then made a simple bash loop to have the HTTP load test running 10 times in a row:

for i in {1..10}
do
    oha --json -n 10000 http://localhost:3000 |
        jq ".summary.requestsPerSec * 100 | round / 100"
done
Finally I've ran the above 3 times for Bun and then another 3 times for Node. Bun 1 2 3
16362.46 16602.01 15066.21
16596.55 16642 16596.26
16506.53 16405.59 16480.51
16590.54 16455.34 15175.14
16574.81 16590.73 16616.67
16427.12 16328.5 16551.74
16397.49 15930.53 16688.25
16563.96 16674.31 16481.32
16474.63 16350.3 15805.88
15220.12 16559.38 16501.57
Total: 163_714.21 Total: 164_538.69 Total: 161_963.55
Node 1 2 3
16512.53 15802.19 16297.3
16359.78 16587.02 15009.95
16656.58 16520.79 15986.62
16687.91 16904.95 16469.07
16835.5 16751.16 16284.44
16227.54 16494.61 16483.25
16768.75 16760.52 16364.27
16289.36 15492.76 16511.66
16722.35 16564 16018.23
16527.35 15178.82 16482.87
Total: 165_587.65 Total: 163_056.82 Total: 161_907.66

So as you can see, in this environment and under this very synthetic benchmark, they seem to be very much neck to neck. Should Bun be able to clearly pull ahead? I'd say so given that uWebSockets is tightly put together with Bun, but to be honest this is just my feeling. I have no technical insight into Bun/Node to set any expectation whatsoever.

mangs commented 7 months ago

@psevdaisthisi Great feedback. One thing that came to mind when looking at your results is that your CPU is VERY old, so I'm wondering if uWebSockets, Node, or Bun has any optimizations for newer microarchitectures and/or the SIMD instructions enabled by them. Your results are still valid, but I'm wondering if a newer CPU would show different results based on that (or other?) reason.

TiBianMod commented 7 months ago

I made also the same test as @psevdaisthisi

Environment:

Node v21.6.1 - uWebSockets.js

1 2 3
120511.94 120561.52 121035.37
122684.38 122400.32 119253.84
123435.5 119785.83 120742.77
122853.24 122195.57 122203.17
122686.37 122035.86 121514.48
123626.88 120536.51 120712.85
122499.38 116788.51 118317.57
122687.16 117109.41 117258.06
123999.95 121456.05 121243.9
123365.43 118507.77 122481.87
Total: 1_228_350.23 Total: 1_201_377.35 Total: 1_204_763.88

Bun 1.0.25

1 2 3
108830.91 110555.47 110059.78
107465.34 110653.9 109737.71
107197.4 110006.75 111668.32
111712.88 110220.19 110317.71
109712.82 109748.8 109454.73
106369.21 112088.13 110218.21
111178.5 111242.34 111329.65
111991.07 110176.26 111572.76
111408.07 112280.62 112142.67
110963.54 110307.78 112279.98
Total: 1_096_829.74 Total: 1_107_280.24 Total: 1_108_781.52

One of my guesses is that around version 0.7.0 (if I remember correctly), bun Normalizes the Request URLs, in this point we had a performance hit. Related PR: https://github.com/oven-sh/bun/pull/4034

In my opinion these tests do not tell the whole truth, uWebSockets.js is much faster from these results. Why ??

let me add two more benchmarks

any router implementation will make use of url and method, so let's use the request url and method for the first bench..

Bun.serve({
    port: 3000,
    fetch(request) {
        // use `url` and `method`
        const url = request.url;
        const method = request.method;

        return new Response("Hello World");
    },
});

Bun 1.0.25

1 2 3
102020.4 101465.79 102742.9
104469.73 101666.56 103376.73
103132.27 102966.11 103810.64
103399.73 103481.1 102007.45
103339.03 102092.66 102339.54
103479.06 103048.71 101245.88
104004.36 103480 101070.99
104101.88 102277.19 103400.33
102036.96 103041.08 101404.35
103620.65 102004.74 104210.16
Total: 1_033_604.07 Total: 1_025_523.94 Total: 1_025_608.97

and you can see another drop on performance

Now let's use ElysiaJS for the second bench.

this example is the closest to uWebSockets.js example.

both using router etc...

import { Elysia } from 'elysia'

const app = new Elysia();

app.all('*', () => {
    return 'Hello World'
});

app.listen(3010)

Bun 1.0.25 + ElysiaJS 0.8.0

1 2 3
98959.65 99448.86 98968.32
100185.03 100826.56 99682.29
98412.88 100903.44 100917.71
100733.16 99026.49 100306.17
101610.91 100122.28 100151.74
100172 100452.09 101520.03
101202.46 100031.44 99362.75
101417.35 99462.35 100861.62
99547.79 99108.63 101432.36
101639.91 99678.31 98831.38
Total: 1_003_881.14 Total: 999_060.45 Total: 1_002_034.37

and you can see another drop on performance

So I will say the performance difference is around 20-25% in my opinion.