SaltyAom / bun-http-framework-benchmark

Compare throughput benchmark from various Bun HTTP framework
349 stars 30 forks source link

feat: add bunzer #50

Closed farzher closed 9 months ago

farzher commented 9 months ago

with the release of bun v1.0 i decided to try it out and write my own server for personal use. it places 1st in this benchmark, even above uws! just wanted to share what's possible with bun!

you can see benchmark results here: https://github.com/farzher/bunzer

aquapi commented 9 months ago

This will not check whether HTTP message is valid or not I think you should create an Engine benchmark for TCP

aquapi commented 9 months ago

But the concept is really cool, loved it @farzher

aquapi commented 9 months ago

Parsing all of the text to string will be much slower than doing something like reading some bytes then check for path and then parse the body later

aquapi commented 9 months ago

The concept of this is that you still need to read a certain bytes from the request which you can provide routing then you read headers and the body and parse them to the format you prefer But parsing with Uint8Array is just too slow compared to what Bun.serve does.

farzher commented 9 months ago

This will not check whether HTTP message is valid or not

i'm not sure what you mean. it's definitely feature incomplete but this is a working http server. i'm using it to serve my website

aquapi commented 9 months ago

This will not check whether HTTP message is valid or not

i'm not sure what you mean. it's definitely feature incomplete but this is a working http server. i'm using it to serve my website

For some malformed HTTP message such as

HTP/1 GET /g

Just an example

aquapi commented 9 months ago

uWS HTTP server is really good at checking stuff like this because it is written in optimized native code.

farzher commented 9 months ago

HTP/1 GET /g

in that case bunzer returns a 404. what's the problem?

aquapi commented 9 months ago

@farzher what will happen when I constantly send 200MB of json data that you don't even need to read for some endpoints? or maybe above?

You need to parse and validate based on the spec.

Jordan-Hall commented 9 months ago

Screenshot_20230915_172942_Brave Your router not exactly a router and doesn't comply with methods correctly.

If you wish this to be treated as a http server then you really need to implement http 1.1 compliant you basically made a fast socket handler than can send http responses in a fashion

farzher commented 9 months ago

Your router not exactly a router and doesn't comply with methods correctly. If you wish this to be treated as a http server then you really need to implement http 1.1 compliant you basically made a fast socket handler than can send http responses in a fashion

like i said it's for personal use and i haven't needed that feature yet but it's trivial to add without affecting performance...

i'm not sure what kind of weird gatekeeping is going on in the benchmark, whatever helps you sleep at night. i'll be sure to let my fast http server know it's not valid.

Jordan-Hall commented 9 months ago

Http 1.1 standards be nice. Ideally only that stops it for me. I'm actually quiet impressed by it tbh

aquapi commented 9 months ago

@farzher I can basically blow up the server if I send some really heavy request.

farzher commented 9 months ago

@farzher I can basically blow up the server if I send some really heavy request.

is that not true of other node.js servers? i feel like almost all of them are vulnerable to stuff like that. i always run my node servers behind an nginx reverse proxy to keep them safe and fast. so i think i'm fine. there's many use cases for an extra fast server that takes shortcuts to get there. internal microservices is another one.