djkoloski / rust_serialization_benchmark

Benchmarks for rust serialization frameworks
514 stars 48 forks source link

Add interactive website via Github Pages #46

Closed caibear closed 9 months ago

caibear commented 10 months ago

This pull request adds an interactive website. Given bandwidth and cpu limits, it calculates how many messages per second could be sent/received for different combinations of serialization crates and compression libraries.

See https://caibear.github.io/rust_serialization_benchmark/

For example, this is useful for calculating how many average concurrent players an mk48.io server can handle. Given inputs 1 TB/Mo and 0.01 cores, it returns 437 updates/s for bitcode. SInce mk48.io sends 10 updates/s per player, a server can handle 43.7 players. The second best is serde_bare + zstd which returns 387 updates/s aka 38.7 players.

The data is taken from a copy of the README.md embedded in the binary. Compression speeds are currently based on constants, ideally they would be measured during the benchmarks.

TODO

djkoloski commented 10 months ago

Thanks for the PR, I think this is very exciting for the benchmarks. I'll break down my thought process a bit:

Direction

Technical

With that in mind, here's what I propose:

  1. Convert the benchmark parsing code to Rust. It should parse a raw log and output a JSON file.
  2. Switch the README formatter to consume the JSON file. This can also be converted to Rust.
  3. Switch this interactive site to consume that JSON file.
  4. Document the calculations used to make the projections in this visualizer.

With future work to add more input sizes to the benchmark data sets.

@caibear I would appreciate your feedback and thoughts. I understand this is probably a significant expansion of the intended scope, so I would of course help get this work done.