openspeedtest / Speed-Test

SpeedTest by OpenSpeedTest™ is a Free and Open-Source HTML5 Network Performance Estimation Tool Written in Vanilla Javascript and only uses built-in Web APIs like XMLHttpRequest (XHR), HTML, CSS, JS, & SVG. No Third-Party frameworks or libraries are Required. Started in 2011 and moved to OpenSpeedTest.com dedicated Project/Domain Name in 2013.
https://openspeedtest.com
MIT License
2.1k stars 197 forks source link

Feature Request: Ping measurements also during load #142

Open zR-JB opened 1 month ago

zR-JB commented 1 month ago

Maybe never stop the ping measurement, and let it run continuously also during the upload and download phase. And then have a simple graph, or just the average during the different load scenarios. With this, one could investigate buffer bloat issues very easy!

Even ookla with speedtest.net had this feature for quite some time now: image

openspeedtest commented 1 month ago

I'll address this in a future version, but there are known limitations as outlined in https://github.com/openspeedtest/Speed-Test/issues/33. Otherwise, we'd need to overhaul the entire setup and potentially use something like webtransfer. I'll seriously consider this for the next major rewrite, which will support multiple protocols.

zR-JB commented 1 month ago

When looking at what is transferred during the speedtest.net test, I noticed that the downloads and uploads are also XHR, and the ping measurements are done via a websocket, so I think you are right.

image

zR-JB commented 1 month ago

So I did a bit more testing a made small POC with a simple node.js echo websocket server. And then performance.now() on the client side. Turns out Firefox does by default also only provide millisecond accuracy, even with performance.now(). Chromium based browsers, at least, have 100 μs. I think there is still a bit much overhead for extremely accurate results.

websocket_latency_test.zip

zR-JB commented 1 month ago

So here is what I found:

  1. The Browser Web socket Implementation adds around 0.1 ms of latency compared to a native rust websocket client when running on the same machine
  2. Using a Rust websocket-server instead of a node.js server only saves about 0.02 ms
  3. When being in the same network but another machine, with native rust <->rust websockets I get around 0.3 ms (between 0.24 and 0.4)
  4. With this setup and the browser implementation client and rust websocket server, I get around 0.9ms.
  5. Real Latency via ICMP, in this scenario, is about 0.2 ms.
  6. So this is not ideal, but I am not sure if any other protocol can achieve anything better than that
openspeedtest commented 1 month ago

When you add browser extensions, web filters, antivirus software, and other busy tabs to the mix, you'll see a much higher RTT.

zR-JB commented 1 month ago

Yes, I think a websocket approach to measure latency similar to:

grafik is probably still the best compromise