Open tarunban opened 6 years ago
For reference, current thresholds in the spec are:
The above round-trip and bandwidth values are based on real user measurement observations:
- slow-2g is the 66.6th percentile of 2G observations
- 2g is the 50th percentile of 2G observations
- 3g is the 50th percentile of 3G observations
Some questions that come to mind...
based on current telemetry have the thresholds meaningfully changed for above percentiles?
I do not think 3G RTT of ~300msec is fast enough to support high resolution images. Since the time we did this investigation, I think the page sizes have become larger which means we can't afford to have high resolution images on 300msec RTT network.
what is the rough global split between 2G / 3G / 4G usage in the telemetry?
Slow 2G: 1.5% 2G: 0.8% 3G: 19% 4G: 78%
Do you think we should consider updating any of the current values in the spec?
If I'm interpreting your point correctly, sounds like current 270ms threshold is still ~reasonable? Should we consider collapsing "Slow 2G" and "2G" into a single bucket?
I think there are 2 problems: (i) The distinction between Slow2g and 2G is not very useful (ii) There is a big jump from 2G to 3G (2.3% to 21.3%).
Ideally, we can have a situation where Slow2G covers bottom X% of slow connections, 2G covers next Y% , 3G covers next Z%, and the jumps between X, Y, Z are not too big. e.g., 5, 10, 15 to get a total coverage of 30%?
Hmm, I see what you're after, but I think that's a very different signal from what we expose today?
That said, if you have the data handy, what would the above thresholds yield in terms of RTT and Downlink values?
We'd be interested in revisiting the ECT thresholds from the Chrome UX Report angle. For example, we're seeing an average of ~90% of origins reporting 4G mobile experiences.
Adding desktop users to the mix, the percent of 4G experiences rises to ~95%:
Rather than grouping the vast majority into 4G, it would be useful to have more granularity to identify whether these experiences are on the fast or slow ends of the spectrum.
Rather than grouping the vast majority into 4G, it would be useful to have more granularity to identify whether these experiences are on the fast or slow ends of the spectrum.
The primary use case and intent for ECT is as an indicator for when the site should (strongly) consider altering what+how it delivers to the user, which in turn is concentrated on very slow connections where the user is likely to fail to load the page or abandon the navigation. As a result, the wide upper bucket is by design and intentional — the explanation column in https://github.com/WICG/netinfo/issues/68#issuecomment-364419902 captures this well.
That said, NetInfo does provide both downlink and RTT for those that want a high resolution view.
NetInfo thresholds for different effective connection types were decided more than a year ago. Since then, we have collected more metrics and have more visibility into how the API is being used. We should analyze the recent data and update the network quality thresholds based on that.
Few things that I can think of at the top of my mind: