Closed altanai closed 7 months ago
I agree that oscillations in the rate can really impact performance, and that this might be useful as a performance result. Do you think this translates in to harm for other flows (e.g., starvation) or poor use of the available capacity?
Exactly. I think such cycles would be detrimental to end-user experience, any telemetric system that a network service provider may be using as well as the both the points you mentioned.
We have to be a bit careful about recommending stability. I remember Van Jacobson explaining that yes, a more stable version of Reno would be a bit more efficient, but then it would also not adapt so quickly to changes in network conditions. There might be a tradeoff there.
So maybe we need to be very clear about the characteristics of the oscillation that we do not want. @altanai are you thinking about something like the "saw tooth" patterns of RENO, or the similar patterns in Cubic? Or do you have some example of worse oscillations?
I'm not sure we need a new criterion here. Under-utilization is not a problem for network stability. Over-utilization is well-covered by our bufferbloat metric.
Even an algorithm that sawtooths around the correct link capacity will experience more buffering than a smoother algorithm.
Since this is a problematic behaviour because it leads to an inefficient use of network resources and can cause variable/unreliable performance ( high speed followed by low speed so on ). Maybe new Cc algorithms can focus on stability in this area.