Imagine that our max bandwidth function returns 92M for configuration X. If it is up to 5% loss. 0.05 * 92M = 4.6M. Then 90M (one step back) also has loss... So when we get ~26ms on base cfg and 95ms on cfg X maybe it's not because the changed parameter in cfg X broke something but maybe we still look at lossy cfg and the results are off...
I think we should remodel our max bandwidth function to check it without loss and we should go like Y (maybe 5) Mbps back from it... and then test. That would also speed up our initial tests, because we would test only one bandwidth.
And please, we should rename name of our function from max bandwidth (or sth like that) to max throughput...
Imagine that our max bandwidth function returns 92M for configuration X. If it is up to 5% loss. 0.05 * 92M = 4.6M. Then 90M (one step back) also has loss... So when we get ~26ms on base cfg and 95ms on cfg X maybe it's not because the changed parameter in cfg X broke something but maybe we still look at lossy cfg and the results are off...
I think we should remodel our max bandwidth function to check it without loss and we should go like Y (maybe 5) Mbps back from it... and then test. That would also speed up our initial tests, because we would test only one bandwidth.
And please, we should rename name of our function from max bandwidth (or sth like that) to max throughput...