The -log CSV differs rather a lot though: the highest I get (shortly before ^Cing) is:
84352424 Media coder bitrate [bps]
85207824 Transmitted bitrate [bps]
84976040 ACKed bitrate [bps]
That’s about 10 Mbit/s less! Where does this difference come from?
Looking at the code, the “Transmit rate” command line output seems to be tracked completely independently of the values that actually end up in the CSV. This doesn’t make any sense to me… shouldn’t the bandwidth be dependent on how many bytes per second you can actually get through the link?
Isn’t ACKed bitrate (plus packet overhead) the better estimate?
With the bandwidthtester receiver on an 100 Mbit/s ethernet link:
The
-log
CSV differs rather a lot though: the highest I get (shortly before^C
ing) is:That’s about 10 Mbit/s less! Where does this difference come from?
Looking at the code, the “Transmit rate” command line output seems to be tracked completely independently of the values that actually end up in the CSV. This doesn’t make any sense to me… shouldn’t the bandwidth be dependent on how many bytes per second you can actually get through the link?
Isn’t ACKed bitrate (plus packet overhead) the better estimate?