WebRTC.org website (*** This repository is deprecated ***) - The documentation to contribute to native code is available at https://webrtc.googlesource.com/src/+/refs/heads/master/docs/native-code/index.md, info about WebRTC is available at https://webrtc.org)
In function OnSentPacket(SentPacket msg) of "pcc_network_controller.cc" file. There is codes
if (last_receivedpackets.size() > 0)
sending_time = last_receivedpackets.back().receive_time -
last_receivedpackets.front().receive_time;
DataRate receiving_rate = bandwidthestimate;
if (sending_time > TimeDelta::Zero())
receiving_rate = received_size / sending_time;
bandwidthestimate =
std::min(bandwidthestimate * 0.5, receiving_rate);
But the unit of sending_time is ms(millisecond), the unit of received_size is bytes. So the unit of receiving_rate is B/ms,is not bps, is it right? but the unit of bandwidthestimate is bps. So what does it mean? The unit is different?
Another question:
In function ComputeDelayGradient(double delay_gradient_threshold) of file "monitor_interval.cc", why the unit of delay is us(microsecond)? And by analyzing the source code of us(), I found this function does nothing but call function ToValue(), so the result is still ms(millisecond)? I am confused. Can anyone answer to me? Thanks a lot!
In function OnSentPacket(SentPacket msg) of "pcc_network_controller.cc" file. There is codes if (last_receivedpackets.size() > 0) sending_time = last_receivedpackets.back().receive_time - last_receivedpackets.front().receive_time; DataRate receiving_rate = bandwidthestimate; if (sending_time > TimeDelta::Zero()) receiving_rate = received_size / sending_time; bandwidthestimate = std::min(bandwidthestimate * 0.5, receiving_rate);
But the unit of sending_time is ms(millisecond), the unit of received_size is bytes. So the unit of receiving_rate is B/ms,is not bps, is it right? but the unit of bandwidthestimate is bps. So what does it mean? The unit is different?
Another question: In function ComputeDelayGradient(double delay_gradient_threshold) of file "monitor_interval.cc", why the unit of delay is us(microsecond)? And by analyzing the source code of us(), I found this function does nothing but call function ToValue(), so the result is still ms(millisecond)? I am confused. Can anyone answer to me? Thanks a lot!