Closed kuentinozaure closed 2 months ago
The values in metricsDidReceive should still work even if audio is muted. I would look at currentRoundTripTimeMs
, and audio packet loss. I would not use availableOutgoingBitrate
as it isn't meant to be a measurement of link bandwidth and may not be the value you expect.
Based on observation, it seems like audio roundTripTime
updates when audio is muted, but video roundTripTime
does not update when the camera is off. Is that correct?
That sounds correct to me. The STUN RTT (which is used for currentRoundTripTimeMs
) should also be always available. Its determined by STUN pings on the same path as media. All these values should be the same.
https://github.com/aws/amazon-chime-sdk-js/blob/main/README.md usecase 10 also has an example on how to check audio signal from signal strength using realtimeSubscribeToVolumeIndicator
.
What are you trying to do?
Hello everyone 👋,
I have an app similar to Zoom or Google Meet, where users can start their camera and microphone through the browser.
I need to implement a connection status indicator to let users know when their connection quality decreases.
However, users can join the meeting without enabling their camera and microphone, which causes issues. I tried using
connectionDidBecomeGood
,connectionDidBecomePoor
, andconnectionDidSuggestStopVideo
, but these events aren't triggered unless the user is in a meeting with audio and/or video enabled.I've also tried using
metricsDidReceive
to get metrics likecurrentRoundTripTimeMs
andavailableOutgoingBitrate
, but it seems I need to have the microphone or video on to get accurate or updated values.I attempted to use
realtimeSubscribeToLocalSignalStrengthChange
, but it also seems to require the video or microphone to be on to get the updated data.How can the documentation be improved to help your use case?
What documentation have you looked at so far?