webrtc-sdk / webrtc

BSD 3-Clause "New" or "Revised" License
222 stars 86 forks source link

Low resolution on some Android phones #133

Closed holzgeist closed 1 month ago

holzgeist commented 1 month ago

Hi there,

during development of our app that uses Livekit as the video call framework, we discovered that the video quality is very low on some Android devices. I tracked down at least one bug (and a combination of heuristics combined with some bad luck I guess) in WebRTC, so I'm reporting this here and not in https://github.com/livekit/client-sdk-flutter/, the entry point we use.

The issue

We set the defaultCameraCaptureOptions to VideoParametersPresets.h720_43 (i.e. 960x720) with simulcast enabled. On some Android phones, the sent simulcast layers are 144x192 and 288x384, even on local network with a local livekit server. This is ~16% of the requested resolution, and definitely not suitable for high quality video conferencing.

The deep dive into WebRTC

After ruling out network issues, I started a deep dive into client-sdk-flutter and subsequently into WebRTC itself. Here are my findings (on a Fairphone 5 with Android 13):

The solution(s)

one more thing

With and without the bug fix, livekit server reports a "Good" connection quality for that phone despite being on the same network. On a different phone, that has 960x720 in its native capture formats, layer selection works and connection quality is "Excellent". This is likely a bug in livekits scorer that I will investigate separately. I still wanted to mention it here because it was the reason for my investigations

davidliu commented 1 month ago

Hey, thanks for the report and PR! I'll take a look at this over soon.

davidliu commented 1 month ago

With and without the bug fix, livekit server reports a "Good" connection quality for that phone despite being on the same network. On a different phone, that has 960x720 in its native capture formats, layer selection works and connection quality is "Excellent". This is likely a bug in livekits scorer that I will investigate separately. I still wanted to mention it here because it was the reason for my investigations

We haven't yet looked into this deeply, but someone mentioned that because the server expected three layers and got only two, the score could be dinged.

holzgeist commented 1 month ago

We haven't yet looked into this deeply, but someone mentioned that because the server expected three layers and got only two, the score could be dinged.

That was me after some more digging :smiley:

davidliu commented 1 month ago
  1. Looks like this might be a flutter-webrtc issue, as they use the adaptOutputFormat functions with the requested resolution. This causes the initial downscaling pointed out at the Java/C++ layer when the actual capture format is larger.

    @cloudwebrtc I think it might be good to just remove the adaptOutputFormat calls here, or at least adding an option to skip the adaptOutputFormat call so that frames are sent to the native layer unadapted (the capturer should be capturing close to the target anyways). Alternatively, we could bump in the values passed into adaptOutputFormat by some factor (say 1.5) to allow for the rounding up of the camera capture format. Android SDK doesn't use this method, so should be fine to remove though.

  2. 4:3 buckets probably not needed, since the Livekit sdks determine by the longest side whether we will use 3 layers or not (which means 4:3 will have more pixels than the 16:9 with the same longest side), and we require at least 960 on the longest side, which lines up with the WebRTC buckets.

  3. I think we actually can't use the PR, since this affects the layer ordering information and the SFUs are sensitive to that. However, I think fixing up the flutter-webrtc code to accurately get the actual video camera dimensions used will address the issue of simulcast layer limiting and thus make this not needed.

    There's another alternative we can use (which also addresses point 2), which is to simply turn off the native simulcast layer limiting so that we don't run into this mismatch in the first place.

    @cloudwebrtc @hiroshihorie we can turn off the simulcast layer limiting by adding WebRTC-LegacySimulcastLayerLimit/Disabled/ as a field trial. Note, for smaller resolutions (under 480px longest side), the native code limited to 1 layer only, and we should similarly follow suit and set a minimum longest side to send 2 layers regardless of whether we go with disabling the field trial or not.

holzgeist commented 1 month ago

Hi @davidliu thanks for your thorough review and quick fixes in the linked PRs. I tested them and they fix all the described problems in my bug report above :heart:

Even though it's not needed anymore, can you elaborate on the layer ordering? I don't see where I change it. The resulting layer array is the same as before, but has correct resolutions/scaling factors set.

davidliu commented 1 month ago

@holzgeist looking at it again, I think I just brain-farted and thought it reversed the order. The ordering should be fine. I think we can actually use it actually. I'll need to do a little more digging and testing though

holzgeist commented 1 month ago

@davidliu thanks :) I'll close the issue for now though, because the problems are fixed and the PRs merged. Let's continue discussion on PR #134 (if needed)