Open SelimEmre opened 4 years ago
Thank you for your question.
WebRTC changes the resolution according to the bandwidth. Opening camera with 1080 does not mean that WebRTC send stream with 1080p. It adapts the resolution according to the bandwidth and it can even send lower than 480p.
Btw, according to the webrtc standards, there are some more settings coming that gives more options to developer to maintain resolution or frame rate.
It adapts the resolution according to the bandwidth and it can even send lower than 480p. - I have 20 MBPS internet speed even after antmedia sending 480P to you tube.
Thank you for your comment.
TL;DR;
Let me point out some more things for enthusiasts
The real bandwidth is between server and client. 20Mbps internet speed is the speed between you and your service provider. After your request pass the service provider, your request is in the Internet and your internet provider cannot guarantee internet speed to everywhere with 20Mbps. Because there may be even some bottleneck in the route or they may be even some bottleneck in the data center in which your server resides.
WebRTC stack has the hard coded upper limit 2500kbps for streams. I mean even if you gigabit network, browser does not use more bandwidth than the 2500kbps for streams. Because, from my point of view, the Internet does not have QoS for Real Time Communication. If you have average gigabit network speed for one hour, you can see some kind of jitter, pixelation during the streaming. Because average network speed does not guarantee that you have the same speed for all the time without any jitter or packet lost. In RTC, even sub-second network fluctuations may affect the quality.
Okay.. I got it but the thing is when i start the streaming through out whole process it look like same there is no change in resolution even i am setting resolution to 1080 it's always taking lower level resolution.
Can you please help into that I having big issue in my live application where thousand of users are there.
Sorry for late reply.
It's something about browser side. It decides which resolution to send. The resolution may increase over the time but the browser decides that
You can google about that for having fixed and high quality in browser for WebRTC streaming. Maybe I'm or we're missing somethings
Short description
Here is scenario: Add RTMP Endpoint in WebRTC Stream. Check added RTMP Endpoint resolution value with ffprobe. You can see the resolution is 640x480. I tried different cases and I encountered this issue always.
Steps to reproduce
Expected behavior
If we send 1080p with WebRTC, RTMP Endpoint should have a 1080p resolution.
Actual behavior
RTMP Endpoint resolution 480p is fixed at whatever resolution I send it with WebRTC.
Logs
AMS 2.1 SNAPSHOT I'm using default settings.
FFprobe output: