Chrome and Safari behave roughly the same in terms of bitrate control when bitrate and frame rate are provided.
I haven't checked Firefox.
Chrome seems to make sure the bitrate is applied consistently whatever the actual frame rate is, be it 30fps or 100fps.
When frame rate is not provided, Safari uses a default frame rate (30 fps).
If the frame rate is above this default frame rate, Safari actual bitrate per second is above the given bitrate.
There is a note in the spec that Authors are encouraged to provide both bitrate and frame rate.
Given this has interop impact, I wonder what the intent of the specification is when frame rate is not provided, or if the actual frame rate is far from the configured frame rate (much lower or much higher).
Should we have more implementation guidelines in that area?
Experiment based on https://peaceful-genie-41c2b6.netlify.app/codec-compare.html and https://bugs.webkit.org/show_bug.cgi?id=274822.
Chrome and Safari behave roughly the same in terms of bitrate control when bitrate and frame rate are provided. I haven't checked Firefox.
Chrome seems to make sure the bitrate is applied consistently whatever the actual frame rate is, be it 30fps or 100fps.
When frame rate is not provided, Safari uses a default frame rate (30 fps). If the frame rate is above this default frame rate, Safari actual bitrate per second is above the given bitrate.
There is a note in the spec that Authors are encouraged to provide both bitrate and frame rate.
Given this has interop impact, I wonder what the intent of the specification is when frame rate is not provided, or if the actual frame rate is far from the configured frame rate (much lower or much higher).
Should we have more implementation guidelines in that area?