Closed tonihei closed 5 years ago
It's an interesting question, but not with a simple answer. Yes, using the resolution (height) as a quality indicator would be possible, but it's still not a one-to-one mapping towards quality (and then there is the question on what "quality scale" is used).
The more complex answer to this question is to instead use the (currently optional) ITU-T P.1203 standard, see the chapter "Video & Audio Quality Metrics".
WG acknowledge average bitrate is an imperfect measure of the perceived quality but switching to a metric taking resolution into account will not yield improvement to the measure of the perceived quality either. Hence WG decides to stay with the current metric in the spec.
The perceived Quality of Experience is not necessarily related to the bitrate of the played streams:
A potentially better option would be to use the resolution height in pixels: