SerenityOS / serenity

The Serenity Operating System 🐞
https://serenityos.org
BSD 2-Clause "Simplified" License
30.65k stars 3.19k forks source link

DisplaySettings: Can't select native resolution supported by monitor at 200% #5003

Closed tomuta closed 3 years ago

tomuta commented 3 years ago

This is hypothetical right now, but will be a real problem with actual hardware. Let's say the monitor only supports 1024x768. If I want to double the size because the pixel density is so high, I need to check 200%. But since 200% now means that the monitor receives 2048x1546, which it doesn't support, I then have to pull out the calculator and calculate that I would have to set the resolution to 512x384, which is not available.

It comes back to the resolution really should be the actual resolution sent to the display. It seems really odd that I have to calculate odd resolutions to be able to use scaling, just to make the display actually display something. Displays generally don't support arbitrary resolutions.

nico commented 3 years ago

(If your monitor only supports 1024x768, you can't run it at 200% because serenity requires min 640x480.)

I think the Right Fix here is to have a min and ma xresolution supported by the actual monitor and then clamp the selected res to that, right? You pick the size you'd want, you pick 200%, and then you get the closest the monitor can get to that.

tomuta commented 3 years ago

It was a hypothetical example, but nonetheless what are the odds that there happens to be another resolution available that is exactly a multiplier?

What if I wanted to install it on my XPS laptop at work that has a 13" 4k screen? 100% is too small, 200% is way too big. I use 150% there. Granted, we don't support this yet, but if I had SerenityOS running there I'd have to select a resolution of 2560x1440 (which happens to be QHD) just so that I can drive the display at its native resolution of 3840x2160. Weird to say the least. Very confusing.

Now, this example likely happens to work because QHD is a common resolution. Let's take a QHD monitor. In order to drive it at its native resolution of 2560x1440 using 150% scaling, I'd have to select a "resolution" of 1706.66x960, so now we're in fractional pixel territory. Or are we flat out not going to support those?

Are we going to make people pull out their calculator just so that they can set it up so they can actually use their display as intended? Are we going to show fractional "resolutions" that work with each multiplier? What if you pick a resolution with a multiplier say 125% that results in a resolution that the display can't drive?

Bottom line, graphics cards/displays report a list of resolutions they support. Those are the resolutions people are familiar with, they know what their monitor can support. Those are the resolutions we should be sending to the display. Just like ALL the other OSs I know of do it.

nico commented 3 years ago

This whole list of resolutions is an artifact of the crt days, right? If you have a non-crt monitor, serenity should always use its native resolution and the only thing you should have to adjust is the scale factor.

tomuta commented 3 years ago

No, to this day all monitors/graphics cards provide the OS with a list of resolutions they support. Back in the day one had to know exactly what the monitor supports, and it was possible to even damage hardware in rare cases. Nowadays, there is generally a two way negotiation, e.g. using hdmi edid. But, you still can't set any arbitrary resolution, it has to be supported by the graphic card and monitor, and everything in between (e.g. hdmi cables, which have built in chips that are involved in negotiations).

nico commented 3 years ago

Right, but everything not at the display's native resolution is scaled in some place or another, right? And it seems serenity has more information so it should be able to scale "better" than letting the monitor do it.

Anyhoo, I don't have a super strong opinion here -- if you have a good idea on how the DisplaySettings UI should behave, go for it and let's see how it feels :)

tomuta commented 3 years ago

Well, you may scale at whatever point you want, as long as you send the bitmap to the device driver at a supported resolution. Generally, people would expect to select the resolution to send to the display, anything else is utterly confusing. Say, if I select 1024x768, that is what I would expect to be delivered to the display. Regardless of any scaling setting.

JonArcherII commented 3 years ago

Well, you may scale at whatever point you want, as long as you send the bitmap to the device driver at a supported resolution. Generally, people would expect to select the resolution to send to the display, anything else is utterly confusing. Say, if I select 1024x768, that is what I would expect to be delivered to the display. Regardless of any scaling setting.

I have to agree with this. Setting my resolution (2560x1440) at 200% results in a WindowServer crash due to an "invalid resolution". I think changing this behavior would match other systems and user expectations, and seems so much more reasonable ;^)

tomuta commented 3 years ago

Now that #8032 was merged this should be fixed. The output resolution you select is what is being sent to the display.