awesomeWM / awesome

awesome window manager
https://awesomewm.org/
GNU General Public License v2.0
6.39k stars 598 forks source link

Handling mixed/dynamic DPI setups #1480

Open Oblomov opened 7 years ago

Oblomov commented 7 years ago

I would like to open a discussion about how awesome could improve its handling of mixed DPI setups (multiple monitors attached, with significantly different DPIs). These setups are getting more and more common (e.g. with laptops with HiDPI integrated displays and external standard-DPI monitors attached), and we want to avoid issues such as #1210.

Additionally, there is the possibility of DPI changes (e.g. because the user decides to temporarily reduce the resolution of a monitor to accommodate programs that do not support HiDPI monitors ahem Steam ahem Java ahem), which we can track via RRChangeNotifyEvent.

The biggest obstacle to mixed and dynamic DPI handling currently is that themes do some handling themselves, using apply_dpi without a screen specification. This prevents lengths such as border_width or useless_gap or the menu height and width from adapting to the display. The rest of the handling is done by the widget themselves, that do so by setting the context DPI from the screen, which allows them to produce DPI-specific results.

I would argue that themes themselves should not do any DPI scaling, providing only reference lengths and sizes (i.e. what one would use at 96 DPI), letting widgets and layouts do the scaling as appropriate. This should be extended to all user-specified lengths. For example, in reference to #1210 and the code at fault indicated there, I would argue that apply_dpi should not be used there, and it should instead be up to the marginbox to scale the margins at render time.

Moreover, the actual conversion from reference to scaled size should happen at the last possible moment, during the draw. This would allow dynamic DPI changes, in the following way: on RRChangeNotifyEvent, xresources updates the screen DPIs (unless overridden by the user), and then emits a signal that widgets and layouts can connect to and simply redraw, automatically using the new DPI.

What are the thoughts of the cognoscenti on the subject?

psychon commented 7 years ago

Sounds nice to me, but it might be hard to get there.

and then emits a signal that widgets and layouts can connect to and simply redraw, automatically using the new DPI.

Widgets don't have to handle DPI changes. They do not know which screen they are on anyway. Instead, the wibox (lib/wibox/drawable.lua) has to handle this and just trigger a redraw. The redraw will then use the new DPI value. (At least that's the theory).

Oblomov commented 7 years ago

Yes, sorry, when I talked about widgets handling it I was referring specifically to the part handled by wibos/drawable. Ok, I'll start working on this and get a proof of concept out for testing.

actionless commented 7 years ago

my bad, i left the comment in the other discussion before noticing this one, but here i have nothing to add, sounds good

Oblomov commented 7 years ago

FWIW, I've started working on this on my mixed-dpi branch. The groundwork was easy to put down thanks to the awesomeness of the awesome internal API. Of course the devil will be in the details ;-)

actionless commented 7 years ago

btw, currently on a laptop i am using awesome with dpi value of 157 and don't see any rendering problem because of 1.635 scaling factor: https://raw.githubusercontent.com/actionless/awesome_config/devel/screenshots/screenshot_new.png

however neither 1 nor 2 factor will look satisfying, so i am concerned about idea of using only int scaling factor by default

but i think it could be very nice idea to round values like 0.85 and 1.1 to 1 but not touch values which are somewhere in the middle (like 1.3-1.7)

Oblomov commented 7 years ago

In my experience (and opinion), fraction scaling results in fatter lines, and in some cases blurrier look (although this tends to depend on content and method). In your screenshot, I would say the lines are a bit too thick too for example —but maybe that's intentional— and I get the same feeling when I use the 2.5 scaling which is more appropriate for my display. This is particularly noticeable with some font rendering, especially fonts designed to be rendered at specific sizes (Terminus comes to mind: 8 or 16 is very nice and clean, 20 IMO looks 'bolder' than necessary). Or maybe it's just a matter of taste.

psychon commented 7 years ago

@Oblomov Font scaling is Pango's job (or some other library used by Pango). I do not know the details, but I think it only uses the DPI value to calculate the pixel size that it wants to use. All other stuff... ok.

Still, I don't think that we should start rounding scaling factors. If people want an integer scaling factor, I'll tell them to set e.g. a DPI of 192. Everyone who is ok with the current non-integer-scaling can continue to stay happy.

Oblomov commented 7 years ago

Font scaling is, but what about line thickness? How do you make a line 1.5 pixels thick?

That being said, even if we don't clamp scaling factors to integers, I think that a separate overload for the scaling factor should be maintained, since there are cases where one may want "true” DPI scaling even when they prefer integer (or whatever else) scale.

psychon commented 7 years ago

How do you make a line 1.5 pixels thick?

Font hinting is the art of trying to do just that. So is anti-aliasing and lots of other tricks.

I'm not sure what you mean with "separate overload". If you mean "people should still be able to call set_dpi to force a certain value, then I agree".

Oblomov commented 7 years ago

Font hinting is the art of trying to do just that. So is anti-aliasing and lots of other tricks.

Actually, font hinting is about maintaining uniform thickness rather, but yeah, I get what you mean. My question was more geared towards: how does awesome handle this presently. And even assuming you get the drawing right, you also have to handle the positioning correctly. Overall, it's considerably much more of a mess than integer scaling would give.

I'm not sure what you mean with "separate overload". If you mean "people should still be able to call set_dpi to force a certain value, then I agree".

I mean separate set_dpi and set_scaling_factor.

psychon commented 7 years ago

how does awesome handle this presently.

It tells Pango about the DPI and doesn't do anything more than that. Everything else is hiding behind Pango (It's the set_resolution() call in textbox.lua that informs Pango).

Oblomov commented 7 years ago

Sorry, I didn't mean about fonts, but about actual lines, like for borders etc.

psychon commented 7 years ago

Still no DPI-related code for that in awesome. Awesome tells Pango about the DPI and then using things like for example pango_layout_get_pixel_extents() to get the size of the PangoLayout in pixels.

ticki commented 7 years ago

Hmm. The last update seems to ignore .Xresources and manually rescale, making the UI unproportionally large.

psychon commented 7 years ago

@ticki You mean "last update of awesome"? Sorry, but that should get its own issue and is not related to this one. When you open a new issue: From what to what did you upgrade? You do mean "awesome's wibar", right (because awesome is not involved e.g. in the font size in Firefox)? Also, please make clear what exactly you mean with "manually rescale".

timroes commented 7 years ago

As it appears to me, awesome does currently only read in the Xft.dpi setting and use this as dpi for all screens unless you call beautiful.xresources.set_dpi and set something else for a specific screen.

Having fiddled with mixed DPI settings for quite some while, I think, one of the most stable mechanism to determine the per-screen-dpi setting is to use the information provided by RANDR about the size of the output in relation to the resolution of that screen. I use that to calculate a per-screen-dpi in my config.

If the screen list or geometry or outputs change, the dpi will be recalculated and I pass them to beautiful.xresources.set_dpi for each screen.

Do you think it would make sense to have such a mechanism of calculating per-screen-dpi in the core itself?

The drawbacks of that approach - I've yet figured out - are:

What do you think in general about using the RANDR information for dpi calculation and also about some of the above issues?

Also thanks for all the work already done for (mixed) hidpi screens, I already could get rid of a lot of my old config and have better visuals in my setups!

blueyed commented 7 years ago

@timroes We have this by now (via @Oblomov): https://github.com/awesomeWM/awesome/blob/2192e61f891c90420088863e52c9f8b7a924a716/lib/beautiful/xresources.lua#L91-L100 (via).

timroes commented 7 years ago

@blueyed ah sorry... I've again looked at my local lib files instead of watching into master. Shame upon me of not noticing that. Also looking into it it seems this currently doesn't calculate per-screen-dpi, since it uses the overall resolution of the root window (which is the xserver framebuffer size?) and the dimension of it (which if it is what I think can be set via xrandr --fbmm WIDTHxHEIGHT?) That way you can only have one DPI size for all screens and - I had something similar in the beginning and found it even worse - in a mixed DPI setup, you will now have wrong DPI for ALL monitors, since you end up with kind of the average of the windows, which imho leads to slightly off graphics on all outputs. But not sure if that is really what root.size() and root.size_mm() does.

psychon commented 7 years ago

@timroes You are right, this isn't per-screen, but only "when there is no Xft.dpi, fallback to something else than the hardcoded 96".

I remember someone trying to submit "compute per screen DPI via RANDR outputs" in a pull request, but that code didn't handle multiple output correctly (if I remember correctly: the code ended up with the sum of the individual output's DPI, which is severely wrong).

a possible solution would be to calculate the dpi of all outputs. Then use either the lowest dpi - assuming that the lower DPI might more likely be an external device (like a presenter), that you want to have proper scaled graphics - or the highest dpi - assuming, that you want to look your graphics best for the highest dpi output. I think using the lowest would be the more common use case people would expect.

Sounds like something that could be made configurable via a beautiful theme setting (and even: "should this be done at all?" could be a beautiful theme setting where "nil" means "yes")....

Tentative (completely untested) patch that anyone is allowed to pick up and turn into a proper pull request (perhaps with some kind of test?):

diff --git a/lib/beautiful/xresources.lua b/lib/beautiful/xresources.lua
index ce6e78e..aa9552c 100644
--- a/lib/beautiful/xresources.lua
+++ b/lib/beautiful/xresources.lua
@@ -70,13 +70,28 @@ local function get_screen(s)
     return s and screen[s]
 end

+local function compute_dpi(s)
+    local result = math.huge
+    local height_px = s.geometry.height
+    local mm_to_inch = 25.4
+    for _, output in pairs(s.outputs) do
+        local height_mm = output.mm_height
+        local dpi = height_px * mm_to_inch / height_mm
+        result = math.min(result, dpi)
+    end
+    if result ~= math.huge then
+        return round(result)
+    end
+end
+
 --- Get global or per-screen DPI value falling back to xrdb.
 -- @tparam[opt] integer|screen s The screen.
 -- @treturn number DPI value.
 function xresources.get_dpi(s)
     s = get_screen(s)
-    if dpi_per_screen[s] then
-        return dpi_per_screen[s]
+    local dpi = dpi_per_screen[s] or compute_dpi(s)
+    if dpi then
+        return dpi
     end
     if not xresources.dpi then
         -- Might not be present when run under unit tests

Note that this means that Xft.dpi is now completely ignored if RANDR is available.

actionless commented 7 years ago

Note that this means that Xft.dpi is now completely ignored if RANDR is available.

i am not sure it's good idea to ignore a value which was set by user in prior to automatical value

also i've just tested that code on my display which should produce smth about 185, but instead i've got 96

$ awesome-client "print(screen[1].geometry.width * 25.4 / screen[1].outputs['Virtual1'].mm_width)"

so it seems what at least for me merging that will break dpi-detection (but probably it could be more setups where randr value is not retrieved correctly automatically and Xft.dpi is set)

timroes commented 7 years ago

I would agree with @actionless. Perhaps the Xft.dpi setting should have priority over auto detection. It would also ensure better backward compatibility for the users, that already specified this setting.

So the priority setting maybe should be:

  1. manually called beautiful.xresources.set_dpi
  2. setting got from Xft.dpi
  3. auto detected dpi via RANDR
  4. default dpi of 96

What I also currently have - but not sure if this makes sense in the core, it might be a bit to specific - is a factor to correct single screens auto detected dpi. Why? Even though it correctly determins the DPI of my build in laptop screen, I just find things a bit too large on that physically small screen, when using the same heights than on an external 96dpi display. So I can specify a factor in my config like: dpi.eDP-1.xfactor = 0.8 which will only use 80% of the dpi for the output eDP-1. Maybe something like that could be configured via the beautiful theme, but maybe it's too much for the core. You could still solve this in your own config, by using the set_dpi methods to apply the factor.

Oblomov commented 7 years ago

I agree. I would actually do:

Ideally, set_dpi should allow per-output DPI configuration too.

EDIT: BTW I had started playing on mixed DPI on the mixed-dpi branch on my fork, but honestly I've had to stop working on it due to being very busy, I'm not sure how much my work there still applies.

tvainika commented 7 years ago

How about hot plugging monitors? Xft.dpi cannot be set per monitor, therefore better solution would be using RANDR dpi over Xft.dpi. It would work automatically when the monitor reports information correctly. And if the measures in randr are incorrect, then it needs to be overridden in awesome configs with set_dpi.

For example my laptop has hi-dpi screen and then I often plug the laptop to low-dpi 24" monitor. There is no correct Xft.dpi for my use case because it cannot be set per monitor. I use Xft.dpi by changing it on the fly with xrd -merge before launching terminal and other older software.

When trying to fix mixed/dynamic DPI issues, Xft.dpi should not be used for anything if it can be avoided, because it cannot be set to handle mixed/dynamic cases.

Oblomov commented 7 years ago

Setting Xft.dpi is a user choice, so it should always take precedence over any automatic information. In a mixed or dynamic DPI environment it makes little sense to set it at all, but if it's set, it's not up to the toolkit or window manager to decide its value should be ignored.

Oblomov commented 7 years ago

Today Debian Sid upgraded Awesome to 4.2 and I got reminded that the mixed-DPI situation is not fixed yet; particularly, even if I set the theme font to Terminus 8, the actual font used is twice as large: I suspect that this is a side-effect of me having Xorg report a DPI of 192, so if Awesome is using (directly or indirectly) Xft and then applying DPI scaling itself, teh font is actually getting scaled twice. I'll have to look into this.

In the mean time I've had the opportunity to toy around with Qt's support for mixed-DPI and it's simply amazing. If you have a mixed-DPI setup and Qt applications and Qt 5.6 or later (possibly Qt 5.9) do yourself a favor and set the environment variable QT_AUTO_SCREEN_SCALE_FACTOR=1. Assuming your font DPI is set to the one of your primary monitor, things will “just work” out of the box (there's a few kinks, but overall the experience is very nice).

Essentially, what Qt does is this:

This has some very interesting consequences, because it allows users to control font scaling by manipulating the core DPI wrt the DPI of the primary output: if your core DPI is set to e.g. 192 and your primary output is loDPI, you'll get 2x font scaling (everywhere!), but if your primary display is HiDPI then you'll get 1x font scaling everywhere.

I think that overall this is a very good strategy, because it allows to have both per-output widget scaling and gives the user independent control on the font scaling (e.g. via Xft.dpi). I would recommend awesome follow the same strategy.

This means:

Does this make sense to everyone else to?

psychon commented 7 years ago

Today Debian Sid upgraded Awesome to 4.2

Oh? Yay!

I'll have to look into this.

What output do you get for awesome-client 'local get_dpi, res = require("beautiful.xresources").get_dpi, {} res[0]=get_dpi() for s in screen do res[s.index] = get_dpi(s) end return require("gears.debug").dump_return(res)'? Index 0 is the "global DPI", the others are the DPI for that screen.

You can assign per-monitor DPIs via beautiful.xresources.set_dpi(screen[1], 42), but by default we do not set per-monitor DPIs (no one yet gave a good answer for "what happens in clone mode?"). I'd recommend doing this assignment quite early in your config.

Does this make sense to everyone else to?

Besides that, I don't have much of an opinion on this. I guess some people want to specify things in pixels while others want something based on meters (or inch). [That's the whole point of DPI, no?] Right now we are mostly doing pixels, except for fonts where both is possible. (Plus, lots of stuff simulating meter-based sizes outside of the widget system, which means only "global DPI" is possible, no "per-screen DPI".)

Edit: Relevant PR: #1204

Oblomov commented 7 years ago

Oh? Yay!

Yeah!

What output do you get for [...]

192 for each element in the table.

You can assign per-monitor DPIs via beautiful.xresources.set_dpi(screen[1], 42), but by default we do not set per-monitor DPIs

Ah that does explain it.

no one yet gave a good answer for "what happens in clone mode?"

In clone mode, I think we should go with the lowest computed DPI. If this choice isn't appropriate, the user can always do an override, as appropriate. Rationale: the clone setups I can think of are things such as projector screens et similia, where you want to avoid going to HiDPI if possible (prefer legibility over screen estate).

(Also, we may want to look at what Qt does in this case, I'll see if I can concot an example.)

I'd recommend doing this assignment quite early in your config.

(Of course, this is something that shouldn't be necessary; ideally, if the user changes DPI settings, awesome should redraw all affected widgets.)

I guess some people want to specify things in pixels while others want something based on meters (or inch). [That's the whole point of DPI, no?]

Yes and no. I think that for most intents and purposes, people working in pixels are doing so in reference pixels (i.e. assuming a standard DPI of 96). Still, if a widget or something wants to use actual screen dimensions and coordinates, I think this can be arranged, with some specific API to ignore DPI scaling.

Plus, lots of stuff simulating meter-based sizes outside of the widget system, which means only "global DPI" is possible, no "per-screen DPI".)

The best approach is, I think, for these is to work in reference pixels (i.e. unscaled), with all final dimensions being DPI scaled at widget placement/render time.

glyh commented 1 year ago

May I query the status of this feature?