ocornut / imgui

Dear ImGui: Bloat-free Graphical User interface for C++ with minimal dependencies
MIT License
59.4k stars 10.12k forks source link

Blurry font rendering when using SDL_Renderer backend #4768

Open w0utert opened 2 years ago

w0utert commented 2 years ago

Version: git 65f4be1a Branch: master

Back-ends: imgui_impl_sdlrenderer.cpp + imgui_impl_opengl2.cpp Compiler: Apple Clang Operating System: MacOS Monterey

When using the SDL_Renderer backend for SDL 2.0.17+, all ImGUI text output is very blurry compared to when using the OpenGL 2 backend. It clearly looks like the SDL_Renderer backend is rendering the fonts at half the size internally and upscaling by 2x, resulting in blurring and contrast loss.

Example:

SDL_Renderer backend:

Screenshot 2021-11-27 at 13 53 25

OpenGL 2 backend:

Screenshot 2021-11-27 at 14 22 14

My application strictly uses software rendering internally, and I only use SDL for input handling and to blit the software-rendered framebuffer to a window. I have no use for any hardware acceleration besides the blit itself, no use for HighDPI etc. I'm aware that the SDL_Renderer ImGUI backend is not recommended, but for my application it is very useful for multiple reasons: it is by far the least amount of code for my purposes, it works on all platforms (macOS, Linux, Windows), and, most importantly, the latest macOS update broke OpenGL V-Sync completely. This means that the OpenGL 2 based version always runs at max framerate while I need to be able to throttle it to the display rate. I can do that by inserting forced delays but these are not accurate, which is very important for my application. Meanwhile SDL_Renderer works fine as it is implemented using Metal on macOS, except for the blurry ImGUI text.

It would be great if there was a workaround to somehow increase the resolution at which text is rendered internally, or a fix that does this automatically like the OpenGL 2 backend.

Relevant code snippets how I initialize the SDL_Renderer backend:

I'm creating the SDL window without the SDL_WINDOW_ALLOW_HIGHDPI flag, because the OpenGL 2 backend doesn't need it (it already does exactly what I want in terms of rendering without the flag), and the SDL_Renderer backend simply breaks when I enable it (text will be crisp but 2x as small and have positioning issues, and coordinates in SDL events do not match up with the screen anymore).

  SDL_Window * const window = SDL_CreateWindow(
    "maboroshi-16",
    SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED,
    WINDOW_WIDTH, WINDOW_HEIGHT,
    0);

  SDL_Renderer * const renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_PRESENTVSYNC);

  ImGui::CreateContext();
  ImGui_ImplSDL2_InitForSDLRenderer(window);
  ImGui_ImplSDLRenderer_Init(renderer);

  // ...

  // Copy framebuffer to texture and copy to framebuffer
  SDL_Surface * const src_surface = SDL_CreateRGBSurfaceFrom(
    (void *) frame.image.data.data(),
    frame.image.width, frame.image.height,
    32, frame.image.width * 4,
    0x00FF0000, 0x0000FF00, 0x000000FF, 0xFF000000);

  SDL_Texture *texture = SDL_CreateTextureFromSurface(renderer, src_surface);
  SDL_FreeSurface(src_surface);
  SDL_RenderClear(renderer);
  SDL_RenderCopy(renderer, texture, nullptr, nullptr);

  // ...

  // ImGui overlay;
  ImGui_ImplSDLRenderer_NewFrame();
  ImGui_ImplSDL2_NewFrame();
  ImGui::NewFrame();

  ImGui::Begin("Statistics");

  const platform::Machine::Statistics &stats = machine.getStatistics();

  ImGui::Text("%s", fmt::format("frame time: {0:#.2f} ms\nspeedup: {1:#.2f}x\ngpu time: {2:#.2f} ms\nrefresh rate: {3:#.2f}",
    stats.frameTime.count() / 1000.0, stats.speedup, stats.gpuTime.count() / 1000.0, refresh_rate).c_str());
  ImGui::End();

  ImGui::Render();

  ImGui_ImplSDLRenderer_RenderDrawData(ImGui::GetDrawData());

  SDL_RenderPresent(renderer);
ocornut commented 2 years ago

I'm creating the SDL window without the SDL_WINDOW_ALLOW_HIGHDPI flag, because the OpenGL 2 backend doesn't need it (it already does exactly what I want in terms of rendering without the flag), and the SDL_Renderer backend simply breaks when I enable it (text will be crisp but 2x as small and have positioning issues, and coordinates in SDL events do not match up with the screen anymore).

That's the problem you have to solve. If you create your window without the SDL_WINDOW_ALLOW_HIGHDPI flag and your Windows settings for scale is larger than 100% (very common) then Windows will scale the output, which is the result you are getting.

"text will be crisp but 2x as small"

That's technically correct, you may want to render the font at a different size based on DPI scale.

And have positioning issues, and coordinates in SDL events do not match up with the screen anymore).

That's something that should be fixed.

w0utert commented 2 years ago

That's the problem you have to solve. If you create your window without the SDL_WINDOW_ALLOW_HIGHDPI flag and your Windows settings for scale is larger than 100% (very common) then Windows will scale the output, which is the result you are getting.

Interestingly, it seems to work opposite from what you describe:

I'm using macOS by the way, not Windows, not sure if the HiDPI handling works differently there.

Other thing to note is that from my perspective I neither need nor want to enable HiDPI if it is not required. My application is a retro emulator, so I don't have any high-DPI assets and am fine if things look the way they currently look when the OpenGL 2 backend without HiDPI enabled (I would leave just leave the code like that if it weren't for the OpenGL V-sync issue on macOS).

"text will be crisp but 2x as small"

That's technically correct, you may want to render the font at a different size based on DPI scale.

If I can 'fix' the SDL_Renderer code by rendering the font at a higher scale (I'm using the default font by the way, there is no font loading/initialization code anywhere at this moment), that would be fine by me. I could check the DPI at runtime and scale accordingly. What is the best place to look for a code example that illustrates how to do that?

That said, it confuses me that the behavior between the OpenGL 2 backend and the SDL_Renderer backend is different.

The OpenGL 2 backend does exactly what I expect: without HiDPI on-screen I get 'some font size' and it looks ok. If I enable HiDPI, I see the same on-screen font size, which is probably because ImGUI just upscales the font texture 2x (making it blurry). This makes sense to me, the HiDPI mode just enables drawing all scalable elements at 2x, and if I want sharper text that means I need to render the font textures at a higher resolution as well. Meanwhile, the SDL_Renderer backend messes with the display size of the text when I enable HiDPI, which is not what I would expect.

And have positioning issues, and coordinates in SDL events do not match up with the screen anymore).

That's something that should be fixed.

Here's an example screenshot of the exact same code using the exact same ImGui window as in the original report, but with SDL_WINDOW_ALLOW_HIGHDPI enabled:

Screenshot 2021-11-29 at 15 18 01

As you can see the window title is really tiny, but also the text inside the dialog is completely gone. I suspect this is because of positioning/clipping issues, because all mouse interactions are also completely broken, SDL window coordinates are not mapped correctly to the ImGui coordinates anymore. I can still move/resize the window if I hunt around the window until the cursor changes, so the problem really is in the coordinate transformations and not in the event handling code.

It still seems to me that besides the positioning problems, there is also something wrong with how the SDL_Renderer behaves regarding font sizes when HiDPI is enabled, considering the different behaviour compared with the OpenGL 2 renderer? I would be willing to investigate myself further and see if I can fix this myself and make a PR for it, but I'm not sure what the expected behavior should be.

Does it make sense to file a separate bug ticket for this, or move this one?

w0utert commented 2 years ago

Update on this:

I tried debugging this to find out where the ImGUI OpenGL 2 and SDL_Renderer back-ends would diverge to produce the different scaling behavior, and found out that the issue is probably not related to ImGUI at all.

Both the font atlas and framebuffer sizes are generated identical for the OpenGL 2 version and the SDL_Renderer version of my application. But when SDL_WINDOW_ALLOW_HIGHDPI is disabled, they are based on the virtual resolution reported by macOS, not the actual on-screen (hardware) pixel resolution, which is 2x the virtual resolution in my case.

What is happening is that the default magnification filter used for an SDL OpenGL window appears to be GL_NEAREST, while the SDL_Renderer by default uses bilinear filtering. The nearest-neighbor filtering just so happens to produce exactly the effect I want to achieve, as it results in simple pixel doubling to get sharp but blocky fonts.

So @ocornut you were indeed correct that I need to fix this at a different level, either forcing the filtering mode used for the SDL renderer, or enabling SDL_WINDOW_ALLOW_HIGHDPI and finding out the cause of the positioning issues with the SDL_Renderer backend.

w0utert commented 2 years ago

either forcing the filtering mode used for the SDL renderer, or enabling SDL_WINDOW_ALLOW_HIGHDPI and finding out the cause of the positioning issues with the SDL_Renderer backend.

Ok that turned out to be much easier than I expected after all. I can just enable SDL_WINDOW_ALLOW_HIGHDPI, and use SDL_RenderSetScale(renderer, 2, 2) to enable pixel-doubling. That fixes the positioning issues as well.

Maybe the SDL_Renderer backend should do this automatically when it detects HiDPI is enabled, setting the correct SDL_RenderSetScale factors based on the virtual vs actual window size?

PathogenDavid commented 2 years ago

Maybe the SDL_Renderer backend should do this automatically when it detects HiDPI is enabled, setting the correct SDL_RenderSetScale factors based on the virtual vs actual window size?

This sounds like it'd be undesirable for people using SDL_Renderer to render things other than a GUI. It might make sense to include it in the example though.

w0utert commented 2 years ago

This sounds like it'd be undesirable for people using SDL_Renderer to render things other than a GUI. It might make sense to include it in the example though.

Yes agree, this would affect all other rendering done through the SDL_Renderer and also prevent being able to actually render the ImGUI in true HiDPI using a higher resolution font. So not desirable indeed.

A hint in the documentation and in the example would be nice to potentially save someone else the trouble figuring out how to get the same behavior I want.

Apart from that, just enabling SDL_WINDOW_ALLOW_HIGHDPI and not doing anything else does currently break the ImGUI SDL_Renderer backend completely, mixing up virtual and actual window coordinates in rendering and events. I think that should still be fixed?

sridenour commented 2 years ago

What is happening is that the default magnification filter used for an SDL OpenGL window appears to be GL_NEAREST, while the SDL_Renderer by default uses bilinear filtering. The nearest-neighbor filtering just so happens to produce exactly the effect I want to achieve, as it results in simple pixel doubling to get sharp but blocky fonts.

This is, AFAIK, a macOS/Metal issue, not an SDL one. When the Metal layer's backing store isn't high DPI but the display is, it uses bilinear filtering to upscale it. You can change it to use nearest-neighbor instead by setting the magnification filter to nearest on the Metal layer, like this:

// AppleSpecificStuff.m
#import <QuartzCore/CAMetalLayer.h>
...
CAMetalLayer *layer = (__bridge CAMetalLayer *)SDL_RenderGetMetalLayer(renderer);
layer.magnificationFilter = kCAFilterNearest;
sridenour commented 2 years ago

Apart from that, just enabling SDL_WINDOW_ALLOW_HIGHDPI and not doing anything else does currently break the ImGUI SDL_Renderer backend completely, mixing up virtual and actual window coordinates in rendering and events. I think that should still be fixed?

On other backends, an orthographic projection matrix is created and supplied to the ImGui vertex shader when rendering, which projects the vertex coordinates using that ortho proj matrix, and thus the virtual coordinates and actual pixel coordinates will always be correct regardless of DPI.

With SDL_Renderer the application can't supply a custom projection matrix, and its own projection matrix just uses the destination pixel size, so that doesn't happen. Perhaps ImGui's SDL_Renderer backend should call SDL_RenderSetScale() on its own to ensure the correct result (obviously saving and restoring the existing scale). Maybe have a way to disable that in case the application is doing something weird.

differentprogramming commented 2 years ago

For windows I managed to turn off dpi-aware automatic scaling by extracting a manifest, adding the claim that the program does its own dpi scaling and then adding that manifest to the project in Visual Studio 2019

Steps: 1) open x64 Native Tools Command Prompt for VS 2019 2) navigate to the directory where the exe was. 3) run the mt tool: mt -inputresource:example_sdl_vulkan.exe;#1 -out:extracted.manifest 4) move extracted.manifest to a source directory 5) edit it to insert the following before </assembly>: <application> <windowsSettings> <dpiAware xmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings">true</dpiAware> <dpiAwareness xmlns="http://schemas.microsoft.com/SMI/2016/WindowsSettings">PerMonitorV2</dpiAwareness> </windowsSettings> </application> 6) add that manifest to the project with the "add existing item" menu entry

Note, doing it this way, rather than programmatically is apparently recommended to avoid bugs.

I managed to come up with code that queries the current display's mode on the window first showing and on it being moved to different displays.

change your process event code, for instance dear imgui using sdl needs the following change: in imgui_impl_sdl.cpp add the following to bool ImGui_ImplSDL2_ProcessEvent(const SDL_Event* event) on case SDL_WINDOWEVENT: add the final test on the series of if statements:

else if (event->window.event == SDL_WINDOWEVENT_DISPLAY_CHANGED || event->window.event == SDL_WINDOWEVENT_SHOWN) { SDL_DisplayMode mode; int display; if (event->window.event == SDL_WINDOWEVENT_DISPLAY_CHANGED) display = event->window.data1; else display = SDL_GetWindowDisplayIndex(SDL_GetWindowFromID(event->window.windowID)); if (0 == SDL_GetCurrentDisplayMode(event->window.data1, &mode)) { \\ here put in a call to user code the scales the fonts and UI based on the contents of mode - the following is just an example of code that changes the font to among a few pre-determined ones int a; if (mode.h < mode.w) a = mode.h; else a = mode.w; int index = 1; if (a >= 1080) index += 2; if (a >= 1440) index += 2; if (a >= 2160) index += 2; ImGuiIO& io = ImGui::GetIO(); (void)io; io.FontDefault = io.Fonts->Fonts[index]; } }

You'd think you could SDL_SetEventFilter instead, but that seems to miss the first SDL_WINDOWEVENT_SHOWN.

In any case I think you can close the issue unless fixing similar problems on macOS is considered part of the same bug. This solves blurriness for Windows.

Also I did this for the vulkan version of SDL, but I'm assuming that it would also work for other versions of SDL.

yig commented 2 years ago

I'm not making a retro game. I do want proper high DPI mode. The example in examples/example_sdl_sdlrenderer/main.cpp is broken on a mac with a high DPI display. Can this be fixed at the ImGui level? Does one of these workarounds work (for input events and drawing)?

ligfx commented 1 year ago

I was able to solve this for my use case by changing imgui_impl_sdlrenderer.cpp to set the font texture scale mode to nearest-neighbor:

     // Upload texture to graphics system
     // (Bilinear sampling is required by default. Set 'io.Fonts->Flags |= ImFontAtlasFlags_NoBakedLines' or 'style.AntiAliasedLinesUseTex = false' to allow point/nearest sampling)
     bd->FontTexture = SDL_CreateTexture(bd->SDLRenderer, SDL_PIXELFORMAT_ABGR8888, SDL_TEXTUREACCESS_STATIC, width, height);
     if (bd->FontTexture == nullptr)
     {
         SDL_Log("error creating texture");
         return false;
     }
     SDL_UpdateTexture(bd->FontTexture, nullptr, pixels, 4 * width);
     SDL_SetTextureBlendMode(bd->FontTexture, SDL_BLENDMODE_BLEND);
-    SDL_SetTextureScaleMode(bd->FontTexture, SDL_ScaleModeLinear);
+    SDL_SetTextureScaleMode(bd->FontTexture, SDL_ScaleModeNearest);

     // Store our identifier
     io.Fonts->SetTexID((ImTextureID)(intptr_t)bd->FontTexture);

     return true;
 }

However, the comment "Bilinear sampling is required by default" gives me pause. I'm not sure what I'm breaking with this, though it works for me on macOS with both high-dpi and non-high-dpi renderers.

PathogenDavid commented 1 year ago

I'm not sure what I'm breaking with this

Dear ImGui's anti-aliasing assumes bilinear sampling is enabled (which is why the comment says to disable it if you enable nearest-neighbor.)